On July 12, the first full-color images from the Webb Space Telescope showed countless nebulae, galaxies, and a gassy exoplanet as they had never been seen before. But Webb only collects infrared and near-infrared light, which the human eye cannot see—so where are these gorgeous colors coming from?
Image developers on the Webb team are tasked with turning the telescope’s infrared image data into some of the most vivid views of the cosmos we’ve ever had. They assign various infrared wavelengths to colors on the visible spectrum, the familiar reds, blues, yellows, etc. But while the processed images from the Webb team aren’t literally what the telescope saw, they’re hardly accurate.
“Something I’ve been trying to change people’s minds about is to stop getting hung up on the idea of ’is this what this would look like if I could fly out there in a spaceship and look at it?'” said Joe DePasquale, a senior data image developer at the Space Telescope Science Institute, in a phone call with Gizmodo. “You don’t ask a biologist if you can somehow shrink down to the size of a cell and look at the coronavirus.”
Webb’s first test images helped check its mirrors’ alignment and captured an orange-tinted shot of the Large Magellanic Cloud. Those early snapshots were not representative color images; one used a monochromatic filter (its image was grayscale) and the other just translated infrared light into the red-to-yellow visible color bands, so the team could see certain features of the cloud they imaged. But now, with the telescope up and running, the images that get released are full of blazing color, like this recent one portrait of the Cartwheel Galaxy.
Astronomy is often done outside the visible spectrum, because many of the most interesting objects in space are shining brightly in ultraviolet, x-rays, and even radio waves (which category light falls into depends on the photon’s wavelength). The Webb Telescope is designed to see infrared light, whose wavelengths are longer than red visible light but shorter than microwaves.
Infrared light can penetrate thick clouds of gas and dust in space, allowing researchers to see previously hidden secrets of the universe. Especially intriguing to scientists is that light from the early universe has been stretched as the universe has expanded, meaning what was once ultraviolet or visible light may now be infrared (what’s known as “redshifted” light).
“These are instruments that we’ve designed to extend the power of our vision, to go beyond what our eyes are capable of doing to see light that our eyes are not sensitive to, and to resolve objects that we can probably see with just our eyes,” DePasquale said. “I’m trying to bring out the most detail and the most richness of color and complexity that’s inherent in the data without actually changing anything.”
Webb’s raw images are so loaded with data that they need to be scaled down before they can be translated into visible light. The images also need to be cleaned of artifacts like cosmic rays and reflections from bright stars that hit the telescope’s detectors. If you look at a Webb image before processing work is done, it’ll look like a black rectangle peppered with some white dots.
“I think there’s some connotations that go along with ‘colorizing’ or ‘false color’ that implies there’s some process going on where we’re arbitrarily choosing colors to create a color image,” DePasquale said. “Representative color is the most preferred term for the kind of work that we do, because I think it encompasses the work that we do of translating light to create a true color image, but in a wavelength range that our eyes are not sensitive to. “
Longer infrared waves are assigned redder colors, and the shortest infrared wavelengths are assigned bluer colors. (Blue and violet light has the shortest wavelengths within the visible spectrum, while red has the longest.) The process is called chromatic ordering, and the spectrum is split into as many colors as the team needs to capture the full spectrum of light depicted in the image.
“We have filters on the instruments that collect certain wavelengths of light, which we then apply a color that is most closely what we think it will be on the [visible] spectrum,” said Alyssa Pagan, a science visuals developer at the Space Telescope Science Institute, in a phone call with Gizmodo.
The chromatic ordering depends too on what elements are being imaged. When working with narrow-band wavelengths in optical light—oxygen, ionized hydrogen, and sulfur, Pagan suggests—the latter two both emit in red. So the hydrogen might get shifted to green visible light, in order to give the viewer more information.
“It’s a balance between the art and the science, because you want to showcase science and the features, and sometimes those two things don’t necessarily work together,” Pagan added.
Webb’s first representative color images were released July 12, over six months after the telescope launched from an ESA spaceport in French Guiana. From there, Webb traveled about a million miles to L2, a point in space where gravitational effects allow spacecraft to stay in place without burning much fuel.
The telescope unfolded itself on the way to L2, so once it was there, mission scientists could get started on aligning the $10 billion observatory’s mirrors and commissioning its instruments. The telescope has four instruments: a near-infrared camera (NIRCam), a near-infrared spectrograph, a mid-infrared instrument (MIRI), and a fine guidance sensor and slitless spectrograph for pointing at targets precisely and characterizing exoplanet atmospheres.
The voluminous amounts of dust in some galaxies and nebulae are transparent to NIRCam, allowing it to capture bright stars at shorter wavelengths. MIRI, on the other hand, can observe discs of material that will give way to planets as well as dust warmed by starlight.
When telescope images are being assembled, image processors work with instrument scientists to decide which features of a given object should be highlighted in the image: its piping hot gas, perhaps, or a cool dusty tail.
When Webb imaged Stephan’s Quintet, a visual grouping of five galaxies, the finished product was a 150-million-pixel image made up of 1,000 images taken by both MIRI and NIRCam. When just seen by MIRI, though, hot dust dominates the image. In the background of the MIRI images, distant galaxies glow in different colors; DePasquale said the team calls them “skittles.”
DePasquale and Pagan helped create the Webb images as we would eventually see them, rich in color and cosmic meaning. In the case of the sweeping shot of the Carina Nebula’s cosmic cliffs, different filters captured the ionized blue gas and red dust. In initial passes at the nebula image, the gas obscured the dust’s structure, scientists asked the image processing team to “tone down the gas” a bit, Pagan said.
Collecting light in Webb’s hexagonal mirrors is only half the battle when it comes to seeing the distant universe. Translating what’s there is another beast entirely.