James Webb telescope pictures didn’t begin as stunning images. Here’s how they started out — and how researchers brought them to life
Erik Rosolowsky was like a kid at Christmas when the James Webb Space Telescope started sending him long-awaited photos of a distant galaxy.
And the first person he wanted to show his new present to was his Uncle Mike.
Rosolowsky, an associate professor of physics at the University of Alberta, was among the first Canadian researchers to put the world's latest and greatest space telescope to use, in his case, by observing star formation in the Triangulum Galaxy, also known as Messier 33.
When my project was executing, I was pulling (data) out of the archive as soon as it was landing," says Rosolowsky. I sent a picture of it to my uncle ... Hey Uncle Mike! You and I are the first people on Earth to see these pictures!' "
His Uncle Mike, a backyard astronomer, is the person who first set Rosolowsky on the path that ultimately led him to receive images of a galaxy two million light years away from a $10-billion telescope orbiting the sun, some 1.5 million kilometres away from Earth.
But the picture Rosolowsky sent to Uncle Mike - significant as it was for them both - was a far cry from the stunning full-colour images released by NASA as the JWST continues to gather acclaim for its high-resolution infrared observations of the universe.
Instead, the picture for Uncle Mike - a few light blobs on a dark background crisscrossed with banding - looked something like this:
That stands in stark contrast - literally - to the famous Hubble image of Triangulum, or even Rosolowsky's partially processed Webb image, both of which show millions of stars, the spiral arms of the galaxy and, in the latter's case, some of the turbulent areas where stars are born.
The difference between the images illustrates the effort that researchers make to translate raw data from the JWST into images that astronomers can use for research and those at which the general public has marvelled.
In truth, Rosolowsky's initial data - part of an actual astronomical image as it appears in his computer code - looked something like this:
What he sent to his uncle, he says, was a quick and dirty look" at the Triangulum Galaxy.
But before those images - or that data - can become remotely useful to him as a researcher, they have to be corrected, not the least for quirks inherent in the equipment.
For all the meticulous precision that went into the JWST's design and construction, the data coming from it, in its rawest form, is uneven.
Images have to be corrected for imperfections inherent in the cameras themselves.
Cosmic rays hitting the telescope can create static in the detectors of its cameras, which is corrected for, in part, by capturing multiple versions of the same image.
And even the pixels - the smallest photosensitive units of the telescope's detectors - themselves have different sensitivities; one of the more than one million pixels in the JWST's Mid-Infrared Instrument (MIRI), for example, might be more sensitive than its neighbour, and less sensitive than another neighbour.
Thankfully, the JWST engineers have a solution for that - a complete calibration map of how to compensate for the variations in each pixel in every instrument on the Webb telescope.
This is the work of hundreds of people," says Rosolowsky. When we say commissioning a telescope' - it launched and then they went through this period where you heard nothing about the results.
That's what they were doing up there. It's finally turned on. They're starting to make images and say, How do we take out all these corrections?' "
All he has to do, he says, is apply the JWST's map of pixel calibrations to his data to account for those variations in pixel sensitivity.
For most researchers, much of this technical work takes place in what they universally call the pipeline."
Think of it as a data-processing highway, where the raw data received from the JWST is refined as it travels from source to destination.
Along the length of that highway, some software processing modules might apply corrections to the data based on inconsistencies or aberrations in the detectors, others might apply corrections based on calibrations for the cameras themselves, and yet others might combine the data from multiple exposures into single images.
But the highway also has a series of off-ramps, from which researchers can extract their information at earlier stages of the processing for their more specific uses.
In Rosolowsky's case, he wants some of that raw data, closer to the beginning of the pipeline.
It's all well and fine to have something nice to send to a loved one, says Rosolowsky, but researchers don't do science off of pretty images.
Much of his work is done with the raw, numerical data.
The pretty pictures are wonderful for interpreting and building up and telling a story," he says. But then backing up that story is what requires doing the computer processing to figure out how to measure every star in this image, figure out what it is, and then use that to back up your claim."
Most of my job is programming, so I'm writing the computer code that is going to take this big chunk of numbers and turn this into properties of stars," he says. That is my day job when you get down to it."
It was gorgeous, honestly'
It's a different situation for Alyssa Pagan, though. She works with NASA as a science visuals developer. She's one of the people responsible for processing the stunning images from the Webb telescope released to the public.
In fact, the image of the Cosmic Cliffs" of the Carina Nebula, one of the first five images released from the JWST, was the first she'd worked on from the Webb.
It was insane," she says of working on the first images from the Webb. It was gorgeous, honestly, to me from the get-go."
The fact that you are one of the first people to see the image and ... process it and put it out for the public - it was a huge honour. And it was also a time to reflect on how far we've come, and all the people involved to make it possible."
For Pagan, the images she began to work with were retrieved from further down the pipeline - with most of the compensations and artifact removal already accomplished.
In fact, the Carina Nebula image was a composite of six monochrome images, captured with six different filters, each showing the nebula in a slightly different wavelength of infrared, which is the Webb telescope's milieu.
But the raw images she began to work on weren't much to look at - just some specks of white on a dark field, no detail at all.
That's because the dynamic range of the JWST cameras - the difference between the blacks and the whites in the images - was far too much for even Pagan's high-end computer monitors to register. The information - the detail in the dark parts of the image - existed, but her monitors weren't able to perceive it.
The solution was to compress that dynamic range - bring the blacks and whites closer together - so that it would fit within the range of her monitor. Somewhat confusingly, this is called stretching" the image.
Post-stretching, she was able to see detail in the images - many of the fainter stars and the cloud of dust and gas that characterize a nebula. But she was still looking at a series of black and white images - the next step was to assign colours to each of those images. This is not done in random fashion.
How your brain understands light
What we humans perceive as visible light is but a small section of the electromagnetic spectrum, that portion to which our eyes are sensitive. To one end of that section, the section where light has shorter wavelengths, our brain assigns the bluish colours. To the other end of the section, that which contains the longer wavelengths, it assigns reddish colours.
On either side of the section of the spectrum to which our eyes are sensitive are wavelengths that our eyes cannot detect, but which, in some cases, other animals - or specially designed instruments - can.
The Webb telescope operates in the infrared range of spectrum; that is, past what we see as red, in the section of spectrum with longer wavelengths than we are able to perceive. That has advantages in terms of astronomy; infrared light can often pierce clouds of dust and gas that would block visible light.
But it means that to see what the Webb has observed, we have to create a representation of each image that falls within that portion of the electromagnetic spectrum to which our eyes are sensitive - the visible light section.
And to add colour to those images, Pagan followed roughly the same conventions that our brain does - shorter wavelengths are assigned bluish colours, longer ones get reddish colours.
When that was done, she could stack those coloured images on top of each other to make one full-colour image.
At this point, she says, she begins balancing science with esthetics.
That's where it becomes quite subjective," she says. We are working with scientists through this process ... to make sure we're showcasing the data as much as possible and we're being as honest as possible.
But it is subjective now because we are adjusting like tonality and contrast, colour and all these things to really make the features in each filter more prominent or just to showcase and highlight those different regions."
Working in Photoshop, Pagan tweaked the image to produce something both esthetically pleasing - organic" is the term she likes to use - and scientifically accurate.
On its release by NASA, on July 12, the Carina Nebula image wowed astronomy buffs and scientists alike, many of whom had been waiting decades to see the depths of the universe through the eyes of the most powerful telescope ever built.
It was, in the words of Canada's JWST scientific director Rene Doyon, the beautiful bridge between science and art."
It's so cool to see everyone talking about it," says Pagan. It feels like a nice, connected, universal moment of appreciation for science and humanity."
Steve McKinley is a Halifax-based reporter for the Star. Follow him on Twitter: @smckinley1