Do you ever look up at the sky and wonder how you could take photographs of things that existed billions of years ago, without them being just little glimmers of light? To see them in detail, with color, and focus… it seems impossible when you’re looking up from your backyard. But it is possible. Thanks to astronomer Edwin Hubble, we are able to observe the universe in this great detail, using the Hubble Telescope. This giant super-camera uses similar principles as with a DSLR (except on a way larger scale), and is able to observe and capture light coming from millions of light years away. But, how exactly does it work?

Cameras

First, you need to know that the Hubble Telescope isn’t just one camera: it is made up of five different cameras/ sensors that each play it’s own role in observing the universe. The two that are responsible for the majority of photos we see coming from the Hubble are the Advanced Camera for Surveys (ACS) and the Wide Field Camera 3 (WFC3).

4996240824_882e4e79a8_b

Taken with the Advanced Camera for Survey © NASA Goddard Space Flight Center via Flickr

The ACS is used mostly for viewing into the deepest of deep space. It is made up of two parts: the wide field camera and the solar blind camera. The wide field camera conducts broad surveys of the universe, of which astronomers use to observe the nature and organization of galaxies, cluing us into the evolution of the universe. When light travels vast distances, such as billions of lightyears, it tends to stretch a bit, into ultraviolet light. The solar blind camera blocks the visual light spectrum and senses this ultraviolet light in order to see the earliest points in the universe’s history.

Taken with Hubble’s Wide Field Camera 3 © Hubble Heritage via Flickr

The WFC3, is the camera responsible for a lot of the most recent photos of close and distant galaxies. It’s special skill is that it observes three different kinds of light–near-infrared light, visible light, and near-ultraviolet radiation. Having the ability to see all three of these light spectrums at once gives this camera a huge depth of field, and allows for a high, detailed resolution. We can observe galaxies that are closer to us using the near-infrared light, and galaxies that are billions of lightyears away with the near-infrared sensor.

Optics

The Hubble Telescope uses principles from basic film cameras and digital cameras by combining the use of mirrors to direct light, and then sensing this light with an ultra-sensitive detector. However, this optical system does not use a lens to focus the light, it uses concave and convex mirrors in order to bounce the light back and forth until it hits a central focal point. The light comes through the aperture door and hits a set of concave, primary mirrors. This sends the light to a centralized, convex, secondary mirror which sends the light to a focal point at the detector, where the light is processed and recorded. From there, the recorded light goes through many phases of processing before it is the photo you can view in Science Magazine.

Processing the Image

After many steps involving the transfer and conversion of data to a readable format by imaging software, the image may be processed into an visual image. Data from the Hubble comes in as a greyscale photograph, such as the image above, and in several chunks from different angles (since this telescope is moving at a rate upwards of 17,000 mph). People who are the most skilled in photo-editing software will be able to connect these puzzle pieces into a cohesive picture.

Once the picture is put together and the form of it is made, the editors look at other chunks of data–which kind of light is coming from where. The Hubble uses filters in order to filter different forms of light (infrared, visual, and ultraviolet), and narrow it down to the light that we can see with our eye. At the same time, these filters also separate what we can see through different forms of visual light (ie red, green, and blue)–these filters are similar to how a yellow colored piece of glass only casts down a beam of yellow light. This data comes in as separate images, and are combined in a photo processing software in order to create what (probably) the image would look if we traveled into deep space and saw it with our naked eye. Above, we see Messier 77, a galaxy, processed two different ways. The left, processed in 2012, had different data than the updated picture on the right, that was processed in 2014. For some reason, in 2012 the data had filtered a lot of blue light, but not the full spectrum. On the second try in 2014, the data showed a much broader spectrum. Though we obviously don’t know what it looks like for sure, photo-editing via processed, chunked information is the most accurate information we have today.

Curious how these images are processed and prepared for public distribution? Check out our detailed conversation and editorial featuring Robert Hurt, an astronomer and visualization scientist at Caltech’s Infrared Processing and Analysis