The 2020 Mars Rover launched July 30th, with a planned landing on February 2021. Every time we go to Mars, decisions of what to bring become a balance of what scientists would like, and what engineers feel will reliably work. This makes each mission part of a continuing chain of innovation, with the learning from past missions informing proven technologies and innovations to future missions.
Over the last twenty years, one perfect example has been CCD sensors. It has been the go-to standard for cameras in space. “A CMOS sensor would probably die very rapidly as soon as it went into space,” points out Robert Groulx, Product Manager for CCD technology at Teledyne Dalsa Semiconductor. Extremes of heat and cold can wear out electronics, and radiation and cosmic rays can wreak havoc on electronic integrated circuits by altering the states of the elements inside them. On five of NASA’s missions to Mars the CCD image sensors on its cameras were made by Teledyne, in collaboration with NASA’s Jet Propulsion Laboratory (JPL). The camera on the InSight lander helped scientists and engineers choose where to place instruments on the surface of Mars. This included the Instrument Deployment Camera, which is placed under the lander and pointed at 120° with a fisheye-type lens, and the Instrument Context Camera, which is on the lander’s robotic arm, looking at a 45° angle. Besides their lenses, both cameras are identical, collecting 1MP images to help with the placement of other instruments. The Insight cameras were monochrome frame-transfer CCD models left over from an earlier program. “But this time JPL wanted to take color photos. Color cameras are much larger, and new models couldn’t be qualified in time for launch,” says Groulx. The solution was to put color filters over the monochrome sensors. “We knew from the users of those color filters that the potential for qualification was good. We had used them on Teledyne Dalsa CCD sensors for many years but we had never sent them to space! JPL had to make sure that those color filters would endure the trip to Mars.” “For the color filters, we use a technology that is unique to Teledyne Dalsa that we’ve been doing since 2002/03,” says Groulx. “It uses color pigments that are evaporated onto the CCD. It’s more expensive but produces very good results in terms of color separation and reproduction, better than the typical dyed resist method.For this mission, our colleagues at Teledyne e2v have contributed their CCD42-10 image sensor for SuperCam and Sherloc. These instruments will help to determine evidence of wateras an influence on the Martian environment.”
Choosing the right camera
The four cameras on Curiosity used a different solution, the KAI-2020 (originally from Kodak, now OnSemi), an interline CCD sensor that shoots at 1600×1200 pixel. The cameras went into the Mars Hand Lens Imager (MAHLI), Mars Descent Imager (MARDI), and two MastCams. Curiosity launched in 2011. Why did it only have a 2MP sensor, smaller than a lot of mobile phones even then? Once again, the sheer logistics and complexity of a Mars mission defined what was possible. “There’s a popular belief that projects like this are going to be very advanced but there are things that mitigate against that. These designs were proposed in 2004, and you don’t get to propose one specification and then go off and develop something else. 2MP with 8GB of flash didn’t sound too bad in 2004. But it doesn’t compare well to what you get in an iPhone today,” said Mike Ravine, project manager at Malin Space Science Systems, responsible for the cameras’ development in a 2012 interview, “And the state of CMOS sensors wasn’t credible in 2004. They’re an interesting option now, but they weren’t then.” By using the same sensor NASA was able to minimize the effort needed for mission qualification, and more easily optimize the sensors for their operating environment, such as accounting for the effect of radiation on individual pixels. There’s also the type of photos the rover would be taking: landscapes. With nothing on Mars moving, it was possible to take multi-shot panoramas with the help of the MastCam arm, creating extremely high-resolution results. One disappointment in the Curiosity imaging system was the cancellation of 6.5 to 100mm zoom lenses that would allow the cameras to catch additional detail, as well as cinematic 3D images. Zoom lenses would have to wait until the next rover in 2020.
When the Curiosity rover landed on Mars, it recorded the descent and landing with its MARDI camera. This camera shot full-color video of Curiosity’s journey through the atmosphere all the way down to the Martian surface. The view was extremely valuable to engineers; it helped them understand what happens during one of the riskiest parts of the mission. The footage also gave the science team and rover drivers a glimpse of the landing site to aid them in accurately identifying Curiosity’s landing spot and plan the rover’s first drives to explore Gale Crater. For the Mars 2020 rover, the engineering team is adding several cameras and a microphone to document entry, descent and landing in even greater detail. In addition to providing engineering data, NASA considers the cameras and microphone partly a public engagement payload, giving all the onlookers a good and dramatic sense of the ride down to the surface. Still, there is a lot of competition for payload and limited space. Currently, the cameras and microphone are counted as discretionary payload, which means they’re assets, but not required for the mission. Perhaps as a result, NASA has not revealed the specifications of these sensors, instead simply saying that they are ‘assembled from easily available commercial hardware’.
Mars 2020 uses a new generation of engineering cameras that build on the capabilities of past Mars rover cameras. These 20MP engineering cameras give much more detailed information, in color, about the terrain around the rover. Each of the nine cameras share the same camera body, but use different lenses selected for each camera’s specific task. They have various functions: they measure the ground around the rover for safe driving (HazCams), support pathfinding (NavCams), and aid sample-gathering (CacheCam). The HazCams and NavCams will help operators on Earth drive the rover more precisely, and better target the movements of the arm, drill and other tools. A much wider field-of-view gives the cameras a much better view of the rover itself. This is important for checking on the health of various rover parts and measuring changes in the amount of dust and sand that may accumulate on rover surfaces. The new cameras are also capable of higher frame rates than previous cameras systems, allowing it to take pictures while the rover is moving.
While the additional cameras will help, real mission success lies with the scientific payload. Here we find some of the most significant differences between Curiosity and Mars 2020. The large robotic arm on the front of the Mars 2020 rover is bigger than Curiosity’s. This will enable the 2020 rover to collect and store samples, but also manage a host of new functions and new science tools. This turret has the coring drill and two science instruments, plus a color camera for close-up surface inspection and selfies for rover health checkups. Mastcam-Z, an upgraded version of Curiosity’s high-definition, two-camera Mastcam instrument, will serve as Mars 2020’s main set of eyes. With a maximum image size of 1600×1200 pixels, it still can resolve incredible detail, between about 150 microns per pixel (0.15mm) to 7.4mm per pixel depending on distance. “Camera technology keeps improving,” said Justin Maki of JPL, Mars 2020’s imaging scientist and deputy principal investigator of the MastCam-Z instrument. The ´Z´ is for zoom, the capability that was left out of Curiosity’s Mastcam. Mastcam-Z will be able to take more 3D images, potentially allowing mission scientists to examine geologic features in more detail and scout out promising scientific sites from large distances. The SuperCam was built by a team of hundreds and packs what would typically require several sizable pieces of equipment into something no bigger than a cereal box. It fires a pulsed laser beam out of the rover’s mast, to vaporize small portions of rock from up to 7m away at up to 10,000°C. This combination camera and spectroscope then analyzes the chemicals released in a method called laser-induced breakdown spectroscopy (LIBS). SuperCam will use AI to seek out rock targets worth zapping during and after drives, when humans are out of the loop. In addition, this upgraded AI lets SuperCam point very precisely at small rock features. Then there are Sherloc and Watson, which are a pair of cameras focused on searching for signs of life. Despite these bacronyms being a little old-fashioned, the cameras are very forward-looking. Sherloc is a laser&spectroscope combo with a macro camera capturing extreme closeups of areas under study, to send home for mission scientists to work on how rock samples formed. Watson will help, capturing fine-scale textures of rocks and dust, and can be turned inwards to look at the rover.