It is presented in false color to emphasize differences between materials in the scene. It was assembled from 817 component images taken between Dec. 21, 2011, and May 8, 2012, while Opportunity was stationed on an outcrop informally named "Greeley Haven," on a segment of the rim of ancient Endeavour Crater.
The cameras (at least on the old rovers, and probably similar on Opportunity) are black-and-white cameras that accurately measure the amount of light hitting each part of the sensor. To get a color image, a filter is placed in front of the camera. There are lots of different filters available, and they are chosen for each image to highlight differences in materials. A "true-color" image would have to be planned in advance and taken multiple times with different filters, then recombined to make a color image. http://areo.info/mer/ (edited for correctness)
All digital cameras are black and white cameras with a filter in front of them. They just differ in how they put filters in front of the sensor.
The digital cameras we use have this fixed filter (most have a Bayer pattern: http://en.wikipedia.org/wiki/Bayer_filter) where every pixel gets its own, individual filter.
That’s handy for snapping color images without multiple exposures, not so flexible, though. Thing is, those sensors without the filters can be sensitive in parts of the spectrum (infrared or ultraviolet) where the human eye is not sensitive. If you were to bake in a Bayer filter you would throw all that information away. So those spacecraft usually have several filters they can combine at will to capture different parts of the spectrum.
In that sense the color is not faked, it’s more a human-readable depiction of which light at which frequencies hit the sensor.
Right! I guess you could say that technically there’s still a filter involved but that would be cheating. It’s really cool tech, sadly never really commercially successful.
It's supposed to be pretty close. There's a reference marker on the rover with a few different colors that is used to index the black-and-whites to color. There are a bunch of people arguing about how the hue isn't perfect, or this or that shade of red is a few nanometers off, but as I understand it, in terms of color reproduction it's no worse than a cheap cell phone camera.
You can see the reference marker on the panorama. It's on the solar panel to the left of the mast. At the top of the panel is a white disk that looks like if has an old style arcade joystick sticking out of it. Zoom in on the disk and you can see the (now red dust marred) colour calibration indicators on the square surrounding the white circle.
I went to a lecture by Geoffrey Landis of Nasa at the Eastercon Science Fiction convention in the UK. He worked on the last rover, and explained how to infallibly tell if the colour is false. There is no blue on Mars. Most false colour pictures end up making part of the landscape blue, which is both a give-away, and incorrect. Probably what you'd actually see would be less colourful and more dusty, given the pictures he showed us.
I suspect it depends on whether you see it as just viewing a flat picture, or as actually seeing from the rover's perspective. Personally, I see it as the latter, so I would actually find it unintuitive if the controls were reversed.
It's kind of funny how such a seemingly irrelevant semantic argument gives a completely opposite implementation.
So beautiful and peaceful; there is something magical about this picture -- a planet that human feet never touched. No human installations, no radio haves (other than from this rover) running, clean.
I wonder if it would be possible to GMO-engineer some sort of plants that could survive and grow/evolve in Martian atmosphere. Would be amazing to see the result -- humans brought life to Mars and it evolved on its own. We definitely should try!!