A network of 1,000 Google computers has achieved a computing milestone: The massive electronic brain taught itself to recognize pictures of cats. Computers remain far behind humans in their ability to identify images, but where do they stand relative to other animals? Can cats recognize pictures of cats?
Probably. Several animals seem drawn to pictures of members of their own species. For example, if you let a Japanese macaque control how long an image remains on a computer screen, it will spend more time gazing at other Japanese macaques as compared to rhesus or bonnet macaques. When placed in a maze, untrained sheep follow paths labeled with pictures of other sheep more often than paths that feature human faces. If placed in front of television screens with rotating images of dogs, humans, toys, and letters of the alphabet, dogs stare longer at the dog pictures and are more likely to focus on the center of the image. The Explainer isn’t aware of similar research on cats—probably because cats are frustratingly apathetic in laboratory experiments—but nothing about feline neurobiology suggests they would behave differently.
The ability to classify images seems to extend beyond mammals. Pigeons—one of the world’s best-studied animals—have demonstrated their ability to distinguish between pictures of country and urban landscapes, letters of the alphabet, cartoon characters, and paintings by Monet and Picasso. It’s not clear, however, whether the pigeons, like humans and the Google computer, create separate mental conceptions for each category of object they gaze upon or if they rely on some lower-order visual cue. (In one study, the pigeons tricked laboratory scientists by relying on irrelevant features of the background to distinguish between pictures with and without humans.) There is another difficulty in comparing the Google computer to animals. The Google computers’ cat-recognition ability was notable because they learned to deduce the presence of felines without human training. Without subjecting animals to human training, by contrast, there’s no way of knowing what the beasts are thinking.
It’s similarly difficult to know whether animals understand that two-dimensional images are representations of real-world objects, or if they think the pictures are the objects themselves. The heart rate of a chimpanzee has been shown to increase in response to pictures of individuals that it knows, but not when shown pictures of unfamiliar chimps. It’s not clear, however, if the chimp is confusing the image with his real-life companion, or if the mere representation of a friend sets his heart aflutter. (For what it’s worth, human infants’ hearts beat more quickly when shown pictures of their parents.)
The way animals perceive video adds an additional layer of complexity. Video isn’t a continuous display of motion, but rather a series of frames that transition so quickly the human eye can’t detect it. Ordinary film and television do not, however, flicker fast enough to fool a dog. To canines, television probably looks like a series of still images or jerky movements. Cats perform somewhere between humans and dogs on flicker fusion tests, so their appreciation of television probably depends on how bright the room is. (Darker rooms make it harder to detect flickering, which is one of the reasons theater owners turn out the lights.)
Got a question about today’s news? Ask the Explainer.
Explainer thanks Robert Cook of Tufts University, editor of the online book Avian Visual Cognition.