I’ve seen this in a number of places (Slashdot refers me both to the Wired article and to the original press release): Thomas DeMarse, of the University of Florida, has cultivated a culture of 25,000 living rat neurons, and hooked up the culture to a computer running flight simulator software. The neurons have learned, in effect, to fly the simulated plane.
This is fascinating on a number of grounds. The neurons constitute, in effect, an artificial bio-brain. The neurons have hooked up with one another for the first time, in the same way that neurons actually do hook up in a developing human’s or animal’s brain; and an interface has been successfully established between this bio-brain and the silicon computational machinery of a computer. Strong-AI enthusiasts like Ray Kurzweil fantasize about replacing human neurons with silicon chips, one by one, until the mind has been entirely translated or downloaded into a computer. But neurons and silicon logic chips in fact function in quite different ways, so the idea of interfacing neurons and digital computers, as DeMarse and others have done, is in fact much more plausible. Brains need to be embodied, in a way that electronic computing machines don’t; but an experiment like this suggests a way that this embodiment could in fact be made entirely simulacral, like in the old (updated-Cartesian) ‘brain in a vat’ scenario.
The whole experiment turns on the fact that brains don’t operate the way digital computers do. Brains signal chemically as well as electronically, which makes them different by nature from computer chips; and from what little evidence we have on the subject, it would seem that (as Gerald Edelman, among others, argues), brains are not in fact Turing machines, but operate according to entirely different principles. Indeed, DeMarse’s goal is less to train the neurons to do useful computational work, than he is “to learn how the brain does its computation.”
The SF writer Peter Watts in fact deals with all these questions in his “Rifters” novels Starfish and Maelstrom (I haven’t yet read the just-published third volume in the series, Behemoth: B-Max; a fourth and final volume is scheduled to come out next year). In these novels, neural cultures called “smart gels” do computational tasks — involving pattern recognition, nuanced judgments involving various qualitative factors, and so on — that digital computers are ill-suited for. But the fact that “smart gels” are required to make human-style judgments, but are devoid of human personalities and emotions, itself leads to disturbing and disastrous consequences…. It’s always a problem when “intelligence” is divorced from context.