Blue Brain in a Virtual Body

virtual wedding

You’re the best man at a wedding. You’re decked out in the finest tuxedo money can buy. Yet to attend the wedding, you don’t even need to leave your chair or move your keyboard. Like Neo when he plugs into The Matrix, your body is computer-generated. The bride and groom are real people like you –- but they control their virtual bodies from keyboards in Boston and London.  

Science fiction? Not really. I’ve attended several such online weddings “wearing” a virtual body and tuxedo. I’ve walked down a virtual wedding isle and given away a bride. But that’s another story.

Now the "Blue Brain" is being put in a virtual body. Observing it may provide the first indications of the molecular and neural basis of thought and memory.

As reported in a recent h+ roundup of projects working on silicon intelligence (see Resources below), the Blue Brain project is the first comprehensive attempt to reverse-engineer the mammalian brain. It relies on a microcircuit known as the neocortical column, repeated millions of times across an electronic cortex.

the Blue Brain project is the first comprehensive attempt to reverse-engineer the mammalian brain.

Project Director Henry Markram told the recent Science Beyond Fiction conference in Europe that the column is being integrated into a virtual reality agent – “a simulated animal in a simulated environment, so that the researchers will be able to observe the detailed activities in the column as the animal moves around the space.”

What sort of animal might that be?  Why an animal rather than a human avatar like the ones used in virtual worlds such as World of Warcraft, The Sims, or Second Life?  Professor Markram’s team is still working out the details, and indicated to h+ Magazine that the project is “in the planning phase.” 

Planning a large engineering project is typically an iterative process with many starts and stops. It’s understandable that Markram is reluctant to reveal much at this early stage. But it’s always fun to speculate.

So, what about the Blue Brain’s new virtual body? Having an online body  — or “virtual embodiment” — is a concept familiar to users of virtual worlds. You create a 3D graphical representation of yourself known as an avatar. Avatars can change appearance, talk with other avatars, and walk around inside of user-generated environments. You control your avatar –- the online agent – from your computer keyboard.

As humans, our embodied experience of the world includes the ways in which our actions bring about changes in our understanding of ourselves, our emotional makeup, and our conscious and unconscious behaviors.

Embodied experience is not necessarily associated with our physical body. As one Second Life researcher puts it, “If we act within virtual spaces, especially in a way that is mediated by a virtual body, then we may have a variety of experiences that are experienced as embodied.”

Philosopher Hubert Dreyfus’ argues that human intelligence is fundamentally situated in a body – trying to separate intelligence from embodiment is like trying to separate cognition from memory. This is the first clue as to why it makes sense to put the Blue Brain in a virtual body.

Tom Boellstorft, a UC Irvine anthropologist who performed the first field work in a virtual world, goes a step further to suggest that online embodiment is “central to online selfhood.”  "The experiences of Second Life residents with actual-world physical disabilities throw into stark relief many of [the] issues around virtual embodiment. The potential of virtual worlds to allow disabled persons to be embodied like the nondisabled has been noted from the earliest days of text-only chatrooms."

The Blue Brain may indeed not be that different from a person with a rare disease confined to a wheelchair.  Remember the bride I mentioned earlier?  The only way she could walk down the wedding isle with me was to use her virtual body. Her sense of self and experience of the world depend upon that virtual body.

The early days of artificial intelligence (AI) research centered on disembodied intelligence. But if intelligence is truly grounded in embodiment, the idea of embodying an AI like the Blue Brain with a virtual body is a golden research strategy.

Blue BrainStarting with an animal avatar rather than a human one also may make a certain amount of sense from the perspective of evolutionary algorithms, learning strategies, the complexity of interacting with a virtual environment –- building a database of experiences that might be used for later human-like embodiment.

Professor Markram says, "It [the Blue Brain] starts to learn things and starts to remember things. We can actually see when it retrieves a memory, and where they retrieved it from because we can trace back every activity of every molecule, every cell, every connection and see how the memory was formed."

Markram believes that, “building up from one neocortical column to the entire neocortex, the ethereal ‘emergent properties’ that characterize human thought will, step-by-step, make themselves apparent.”

h+ contributor Ben Goertzel also uses animal avatars in his research – specifically, parrots. His Novamente artificial general intelligence (AGI) engine follows the research strategy of John Laird at the University of Michigan, who initiated a project in 2002 using the Soar AI system to control agents in 3D gaming environments.

Electronic ParrotGoertzel’s parrots are being designed for introduction into virtual worlds, where they can learn from human users. They are AGI programs in disguise, "… the AI has one or more bodies but may also have other important sources of knowledge (generally, some sources present in it from creation, some interactively presenting it with new information as it learns).”  His parrots share a common knowledgebase that accumulates experiences shared by all parrots as they interact with human-controlled avatars. 

SciFi writer Charles Stross posted a recent blog entry on where he gets his ideas. He explores possible plot lines based around the idea that it is possible today to use CGI-animated characters as protagonists. Video motion capture (in which a computer image recognition system captures and digitizes the body movements of a living model) and re-skinning a CGI-rendered avatar make it possible to map the likeness of an actor onto the motions of a human actor.

Yes, you too can be Spenser Tracy or Kathryn Hepburn. And when Ronald Reagan enters the scene in 2025, could his 3D avatar actually be… the Blue Brain’s grandson?

Leave a Reply