H+ Magazine
Covering technological, scientific, and cultural trends that are changing–and will change–human beings in fundamental ways.

Editor's Blog

Surf-D
April 29, 2011

Ray Kurzweil has commented that we'll be spending quite a bit of our time in virtual reality environments in the coming decades. They'll be full immersion, and, as in Second Life, you can be someone else if you choose. Or several someones.

“Re-creating Yourself” is but one of the chapters in the new book, Infinite Reality: Avatars, Eternal Life, New Worlds, and the Dawn of the Virtual Revolution, by psychologists Jim Blascovich (UCSB) and Jeremy Bailenson (Stanford). In this chapter, they discuss a series of studies on virtual doppelgangers --- and explore the notion of avatar clones of you that can behave independently of your own intentions and actions in virtual space.

Seth Arthur Weisberg
April 28, 2011

Human language did not evolve to enable clear discussion of the subtle nature of physical or experiential reality, let alone of the new realities that advanced technology seems likely to give rise to. Seemingly basic terms like “existence” are OK for everyday discourse about ordinary life, but when one considers them in the context of potential phenomena like mind uploading and simulated universes, they begin to seem unacceptably fuzzy. This article describes part of the author’s quest to craft a clear conceptual vocabulary for discussing the nature of reality, now and after the next decades and centuries of technological advancement. While futurist in thrust, it also hits on some age-old philosophical issues.

Ben Goertzel and Paul Werbos
April 27, 2011

The idea that advanced technology may pose a risk to the very survival of the human race (an “existential risk”) is hardly a new one – but it’s a difficult topic to think about, both because of its emotional impact, and because of its wildly cross-disciplinary nature. It’s a subject demanding thinkers who are ruthless in rationality, polymathic in scope, and sensitive to the multitudinous dimensions of human nature. Dr. Paul Werbos is one such thinker, and this interview probes some of his views on the greatest risks posed by technology to humanity in the foreseeable future.

Natasha Vita-More
April 26, 2011

It is long established by theorist and telematic artist Roy Ascott that we have entered the moist zone. It is possible that we are awaiting the zone of life prolongation. Life prolongation is one anticipated result of human enhancement's continuous modification of human physiology, diversification of our species, and advancement toward substrate-independent minds. The anticipated results rely upon developments of bio-nanotechnology and artificial [general] intelligence. Unfortunately, the use of these media has unspecified and potentially risky outcomes. Nevertheless, unknown outcomes have not restricted or discourage artists from engaging human-computer integration and human-synthetic environments. This paper explores human enhancement media, including tools suggested by a technological Singularity, as a penultimate stage of life prolongation.

Tom McCabe and Humanity+
April 25, 2011

This is the third in a series of H+ Magazine articles on local transhumanist clubs and organizations. For more about why local communities are important and how they can improve your life, see the first article, Building and Growing Transhumanist Communities.

In addition to informing people about ideas associated with your group’s themes, speeches and similar events are a group’s primary means of outreach – attracting people from outside the group to join. You should try to host at least one significant event for every six months of activity (at least two per year if at all possible).

Hugo de Garis
April 21, 2011

The issue of species dominance is about whether humanity should build godlike, massively intelligent machines this century, with mental capacities trillions of trillions of times above human level. In certain circles, this is widely thought to be the singularly most important issue of the 21st century, due to its profound consequences for humanity’s survival once these “artilects” (artificial intellects) come into being.

Ben Goertzel and Michael Anissimov
April 20, 2011

“Existential risk” refers to the risk that the human race as a whole might be annihilated. In other words: human extinction risk, or species-level genocide. This is an important concept because, as terrible at it would be if 90% of the human race were annihilated, wiping out 100% is a whole different matter.

Existential risk is not a fully well defined notion, because as transhumanist technologies advance, the border between human and nonhuman becomes increasingly difficult to distinguish. If humans somehow voluntarily “transcend” their humanity and become superhuman, this seems a different sort of scenario than everyone being nuked to death. However, philosophical concerns aside, there are sufficiently many clear potential avenues to human extinction to make the “existential risk” concept valuable -- including nanotech arms races, risks associated with unethical superhuman AIs, and more mundane risks involving biological or nuclear warfare. While one doesn’t wish to approach the future with an attitude of fearfulness, it’s also important to keep our eyes open to the very real dangers that loom.

Eray Özkural
April 19, 2011

The nature of experience is one of those deep philosophical questions which philosophers and scientists alike have not been able to reach a consensus on. In this article, I review a transhumanist variant of a basic question of subjectivity.

In his classic article "What Is it Like to Be a Bat?," Thomas Nagel investigates whether we can give a satisfactory answer to the question in his title. Due to what he considers fundamental barriers, Nagel concludes that it is not something we humans can know.

Humanity+ Student Network
April 18, 2011

This is the second in a series of H+ Magazine articles on local transhumanist clubs and organizations. For more about why local communities are important and how they can improve your life, see the first article, Building and Growing Transhumanist Communities.

Hugo de Garis
April 15, 2011

I’m known for predicting that later this century, there will be a terrible war over the issue of species dominance. More specifically, it will be fought over whether humans should build artilects (artificial intellects), which could become so vastly superior to human beings in intellectual capacity that they may end up treating us as grossly inferior pests, wiping us out. I anticipate billions of casualties resulting from the conflict over the artilect question.

Join the h+ Community