From H+Pedia
Revision as of 10:48, 12 July 2015 by Davidwwood (talk | contribs)
Jump to: navigation, search

Definitions of transhumanism

"Transhumanism is a way of thinking about the future that is based on the premise that the human species in its current form does not represent the end of our development but rather a comparatively early phase" - Transhumanist FAQ

"Transhumanism is a class of philosophies of life that seek the continuation and acceleration of the evolution of intelligent life beyond its currently human form and human limitations by means of science and technology, guided by life-promoting principles and values" - Max More, 1990

"Transhumanism is the philosophy that we can and should develop to higher levels, both physically, mentally and socially using rational methods" - Anders Sandberg

Criticisms of transhumanism

Disbelief in a particular technology or timescale

Some critics may claim that a particular technology (e.g. mind-uploading) is very unlikely to be achieved in a given timescale (e.g. 2043) and therefore transhumanism must be discredited.

Note, however, that

  • None of the above definitions take for granted any particular attitude towards any particular technology, e.g. mind-uploading, cryonics, or artificial general intelligence
  • None of these definitions presuppose any given timescale for future developments to take place.

Any such criticisms, therefore, are criticisms of particular views of particular transhumanists, rather than a criticism of transhumanism itself.

The view that humanity has already reached a final, desirable state

Some adherents of religious viewpoints assert that humanity has been created in what is already a perfect state, and cannot be improved by means proposed by transhumanism (including science and technology).

Other critics assert that humanity is currently too profligate in use of planetary resources, and needs to be returned to a simpler lifestyle that uses fewer resources.

(The remainder of this article is in a draft status)

The view that science and technology will cause more harm than good

(Existential risks)

The view that there are more urgent priorities for human attention

(Poverty, hygiene, malaria, inequality...)

The dislike of any "ism", and a preference for action

(Talk of changing human nature is unnecessarily scary)