Proactionary Principle

From H+Pedia
(Redirected from Proactionary principle)
Jump to navigation Jump to search

The Proactionary Principle is a ethical and decision making principle formulated by Max More after the 2004 Extropy Institute Vital Progress Summit, which focused on an alternative to the Precautionary Principle used by bioethicists against transhumanism.[1]


According to its founder Max More:

The Proactionary Principle emerged out of a critical discussion of the precautionary principle during Extropy Institute’s Vital Progress Summit in 2004. We saw that the precautionary principle is riddled with fatal weaknesses. Not least among these is its strong bias toward the status quo and against the technological progress so vital to the continued survival and well-being of humanity. Participants in the VP Summit understood that we need to develop and deploy new technologies to feed billions more people over the coming decades, to counter natural threats—from pathogens to environmental changes, and to alleviate human suffering from disease, damage, and the ravages of aging. We recognized the need to formulate an alternative, more sophisticated principle incorporating more extensive and accurate assessment of options while protecting our fundamental responsibility and liberty to experiment and innovate. With input from some of those at the Summit, I developed the Proactionary Principle to embody the wisdom of structure. The Principle urges all parties to actively take into account all the consequences of an activity—good as well as bad—while apportioning precautionary measures to the real threats we face. And to do all this while appreciating the crucial role played by technological innovation and humanity’s evolving ability to adapt to and remedy any undesirable side-effects.[2]


People’s freedom to innovate technologically is highly valuable, even critical, to humanity. This implies a range of responsibilities for those considering whether and how to develop, deploy, or restrict new technologies. Assess risks and opportunities using an objective, open, and comprehensive, yet simple decision process based on science rather than collective emotional reactions. Account for the costs of restrictions and lost opportunities as fully as direct effects. Favor measures that are proportionate to the probability and magnitude of impacts, and that have the highest payoff relative to their costs. Give a high priority to people’s freedom to learn, innovate, and advance.[3]


  1. Freedom to innovate: Our freedom to innovate technologically is valuable to humanity. The burden of proof therefore belongs to those who propose restrictive measures. All proposed measures should be closely scrutinized.
  2. Objectivity: Use a decision process that is objective, structured, and explicit. Evaluate risks and generate forecasts according to available science, not emotionally shaped perceptions; use explicit forecasting processes; fully disclose the forecasting procedure; ensure that the information and decision procedures are objective; rigorously structure the inputs to the forecasting procedure; reduce biases by selecting disinterested experts, by using the devil’s advocate procedure with judgmental methods, and by using auditing procedures such as review panels.
  3. Comprehensiveness: Consider all reasonable alternative actions, including no action. Estimate the opportunities lost by abandoning a technology, and take into account the costs and risks of substituting other credible options. When making these estimates, carefully consider not only concentrated and immediate effects, but also widely distributed and follow-on effects.
  4. Openness/Transparency: Take into account the interests of all potentially affected parties, and keep the process open to input from those parties.
  5. Simplicity: Use methods that are no more complex than necessary
  6. Triage: Give precedence to ameliorating known and proven threats to human health and environmental quality over acting against hypothetical risks.
  7. Symmetrical treatment: Treat technological risks on the same basis as natural risks; avoid underweighting natural risks and overweighting human-technological risks. Fully account for the benefits of technological advances.
  8. Proportionality: Consider restrictive measures only if the potential impact of an activity has both significant probability and severity. In such cases, if the activity also generates benefits, discount the impacts according to the feasibility of adapting to the adverse effects. If measures to limit technological advance do appear justified, ensure that the extent of those measures is proportionate to the extent of the probable effects.
  9. Prioritize (Prioritization): When choosing among measures to ameliorate unwanted side effects, prioritize decision criteria as follows: (a) Give priority to risks to human and other intelligent life over risks to other species; (b) give non-lethal threats to human health priority over threats limited to the environment (within reasonable limits); (c) give priority to immediate threats over distant threats; (d) prefer the measure with the highest expectation value by giving priority to more certain over less certain threats, and to irreversible or persistent impacts over transient impacts.
  10. Renew and Refresh: Create a trigger to prompt decision makers to revisit the decision, far enough in the future that conditions may have changed significantly.

See also

External links