Proactionary Principle

From H+Pedia
(Redirected from Proactionary principle)
Jump to: navigation, search

The Proactionary Principle is a ethical and decision making principle formulated by Max More developed from Extropianism:

People’s freedom to innovate technologically is highly valuable, even critical, to humanity. This implies a range of responsibilities for those considering whether and how to develop, deploy, or restrict new technologies. Assess risks and opportunities using an objective, open, and comprehensive, yet simple decision process based on science rather than collective emotional reactions. Account for the costs of restrictions and lost opportunities as fully as direct effects. Favor measures that are proportionate to the probability and magnitude of impacts, and that have the highest payoff relative to their costs. Give a high priority to people’s freedom to learn, innovate, and advance.[1]


  1. Freedom to innovate: Our freedom to innovate technologically is valuable to humanity. The burden of proof therefore belongs to those who propose restrictive measures. All proposed measures should be closely scrutinized.
  2. Objectivity: Use a decision process that is objective, structured, and explicit. Evaluate risks and generate forecasts according to available science, not emotionally shaped perceptions; use explicit forecasting processes; fully disclose the forecasting procedure; ensure that the information and decision procedures are objective; rigorously structure the inputs to the forecasting procedure; reduce biases by selecting disinterested experts, by using the devil’s advocate procedure with judgmental methods, and by using auditing procedures such as review panels.
  3. Comprehensiveness: Consider all reasonable alternative actions, including no action. Estimate the opportunities lost by abandoning a technology, and take into account the costs and risks of substituting other credible options. When making these estimates, carefully consider not only concentrated and immediate effects, but also widely distributed and follow-on effects.
  4. Openness/Transparency: Take into account the interests of all potentially affected parties, and keep the process open to input from those parties.
  5. Simplicity: Use methods that are no more complex than necessary
  6. Triage: Give precedence to ameliorating known and proven threats to human health and environmental quality over acting against hypothetical risks.
  7. Symmetrical treatment: Treat technological risks on the same basis as natural risks; avoid underweighting natural risks and overweighting human-technological risks. Fully account for the benefits of technological advances.
  8. Proportionality: Consider restrictive measures only if the potential impact of an activity has both significant probability and severity. In such cases, if the activity also generates benefits, discount the impacts according to the feasibility of adapting to the adverse effects. If measures to limit technological advance do appear justified, ensure that the extent of those measures is proportionate to the extent of the probable effects.
  9. Prioritize (Prioritization): When choosing among measures to ameliorate unwanted side effects, prioritize decision criteria as follows: (a) Give priority to risks to human and other intelligent life over risks to other species; (b) give non-lethal threats to human health priority over threats limited to the environment (within reasonable limits); (c) give priority to immediate threats over distant threats; (d) prefer the measure with the highest expectation value by giving priority to more certain over less certain threats, and to irreversible or persistent impacts over transient impacts.
  10. Renew and Refresh: Create a trigger to prompt decision makers to revisit the decision, far enough in the future that conditions may have changed significantly.

See also

External links