Jump to content

User:Aryeh M. Friedman/State of Fear Science

From Wikipedia, the free encyclopedia

This Wikipedia Article is meant to present an as fair and balance look at the science and the political debate that Michale Crighton's novel State of Fear has created. It is not a place to debate the actual merits of both sides of debate (please do that on the discussion page).


Accuracy and Validity of Citations

[edit]

Are the citations valid scientific works?

[edit]

Does Crighton misuse any of his cited work?

[edit]

Does Scientific Consensus exist on the issue of Climate Change

[edit]

Have politicians and other non-scientests misused Crighton's work (or the work he cites)?

[edit]

Global Warming (or Lack thereof)

[edit]

Raw Data Reliability

[edit]

Accuracy of Collection Methods

[edit]

Can we correctly correct for known inaccuracies in the raw data?

[edit]

What does the raw data say about actual changes (if any) in the climate

[edit]

How well do we understand the climate?

[edit]

Do we have enough historically accurate raw data to say if there is climate change in the long term or not?

[edit]

Do we understand the primary forces that drive the climate?

[edit]

Do we have sufficient raw data to support or not support the hypothesis that human activity is driving measurable climate change?

[edit]

Can we predict future climate change (or lack thereof) based on our current knowledge of the raw data and the forces driving the climate?

[edit]

Computer Modeling

[edit]
Can we quantify our knowledge of the climate sufficently to create realistic mathimatical models?
[edit]

Note: This section assumes that there are no limitations on the computational capabilities of computers and/or other "known" limitations to the ability to model complex systems

Theory of Computation Limitations
[edit]
Basic Theory of Computational Terminology
[edit]

Decidablity: Can a given mathematical property be proven/disproved and if that is not possible can we prove it is impossible to prove/disprove. If yes to both questions then the problem is "undecidable".

Determinable vs. non-determinable processes: Determinable processes do not require the "machine" to try every possible outcome before selecting the correct solution (namely only one solution is possible if one is possible). Non-determinable processes require the machine to explore at least 2 possible solutions before selecting the "correct" one (if a "correct" one exists).

Universal Turning Machine: The simplest possible machine that can do every task that a general purpose computer can do (efficiency is not a concern here)

Space/Time Complexity: How much memory requires and/or time a decidable determinable process takes to complete. Typically this stated as being either "deterministically polynomial" [P] (the process completes in some linear time based on the input size) or as being "non-deterministically polynomial" which means that if we had an infinite amount of memory or processors assigned to the task then it would complete in P time but since that is theortically impossible it will require some higher order exponiantial time/space to complete on a machine with finite processors and memory.

Barber Paradox: The logical paradox that all undecidable problems "reduce" to. In simplest terms if there is a town where everyone must shave everyday and everyone who does not shave them selves is shaved by the only barber in town. Who shaves the barber? If the barber shaves himself then he has violated the rule he only shaves people who do not shave themselves, but if he does not shave himself then he violates the rule that everyone must shave everyday. In other words no matter how you answer you create a unsolvable paradox.

Rice's Theorem: A theorem that states that for all non-trivial problems they are undecidable.

NP-Completeness: A decidable problem that has been proven to only be solvable in NP space/time

Does the Theory of Computation limit how accurate a climate model is?
[edit]

Note: This section assumes that the model is completely correct we only want to know if by running it on a computational machine will it be distorted by the nature of the Theory of Computation

Does Godel's Incompleteness Theorem Limit the accuracy of our models?

[edit]

Dynamic Complex Systems

[edit]
Input sensitivity
[edit]
Computers can not handle infinitely long/irrational numbers
[edit]
Does input sensitivity and the need to estimate infinitely long/irrational numbers have a measurable impact on our climate models?
[edit]

Note: Do we even need to use such numbers in our climate models?

Is it possible for a "perfect" computer climate model to accurately predict the past climate?

[edit]

Predictive ability of computer models

[edit]
How accurate do we need to be?
[edit]

Summing up

[edit]

Can we manage dynamic complex systems with incomplete knowledge of their workings and inputs?

[edit]

What role (if any) does national/international policy making and the politics thereof have on our ability to manage such systems?

[edit]

Supplemental Examples

[edit]

Is there undue influence on science from policy makers/the general public/media and/or does incomplete/bad science have undue influence on policy making?

[edit]

Yellow stone

[edit]

Eugenics

[edit]

Cases of complex system failure due to incomplete knowledge of the system (not in Crighton's Work)

[edit]
North Eastern US 1976 and 2004 Blackouts
[edit]
California water policy
[edit]
NYNEX System crash of 1998
[edit]

See Also

[edit]

External References

[edit]