March 19, 2006

The future of science

Stewart Brand of the Long Now Foundation has organized a series of seminars which he hopes will build a "coherent, compelling body of ideas about long-term thinking, to help nudge civilization toward Long Now's goal of making long-term thinking automatic and common instead of difficult and rare."

One such seminar has recently been published on by Kevin Kelly. Titled, "Speculations on the Future of Science," Kelly tries to predict how science and the scientific method will change over the next 50 years.

As a starting point, he looks at how recursion is at the heart of science; Kelly compiled a list of new recursive devices in the history science:

2000 BC — First text indexes
200 BC — Cataloged library (at Alexandria)
1000 AD — Collaborative encyclopedia
1590 — Controlled experiment (Roger Bacon)
1600 — Laboratory
1609 — Telescopes and microscopes
1650 — Society of experts
1665 — Repeatability (Robert Boyle)
1665 — Scholarly journals
1675 — Peer review
1687 — Hypothesis/prediction (Isaac Newton)
1920 — Falsifiability (Karl Popper)
1926 — Randomized design (Ronald Fisher)
1937 — Controlled placebo
1946 — Computer simulation
1950 — Double blind experiment
1962 — Study of scientific method (Thomas Kuhn)

Looking to the future, Kelly makes some predictions (details can be found in the article):

1) There will be more change in the next 50 years of science than in the last 400 years
2) This will be a century of biology
3) Computers will keep leading to new ways of science
4) New ways of knowing will emerge
5) Science will create new levels of meaning

More specifically, Kelly comes up with possible breakthroughs in how science is done:

Compiled Negative Results - Negative results are saved, shared, compiled and analyzed, instead of being dumped.

Triple Blind Experiments - In a double blind experiment neither researcher nor subject are aware of the controls, but both are aware of the experiment. In a triple blind experiment all participants are blind to the controls and to the very fact of the experiment itself.

Combinatorial Sweep Exploration - Much of the unknown can be explored by systematically creating random varieties of it at a large scale.

Evolutionary Search - If new libraries of variations can be derived from the best of a previous generation of good results, it is possible to evolve solutions.

Multiple Hypothesis Matrix - nstead of proposing a series of single hypothesis, in which each hypothesis is falsified and discarded until one theory finally passes and is verified, a matrix of many hypothesis scenarios are proposed and managed simultaneously.

Pattern Augmentation - Pattern-seeking software which recognizes a pattern in noisy results.

Adaptive Real Time Experiments - Results evaluated, and large-scale experiments modified in real time.

AI Proofs - Artificial intelligence will derive and check the logic of an experiment.

Wiki-Science - The average number of authors per paper continues to rise.

Defined Benefit Funding - The use of prize money for particular scientific achievements will play greater roles.

Zillionics - Ubiquitous always-on sensors in bodies and environment will transform medical, environmental, and space sciences.

Deep Simulations – As our knowledge of complex systems advances, we can construct more complex simulations of them.

Hyper-analysis Mapping – Just as meta-analysis gathered diverse experiments on one subject and integrated their (sometimes contradictory) results into a large meta-view, hyper-analysis creates an extremely large-scale view by pulling together meta-analysis.

Return of the Subjective - Existence seems to be a paradox of self-causality, and any science exploring the origins of existence will eventually have to embrace the subjective, without become irrational.

Tags: , , , , , , .

No comments: