Professional Documents
Culture Documents
Introduction by:
John Brockman
People have to go around measuring things. There's no escape from that
for most of that type of work. There's a deep relationship between the
two. No one's going to come up with a model that works without going
and comparing with experiment. But it is the intelligent use of
experimental measurements that we're after there because that goes to
this concept of Bayesian methods. I will perform the right number of
experiments to make measurements of, say, the time series evolution of
a given set of proteins. From those data, when things are varying in time,
I can map that on to my deterministic Popperian model and infer what's
the most likely value of all the parameters that would be Popperian ones
that would fit into the model. It's an intelligent interaction between them
that's necessary in many complicated situations.
INTRODUCTION
by John Brockman
Theres a massive clash of philosophies at the heart of modern science.
One philosophy, called Baconianism after Sir Francis Bacon, neglects
theoretical underpinning and says just make observations, collect data,
and interrogate them. This approach is widespread in modern biology and
medicine, where its often called informatics. But theres a quite different
philosophy, traditionally used in physics, formulated by another British
Knight, Sir Karl Popper. In this approach, we make predictions from
models and we test them, then iterate our theories.
In modern medicine you might find it strange that many people dont
think in theoretical terms. It's a shock to many physical scientists when
they encounter this attitude, particularly when it is accompanied by a
conflation of correlation with causation. Meanwhile, in physics, it is
extremely hard to go from modeling simple situations consisting of a
handful of particles to the complexity of the real world, and to combine
theories that work at different levels, such as macroscopic theories
(where there is an arrow of time) and microscopic ones (where theories
are indifferent to the direction of time).
At University College London, physical chemist Peter Coveney, is using
theory, modeling and supercomputing to predict material properties from
basic chemical information, and to mash up biological knowledge at a
range of levels, from biomolecules to organs, into timely and predictive
clinical information to help doctors. In doing this, he is testing a novel
way to blend the Baconian and Popperian approaches and have already
had some success when it comes to personalized medicine and predicting
the properties of next generation composites.
PETER COVENEY holds a chair in Physical Chemistry, and is director of the
Centre for Computational Science at University College London and co-
If you look at the way we categorize our theories, there are different ways
of analyzing them. Some lie within the domain of physics or even applied
mathematics. We have chemistry, biology, engineering; these usually are
regarded as separate disciplines and historically have comparatively little
to do with one another. It's not a surprise when you ask questions about
who's doing what, in scientific terms, that answers my question of a
unified theory of knowledge, so to speak, that it's rather fragmented still
today.
We have people who explore the extremely large-scaleyou might call
that cosmologyor the very small scales. Again, that's a physical domain
subatomic theories, going down to extremely short length in timescales.
We can have problems that relate to life, such as where life has come
from on this planet, but we have plenty of reasons to suspect that it's
probably much more widespread than that, and then questions are posed
in rather different ways.
In modern biology and medicine today you would find most people not
even trying to think in theoretical terms. It's quite a shock to many
physical scientists when they encounter this. It's a funny clash between
two philosophies of science that have been around for overall 500 years
or so. What we call "Baconian theory" says, don't worry about a
theoretical underpinning, just make observations, collect data, and
interrogate the data. This Baconianism, as it's come to be known, is very
widespread in modern biology and medicine. It's sometimes also called
informatics today.
We have the model of philosophy of science, which is the physicists one,
formulated in a nice and concise way by Sir Karl Popper. These are two
curious Knights of the British Realm, in fact, whose descriptions of the
way science works are at complete odds with one another. A Popperian
theory is one where it's fundamentally mathematical, and you can
describe reality in terms that are somehow out there, objective. We make
predictions from these theories and models and we test them. If the
agreement isn't good enough, it could be that the experimental
observations are wrong. Every now and then we have to change the
theory.
patients and stored in databases and some form of expert system is run
on them. When a new sequence comes in from a virologist, it's matched
up to everything that was done before and someone will infer that the
best treatment now is the same thing as what was done for some group
of people before. This is not a reliable approach to individual medical
treatment.
If you can find a Popperian method, you'd be much better off. What is
that method? That's one of the things I'm interested in, and that is doing
sequence-specific studies from the virushow it binds to individual drugs.
It's no longer a generic task that a drug company is interested in. The
drug companies have their problems now. They're trying to produce drugs
as blockbusters, one-size-fits-all. This is not going to work in the future
anyway. We have to tailor drugs to individuals. The challenge there is, can
I match drugs to individual sequences? That's quite a demanding thing. It
has quantum mechanics in it, it has classical mechanics, and it connects
up to the way the patient is treated clinically. It too is a multilevel thing.