Professional Documents
Culture Documents
Collin Heckman
CAS 138
13 April 2018
1. Introduction
small foothold in the broader scientific community to prevent your contributions and thus
yourself from fading into the infinite masses of unread and unappreciated work. In the
unending fight for relevance, academia has created a threat to its very foundation. The
common place name for this complex issue that spans many topics itself is “Publish or
Perish.” This broad phrase is often used in reference to the constant need to push out
or research institution. This specific use of the phrase accompanies a share of challenges
that the scientific community will one day have to face, but the largest threat the “Publish
undermines the very pillars of the scientific method. Dr. J Matthias Starck labels these
The three pillars of the scientific method form the basis on which modern
research should be completed. Unfortunately, the pressure to succeed can lead to ignoring
one or more of these pillars. In order to prevent the loss of integrity that such oversights
lead to, a solution to the symptoms of “Publish or Perish” must be found. The basis of
one such solution is the instillation of a committee with the sole purpose of reviewing and
Heckman 2
supporting scientific publications with a standardized approach that holds the three pillars
as unwavering guide post for scientific virtue. Such an organization would need to be
huge in scale and funding if it were to serve the entire scientific community, so instead
each field of science would independently host a committee made of their own members
inside of their already accepted organizations. For example, within the astronomy and
Astronomical Society. The specific responsibilities and roles of the committee will be
focused on addressing each of the three pillars and the situations that could arise in
relation to each pillar. Critics of such an organization will immediately point on the
immense cost such a committee would represent. They would be correct in pointing out
that this process would not be without monetary cost, but the greater benefit to the
scientific community far outweighs the cost when considering the value of reputation in
fields where both the public and the scientific community often accept results based
purely on the interpretations of the researchers. Others would argue that such an
institution would slow the progress of science by adding yet another step in the
distribution of scientific information. Again, their concerns are valid, but a committee
dedicated to ensuring scientific standards are met ensures that published information was
gathered and analyzed using accepted methodology and best practices. Overall such a
2. Reproducibility
Reproducibility is the first pillar of the scientific method. Dr. Starck defines
reproducibility as meaning a study can be replicated based solely only the documentation
and notes of the original study (4). Despite being a core tenet of science, reproducing a
previous experiment in order to confirm findings is given low priority when applying to
be published (Casadevall & Fang). Not being able to publish disincentivizes confirmation
struggle, their value only becomes more apparent. An analysis by a major pharmaceutical
company found that out of a sample of cancer biology papers only 11% had findings that
cancer biology papers and found only 6% of findings were validated. At this point it is
important to note that not all experiments are repeatable. Some situations such as the
observations made of the comet Shoemaker-Levy’s collision with Jupiter are unique and
were carried out in an uncontrolled natural environment (Casadevall & Fang). Other
experiments, such as those carried out on live animals, cannot be repeated frequently due
to ethical concerns. The third reason reproduction of an experiment may not be possible
is cost. Experiments that involve costly supplies or very specialized equipment may not
be feasible to reproduce after the initial experiment. These exceptions are exactly that
exceptions. When possible, even they should be reproducible to ensure quality in both the
analysis and the experimentation. Reproduction of an experiment also helps isolate any
community simply by holding each other to high standards. This means demanding that
Heckman 4
researchers ensure their work is reproducible when applicable and providing proper
funding to those who set out to validate the findings of other experiments. Both aspects of
a clear and precise way. The committee would also have the ability to allocate funding to
researchers who wish to focus on validation where a traditional source of funding may
default to the preferences of scientific journals and shy away from repeated experiments.
Some would argue the extra steps and the increased number of repeated experiments
would decrease the amount of new science being done and published; however, these two
3. Transparency
transparency. Transparency goes hand in hand with the previous pillar of reproducibility
in that it refers to the accurate recording of all materials and methods used (Starck 4).
Besides the importance of transparency for repeating the experiment at a later time,
similar research is often being done by several teams at once. If the researchers are not
completely clear any communication between the teams can lead to confusing and often
researchers (Henderson). Two teams, one at University of California Berkeley and the
other at Harvard, were producing profiles of human breast cells using the same method,
but their results were not matching up. The two teams could not isolate a reason for the
Heckman 5
difference until they met in person and carried out the experiment next to each other.
Only then did they realize a slight difference in the way they agitated the sample. This
small difference should have been caught through an analysis of the two labs’ methods.
statistics. In a study of 250,000 statistical analysis from papers published between 1985
and 2013 in eight psychology journals, 50% of the papers had incorrect statistics (Starck
12). It is reasonable to argue that psychology is a very different science than the more
concrete sciences due to the involvement of the human mind, but if improper statistics are
being used in psychology it can be safely assumed that there exists some level of error in
The solution a committee offers to both the methodology and the statics
challenges is one of peer review. Peer review already exist as a part of the publishing
process, but there are several barriers that can inhibit its effectiveness. A reviewing board
has the ability to decide if a method is clearly described and gives enough information to
reproduce or not, whereas traditional peer review is primarily focused on the validity of
the science being done based on the experience of a single person. Another challenge
falls in the ethics of peer review. In some fields, such as nursing, it is standard procedure
to have the review be double-blind (neither reviewer nor author knows who the other is),
but it is possible for the reviewer to recognize the work of the author (Schreiber). If that
occurs the reviewer is obligated to recuse him or herself from the process based solely on
their personal values. Although a peer review process should still occur, a committee
offers a group analysis that prevents any preferential treatment from becoming too
apparent especially when combined with a double-blind process. In order to handle the
Heckman 6
issue of statistical error, the committee approach would take the same approach as the
Board of Reviewing Editors at Science and ensure that members of the reviewing team
included experts in statistics (McNutt). The addition of a formal review committee to the
peer review process fills in the gaps that peer reviewing can often leave.
4. Honesty
The third and final pillar of the scientific method is honesty. Honesty is a crucial
aspect of science in that it applies to every stage of the process. Not only is honesty a
vital part of taking data, but also in sharing the potential results of that data with the
public. The accurate sharing of data is an integral part of how science moves forward.
However, the lack of any consistent accountability measures has led to a rash of integrity
violations. Social scientist Daniele Fanelli found that 2% of British researchers admitted
to falsifying or modifying data at least once (“Publish or Perish”). Another survey found
that 50% of US scientist claim to know of scientific misconduct (Starck 12). This finding
implies an underlying issue with the current system. Finding faked data is difficult
enough, but because there is no separate body with the sole goal of analyzing research in
order to ensure that it is thoroughly vetted and if necessary reproduced, the chances of
finding these violations of scientific integrity diminish to near zero. This lack of trust
There is already widespread support that all data collected should be shared with
the scientific community (Starck 16). That itself is not an issue, but when that data or a
side comment made by a scientist catch the eye of the broader public there can be drastic
effects. Public support is an important aspect of science as much of the funding for
research is tied to the public opinion. There is however an ethical aspect in getting the
Heckman 7
public excited for new science or technology that is not yet ready (Master & Resnik 322).
One such occurrence is with biotechnology. Biotechnology is a topic that often inspires a
lot of excitement, but like most science it moves at a relatively slow pace when being
thoroughly tested. The lack of results can disappoint the public and lead to a sense of
betrayal that hurts science as a whole. This interaction with the public is one reason
having central committees that validate and review publications is so important. When a
new piece of research surfaces and the public gets a hold of the information, it is the
responsibility of the committee to either endorse the work or make it clear that the
important in topics such as climate science and genetically modified food where people
are already more skeptical of real science (Funk). The committee offers not only a face to
a field of science, but also a check on the scientist who put out information and
publications.
5. Conclusion
The pillars of the scientific method are what builds trust and consistency into the
experts is a way to protect and reinforce these pillars. It is an insurance against the
constant push to publish that the content being produced by the scientific community will
be accurate and representative of the work done. Such a committee would be the face of
each scientific branch. Yes, this endeavor may slow down the number of new published
works and it will defiantly divert funds from research. However, the work being done
will be of top quality. There will significantly less risk that years of work will need to be
redone, because they are built upon incorrect assumptions based on inaccurate studies.
Heckman 8
Through reproducibility, transparency, and honesty the scientific being safe guarded by
scientist for scientist, the entire community can move forward and focus on producing the
Works Cited
Casadevall, Arturo and Ferric C Fang. "Reproducible Science." Infection and Immunity
78.12 (2010): 4972-4975.
Funk, Cary. "Real Numbers: Mixed Messages about Public Trust in Science." Issues in
Science and Technology 34.1 (2017).
Henderson, Dakin. "Why Should Scientific Results Be Reproducible?" PBS (2017).
<www.pbs.org/wgbh/nova/next/body/reproducibility-explainer/>.
Master, Zubin and David B Resnik. "Hype and Public Trust in Science." Science and
Engineering Ethics 19.2 (2013): 321-335.
McNutt, Marcia. "Reproducibility." Science 343.6168 (2014): 229.
"Publish or Perish." Nature 521.7552 (2015): 259-259.
Schreiber, Mary L. "Peer Review." Medsurg Nursing 26.2 (2018): 146-147.
Starck, J M and SpringerLink (Online service). Scientific Peer Review: Guidlines for
Informative Peer Review. Wiesbaden: Springer Fachmedien Wiesbaden, 2017.