You are on page 1of 9

research in medical education

Quality of reporting of experimental studies in


medical education: a systematic review
David A Cook,1 Thomas J Beckman1 & Georges Bordage2

OBJECTIVE Determine the prevalence of essential 26 articles (32%), respectively. A total of 17 articles
elements of reporting in experimental studies in (16%) contained an explicit study design statement.
medical education. Among the 48 studies with a comparison group, 35
(73%) clearly defined the comparison intervention
DESIGN Systematic review. or control group. Institutional review board approval
or participant consent was reported in 44 articles
DATA SOURCES Articles published in 2003 and 2004 (42%).
in Academic Medicine, Advances in Health Sciences
Education, American Journal of Surgery, Journal of CONCLUSIONS The quality of reporting of experi-
General Internal Medicine, Medical Education, and mental studies in medical education was generally
Teaching and Learning in Medicine. poor. Criteria are proposed as a starting point for
establishing reporting standards for medical
REVIEW METHODS Articles describing education education research.
experiments, including evaluation studies with
experimental designs, were identified (n = 185) by KEYWORDS review [publication type]; *education,
reviewing titles and abstracts. A random sample medical; research design ⁄ *standards; professional
(n = 110) was selected for full review. The full text competence ⁄ *standards; periodicals ⁄ *standards;
of each article was evaluated for the presence of guidelines; data collection ⁄ standards.
guideline-based features of quality reporting: a
critical literature review, conceptual framework, Medical Education 2007: 41: 737–745
statement of study intent (e.g. aim, research doi:10.1111/j.1365-2923.2007.02777.x
question, or hypothesis), statement of study design,
definition of main intervention and comparison
intervention or control group, and consideration of INTRODUCTION
human subject rights.
Medical education is a rapidly growing field of
RESULTS Of the 105 articles suitable for review, 47 study,1,2 as evidenced by the presence of 6 English-
(45%) contained a critical literature review and 58 language journals dedicated to reporting research
(55%) presented a conceptual framework. A state- and issues in medical education, and the increasing
ment of study intent was present in 80 articles (76%), number of education reports published in clinical
among which the independent and dependent journals.3–5 However, authors have called for greater
variables were operationally defined in 38 (47%) and use of theoretical frameworks,6–8 rigorous and
creative study designs,9–11 and meaningful out-
comes12–14 in medical education research, which
together suggest the need for higher quality
1
Division of General Internal Medicine, Mayo Clinic College of research.15 Furthermore, deficiencies in reporting
Medicine, Rochester, Minnesota, USA
2
Department of Medical Education, University of Illinois at Chicago, quality have been identified in medical education,16–18
Chicago, Illinois, USA as in other disciplines.19–23 Although reporting
Correspondence: David A Cook MD, MHPE, Division of General Internal quality and methodological rigour are not synony-
Medicine, Mayo Clinic College of Medicine, Baldwin 4-A, 200 First mous,24,25 they are inter-related26 in that high
Street SW, Rochester, Minnesota 55905, USA. Tel: 00 1 507 266 4156;
reporting quality is a prerequisite to understanding a
Fax: 00 1 507 284 5370; E-mail: cook.david33@mayo.edu

 Blackwell Publishing Ltd 2007. MEDICAL EDUCATION 2007; 41: 737–745 737
738 research in medical education

Another study investigated the reporting of ethical


safeguards, finding that only 24% of papers reported
Overview informed consent, and fewer than 5% reported review
by an institutional review board.17 Finally, an analysis of
What is already known on this subject 27 papers presented at the annual Research in Medical
Education Conference of the Association of American
Little is known about the quality of the Medical Colleges31 concluded that although the
reporting of research studies in medical edu- research question was explicitly stated in 96% of the
cation. studies, there was still considerable room for improve-
ment in reporting quality. Yet these studies present an
What this study adds incomplete and fragmented view of the quality of
reporting in medical education.
Many essential elements of scientific reporting
are frequently absent from articles describing The purpose of the present study was to describe
medical education experiments, including a more systematically the problem of reporting in
critical literature review, conceptual medical education research, as the first step to
framework, study design statement, definition improving reporting quality. Because reporting
of the comparison or control group, and criteria are highly specific to the type of study
acknowledgement of human subject rights. reported, we focused on a single broad category of
Journal requirements (such as statements studies: experimental studies. Experiments were
regarding ethical approval) can significantly chosen because they represent the gold standard for
improve reporting quality. evaluating the outcomes of instructional interven-
tions in education, and reporting should be relatively
Suggestions for further research straightforward as the rules for experiments are well
codified in science in general. However, experiments
This study might be replicated for other types in education vary widely in their execution, ranging
of medical education research (observational from single-group, post-test studies to static-group
studies, validity studies or qualitative studies) comparisons and randomised, controlled trials. Given
and other elements of reporting. this heterogeneity, our study focused on how well the
intent and nature of each experiment was described,
Formal reporting standards might be by contrast with, for example, the Consolidated
developed and evaluated for medical Standards of Reporting Trials (CONSORT) state-
education research. ment,19 which targets randomised trials and includes
specific elements of methods, results and discussion.
In addition, because of recent concerns about the
study’s intent and methods.27 Guidelines have been human rights of study subjects,17,32–34 we evaluated
published in clinical medicine for reporting how well human subject rights were acknowledged.
randomised clinical trials,19 studies of diagnostic
accuracy,28 meta-analyses29,30 and non-randomised Thus, our intent was to systematically document the
trials.27 The goal of such guidelines is to propose quality of reporting the intent and nature of experi-
reporting criteria that improve the ethical and mental studies, with the realisation that reporting
scientific validity of the reports. The road to creating may reflect the way research is conducted. The results
such reporting criteria is long and arduous and from this study can then be used by the medical
involves identifying a reporting problem, selecting education community as groundwork for the next
elements of reporting that are shown to affect the step, namely, selecting reporting criteria that affect
valid interpretation of the reports, and developing ethical and scientific validity.
standards by the consensus of an international panel.

We found only 3 studies evaluating the quality of METHODS


reporting of medical education research.17,18,31 A
review of studies evaluating cultural competence Eligibility criteria
training18 found that important elements (namely,
setting, participant demographics, intervention, We selected a representative sample of recently
appropriate comparison group, and reasons for non- published reports of experimental studies in medical
inclusion of data) were reported less than half the time. education. We defined ÔexperimentÕ according to

 Blackwell Publishing Ltd 2007. MEDICAL EDUCATION 2007; 41: 737–745


739

Fraenkel and Wallen,35 as a study in which research- 1459 abstracts screened


ers manipulate a variable (also known as the Academic Medicine (210)
treatment, intervention or independent variable) to Advances in Health Sciences Education (33)
assess its impact on other (dependent) variables. We American Journal of Surgery (584)
Journal of General Internal Medicine (297)
included all studies that met this definition. Medical Education (258)
Therefore, our sample was not limited to the classic Teaching and Learning in Medicine (77)
randomised trial, but embraced the entire spectrum 1274 articles excluded as not
of weak and strong experimental designs, including describing experimental
single-group, post-test only studies, static group research or not involving
medical education
comparisons, and non-randomised and randomised
trials. Evaluation studies with experimental designs 185 articles selected for further consideration
were included. Abstract-length reports were Academic Medicine (42)
Advances in Health Sciences Education (8)
excluded. American Journal of Surgery (13)
Journal of General Internal Medicine (26)
Medical Education (65)
Given the difficulties and limitations associated with Teaching and Learning in Medicine (31)
identifying medical education research using existing
databases,36,37 we hand-searched journals known for 75 articles not in
publishing medical education experiments. We random sample
selected the 4 journals with the highest impact 110 articles randomly
selected for review
factors among journals focusing exclusively on
medical education research. After discussion with 5 articles excluded as not describing
experimental research
experienced American medical educators in internal
Academic Medicine (2)
medicine and surgery, we also selected 2 specialty American Journal of Surgery (1)
journals that frequently publish education research. Medical Education (2)
To focus on the current state of research, we limited
105 articles included in final analysis
our review to the 2 full calendar years immediately
Academic Medicine (23)
preceding the start of data collection. Consequently, Advances in Health Sciences Education (5)
all articles published in 2003 and 2004 in the journals American Journal of Surgery (7)
Journal of General Internal Medicine (15)
Academic Medicine, Advances in Health Sciences Educa- Medical Education (37)
tion, American Journal of Surgery, Journal of General Teaching and Learning in Medicine (18)
Internal Medicine, Medical Education and Teaching and
Learning in Medicine were eligible for inclusion. Figure 1 Flow diagram of article selection

Study selection
characteristics of good reporting for medical
The titles and abstracts for all eligible articles education research were identified.15,16,36,41–43 For
(n ¼ 1459) were reviewed by the first author (DAC) example, Bordage16 showed that leading reasons for
and all articles describing education experiments the rejection of manuscripts by external reviewers in
were identified (n ¼ 185). Of these, we selected a medical education included incomplete research
random sample of 110 articles for full review, with the questions, incomplete or outdated literature reviews,
goal of including at least 100 articles in the final lack of conceptual frameworks, and inadequately
analysis. This was judged an adequate sample size, described interventions. Morrison et al.41 and Reed
given previous studies using similar or smaller et al.36 each noted the research question, literature
samples.18,25,31,38–40 Sampling was stratified by jour- review, study design, clear descriptions of interven-
nal and weighted by the number of education tions and educational context, and appropriate
experiments reported in each journal. Full text outcomes as key appraisal elements. We thus identi-
review by pairs of authors revealed that 5 studies were fied 5 elements essential to reporting the intent and
not experiments, leaving 105 articles for final analysis nature of education experiments: critical review of
(Fig. 1). A complete list of eligible and selected the literature;7,44 conceptual or theoretical frame-
articles is available as online supplementary material. work;7,45 statement of study intent (e.g. aim, purpose,
research question or hypothesis);45–47 explicit
Quality criteria statement of study design,48 and explicit definition of
all study interventions.27 For reasons noted in the
Based on studies of journal reviewers’ comments and Introduction, we added the acknowledgement of
guidelines for scientific writing in medical education, human subject rights.17,32–34 These 6 elements

 Blackwell Publishing Ltd 2007. MEDICAL EDUCATION 2007; 41: 737–745


740 research in medical education

Table 1 Definitions of key elements for reporting experimental studies

1 Review of literature
The literature review cites articles relevant to the topic or study design and critically discusses these articles at the beginning of the paper44
ÔCritical discussionÕ requires thoughtful assessment of the quality of the cited literature and ⁄ or reflection on how the present study builds on
limitations or gaps in prior work
Articles citing literature without a critical discussion were counted separately
2 Conceptual framework
The conceptual framework situates the research question, intervention methods or study design within a model or theoretical framework that
facilitates meaningful interpretation of the methods and results7,45
3 Statement of study intent
Study intent can be phrased as a research question, a research hypothesis or a purpose (goal, aim)
The statement of study intent explicitly identifies the independent variable (intervention), the dependent variable (outcome), the relationship
between these variables, and the population of interest43
These elements were further rated as follows:
• independent variable: operationally defined (described in sufficient detail as to permit replication), vague or absent
• dependent variable: operationally defined, vague or absent
• relationship between variables: suggests the intent to determine a difference (e.g. ÔimproveÕ, ÔenhanceÕ, Ôbetter thanÕ, Ôsimilar toÕ)
• population of interest: includes the type of learner (e.g. medical student or internal medicine resident) and represents a population
(e.g. Ômedical studentsÕ) rather than a sample (e.g. Ôour medical studentsÕ)
4a Statement of study design
An explicit and accurate statement identifies the nature of the study design
4b Type of study design
Frameworks for experimental study design described by Campbell and Stanley48 and Fraenkel and Wallen35 were used
For each of the following, post-test only studies (assessment carried out only after the intervention) and pretest ⁄ post-test studies (assessment
carried out both before and after the intervention) were distinguished:
• randomised study: study groups were assigned randomly
• quasi-experimental non-randomised study: study groups were assigned using a non-randomised method
• matched-group study: study groups were systematically assigned by matching participant characteristics
• static-group comparison study: groups were pre-existing, formed for non-study purposes
• single-group study: a single study group
4c Additional design features
Factorial designs are those in which more than 1 independent variable is included in the study design, with combinations of the levels
of the independent variables represented in the study interventions
Crossover designs are those in which participants are systematically exposed to more than 1 intervention over the course of the study
5a Study intervention definition
The study intervention is described in sufficient detail as to permit replication27
Interventions were rated as explicit, vague, or not defined
5b Comparison intervention or control group definition
The comparison intervention or control group is described in sufficient detail as to permit replication27
Interventions were rated as explicit, vague, or not defined
This element was further classified as:
• no intervention (no educational exposure; a true control group)
• placebo intervention (exposure to an educational intervention dealing with a topic different from the study intervention)
• comparison intervention (exposure to an educational intervention different than, but on the same topic as, the study intervention), or
• not defined
6 Acknowledgements of human subject rights
Acknowledgement of human subject rights, defined as either informed consent or approval by an institutional review board, is documented33

constituted the primary outcomes for this study Each full article included in the study sample was
(Table 1 shows definitions). We elected not to rate rated independently by pairs of authors. Disagree-
outcomes because this issue in medical education has ments were resolved by discussion and consensus was
already been addressed.12 The statement of study reached on final ratings.
intent was further evaluated for the presence of 4
basic elements:43 the independent variable; the Data analysis
dependent variable; the relationship between the
variables, and the population of interest. Results were summarised using descriptive statistics.
Inter-rater agreement was determined for each set
Data extraction of ratings using intraclass correlation coefficients
(ICC).49 Chi-square test or Fisher’s exact test was
A data collection form was created to rate these used for subgroup comparisons. Analyses were
elements of reporting. The form was pilot-tested and performed with SAS 9.1 (SAS Institute, Cary, NC,
refined using a sample of articles published in 2002. USA).

 Blackwell Publishing Ltd 2007. MEDICAL EDUCATION 2007; 41: 737–745


741

statement of study intent, the independent and


RESULTS dependent variables were operationally defined in 38
(47%) and 26 (32%), respectively. All 4 elements
A total of 105 articles were reviewed in full. Intraclass were fully present in 2 articles (3%) and were either
correlation coefficients were statistically significant present or vaguely defined in another 34 (43%).
for each set of ratings and ranged from 0.344 to Several articles presented objectives for the paper
0.775, suggesting inter-rater agreement that was at (ÔThe aim of this paper is to raise the profile of
least fair and usually good to excellent.49 patient safety in undergraduate curriculaÕ50) or the
study intervention (ÔCourse objectives were to …Õ),
Prevalence of key elements but failed to report the intent of the study itself.

Almost half the articles (47 [45%]) presented a Secondary analyses


critical review of the literature and another 52
(49%) cited literature without discussion. Slightly There were no significant differences across journals
over half (58 [55%]) reported a conceptual among the elements rated (P > 0.08), except for
framework. A total of 17 articles (16%) gave an presence of a conceptual framework (P ¼ 0.015).
explicit statement of study design. Of the 48 studies Conceptual frameworks were present more
(46%) conducted with a comparison group, the frequently in Advances in Health Sciences Education
definition of the control group or comparison (4 ⁄ 5 articles [80%]), Medical Education (26 ⁄ 37 articles
intervention was explicit in 35 (73%). One study [70%]), and Academic Medicine (15 ⁄ 23 articles [65%])
reported statistical comparisons, but we were than in Journal of General Internal Medicine (5 ⁄ 15
unable to determine whether these reflected pre-- articles [33%]), Teaching and Learning in Medicine
test ⁄ post-test comparisons or a separate comparison (6 ⁄ 18 articles [33%]), and American Journal of Surgery
group. Fewer than half the studies (44 [42%]) (2 ⁄ 7 articles [29%]).
reported IRB approval or participant consent. See
online supplementary Table S1 for details. In 2004, Medical Education instituted a policy manda-
ting authors to report ethical approval for their study.
The single-group pre-test ⁄ post-test design was the Of the 23 articles published before this policy went
most common study design (34 [32%]), followed into effect, 7 (30%) contained a statement about
by the single-group post-test only design (27 human subject rights, compared with all 14 articles
[26%]) and the static-group comparison post-test (100%) published after the policy took effect
only design (19 [18%]). A total of 18 studies (17%) (P < 0.0001).
reported randomisation. Factorial designs and
crossover designs were rare (3 [3%] and 4 [4%] of
studies, respectively). We looked at differences in
reporting quality for different study designs. DISCUSSION
Reports of studies with stronger designs (random-
ised, quasi-experimental and matched-group This systematic review of reports of experimental
controlled trials), compared with weaker designs studies in medical education revealed that several
(uncontrolled or static-group controlled trials), essential elements of scientific reporting were
more frequently contained critical literature reviews frequently missing. Although the study intent and
(22 ⁄ 28 [79%] versus 25 ⁄ 76 [33%]; P ¼ 0.0002) and intervention definition were reported often, a
study design statements (11 ⁄ 28 [39%] versus 6 ⁄ 76 statement of study design was rare. The other main
[8%]; P ¼ 0.0001). The difference in reporting elements of scientific reporting hovered at around
acknowledgement of human subjects rights was not 50%. Overall, the quality of reporting was poor,
statistically significant, albeit close (16 ⁄ 28 [57%] leaving room for much improvement.
versus 28 ⁄ 76 [37%]; P ¼ 0.06). Other differences
were not significant (P > 0.13). Almost all articles (94%) cited at least some litera-
ture, but fewer than half (45%) critically assessed the
The statement of study intent was further rated for literature. Many reports cited several articles in
the presence of 4 essential elements: the adequacy of sequence without critique, and others cited only
the independent and dependent variables, respect- textbooks and monographs rather than primary
ively; the relationship between the variables, and the research articles. Such so-called literature reviews
intended population (See online supplementary generally failed to identify the current knowledge
Table S2 for details). Of the 80 articles (76%) with a gaps for research on the topic, and also failed to

 Blackwell Publishing Ltd 2007. MEDICAL EDUCATION 2007; 41: 737–745


742 research in medical education

indicate how the present study would build upon controlled studies failed to clearly define the
prior work. comparison or control group, including 4 articles
that contained no definition at all. Without a clear
Barely half the studies (55%) presented a conceptual idea of what happened to the comparison group, it is
framework. The consequence of this frequent impossible to make any meaningful interpretation
omission cannot be overemphasised. Conceptual regarding the study intervention.27
frameworks serve several purposes.45,51 Firstly, they
guide the selection of variables to be included in the Consistent with previous research,17 fewer than half
study. Secondly, they permit results to be interpreted of the articles we reviewed acknowledged human
meaningfully and applied to other settings. Finally, subject rights. This is lower than the prevalence
they facilitate the refinement of existing theories or observed in recent non-education studies20,54 and
the development of new theories, and provide a basis may reflect researchers’ ignorance regarding the
upon which to design and conduct further need to consider learners as research subjects or
research.7,27,52 We found several studies that were variations in standards across cultures and nations.
otherwise well executed, but the absence of a formal However, recent discussions emphasise that ethical
conceptual framework to guide interpretation issues can no longer be ignored.32–34,55
severely limited the meaning of the results.
Although inter-rater agreement was generally good,
Consistent with prior research,31 most studies (76%) disagreements were not uncommon. We found that
presented a statement of study intent (purpose, some elements, such as conceptual framework and
research question or hypothesis). However, the great critical literature review, were difficult to operation-
majority of these statements were incomplete. A clear alise and that our understanding of these concepts
statement of study intent sets the stage for the rest of matured during the pilot phase and early in the
the study.43,53 Subsequent study design, sampling and rating of the articles. However, for these and all other
outcome measures all become clear once the inde- elements, disagreements most often arose when a
pendent and dependent variables, the relationship rater failed to locate an element that, once pointed
between these variables, and the intended population out by the other rater, became obvious. For example,
have been defined explicitly. acknowledgement of human subject rights was
readily understood, but statements buried in the
Only a small minority of reports (16%) contained middle of a long paragraph were easy to miss. If we
an explicit statement of study design. Although had difficulty in identifying essential elements of
most reports simply omitted the design statement, research reporting despite careful scrutiny, then
several authors used general, incorrect or ambigu- casual readers will be even less likely to discern them.
ous terminology, employing terms such as Ônatur-
alistic studyÕ or Ôexperimental studyÕ without Our ratings emphasised the quality of reporting
further qualification, or saying that they had rather than the quality of study methods or the
conducted a Ôquasi-experimental studyÕ or Ôcohort presence or absence of errors. Our aim was to instil a
studyÕ when they apparently meant a single-group more scholarly approach to medical education
pretest ⁄ post-test study. Even when accurate, such research, focusing on the fundamentals of clear goals
general terms fail to convey important details (statement of study intent), adequate preparation
about the timing of assessments (e.g. Ôpretest ⁄ post- (literature review and conceptual framework),
testÕ) and the method of group assignments appropriate methods (study design), and effective
(e.g. Ôexisting groupsÕ or ÔrandomisedÕ). Such communication (reporting as a whole).56 Although
distinctions are essential because explicit state- the link between reporting quality and methodolo-
ments of study design facilitate the reader’s gical rigour is not direct, deficiencies in reporting
understanding of study procedures.27 We also may reflect a lack of awareness regarding rules for
noted the overwhelming prevalence of non- conducting scientifically valid research, as well as the
randomised study designs, and more than half the absence of reporting standards. We call for an
studies had no control or comparison group. international assembly of editors and authors to
define reporting standards that can affect the scien-
Whereas a previous review found a definition of the tific and ethical validity of medical education reports.
study intervention in only 33% of articles,18 most of The criteria in Table 1 may serve as a starting point.
the reports we reviewed clearly defined this element,
and all described the intervention at least in part. However, guidelines alone will probably be
However, over a quarter of papers reporting insufficient. Recent research suggests that journal

 Blackwell Publishing Ltd 2007. MEDICAL EDUCATION 2007; 41: 737–745


743

instructions to authors provide little guidance in human subject rights. Reporting guidelines are
methodology57 and infrequently cite relevant needed (for which the criteria in Table 1 may serve as
guidelines.58 Even when instructions recommend a starting point for discussion). Greater attention to
guidelines such as CONSORT, adherence to these matters by authors, mentors, reviewers and
the guidelines remains sub-optimal.22,23 A more editors will enable researchers to further advance the
proactive approach is needed. Evidence suggests art and science of medical education.
that specific directions or checklists for authors may
have limited impact,59,60 perhaps because they are
not sufficiently enforced. However, the significant Contributors: all authors were involved in the planning
improvement we observed when Medical Education and execution of this study and in the drafting and revising
began to require and enforce the reporting of of this manuscript.
ethical approval suggests that journal-initiated Acknowledgements: none.
interventions can impact reporting quality and Funding: none.
perhaps influence study procedures. Tutoring by Conflicts of interest: GB chairs the editorial board of
editors may also help,61 but studies linking this Medical Education. We are not aware of any other conflicts of
intervention to reporting quality are still interest.
forthcoming. At present, evidence suggests that the Ethical approval: not required.
peer review and editing processes have the best
chances of improving reporting quality.38,39,62–64
REFERENCES
In this study we excluded a vast body of non-
1 Norman GR. Research in medical education: three
experimental research in medical education. We do
decades of progress. BMJ 2002;324:1560–2.
not imply that all research should be experimental, 2 Wass V, Richards T, Cantillon P. Monitoring the
but do believe that all research should be rigorous medical education revolution. BMJ 2003;327:1362.
in its planning, execution and reporting. Many of 3 Petersen S. Time for evidence-based medical
the fundamentals of research reporting that we education. BMJ 1999;318:1223–4.
identified will translate to other research types, 4 Golub RM. Medical education 2005: from allegory to
designs and methods in medical education. We bull moose. JAMA 2005;294:1108–10.
encourage researchers to replicate this study with 5 Branch WT Jr, Kern DE. An emerging renaissance in
other types of medical education research, such as medical education. J Gen Intern Med 2004;19:606–9.
observational studies, validity studies or qualitative 6 Bligh J, Parsell G. Research in medical education:
studies. In addition, there is a need to address finding its place. Med Educ 1999;33:162–3.
7 Prideaux D, Bligh J. Research in medical education:
other essential elements of reporting that are likely
asking the right questions. Med Educ 2002;36:1114–5.
to affect valid interpretation, such as appropriate 8 Cook DA. The research we still are not doing: an
study design and procedures, appropriate statistical agenda for the study of computer-based learning. Acad
tests, appropriate interpretation of study findings, Med 2005;80:541–8.
and consistency in reporting results (e.g. consis- 9 Hutchinson L. Evaluating and researching the
tency between text and abstract, or text and tables). effectiveness of educational interventions. BMJ
Research on methodological details will be 1999;318:1267–9.
facilitated by establishing a consensus on reporting 10 Norman G. RCT ¼ results confounded and trivial: the
standards. perils of grand educational experiments. Med Educ
2003;37:582–4.
In addition to limitations noted above, we acknow- 11 Carney PA, Nierenberg DW, Pipas CF, Brooks WB,
Stukel TA, Keller AM. Educational epidemiology:
ledge that much medical education research is
applying population-based design and analytic
published in journals not included in our sample, approaches to study medical education. JAMA
and note that only 1 reviewer screened abstracts for 2004;292:1044–50.
inclusion as experiments. 12 Prystowsky JB, Bordage G. An outcomes research
perspective on medical education: the predominance
This review highlighted several major deficiencies in of trainee assessment and satisfaction. Med Educ
the reporting of experimental studies in medical 2001;35:331–6.
education. The quality of reporting was generally 13 Lurie SJ. Raising the passing grade for studies of
poor. Although our review did not directly assess the medical education. JAMA 2003;290:1210–2.
quality of the research conducted, we are concerned 14 Chen FM, Bauchner H, Burstin H. A call for outcomes
that poor reporting may reflect sub-optimal research research in medical education. Acad Med 2004;79:
955–60.
designs and methods, and a lack of attention to

 Blackwell Publishing Ltd 2007. MEDICAL EDUCATION 2007; 41: 737–745


744 research in medical education

15 Education Group for Guidelines on Evaluation. 31 Wolf FM. Methodological quality, evidence and
Guidelines for evaluating papers on educational Research in Medical Education (RIME). Acad Med
interventions. BMJ 1999;318:1265–7. 2004;79 (Suppl 10):68–9.
16 Bordage G. Reasons reviewers reject and accept 32 Morrison J, Prideaux D. Ethics approval for research in
manuscripts: the strengths and weaknesses in medical medical education. Med Educ 2001;35:1008.
education reports. Acad Med 2001;76:889–96. 33 Tomkowiak JM, Gunderson AJ. To IRB or not to IRB?
17 Roberts LW, Geppert C, Connor R, Nguyen K, Warner Acad Med 2004;79:628–32.
TD. An invitation for medical educators to focus on 34 McLachlan JC, McHarg J. Ethical permission for the
ethical and policy issues in research and scholarly publication of routinely collected data. Med Educ
practice. Acad Med 2001;76:876–85. 2005;39:944–8.
18 Price EG, Beach MC, Gary TL et al. A systematic review 35 Fraenkel JR, Wallen NE. How to Design and Evaluate
of the methodological rigor of studies evaluating Research in Education. New York, NY: McGraw-Hill
cultural competence training of health professionals. 2003;267–89.
Acad Med 2005;80:578–86. 36 Reed D, Price EG, Windish DM et al. Challenges in
19 Moher D, Schulz KF, Altman D, for the Consort Group. systematic reviews of educational intervention studies.
The CONSORT statement: revised recommendations Ann Intern Med 2005;142:1080–9.
for improving the quality of reports of parallel-group 37 Haig A, Dozier M. BEME Guide No. 3: Systematic
randomised trials. JAMA 2001;285:1987–91. searching for evidence in medical education. Part 1.
20 Yank V, Rennie D. Reporting of informed consent and Sources of information. Med Teach 2003;25:352–63.
ethics committee approval in clinical trials. JAMA 38 Pierie JPEN, Walvoort HC, Overbeke AJPM. Readers’
2002;287:2835–8. evaluation of effect of peer review and editing on
21 Altman DG. Poor-quality medical research: what can quality of articles in the Nederlands Tijdschrift voor
journals do? JAMA 2002;287:2765–7. Geneeskunde. Lancet 1996;348:1480–3.
22 Mills EJ, Wu P, Gagnier J, Devereaux PJ. The quality of 39 Goodman SN, Berlin J, Fletcher SW, Fletcher RH.
randomised trial reporting in leading medical journals Manuscript quality before and after peer review and
since the revised CONSORT statement. Contemp Clin editing at Annals of Internal Medicine. Ann Intern Med
Trials 2005;26:480–7. 1994;121:11–21.
23 Plint AC, Moher D, Schulz K, Altman DG, Morrison A. 40 Soares HP, Daniels S, Kumar A et al. Bad reporting
Does the CONSORT checklist improve the quality of does not mean bad methods for randomised trials:
reports of randomised controlled trials? A systematic observational study of randomised controlled trials
review. Paper presented at the Fifth International performed by the Radiation Therapy Oncology Group.
Congress on Peer Review and Biomedical Publication, BMJ 2004;328:22–4.
Chicago, IL, September, 2005. 41 Morrison JM, Sullivan F, Murray E, Jolly B. Evidence-
24 Jüni P, Altman DG, Egger M. Systematic reviews in based education: development of an instrument to
health care: assessing the quality of controlled clinical critically appraise reports of educational interventions.
trials. BMJ 2001;323:42–6. Med Educ 1999;33:890–3.
25 Huwiler-Muntener K, Jüni P, Junker C, Egger M. 42 Bordage G, Caelleigh AS, Steinecke A et al. Review
Quality of reporting of randomised trials as a measure criteria for research manuscripts. Acad Med
of methodological quality. JAMA 2002;287:2801–4. 2001;76:897–978.
26 Rennie D. CONSORT revised – improving the 43 Bordage G, Dawson B. Experimental study design and
reporting of randomised trials. JAMA 2001;285:2006– grant writing in 8 steps and 28 questions. Med Educ
7. 2003;37:376–85.
27 Des Jarlais DC, Lyles C, Crepaz N. Improving the 44 Crandall SJ, Caelleigh AS, Steinecke A. Reference to
reporting quality of non-randomised evaluations of the literature and documentation. Acad Med
behavioural and public health interventions. The 2001;76:925–7.
TREND statement. Am J Public Health 2004;94: 45 McGaghie WC, Bordage G, Shea JA. Problem state-
361–6. ment, conceptual framework and research question.
28 Bossuyt PM, Reitsma JB, Bruns DE et al. Towards Acad Med 2001;76:923–4.
complete and accurate reporting of studies of diag- 46 Morrison J. Developing research questions in medical
nostic accuracy: the STARD initiative. Clin Chem education: the science and the art. Med Educ
2003;49 (1):1–6. 2002;36:596–7.
29 Moher D, Cook DJ, Eastwood S, Olkin I, Rennie D, 47 Shea JA, Arnold L, Mann KV. A RIME perspective on
Stroup DF. Improving the quality of reports of the quality and relevance of current and future
meta-analyses of randomised controlled trials: the medical education research. Acad Med 2004;79:931–8.
QUOROM statement. Quality of reporting of 48 Campbell DT, Stanley JC. Experimental and Quasi-
meta-analyses. Lancet 1999;354:1896–900. experimental Designs for Research. Chicago, IL: Rand
30 Stroup DF, Berlin JA, Morton SC et al. Meta-analysis of McNally 1966.
observational studies in epidemiology: a proposal for 49 Shrout PE, Fleiss JL. Intraclass correlations: uses in
reporting. JAMA 2000;283:2008–12. assessing rater reliability. Psychol Bull 1979;86:420–8.

 Blackwell Publishing Ltd 2007. MEDICAL EDUCATION 2007; 41: 737–745


745

50 Flanagan B, Nestel D, Joseph M. Making patient safety 63 Lee KP, Boyd EA, Bero LA. Editorial changes to
the focus: crisis resource management in the under- manuscripts published in major biomedical journals.
graduate curriculum. Med Educ 2004;38:56–66. Paper presented at the Fifth International Congress on
51 Pangaro L, McGaghie WC. Relevance. Acad Med Peer Review and Biomedical Publication, Chicago, IL,
2001;76:927–9. September, 2005.
52 Prideaux D. Researching the outcomes of educational 64 Pitkin RM, Branagan MA, Burmeister LF. Effectiveness
interventions: a matter of design. BMJ 2002;324:126–7. of a journal intervention to improve abstract quality.
53 Inouye SK, Fiellin DA. An evidence-based guide to JAMA 2000;283:481.
writing grant proposals for clinical research. Ann Intern
Med 2005;142:274–82. Received 7 August 2006; editorial comments to authors 19
54 Schroter S, Plowman R, Hutchings A, Gonzalez A. December 2006; accepted for publication 2 March 2007
Reporting of ethical committee approval and patient
consent by study design in 5 general medical journals.
Paper presented at the Fifth International Congress on
Peer Review and Biomedical Publication, Chicago, IL,
September, 2005.
55 Henry RC, Wright DE. When do medical students SUPPLEMENTARY MATERIAL
become human subjects of research? The case of
programme evaluation. Acad Med 2001;76:871–5. The following supplementary material is available for
56 Glassick CE. Boyer’s expanded definitions of scholar-
this article:
ship, the standards for assessing scholarship, and the
elusiveness of the scholarship of teaching. Acad Med Table S1. Prevalence of key elements of reporting
2000;75:877–80. experimental studies.
57 Schriger DL, Arora S, Altman DG. The content of Table S2. Prevalence of the four basic elements of a
medical journal instructions for authors. Ann Emerg statement of study intent.
Med 2006;48:743–9.
Appendix S1. Experiments in Medical Education in
58 Altman DG, for the Consort Group. Endorsement of
the CONSORT statement by high-impact medical Six Journals, 2003-2004.
journals: survey of instructions for authors. BMJ The material is available as part of the online article
2005;330:1056–7. from:
59 Jefferson T, Smith R, Yee Y, Drummond M, Pratt M, http://www.blackwell-synergy.com/doi/abs/
Gale R. Evaluating the BMJ guidelines for economic 10.1111/j.1365-2923.2007.02777.x
submissions: prospective audit of economic submis-
sions to BMJ and The Lancet. JAMA 1998;280:275–7. (This link will take you to the article abstract).
60 Pitkin RM, Branagan MA. Can the accuracy of abstracts Please note: Blackwell Publishing is not responsible
be improved by providing specific instructions? A for the content or functionality of any supplementary
randomised controlled trial. JAMA 1998;280:267–9. materials supplied by the authors. Any queries (other
61 Marusic M, Markulin H, Lukic IK, Marusic A. Academic than missing material) should be directed to the
advancement of authors receiving tutoring from a corresponding author for the article.
medical journal. Teach Learn Med 2006;18:126–9.
62 Jefferson T, Alderson P, Wager E, Davidoff F. Effects of
editorial peer review: a systematic review. JAMA
2002;287:2784–6.

 Blackwell Publishing Ltd 2007. MEDICAL EDUCATION 2007; 41: 737–745

You might also like