You are on page 1of 8

ORIGINAL REPORTS

Transpedicular Approach on a Novel


Spine Simulator: A Validation Study

David Fürst, PhD,* Marianne Hollensteiner, PhD,* Stefan Gabauer, MSc,* Benjamin Esterer, MSc,*
Klemens Trieb, MD,† Felix Eckstein, MD,‡ and Andreas Schrempf, PhD*

*
Research Group for Surgical Simulators Linz, Upper Austria University of Applied Sciences, Linz, Austria;

Department of Orthopedics, Klinikum Wels-Grieskirchen, Wels, Austria; and ‡Institute of Anatomy,
Paracelsus Medical University Salzburg & Nuremberg, Salzburg, Austria

OBJECTIVE: The popularity of simulation in the medical score classifies different levels of expertise accurately.
field has increased dramatically over the last decades. ( J Surg Ed ]:]]]-]]]. J C 2018 Association of Program
However, the majority of studies focused on laparoscopic Directors in Surgery. Published by Elsevier Inc. All rights
or other endoscopic procedures. In this study, participants reserved.)
performed an image-guided surgery task on a novel spine
KEY WORDS: surgical simulation, validity, transpedicular
simulator. Face, content, construct, and concurrent validity
approach, specialist rating
were examined.
COMPETENCIES: Patient Care, Medical Knowledge, Prac-
DESIGN: A surgical access through both pedicles (trans-
tice-Based Learning and Improvement
pedicular) into the vertebral body of artificial L3 vertebrae
was performed. Questionnaires, a simulation-based per-
formance score, and a specialist rating were used to evaluate
the various forms of validity. INTRODUCTION
SETTING: Klinikum Wels-Grieskirchen, Wels, Austria;
Simulation covers a set of conditions that are created
tertiary hospital
artificially to experience something that does exist in
PARTICIPANTS: According to their expertise in image- reality.1 For more than 50 years, the flight industry has
guided surgery and pedicle tool insertions, 43 participants used simulation to increase pilot performance and passen-
were subdivided into 3 groups: 22 novices, 12 intermedi- gers’ safety. In the last decades, the popularity of simulation
ates, and 9 experts. in the medical field increased dramatically. However, the
majority of studies and reviews showing a benefit in surgical
RESULTS: Of the novice group, the vast majorities were
training focused on laparoscopy or endoscopic procedures2
impressed with the attractiveness and the general appearance
while simulation in the field of orthopedics lags behind.3
of the simulator. The majority of intermediates (92%) and
When discussing different simulator models, criteria such as
experts (89%) would recommend the simulator to others.
fidelity, validity, and reliability should be taken into
According to a simulation-based performance score, experts
account.4 Bench models or box trainers are summarized
performed significantly better than novices (p ¼ 0.001,
under the expression “low-fidelity” simulators. They allow
d ¼ 1.52) and intermediates (p ¼ 0.01, d ¼ 1.26). The
operating on physical objects (e.g., artificial anatomical
association between the simulation-based performance score
structures) with real surgical instruments which offers more
and the specialist rating was strong (R ¼ 0.86, p o 0.01).
or less realistic haptic feedback. While low-fidelity trainers
CONCLUSIONS: The novel spine simulator provides an are relatively inexpensive, their ability of defining profi-
applicable tool for the training of image-guided surgery ciency-based criteria is limited.5 In contrast, procedural
skills in a realistic design. Its simulation-based assessment trainers or virtual reality systems are often referred to as
“high-fidelity” simulators. They are characterized by a more
This work was supported by the Austrian Research Promotion Agency (FFG) complex technology, a higher degree of realism, and
within the program line Cooperation & Innovation (COIN) and project number extensive feedback possibilities. However, such systems are
845436.
very cost intensive.5
Correspondence: Inquiries to David Fürst, Research Group for Surgical Simulators
Linz, Upper Austria University of Applied Sciences, Garnisonstr. 21, 4020, Linz, Regardless of fidelity, a simulation-based assessment
Austria; e-mail: david.fuerst@fh-linz.at should be valid and reliable. This means that it should

Journal of Surgical Education  & 2018 Association of Program Directors in Surgery. Published by 1931-7204/$30.00 1
Elsevier Inc. All rights reserved. https://doi.org/10.1016/j.jsurg.2018.01.002
measure consistently what it was intended to measure
across repeated assessments.6 To achieve this, various
forms of validity and reliability are defined in the
literature.6-8 Face and content validity of a simulator are
often examined by questionnaires: while novices answer
questions about the general appearance of a simulator,
experienced surgeons assess it′s content-related suitabil-
ity.9 To demonstrate construct validity, the simulator
should be able to discriminate between different levels of
expertise.8 Concurrent validity describes the extent to
which the results of the simulation-based assessment
correlate with the gold standard assessment, such as the
Objective Structured Assessment of Technical Skills
(OSATS).10 It consists of an operation- or task-specific
checklist and a global rating scale.7
As a training modality for spine surgery, a novel simulator
was developed. To practice image-guided surgery skills, the
transpedicular approach can be simulated. The objective of
this study was to examine its face, content, construct, and
concurrent validity.

MATERIALS AND METHODS


Simulator
The simulator consisted of a patient dummy, an electro-
magnetic tracking system (Aurora, NDI, Waterloo, Can-
ada), a foot pedal, and a computer with monitor for FIGURE 1. Simulator for minimally invasive spine surgery consisting of
operating the simulation software (Fig. 1). The patient a patient phantom (A), an electromagnetic tracking system (B), a foot
dummy was made of silicone rubber and has a rectangular pedal (C), and a monitor (D).
cutout at the thoracolumbar spine. Beneath this cutout, 3
exchangeable artificial vertebrae (in this study L2-L4) into 3 groups: A novice group (n ¼ 22), consisting of
were constrained within a semirigid clamping device. The participants who had no or little experience in pedicle tool
trabecular structure within the artificial vertebrae was insertions or image-guided surgery, an intermediate group
mechanically and morphologically validated as published (n ¼ 22) who had at least assisted in pedicle tool
previously.11 insertions as well as image-guided surgery, and an expert
Real surgical instruments were equipped with sensor coils group (n ¼ 9), comprising participants with a high level of
and their position and orientation were determined with the expertise (active and passive) in pedicle tool insertions and
electromagnetic tracking system. By pressing the foot pedal, image-guided surgery.
the simulation software generated simulated fluoroscopic
images in anterior-posterior (AP) and mediolateral (ML)
Surgical Procedure
direction presented on the monitor (Fig. 2). For this study,
the focal points of the simulated x-ray sources (AP and ML) Each participant had to perform a bone access through both
were set automatically. pedicles (transpedicular) into the vertebral body of an
artificial L3 vertebra. First, all participants received an oral
presentation that included detailed instructions on each
Subjects
task. Initially, a 6 degree-of-freedom probe (NDI, Waterloo,
The simulator was provided to a regional hospital Canada) with a rigid metal tip was used to localize the
(Klinikum Wels-Grieskirchen) in Wels, Austria, where pedicle. Then, an 11-gauge osteo-introducer-tool (Med-
43 participants from various medical departments were tronic, Minneapolis) with a diamond-tipped stylet was
recruited. First, each participant signed an informed penetrated through artificial skin, muscle and bone tissue,
consent form concerning confidential usage of personal until a sufficient insertion depth was reached. Each partic-
data. Then, a short questionnaire assessing the task- ipant was free to decide the number and times of acquiring
specific expertise of each participant was applied. Based simulated projections as well as the time when a task was
on the answers given, the participants were divided finished. Each participant was supported by a member of

2 Journal of Surgical Education  Volume ]/Number ]  ] 2018


FIGURE 2. Screenshot of the operating software showing the simulated projection images in AP (left) and ML (right) direction.

the development team in operating the software (e.g., taking contribution to the overall performance score. Such profi-
pictures). ciency metrics have already been used previously for validating
the Imperial College Surgical Assessment Device (ICSAD)12
and are commonly used for evaluating simulators for laparo-
Assessment
scopic surgery.13 The number of projections (nAP for both
To examine the face and content validity, questionnaires tasks, nML only for insertion) and the duration (t) of each task
with a 5-point Likert scale (adopted from Xiao et al.9) were were also included in computing the score. In the study of
used. The participants were asked to complete these Tonetti et al.14 the number of projections also was already
immediately after completing the simulation procedure. used as an assessment parameter. The smaller these metrics,
While novices answered questions about the relative realism the higher the individual scores. Instrument penetration into
of the simulator (face validity—Table 1), intermediates and the spinal canal was defined as a fatal error.
experts rated its appropriateness as a surgical training tool For the instrument insertion task, an efficiency index was
(content validity—Table 2). additionally implemented. For calculation, a 3-dimensional
To evaluate the construct validity, a simulation-based region of interest (ROI) with the shape of a bulb was
performance score was calculated for each insertion side defined for each insertion side (Fig. 3). The index was then
(Table 3) and compared between the 3 groups. For this calculated by dividing the number of samples within the
purpose, a custom analysis software was programed using bulb-shaped ROI with the overall sample number. The
Matlab (Matlab R2015b, The MathWorks, Natick). Dur- more the instrument was moved within the ROI, the higher
ing each of the 4 tasks (localization and insertion on each the calculated index and thus the individual score. In
side), the position and orientation of the particular tool were contrast, samples detected within a canal-specific ROI (red
recorded with a sample rate of 40 Hz. Based on the zone) set the insertion score to 0. However, the usage of a
resulting trajectory, parameters such as the number of tracking system always induces slight inaccuracies. There-
instrument movements nAoM and the length of the traveled fore, a visual control of each vertebra was additionally
instrument path xtool were calculated and provided a performed to confirm penetrations into the spinal canal.

TABLE 1. Questionnaire for the Assessment of the Simulator’s Face Validity (1 ¼ Strongly Disagree/Not Realistic, 3 ¼ Neither
Agree or Disagree/Neutral, and 5 ¼ Strongly Agree/Very Realistic)
Rating

Questions 1 2 3 4 5
How would you rate the attractiveness of this spine simulator? ⃝ ⃝ ⃝ ⃝ ⃝
Do you think that having this spine simulator would encourage you to ⃝ ⃝ ⃝ ⃝ ⃝
practice your image-guided surgery skills more often?
I would like to have a spine simulator of my own. ⃝ ⃝ ⃝ ⃝ ⃝
How would you evaluate the general appearance of the spine simulator ⃝ ⃝ ⃝ ⃝ ⃝
(proportions of the patient phantom, the structure of the simulator, …)?

Journal of Surgical Education  Volume ]/Number ]  ] 2018 3


TABLE 2. Questionnaire for the Assessment of the Simulator’s Content Validity (1 ¼ Strongly Disagree/Not Realistic, 3 ¼ Neither
Agree or Disagree/Neutral, and 5 ¼ Strongly Agree/Very Realistic)
Rating

Questions 1 2 3 4 5
How would you rate this spine simulator for training in an ergonomic manner? ⃝ ⃝ ⃝ ⃝ ⃝
(e.g., ensuring proper manipulation angle/elevation angle/instrument scaling)
Do you consider this spine simulator useful for practicing image-guided surgery skills? ⃝ ⃝ ⃝ ⃝ ⃝
Did you consider this spine simulator easy to use? ⃝ ⃝ ⃝ ⃝ ⃝
Would you recommend such a spine simulator to others? ⃝ ⃝ ⃝ ⃝ ⃝
The spine simulator contains all the important procedural steps that are necessary ⃝ ⃝ ⃝ ⃝ ⃝
for a surgeon to learn the procedure.

Minor medial breaches were considered correct. The To compare the performance parameters between both
weighting of all performance parameters was determined approaches, the paired t-test or a nonparametric equivalent
in coordination with the specialist. (Wilcoxon signed-rank test) was used. In contrast, differ-
To establish concurrent validity, the performance of each ences between groups were detected using the unpaired
participant was assessed by a specialist (head of the depart- Student t-test or the nonparametric Mann-Whitney U test.
ment of orthopedic surgery) who did not participate himself To measure the strength of a statistical relationship, the
in the study as an active participant. Using a video camera effect size (Cohen d or r) was calculated. Prior to that, each
and screen capture software, the individual fluoroscopic group data was checked for normality (Shapiro-Wilk test)
images were recorded and provided for assessment. The and variance homogeneity (Levene test). The association
resulting film material was anonymized before rating, to between the simulation-based performance score and the
warrant blinded evaluation. The specialist used a task- expert rating was evaluated using Pearson test for linear
specific checklist and a global rating scale (OSATS). While correlation. Unless otherwise stated, a significance level of
the checklist had 10 items (1 point for each correctly 5% was chosen.
performed action—Table 4), the global rating scale con-
sisted of 7 variables marked on a 5 point Likert scale
corresponding to a maximum of 35 points. According to the RESULTS
specialist rating, a maximum of 45 points could be reached
for each insertion side. If the spinal canal was penetrated, Face and Content Validity
the checklist score was set to 0 while the global rating scale
The average rating on the simulators’ face validity is shown
resulted in 7 points (¼very bad performance).
in Table 5. Almost all novices rated the attractiveness (95%)
as well as the general appearance (91%) of the simulator as
good/realistic (4 points) to excellent/very realistic (5 points).
Statistical Analysis
Further, 86% stated they would practice their image-guided
All statistical analyses were performed using SPSS (SPSS surgery skills more often when having access to such a
Statistics 22, IBM, Amonk), while all plots were generated simulator and more than 59% stated they would like to own
with Matlab (Matlab R2015b, The MathWorks, Natick). such a training modality.

TABLE 3. Structure of the Simulation-Based Performance


Score. Depending on Its Importance, a Maximum of 3, 6, or
9 Points Was Awarded for Each Parameter Which Resulted in a
Maximum Subscore of 18 and 30 Points, Respectively, for
Localization and Insertion. Thus, a Maximum Performance
Score of 48 Points Could Be Achieved for Each Side
Parameter Localization Insertion
nAoM 3 3
x tool ðmÞ 6 6
nAP 3 3
nM L - 3
tðsÞ 6 6
EI - 9 FIGURE 3. Example of left (novice, EI ¼ 0.47) and right (expert, EI ¼
Overall max. 18 max. 30 0.97) insertion approach showing the tool tip trajectories (A), the bulb-
shaped ROI’s (B) and the critical regions for detecting penetrations into
EI, efficiency index. the spinal canal (C). EI, efficiency index.

4 Journal of Surgical Education  Volume ]/Number ]  ] 2018


TABLE 4. Task-Specific Checklist for Transpedicular Bone Access. For Each Correctly Performed Action 1 Point is Scored Resulting in
a Maximum of 10 Points
No. Procedural Step Correct Incorrect
1 Find spinous process under skin ⃝ ⃝
2 Selects appropriate instrument for pedicle localization ⃝ ⃝
3 Correct localization of the pedicle ⃝ ⃝
4 Selects appropriate insertion instrument ⃝ ⃝
5 Chooses correct starting positions 9 to 11-o’ clock (left side) and 1 to 3-o’clock (right side) ⃝ ⃝
6 Advances instrument until getting in contact with bone (controlled by imaging) ⃝ ⃝
7 Advances instrument into first half of pedicle (controlled by imaging) ⃝ ⃝
8 Advances instrument to margin of pedicle (controlled by imaging) ⃝ ⃝
9 Advances instrument through back wall of vertebral body (3 to 4 mm after ⃝ ⃝
back trailing edge, controlled by imaging)
10 Finished transpedicular approach successfully ⃝ ⃝

TABLE 5. Average Rating on the Simulators’ Face Validity Assessed by the Novice Group (n ¼ 22)
Questions Average Rating
How would you rate the attractiveness of this spine simulator? 4.59
Do you think that having this spine simulator would encourage you to practice 4.59
your image-guided surgery skills more often?
I would like to have a spine simulator of my own. 3.45
How would you evaluate the general appearance of the spine simulator 4.41
(proportions of the patient phantom, the structure of the simulator, … )?

TABLE 6. Average Rating on the Simulators’ Content Validity Assessed by Intermediates (n ¼ 12) and Experts (n ¼ 9)
Average Rating

Questions I E
How would you rate this spine simulator for training in an ergonomic manner? 4.34 4.56
(e.g., ensuring proper manipulation angle/elevation angle/instrument scaling)
Do you consider this spine simulator useful for practicing image-guided surgery skills? 4.50 4.78
Did you consider this spine simulator easy to use? 4.58 4.56
Would you recommend such a spine simulator to others? 4.50 4.67
The simulator contains all the important procedural steps that are necessary 4.08 4.11
for a surgeon to learn the procedure.
I, intermediates; E, experts.

TABLE 7. Comparison of Left and Right Approach Based on Absolute Values (Mean ± SD) of the Performance Parameters
Parameter Left Right p Value Effect Size
*
nAoM 36.7 ± 15.1 33.6 ± 29.5 0.024 0.34
x tool ðmÞ 1.68 ± 0.68 1.45 ± 0.83 0.018* 0.36
nAP 13.0 ± 7.8 13.6 ± 10.7 0.907 -
nM L 9.3 ± 6.0 10.3 ± 7.6 0.454 -
tðsÞ 178.6 ± 82.6 154.2 ± 118.8 0.022* 0.35
EI 0.35 ± 0.12 0.47 ± 0.13 0.000* 0.63
EI, efficiency index.
p o 0.05.

Journal of Surgical Education  Volume ]/Number ]  ] 2018 5


TABLE 8. Side-Specific Simulation-Based Performance Scores and Specialist Ratings for Novices (n ¼ 22), Intermediates (n ¼ 12),
and Experts (n ¼ 9). The Penetrations into the Spinal Canal Are Indicated in Parentheses
Performance Score Specialist Rating

Group Left Right Left Right


Novices 16.1 ± 10.9 (16) 14.0 ± 10.7 (19) 19.7 ± 17.4 11.9 ± 12.5
Intermediates 18.0 ± 11.8 (6) 18.2 ± 10.4 (10) 22.7 ± 16.7 13.3 ± 14.8
Experts 26.4 ± 9.8 (1) 29.2 ± 10.9 (2) 38.0 ± 12.0 34.3 ± 15.6

The responses to asking whether the simulator was an In view of the overall simulation-based performance score
appropriate training tool were entirely positive (Table 2). (Fig. 4), the experts performed significantly better than the
Almost all intermediates (92%) and experts (100%) were novices (p ¼ 0.001, d ¼ 1.52) and intermediates (p ¼ 0.01,
satisfied with the ergonomics of the simulator. They d ¼ 1.26). The performance was, however, not significantly
declared the simulator was easy to use (92% and 100%), different (p ¼ 0.301) between the novices and intermedi-
useful for practicing image-guided surgery skills (92% and ates. The overall specialist rating confirmed these results
100%), and said they would recommend the simulator to (Fig. 5) demonstrating a significantly better performance of
others (92% and 89%). Finally, 83% of the intermediates the experts compared with the novices (p ¼ 0.003, r ¼
and 78% of the experts confirmed that the simulator 0.55) and intermediates (p ¼ 0.012, r ¼ 0.55), but no
contained all important procedural steps that were necessary significant difference in the performance of the novices and
to learn the surgical procedure (Table 6). intermediates (p ¼ 0.790).

Left vs Right Approach Concurrent Validity


When comparing the performance parameters of the second Comparing the simulation-based performance score and the
approach with the first (Table 7), participants tended to specialist rating, the relation between both assessments was
proceed quicker making less movements. The number of apparent (R ¼ 0.86, p o 0.01). The linear regression plot
projections, however, remained nearly constant. (Fig. 6) further shows that the highest scores were primarily
reached by experts (upper right corner) while novices and
Construct Validity intermediates shared the lower left.

The simulation-based performance scores (penetrations into


the spinal canal in parentheses) and the specialist ratings for DISCUSSION
each insertion side are summarized in Table 8. According to
these results, the level of experience of the participants had a The purpose of the current study was to validate a novel
great impact on the success of the surgical procedure. The spine simulator based on a percutaneous transpedicular
novice group recorded numerous penetrations into the approach. In contrast to a virtual reality system,15 the tactile
spinal canal on both sides, while the intermediates had experience was generated by valid artificial structures. Based
severe problems on the right side only. The experts group on the ratings of novices, intermediates and experts one can
recorded the fewest number of violations on both sides. conclude that the novel spine simulator provides a realistic

FIGURE 4. Overall simulation-based performance scores for novices FIGURE 5. Overall specialist rating for novices (n ¼ 22), intermedi-
(n ¼ 22), intermediates (n ¼ 12), and experts (n ¼ 9). ates (n ¼ 12), and experts (n ¼ 9).

6 Journal of Surgical Education  Volume ]/Number ]  ] 2018


modalities16 prior to the performance on high-fidelity
simulators could be a more efficient approach for improving
task-specific psychomotor skills—particularly for novices,
since the simulator fidelity should comply with the trainees’
level of experience.17 Such a setting would also allow the
assessment of a previously performed skills course using a
validated simulator.18
The present study has some limitations: First, participants
from only 1 institution were included resulting in a
relatively small number of experts. Second, only 1 inde-
pendent rating was conducted and therefore interrater
reliability could not be assessed. Further, all participants
performed the surgical procedure only once and therefore
FIGURE 6. Linear regression relating specialist rating to simulation- the documentation of a task-specific learning curve was not
based performance score.
possible. Also, the performance of novices and intermediates
was similar, so that a different definition of intermediates
design and can be readily applied for training image-guided may have provided a better differentiation between the
surgery skills. For the novice group, the wish to own a participant groups.
simulator was rated low, potentially because within the
novice group surgeons from different medical departments
were included. Therefore, their enthusiasm for owning such CONCLUSIONS
a complex device dedicated to orthopedic surgery was
limited, although they were convinced of its usefulness. The findings of this study show that the novel simulator
The number of penetrations into the spinal canal, which provides an applicable tool for the training of image-guided
was initially defined as a fatal error, was very high amongst surgery skills in a realistic design. Based on a simulation-
novices and intermediates. This suggests that the procedure based performance score, the assessment can distinguish
of percutaneous transpedicular bone access and thus, image- between different levels of expertise. Moreover, surgical
guided interventions are quite challenging and that trainees can improve essential psychomotor skills without
simulator training may be very helpful. In our setting, a temporal limitations outside the operating room.
high-fidelity design with a combination of realistic tactile
force-feedback and eye-hand coordination was chosen. Such
psychomotor skills effectively develop through repetitive
training,3 and therefore participants of the novice group REFERENCES
were partially overstrained with the task. Consequently, the
1. Hornby AS. Oxford Advanced Learner’s Dictionary.
overall performance score of many participants was just that
8th Ed, Oxford University Press, 2013.
given for pedicle localization. The spread of the overall
simulation-based performance score was hence relatively 2. Davies J, Khatib M, Bello F. Open surgical simulation
large and the same applied for the specialist rating. A clear —a review. J Surg Educ. 2013;70:618-627. http://dx.
majority of novices and intermediates (particularly their doi.org/10.1016/j.jsurg.2013.04.007.
right insertion approach) were rated with only 7 (of 45)
3. Thomas GW, Johns BD, Marsh JL, Anderson DD.
points. Since no feedback was provided to participants after
A review of the role of simulation in developing and
their first approach (left), they tended to proceed quicker on
assessing orthopaedic surgical skills. Iowa Orthop J.
the second side (right), maybe with less carefulness. While
2014;34:181. http://dx.doi.org/10.1515/folmed-2017-
single penetrations happened almost exclusively during the
0039.
second approach, the vast majority of participants who
penetrated into the spinal canal on the first side did so on 4. Evgeniou E, Walker H, Gujral S. The role of
the second side. Nevertheless, the simulation-based per- simulation in microsurgical training., J Surg Educ,
formance score distinguished between different levels of 2017. http://dx.doi.org/10.1016/j.jsurg.2017.06.032.
expertise and displayed a relatively high correlation with the
5. Fairhurst K, Strickland A, Maddern GJ. Simulation
independent specialist ratings.
speak. J Surg Educ. 2011;68:382-386. http://dx.doi.
Apart from various types of validity, this study also
org/10.1016/j.jsurg.2011.03.003.
demonstrated how difficult the combined use of surgical
skills actually is. Tactile force-feedback and eye-hand 6. Van Nortwick SS, Lendvay TS, Jensen AR,
coordination are basic orthopedic skills required for many Wright AS, Horvath KD, Kim S. Methodologies for
surgical interventions. Their isolated training on simple establishing validity in surgical simulation studies.

Journal of Surgical Education  Volume ]/Number ]  ] 2018 7


Surgery. 2010;147:622-630. http://dx.doi.org/10.1016/ 2009;56(6):419. http://dx.doi.org/10.1007/s12630-
j.surg.2009.10.068. 009-9090-1.
7. Moorthy K, Munz Y, Sarker SK, Darzi A. Objective 13. Alaker M, Wynn GR, Arulampalam T. Virtual reality
assessment of technical skills in surgery. Br Med J. training in laparoscopic surgery: a systematic review &
2003;327(7422):1032. http://dx.doi.org/10.1136/bmj. meta-analysis. Int J Surg. 2016;29:85-94. http://dx.
327.7422.1032. doi.org/10.1016/j.ijsu.2016.03.034.
8. Tay C, Khajuria A, Gupte C. Simulation training: a 14. Tonetti J, Vadcard L, Girard P, Dubois M, Merloz P,
systematic review of simulation in arthroscopy and Troccaz J. Assessment of a percutaneous iliosacral
proposal of a new competency-based training frame- screw insertion simulator. Orthop Traumatol Surg
work. Int J Surg (London, England). 2014;12:626-633. Res. 2009;95(7):471-477. http://dx.doi.org/10.1016/
http://dx.doi.org/10.1016/j.ijsu.2014.04.005. j.otsr.2009.07.005.
9. Xiao D, Jakimowicz JJ, Albayrak A, Buzink SN, 15. Gasco J, Patel A, Ortega-Barnett J, et al. Virtual
Botden SMBI, Goossens RHM. Face, content, and reality spine surgery simulation: an empirical study
construct validity of a novel portable ergonomic of its usefulness. Neurol Res. 2014;36(11):968-973.
simulator for basic laparoscopic skills. J Surg Educ.
http://dx.doi.org/10.1179/1743132814y.0000000388.
2014;71:65-72. http://dx.doi.org/10.1016/j.jsurg.2013.
05.003. 16. Hohn EA, Brooks AG, Leasure J, et al. Development
of a surgical skills curriculum for the training and
10. Martin J, Regehr G, Reznick R, et al. Objective
assessment of manual skills in orthopedic surgical
structured assessment of technical skill (osats) for surgical
residents. Br J Surg. 1997;84(2):273-278. http://dx.doi. residents. J Surg Educ. 2015;72(1):47-52. http://dx.
org/10.1046/j.1365-2168.1997.02502.x. doi.org/10.1016/j.jsurg.2014.06.005.

11. Fuerst D, Senck S, Hollensteiner M, et al. 17. Aggarwal R, Mytton OT, Derbrew M, et al. Training
Characterization of synthetic foam structures used to and simulation for patient safety. Qual Saf Health
manufacture artificial vertebral trabecular bone. Mater Care. 2010;19(Suppl 2):i34-i43. http://dx.doi.org/
Sci Engin C Mater Biol Appl. 2017;76:1103-1111. 10.1136/qshc.2009.038562.
http://dx.doi.org/10.1016/j.msec.2017.03.158. 18. Egol KA, Phillips D, Vongbandith T, Szyld D, Strauss EJ.
12. Hayter MA, Friedman Z, Bould MD, et al. Validation Do orthopaedic fracture skills courses improve resident
of the imperial college surgical assessment device performance? Injury. 2015;46(4):547-551. http://dx.doi.
(icsad) for labour epidural placement. Can J Anesth. org/10.1016/j.injury.2014.10.061.

8 Journal of Surgical Education  Volume ]/Number ]  ] 2018

You might also like