Assessment tools for minimally invasive surgery simulation programmes: a narrative review
Review Article

Assessment tools for minimally invasive surgery simulation programmes: a narrative review

Pablo Guerrero-Antolino, Carmen Gutiérrez-Sánchez, Mónica Millán-Scheiding ORCID logo

Department of General Surgery, University and Polytechnic Hospital La Fe, Valencia, Spain

Contributions: (I) Conception and design: M Millán-Scheiding; (II) Administrative support: None; (III) Provision of study materials or patients: All authors; (IV) Collection and assembly of data: All authors; (V) Data analysis and interpretation: All authors; (VI) Manuscript writing: All authors; (VII) Final approval of manuscript: All authors.

Correspondence to: Dr. Pablo Guerrero-Antolino, MD. Resident of University and Polytechnic Hospital La Fe; Department of General Surgery, University and Polytechnic Hospital La Fe, Av. Fernando Abril Martorell 106, Quatre Carreres, 46026 Valencia, Spain. Email: pab.guerrero11@gmail.com.

Background and Objective: Minimally invasive surgery has become the main surgical approach for a wide range of procedures; therefore, a thorough learning process becomes essential throughout the residency period. Currently, specific training programs in laparoscopic techniques are increasing in popularity in order to improve these skills. However, the lack of uniformity and the absence of universal validation tools raise concerns about their systematic inclusion in residents’ training curricula. Technological advancements in computer models and virtual reality present an opportunity to establish objective evaluation methods for surgical skills. The objective of this review is to present the different evaluation methods and validation tools most frequently described in the literature, in order to clarify which of them may be more appropriate when validating a simulation training program in laparoscopic surgery.

Methods: In this narrative review, we report studies that investigated the application of simulation programs for minimally invasive surgery, as well as the different assessment tools used for their validation. We carried out the research using different databases, such as PubMed, Uptodate, and the Cochrane Library, and we included published articles from 2002 to 2023. The search strategy included a combination of MeSH terms and pre-specified ‘free-text’ terms. Evaluation methods, ranging from structured checklists to rating scales, are explored, emphasizing the need for impartial and competence-based assessments.

Key Content and Findings: Currently there is a great diversity in terms of simulation programs and its evaluation methods. The use of observational assessment tools, such as Objective Structured Assessment of Technical Skills (OSATS), Global Operative Assessment of Laparoscopic Skills (GOALS) and Fundamentals of Laparoscopic Surgery (FLS), remain the most popular among the published literature. Nevertheless, more sophisticated virtual or hybrid simulators are arising as a more objective way of assessing laparoscopic skills. Despite this wide variety of methods, the lack of universality, the cost considerations, and the need for impartial evaluation still represent some of the issues yet to be solved.

Conclusions: This review underscores the imperative of establishing a universally applicable, accessible, and accredited simulation model for evaluating the capabilities and skills of residents undergoing minimally invasive surgery training. This calls for concerted efforts in the surgical education community to address the current challenges and pave the way for a more standardized and effective approach to surgical training.

Keywords: Simulation; laparoscopic surgery; surgical training; skill assessment; objective evaluation


Received: 29 November 2023; Accepted: 21 March 2024; Published online: 13 June 2024.

doi: 10.21037/ales-23-66


Introduction

Minimally invasive surgery has emerged as the predominant approach in numerous procedures, encompassing both scheduled and urgent interventions. Its well-established benefits, such as reduced postoperative pain, shorter hospital stays, and early postoperative recovery, are widely acknowledged (1). Against this backdrop, the implementation of structured training and educational programs assumes paramount importance in fostering both theoretical comprehension and surgical proficiency during the residency period (2). Consequently, an escalating number of institutions now integrate focused training courses within the residency curriculum, focusing on the development of laparoscopic skills.

These instructional initiatives play an important role in the improvement of technical competencies for the successful execution of laparoscopic procedures. Furthermore, they afford surgical residents the opportunity to train their skills in controlled environments through simulations and practical exercises before transitioning to live patient scenarios (3,4). The array of tools employed in these simulation programs ranges from rudimentary “box trainers” or “pelvic trainers” to virtual simulators and more complex systems such as animal models (5). The acquisition of such skills has substantial implications for patient safety, surgical outcomes, and the overall economic landscape of healthcare (6,7). Additionally, minimally invasive training programs serve as a conduit for staying abreast of the latest technological innovations and surgical approaches, thereby ensuring that professionals remain updated in this field (8).

Nevertheless, these simulation programs lack universality and standardization. In addition, they are often not part of the resident’s educational plan and many hospital centers do not have the necessary material nor can they afford the considerable expense they can imply (9). The absence of universal validation tools, historically resulting in the subjective evaluation of technical skills and competencies, often fails to meet the requisite standards for the responsibilities entailed in these surgical procedures (10). An effective evaluation tool must embody reliability, validity, educational impact, reproducibility, and feasibility (11). In response to this imperative, and recognizing the escalating demand for an objective and structured assessment of technical performance, various models have been proposed for the evaluation of both theoretical and practical facets during laparoscopic simulation training (12,13). We present this article in accordance with the Narrative Review reporting checklist (available at https://ales.amegroups.com/article/view/10.21037/ales-23-66/rc).


Methods

This article undertakes an exhaustive analysis of the published literature concerning diverse simulation models and the tools available for their validation (Table 1). Additionally, we delineate their respective advantages and disadvantages, with the aim of clarify which models may be better suited for accrediting a training program in laparoscopic surgery. Finally, we examine the significant contribution of laparoscopic surgery training programs, fostering continuous enhancements in surgical care, ensuring optimal outcomes for patients, and elevating standards within medical practice.

Table 1

Search strategy summary

Items Specification
Date of search From June to December 2023
Databases and other sources searched PubMed, Cochrane Library, Google Scholar, UptoDate
Search terms used Simulation training
Surgical simulation
Evaluation methods
Virtual reality training
Skills assessment
Objective Structured Clinical Examination (OSCE)
Global Operative Assessment of Laparoscopic Skills (GOALS)
Fundamentals of Laparoscopic Surgery (FLS)
Simulator validation
Haptic feedback
Laparoscopic simulation curriculum
Timeframe 2002 to 2023
Inclusion and exclusion criteria Inclusion criteria:
   Relevance to surgical training: articles focusing on surgical training methods, especially in minimally invasive surgery
   Simulation technologies: articles discussing various simulation technologies, both virtual and physical, used in surgical training
   Evaluation methods: research on different methods and systems for evaluating surgical skills, including scales, assessments, and simulators
   Comparison of training programs: articles comparing the effectiveness of different training programs and methodologies
   Outcome measures: research presenting outcomes of surgical training programs, including the development of skills and competencies
Exclusion criteria:
   Irrelevance to surgical training: articles not directly related to surgical training or simulation methods
   Non-specific medical education: research not focused on surgical education or training programs
   Non-technical skills: studies primarily focused on non-technical skills without a significant emphasis on technical surgical skills
   Non-English publications: exclusion of articles not published in English, as the text is in English
Selection process The selection process was performed independently. After all the participating authors established and concurred with the inclusion and exclusion criteria, the selection of the articles was carried out by two different authors, using the information presented in the abstract of each article. A third one reviewed the selection process and excluded those that were not suitable for this narrative review

A systematic search was carried out from June 2023 to December 2023 in PubMed, Cochrane Library, Google Scholar and UptoDate (Figure 1). We use the following keywords: simulation training, surgical simulation, evaluation methods, virtual reality (VR) training, haptic feedback and laparoscopic curriculum. PRISMA guidelines were adhered to in reporting the results of this study. No year of publication limits were set and only English publications were included. At the beginning we had 87 articles, after applying the inclusion and exclusion criteria, 20 articles were removed. Following removal of duplicates, an initial review of titles and abstracts was conducted to identify articles of potential interest. The lists of included publications were checked by two different authors. Then, a third one reviewed the selection process and excluded those that were not suitable for this narrative review.

Figure 1 Inclusion process.

Results

Primary objectives of simulation programs

The literature reveals a wide range of designs for minimally invasive surgery training programs for residents. Numerous studies have inquired into the fundamental aspects of these initiatives, concentrating on improving technical skills and effecting the transfer of these skills to the daily surgical practice. We can categorize some of the most frequently evaluated facets of laparoscopic surgery simulation:

Improvement of technical skills

A cornerstone of simulation programs resides in the refinement of technical skills. Simulators provide a secure environment conducive to multiple repetitions for practicing laparoscopic techniques without jeopardizing patient safety. These programs evaluate the precision, speed, and coordination of the surgeon’s movements in specific tasks, encompassing the manipulation of instruments, sutures, etc. (14).

Several studies have shown a direct correlation between simulator-based training and a noteworthy improvement in the technical skills of residents. Lovasik et al.’s study, for instance, demonstrated improvement by analyzing a group of surgical gestures in a cohort of residents before and after a 6-week practice period (15). Video recordings of these gestures were subjected to analysis and evaluation by a senior surgeon, who provided formative feedback to be enhanced during the practice period.

Similarly, Gallagher et al. illustrated that training on laparoscopic simulators translated into a significant enhancement in manual dexterity and the precision of movements, manifesting as superior performance in actual surgical procedures (16).

Furthermore, proficiency in teamwork and effective communication within the rest of the team assumes great importance. These aspects can also be addressed through training on diverse simulators. Wheelock, for instance, investigated the impact of simulated patients on the development of communication skills in the healthcare domain (17). Results from this study underscored the impact of simulation on improving communication skills in participants, a relevance extending beyond surgery to encompass interactions with patients and teamwork in the operating room (18).

Cognitive development

Simulation extends beyond just technical skills to play a pivotal role in the cognitive development of residents. Inside an operating theater, rapid decision-making and the management of complex, unpredictable scenarios become crucial. Simulators furnish realistic scenarios for residents to acquire these cognitive skills.

A study by Seymour et al. [2002], evaluating the impact of training on a VR simulator in resolving risk situations in the operating room, found that participants receiving simulator training made more effective decisions and handled unexpected situations better during real procedures compared to a control group without simulator training (19). This underscores the importance of simulation in the cognitive development of residents and their ability to confront intraoperative challenges (18).

Transfer of skills to the operating room

One of the most critical aspects evaluated in laparoscopic surgery simulation is the effective transfer of skills acquired in simulators to the real operating room. It is imperative that residents can proficiently apply the skills learned in simulator practice to real clinical situations.

The literature presents promising results regarding skill transfer. Shi et al. highlighted how residents undergoing simulator training achieved greater competence and confidence in the real operating room, leading to improved performance and a reduction in errors (20).

Skill transfer has also been studied in the context of specific procedures, such as laparoscopic cholecystectomy. A study by Kowalewski et al. [2018] compared training on VR simulators with a mixed learning approach and found that participants receiving training on VR simulators demonstrated superior performance in actual laparoscopic cholecystectomies compared to the control group (21).

Feedback

Objective feedback based on data collected through simulators can be especially valuable for measuring progress over time and identifying areas for improvement. Simulators typically provide real-time information on the resident’s performance, enabling the identification of areas for improvement and the correction of errors. Haptic feedback contributes to a shorter learning curve in surgical procedures. Force parameters are measured using force detection systems in the instrument and, in conjunction with movement parameters, furnish instructors with an objective assessment of laparoscopic skills (22). However, most haptic simulators are currently in the experimental phase and additionally pose a significant cost, thus precluding their widespread standardization (23).

New simulation models

The effectiveness of simulation programs is closely tied to the level of realism and fidelity offered by simulators. Simulators must accurately replicate anatomical conditions and surgical procedures to ensure a training experience closely mirroring reality. Surgical simulation continues to evolve with technological advancements. VR and augmented reality (AR) simulators have expanded the horizons of training. VR simulators enable residents to practice procedures in highly realistic virtual environments, affording them the opportunity to gain experience in a virtual space before performing surgeries on real patients. AR simulators combine physical elements with virtual elements, offering an even more authentic experience (24).

Different methods and evaluation systems

It is not only essential to develop increasingly realistic simulation models with more advanced resources, but we must also be able to establish high-quality, objective evaluation methods that accredit these programs, in order to clarify their true utility. The classic model for assessing surgical skills has been largely subjective due to the difficulty of conducting objective analyses of residents’ performances without incurring observation biases (10).

Various scales and benchmarks have been developed with the aim of rating performance and technical quality as objectively as possible during simulation exercises. Many virtual simulators currently on the market include built-in measurement systems that allow for the assessment of various technical capabilities based on specific parameters. Below, we present some of the most widely used and validated assessment methods in the published literature:

Rating scales (Table 2)

Table 2

Rating scales

Rating scale Description Approach Advantages Disadvantages
OCRS Uses a Global Rating Score system supported by videos Generic (with video assistance) Increased objectivity Not focused on technical skills
OSCE Rotary stations with checklist evaluation Specific for each station Structured evaluation Not focused on technical skills
OSATS Evaluation of surgical competencies with checklist
and global scale
Specific for each station Focused on surgical skills High resources requirement
GOALS Specific version of OSATS for laparoscopic surgery Focused on laparoscopic procedures Evaluates touch, perception, efficiency, bimanual dexterity and autonomy Does not evaluate psychomotor skills

OCRS, Operative Component Rating Scale; OSCE, Objective Structured Clinical Examination; OSATS, Objective Structured Assessment of Technical Skills; GOALS, Global Operative Assessment of Laparoscopic Skills.

  • Global Rating System (GRS) (25): a non-blinded evaluation system with general markers for technical skills (such as time, motion, instrument handling, and bimanual dexterity) applicable to a wide range of procedures. However, it does not assess specific skills and requires the presence of an examiner during the evaluation process, making it a subjective tool with important interobserver differences.
  • Operative Component Rating Scale (OCRS) (25): uses a GRS system accompanied by videos, providing more objective assessment information.
  • Objective Structured Clinical Examination (OSCE) (25): consists of a series of stations where participants rotate. In each station, a specific task is evaluated by an expert surgeon using a checklist of items that must be completed. It is based on clinical stations and could be useful for the assessment of preoperative and postoperative care knowledge as well as physical examination. However, it lacks a specific focus on technical skills and does not assess psychomotor abilities.
  • Objective Structured Assessment of Technical Skills (OSATS) (15,25,26): developed from OSCE, it focuses on evaluating surgical competencies. Participants must perform a surgical skill within a limited time at each station, assessed by a specific checklist and a global scale. It is one of the most commonly used systems, evaluating seven categories (tissue handling, time taken, movement, instrument handling, knowledge of instruments, task fluency, use of assistants, and knowledge of the procedure), all rated on a scale of 1 to 5.
  • Global Operative Assessment of Laparoscopic Skills (GOALS) (18,25): a specific version of OSATS, based on global assessment systems during laparoscopic surgery. The GOALS assessment form contains six items. Four items represent laparoscopic surgery skills: depth perception, bimanual dexterity, efficiency, and tissue handling. The fifth item evaluates the autonomy of the trainee and the sixth item takes into account the difficulty of the procedure.
  • Fundamentals of Laparoscopic Surgery (FLS) (25,26): currently one of the most widely used systems for evaluating surgical skills, developed by the Society of American Gastrointestinal Endoscopic Surgeons (SAGES). Based on studies and manuals on laparoscopic skills and training practices, it evaluates basic laparoscopic competencies through the use of boxes and animal models. It consists of 13 modules and evaluates, on one hand, the understanding and application of the basic fundamentals of laparoscopy, focusing on intraoperative decision making, and, on the other hand, the skills based on the efficiency and precision of the surgeons’ maneuvers.
    • Module 0: introduction;
    • Module 1: laparoscopic equipment;
    • Module 2: energy sources;
    • Module 3: operating room setup;
    • Module 4: patient selection/preoperative assessment;
    • Module 5: anesthesia and patient positioning;
    • Module 6: pneumoperitoneum establishment & trocar placement;
    • Module 7: physiology of pneumoperitoneum;
    • Module 8: exiting the abdomen;
    • Module 9: current and diagnostic laparoscopic procedures;
    • Module 10: biopsy;
    • Module 11: laparoscopic suturing;
    • Module 12: hemorrhage & hemostasis;
    • Module 13: postoperative care and complications.

Surgical simulators (Table 3) (25,27)

Table 3

Surgical simulators

Simulator Examples Feedback Evaluated tasks Advantages Disadvantages
Virutal simulator LapMentor, SINERGIA, LapSIM, MIST-VR, SIMEND, SEP Simulator, LapVR Variable (positive feedback in LapMentor and LapVR) Simple and advanced tasks and procedures Greater versatility and evaluation of procedures Not standardize feedback
Physical simulator CELTS, ZEBRIS, SIMLTS, HUESAD, ADEPT Variable (metric Feedback in CELTS y SIMLTS) Metric movements, suturing, cutting, suture integrity Accurate assessment of specific movements and skills Some of them without available feedback
Hybrid system ProMIS Feedback based on artificial markers Analysis of movements with artificial markers Combines advantages of virtual and physical systems Reliance on artificial markers

MIST-VR, Minimally Invasive Surgery Trainer-Virtual Reality; CELTS, Computer Enhanced Laparoscopic Training System; HUESAD, Hiroshima University Endoscopic Surgical Assessment Device; ADEPT, Advanced Dundee Endoscopic Psychomotor Tester.

  • Virtual simulator:
    • LapMentor (SIMBINOIX): provides feedback and allows training and evaluation of 230 both simple and advanced tasks and certain procedures. His benefits are the tactile feedback and realistic graphics, the range of surgical instruments and scenarios with real-time complications. There are different modules that allow us develop a big number of skills (basic and advanced sutures, camera manipulation, specific surgical intervention, etc.).
    • SINERGIA (SINERGIA Thematic Collaborative Network) (28): method for training and assessment of motor and perceptual skills on the first stages of surgical formation. It has exercises grouped into seven didactic units (hand-eye coordination, camera manipulation, grasping, pulling, cutting, dissection and suture). For each exercise the simulator obtained a set of metrics.
    • LapSIM (Surgical Science): including a range of exercise from basic to advanced laparoscopic procedures using a 3D screen. It gives you tactile feedback, realistic graphics and statistical reports on your skills. You can customize your own courses and benchmarks as well as recording it. There are eleven training modules about abdominal surgery procedures.
    • Minimally Invasive Surgery Trainer-Virtual Reality (MIST-VR) (Mentice) (29): the system comprises of a personal computer (PC) with a 32 Mb random-access memory (RAM) linked to a jig with two laparoscopic instruments. The movement of the instrument is translated as a real-time graphical. During the tasks, errors, accuracy and time are logged on a report. The main disadvantage is the lack of feedback in scenarios for performing simple tasks. Lack of feedback in scenarios for performing simple tasks.
    • SIMENDO and SEP Simulator (DelltaTech): training of basic laparoscopic hand-eye coordination skills, so it does not feedback. The system consists of simulation software and a pair of interfaces designed to emulate surgical instruments. A major advantage is that it requires few computational resources.
    • LapVR (CAE Healthcare): provides positive feedback and the ability to evaluate all types of tasks and procedures. Also, patient cases are generated from real data. It allows practice of skills to use instruments with six degrees of freedom and the instructor can select the instrument for each hand. Each exercise includes an explanation of training objectives, instructions and patient history.
  • Physical simulator (30,31):
    • Computer Enhanced Laparoscopic Training System (CELTS) (CIMIT) (32): computer-based laparoscopic training using two laparoscopic instruments in a virtual interface. It incorporates a standardized and task-independent scoring system for performance assessment. The evaluation process relies on subjective observation of instrument handling and the overall outcome. Later, CELTS transforms this subjective evaluation into quantitative metrics, specifically the following five parameters: time to complete the task, depth perception, path length of the instruments, motion smoothness, and response (instrument) orientation.
    • ZEBRIS (ZEBRIS, GmbH): combines the physical simulator with ultrasound. The CMS10 system tracks the 3D coordinates of the miniature ultrasonic transmitters to three microphones. Because of these, allowing the assessment of instrument position and rotation using a small transmitter placed on them. Training exercises include suturing and precision tasks. The main disadvantage is its high cost.
    • SurgicalSIM-LTS (SimuLAB): it is a self-contained patent-pending computer-enhanced interactive laparoscopic physical reality simulator. It can test and train basic and enabling laparoscopic skills and uses sensors on physical models to evaluate suturing, cutting and suture integrity skills. A computer is housed at the distal end of the enclosure and a digital camera records video. The administrative software supports enrolling users in a database, selecting and performing exercises, viewing and printing past and present test reports, watching tutorials and shutting down.
    • Hiroshima University Endoscopic Surgical Assessment Device (HUESAD) (27): consists of optical scale sensors, micro-encoders, an experimental table and monitor, which are connected to a computer. Evaluates the precision and smoothness of various laparoscopic instruments. It is possible to measure two rotation angle parameters, one distance parameter and time taken.
    • Advanced Dundee Endoscopic Psychomotor Tester (ADEPT) (27): consists of a dome enclosing a defined workspace that contains a target plate. Trainees are instructed to undertake up to four tasks using the target plate, including flicking a switch and turning a dial. Excessive contact outside of the plate is measured as an error. Total time required to execute a task is also recorded.
  • Hybrid system:
    • ProMIS (Haptica): contains three separate camera tracking systems, arranged to identify any instrument inside the simulator. Uses artificial markers placed on instruments to analyze movements. Its evaluation system is based on time and task monitoring along with parameters such as movement smoothness or trajectory. The simulator records “time”, “path length”, and “smoothness of movement” during each separate task. After completion of the task, it provides statistics on the screen. In addition, a full video and virtual playback of the trainee’s performance are saved.

Motion analysis (Table 4) (25,27)

Table 4

Motion analysis

System Examples Features Advantages Disadvantages
Extracorporeal system ICSAD, MicronTracker, HAWNG Use of markers and external sensors Less invasive, external analysis Possible distortions wit electromagnetic devices
Intracorporeal system LED light patterns Use of internal camera or reference points Increased accuracy in motion analysis. Invasive, limited by the field of vision

ICSAD, Imperial College Surgical Assessment Device; LED, light emitting diode.

  • Extracorporeal system: based on artificial visual markers and sensors located on the external part of the instruments. Estimate the position and orientation of the instruments.
    • Imperial College Surgical Assessment Device (ICSAD): electromagnetic sensor on the back of each hand to analyze movements. Electromagnetic systems may distort position and orientation measurements in the presence of other metallic elements in the field. Allows use within simulated and operating theatre environments. Data are produced by custom-built software.
    • MicronTracker: system based on the visible light spectrum. Uses a set of points on the instruments as a reference, tracked by a camera.
    • HAWNG: hybrid electromagnetic and passive optical system (reflective marker).
  • Intracorporeal system: uses a camera inside the simulator to provide information about the movement of the instruments. Another option is the use of reference points and subsequent analysis of the distance traveled by the instrument or light emitting diode (LED) light patterns.

Analysis of clinical-surgical skills (Table 5) (16,18,25,32,33)

Table 5

Clinical-surgical skills analysis

Program Description Approach Advantages Disadvantages
SBT Establishes scenarios for clinical and technical skills Combined Integral development of skills Simulations limited to specific situations
L1RRE Training based on modules and practical sessions Specific for laparoscopic surgery Evaluation over time, progress monitoring Dependence on access to compact discs for further evaluation

SBT, simulation-based training; L1RRE, Laparoscopy 101-Resource for Resident Education.

Structured box training (SBT)

Establishes training scenarios to develop both technical and non-technical skills. As at least 50% of errors during a surgical procedure are due to communication problems among medical staff. This system provides a standardized and safe method for clinical practice, showing favorable results in the surgeon’s confidence in risky situations. A simulation is performed in two critical situations in a box trainer (18):

  • Vasovagal episode after the introduction of a Veress needle.
  • Bleeding from the inferior vena cava after dissecting and clipping the renal artery.

The process is recorded on video and subsequently evaluated by experts using the GOALS scale and a checklist of completed tasks. Non-technical skills are assessed by teaching staff through the audiovisual system during their execution (32).

Laparoscopy 101-Resource for Resident Education (L1RRE)

Training program based on nine study modules on compact disc (CD). It combines training with three sessions based on manual skills with instruments. After completing the training process, each participant’s competence is evaluated with an online test, considering the time taken to complete it, the technique used, knowledge of instruments, and the number of errors. After each module, competence is assessed through web exams to track educational progress. Subsequently, after 6 months, competence is reassessed without allowing participants access to the CDs (16,33).


Discussion

Currently, a great number of minimally invasive surgery training programs for residents are available according to the published literature. However, it seems evident that there is significant variability in the design of these programs depending on the center where they are carried out. This lack of universality could be one of the main reasons why these programs are not systematically included in the training plans during the residency of the different surgical specialties (34). On the other hand, the development of computer models, advances in VR, and real-time analysis of different technical parameters during exercises on simulators are making significant progress in establishing new methods of objective evaluation (35).

For the proper validation of these training models, it is essential that residents’ skills could be assessed impartially, establishing competence criteria to guide their progress. As we have observed, there are numerous evaluation methods, although none seem to be the standard in clinical practice. One of the most frequently used tools nowadays are the logbooks or activity records. However, these lack content validity, as they assess the overall performance of the resident and do not focus on specific procedures or technical skills (36). In addition, the assessment is often highly subjective and influenced by multiple factors, such as patient conditions, the surgical environment, and the hospital’s state.

Most evaluation systems are based on structured checklists and rating scales, with different items that the resident must fulfill to pass the proposed task. These systems are valid for measuring training progress but are not commonly used to accredit training programs. They are easy to implement but consume time and human resources. Among them, the most frequently used according to the published literature are OSATS (and its laparoscopy-specific version “GOALS”) and FLS (25).

The OSATS system was one of the first methods designed for objective and skill-oriented evaluation and one of the most used in clinical practice. This evaluative method has been validated for different laparoscopic procedures, such as intestinal anastomoses in live animals (37). It has the advantage of being able to assess the surgeon’s technical ability during the procedure but does not evaluate the final result (38). Furthermore, its application in the operating room can be complex because, on many occasions, the surgical field, vision, patient conditions can hinder performing a certain surgical gesture, and the instruments used (39). Another drawback is the significant human resources it requires, needing numerous evaluators (40). Lastly, the participant does not receive direct feedback from the evaluator at the time of performing the technical gesture, which may pose a challenge for subsequent improvement of the gesture.

The use of the GOALS method, derived from OSATS, has also been validated to assess residents’ technical skills (41). Among its advantages, it stands out as a specific tool for assessing skills in multiple laparoscopic procedures. It has been used in simulations of different minimally invasive procedures, such as cholecystectomy or nephrectomy, as well as in simulating intraoperative risk situations (42,43).

The FLS training and accreditation program only considers parameters such as the time taken, precision, and performance during the procedure. Some authors advocate for the use of this method, as it employs only objective measurements, considering them sufficient to evaluate surgical performance (25,26,44,45). However, other authors argue that, to better understand the learning process, it is not enough to analyze isolated elements of the procedure but rather to analyze it as a whole, considering different variables that can affect laparoscopic skills’ performance (type of intervention, patient anatomy, available instruments, etc.) (46). However, despite being widely used for formative evaluation, their performance has been questioned by various authors. A systematic review by van Hove et al. demonstrated the limited utility of these measures in procedures of high technical demand, prioritizing their use in simpler tasks performed by novice surgeons (12). Some authors also accuse these methods of lacking objectivity, considering that their use would lead to biased evaluations (47). This has led to their limited use in simulation training programs, both in open and laparoscopic surgery, in many hospitals.

On the other hand, the use of virtual simulators offers the advantage of carrying out surgical training without the need for in-person supervision by a tutor, as well as the ability to perform multiple repetitions of a specific task before performing it in the operating room, thus reducing the learning curve for various procedures (48,49). An example is found in the study by Sankaranarayanan et al., who demonstrated a clear improvement in both execution time and the number of movements used during laparoscopic sigmoidectomy after prior training on the LapMentor simulator (50). Most simulators have technologies that allow a detailed analysis of different performance variables, providing immediate feedback to the participant. This is achieved through the combination of tracking technologies and computer-generated environments, allowing the tracking of instruments and the analysis of each surgical gesture (51). However, the wide variety of simulators complicates their validation as standardized tools for evaluation in simulation programs. For example, simulators like SIMENDO (Simendo, Rotterdam, Netherlands) focus on training advanced psychomotor skills in simple scenarios; while simulators like LapMentor (Simbionix, Lod, Israel) use more realistic scenarios to improve user interaction (52). Additionally, these instruments are costly, and their availability is often limited to centers with greater economic capacity (53). Regarding hybrid simulators, their major advantage lies in providing immediate haptic feedback but, in this case, using real material instead of virtual simulation (54). The increased realism offered by these simulators implies a greater ability to transfer exercises performed to the real operating room, making them more attractive for trainee surgeons (55,56).

In summary, despite the wide variety of programs and simulation models, there is no evidence in the current literature advocating for the use of one over another. Probably, the fundamental reason underlying this fact is the lack of universal validation and accreditation tools for these training models. Additionally, the absence of specific and pre-established criteria indicating whether a simulation model can be accredited or not further complicates its implementation in these educational centers.


Conclusions

Minimally invasive approaches have spread globally. Currently, there is no consideration for training in any surgical specialty without extensive training in these types of techniques. In this regard, the different types of existing simulators play an important role in the resident’s learning of these procedures without endangering the health of patients. This, coupled with the high technological sophistication, has made them a fundamental tool in the surgical training process.

However, the lack of universality, the absence of evaluative systems and accreditation to endorse these training models, as well as their high cost, means that in many hospital centers, they have not been established as part of the educational program for residents. Therefore, it is of utmost importance to establish a simulation model applicable in any center, accessible and properly accredited to assess the capabilities and skills in minimally invasive surgery of residents in training.


Acknowledgments

Funding: None.


Footnote

Reporting Checklist: The authors have completed the Narrative Review reporting checklist. Available at https://ales.amegroups.com/article/view/10.21037/ales-23-66/rc

Peer Review File: Available at https://ales.amegroups.com/article/view/10.21037/ales-23-66/prf

Conflicts of Interest: All authors have completed the ICMJE uniform disclosure form (available at https://ales.amegroups.com/article/view/10.21037/ales-23-66/coif). The authors have no conflicts of interest to declare.

Ethical Statement: The authors are accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved.

Open Access Statement: This is an Open Access article distributed in accordance with the Creative Commons Attribution-NonCommercial-NoDerivs 4.0 International License (CC BY-NC-ND 4.0), which permits the non-commercial replication and distribution of the article with the strict proviso that no changes or edits are made and the original work is properly cited (including links to both the formal publication through the relevant DOI and the license). See: https://creativecommons.org/licenses/by-nc-nd/4.0/.


References

  1. He H, Yang G, Wang S, et al. Fast-track surgery nursing intervention in CRC patients with laparotomy and laparoscopic surgery. Medicine (Baltimore) 2022;101:e30603. [Crossref] [PubMed]
  2. Stefanidis D, Korndorffer JR Jr, Sierra R, et al. Skill retention following proficiency-based laparoscopic simulator training. Surgery 2005;138:165-70. [Crossref] [PubMed]
  3. Cook DA, Zendejas B, Hamstra SJ, et al. What counts as validity evidence? Examples and prevalence in a systematic review of simulation-based assessment. Adv Health Sci Educ Theory Pract 2014;19:233-50. [Crossref] [PubMed]
  4. Stulberg JJ, Huang R, Kreutzer L, et al. Association Between Surgeon Technical Skills and Patient Outcomes. JAMA Surg 2020;155:960-8. [Crossref] [PubMed]
  5. Nepomnayshy D, Whitledge J, Birkett R, et al. Evaluation of advanced laparoscopic skills tasks for validity evidence. Surg Endosc 2015;29:349-54. [Crossref] [PubMed]
  6. Bonrath EM, Dedy NJ, Gordon LE, et al. Comprehensive Surgical Coaching Enhances Surgical Skill in the Operating Room: A Randomized Controlled Trial. Ann Surg 2015;262:205-12. [Crossref] [PubMed]
  7. Bridges M, Diamond DL. The financial impact of teaching surgical residents in the operating room. Am J Surg 1999;177:28-32. [Crossref] [PubMed]
  8. Palter VN, Orzech N, Aggarwal R, et al. Resident perceptions of advanced laparoscopic skills training. Surg Endosc 2010;24:2830-4. [Crossref] [PubMed]
  9. Rasheed F, Bukhari F, Iqbal W, et al. A low-cost unity-based virtual training simulator for laparoscopic partial nephrectomy using HTC Vive. PeerJ Comput Sci 2023;9:e1627. [Crossref] [PubMed]
  10. Ahlberg G, Enochsson L, Gallagher AG, et al. Proficiency-based virtual reality training significantly reduces the error rate for residents during their first 10 laparoscopic cholecystectomies. Am J Surg 2007;193:797-804. [Crossref] [PubMed]
  11. Schuwirth L, van der Vleuten C. Merging views on assessment. Med Educ 2004;38:1208-10. [Crossref] [PubMed]
  12. van Hove PD, Tuijthof GJ, Verdaasdonk EG, et al. Objective assessment of technical surgical skills. Br J Surg 2010;97:972-87. [Crossref] [PubMed]
  13. Vassiliou MC, Feldman LS, Andrew CG, et al. A global assessment tool for evaluation of intraoperative laparoscopic skills. Am J Surg 2005;190:107-13. [Crossref] [PubMed]
  14. Fernandez GL, Page DW, Coe NP, et al. Boot cAMP: educational outcomes after 4 successive years of preparatory simulation-based training at onset of internship. J Surg Educ 2012;69:242-8. [Crossref] [PubMed]
  15. Lovasik BP, Fay KT, Patel A, et al. Development of a laparoscopic surgical skills simulation curriculum: Enhancing resident training through directed coaching and closed-loop feedback. Surgery 2022;171:897-903. [Crossref] [PubMed]
  16. Gallagher AG, Ritter EM, Champion H, et al. Virtual reality simulation for the operating room: proficiency-based training as a paradigm shift in surgical skills training. Ann Surg 2005;241:364-72. [Crossref] [PubMed]
  17. Wheelock A, Suliman A, Wharton R, et al. The Impact of Operating Room Distractions on Stress, Workload, and Teamwork. Ann Surg 2015;261:1079-84. [Crossref] [PubMed]
  18. Goldenberg MG, Fok KH, Ordon M, et al. Simulation-Based Laparoscopic Surgery Crisis Resource Management Training-Predicting Technical and Nontechnical Skills. J Surg Educ 2018;75:1113-9. [Crossref] [PubMed]
  19. Seymour NE, Gallagher AG, Roman SA, et al. Virtual reality training improves operating room performance: results of a randomized, double-blinded study. Ann Surg 2002;236:458-63; discussion 463-4. [Crossref] [PubMed]
  20. Shi R, Marin-Nevarez P, Hasty B, et al. Operating Room In Situ Interprofessional Simulation for Improving Communication and Teamwork. J Surg Res 2021;260:237-44. [Crossref] [PubMed]
  21. Kowalewski KF, Garrow CR, Proctor T, et al. LapTrain: multi-modality training curriculum for laparoscopic cholecystectomy-results of a randomized controlled trial. Surg Endosc 2018;32:3830-8. [Crossref] [PubMed]
  22. Overtoom EM, Horeman T, Jansen FW, et al. Haptic Feedback, Force Feedback, and Force-Sensing in Simulation Training for Laparoscopy: A Systematic Overview. J Surg Educ 2019;76:242-61. [Crossref] [PubMed]
  23. Rangarajan K, Davis H, Pucher PH. Systematic Review of Virtual Haptics in Surgical Simulation: A Valid Educational Tool? J Surg Educ 2020;77:337-47. [Crossref] [PubMed]
  24. Zhang L, Grosdemouge C, Arikatla VS, et al. The added value of virtual reality technology and force feedback for surgical training simulators. Work 2012;41:2288-92. [Crossref] [PubMed]
  25. Sánchez-Margallo JA, Sánchez-Margallo FM, Oropesa I, et al. Systems and technologies for objective evaluation of technical skills in laparoscopic surgery. Minim Invasive Ther Allied Technol 2014;23:40-51. [Crossref] [PubMed]
  26. Chartrand G, Soucisse M, Dubé P, et al. Self-directed learning by video as a means to improve technical skills in surgery residents: a randomized controlled trial. BMC Med Educ 2021;21:91. [Crossref] [PubMed]
  27. Hattori M, Egi H, Hasunuma N. Conscientiousness Counts: How Personality Traits Impact Laparoscopic Surgical Skill Improvement in Medical Students. J Surg Educ 2023;80:1412-7. [Crossref] [PubMed]
  28. Lamata P, Gómez EJ, Sánchez-Margallo FM, et al. SINERGIA laparoscopic virtual reality simulator: didactic design and technical development. Comput Methods Programs Biomed 2007;85:273-83. [Crossref] [PubMed]
  29. Wilson MS, Middlebrook A, Sutton C, et al. MIST VR: a virtual reality trainer for laparoscopic surgery assesses performance. Ann R Coll Surg Engl 1997;79:403-4. [PubMed]
  30. Kantamaneni K, Jalla K, Renzu M, et al. Virtual Reality as an Affirmative Spin-Off to Laparoscopic Training: An Updated Review. Cureus 2021;13:e17239. [Crossref] [PubMed]
  31. Mason JD, Ansell J, Warren N, et al. Is motion analysis a valid tool for assessing laparoscopic skill? Surg Endosc 2013;27:1468-77. [Crossref] [PubMed]
  32. Stylopoulos N, Cotin S, Maithel SK, et al. Computer-enhanced laparoscopic training system (CELTS): bridging the gap. Surg Endosc 2004;18:782-9. [Crossref] [PubMed]
  33. Grantcharov TP, Funch-Jensen P. Can everyone achieve proficiency with the laparoscopic technique? Learning curve patterns in technical skills acquisition. Am J Surg 2009;197:447-9. [Crossref] [PubMed]
  34. Schell SR, Flynn TC. Web-based minimally invasive surgery training: competency assessment in PGY 1-2 surgical residents. Curr Surg 2004;61:120-4. [Crossref] [PubMed]
  35. Rinewalt D, Du H, Velasco JM. Evaluation of a novel laparoscopic simulation laboratory curriculum. Surgery 2012;152:550-4; discussion 554-6. [Crossref] [PubMed]
  36. Crochet P, Aggarwal R, Dubb SS, et al. Deliberate practice on a virtual reality laparoscopic simulator enhances the quality of surgical technical skills. Ann Surg 2011;253:1216-22. [Crossref] [PubMed]
  37. Cuschieri A, Francis N, Crosby J, et al. What do master surgeons think of surgical competence and revalidation? Am J Surg 2001;182:110-6. [Crossref] [PubMed]
  38. Reznick RK, MacRae H. Teaching surgical skills--changes in the wind. N Engl J Med 2006;355:2664-9. [Crossref] [PubMed]
  39. Toledo Martínez E, Martín Parra JI, Magadán Álvarez C, et al. Influence of previous experience on the benefits of laparoscopic surgical training based on simulation. Cir Esp (Engl Ed) 2019;97:314-9. [Crossref] [PubMed]
  40. Chmarra MK, Grimbergen CA, Jansen FW, et al. How to objectively classify residents based on their psychomotor laparoscopic skills? Minim Invasive Ther Allied Technol 2010;19:2-11. [Crossref] [PubMed]
  41. Sidhu RS, Grober ED, Musselman LJ, et al. Assessing competency in surgery: where to begin? Surgery 2004;135:6-20. [Crossref] [PubMed]
  42. Kramp KH, van Det MJ, Hoff C, et al. Validity and reliability of global operative assessment of laparoscopic skills (GOALS) in novice trainees performing a laparoscopic cholecystectomy. J Surg Educ 2015;72:351-8. [Crossref] [PubMed]
  43. Ohtake S, Makiyama K, Yamashita D, et al. Training on a virtual reality laparoscopic simulator improves performance of live laparoscopic surgery. Asian J Endosc Surg 2022;15:313-9. [Crossref] [PubMed]
  44. van Zwieten TH, Okkema S, Kramp KH, et al. Procedure-based assessment for laparoscopic cholecystectomy can replace global rating scales. Minim Invasive Ther Allied Technol 2022;31:865-71. [Crossref] [PubMed]
  45. Scott DJ, Ritter EM, Tesfay ST, et al. Certification pass rate of 100% for fundamentals of laparoscopic surgery skills after proficiency-based training. Surg Endosc 2008;22:1887-93. [Crossref] [PubMed]
  46. Rosenthal ME, Ritter EM, Goova MT, et al. Proficiency-based Fundamentals of Laparoscopic Surgery skills training results in durable performance improvement and a uniform certification pass rate. Surg Endosc 2010;24:2453-7. [Crossref] [PubMed]
  47. Oropesa I, Sánchez-González P, Lamata P, et al. Methods and tools for objective assessment of psychomotor skills in laparoscopic surgery. J Surg Res 2011;171:e81-95. [Crossref] [PubMed]
  48. Hiemstra E, Kolkman W, Wolterbeek R, et al. Value of an objective assessment tool in the operating room. Can J Surg 2011;54:116-22. [Crossref] [PubMed]
  49. Sherman V, Feldman LS, Stanbridge D, et al. Assessing the learning curve for the acquisition of laparoscopic skills on a virtual reality simulator. Surg Endosc 2005;19:678-82. [Crossref] [PubMed]
  50. Sankaranarayanan G, Parker L, De S, et al. Simulation for Colorectal Surgery. J Laparoendosc Adv Surg Tech A 2021;31:566-9. [Crossref] [PubMed]
  51. Issenberg SB, McGaghie WC, Petrusa ER, et al. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach 2005;27:10-28. [Crossref] [PubMed]
  52. Verdaasdonk EG, Stassen LP, Schijven MP, et al. Construct validity and assessment of the learning curve for the SIMENDO endoscopic simulator. Surg Endosc 2007;21:1406-12. [Crossref] [PubMed]
  53. Oussi N, Enochsson L, Henningsohn L, et al. Trainee Performance After Laparoscopic Simulator Training Using a Blackbox versus LapMentor. J Surg Res 2020;250:1-11. [Crossref] [PubMed]
  54. Botden SM, Buzink SN, Schijven MP, et al. Augmented versus virtual reality laparoscopic simulation: what is the difference? A comparison of the ProMIS augmented reality laparoscopic simulator versus LapSim virtual reality laparoscopic simulator. World J Surg 2007;31:764-72. [Crossref] [PubMed]
  55. Korndorffer JR Jr, Kasten SJ, Downing SM. A call for the utilization of consensus standards in the surgical education literature. Am J Surg 2010;199:99-104. [Crossref] [PubMed]
  56. Lamata P. Methodologies for the analysis, design and evaluation of laparoscopic surgical simulators. PhD Thesis. Louvain: Presses univ de Louvain; 2006.
doi: 10.21037/ales-23-66
Cite this article as: Guerrero-Antolino P, Gutiérrez-Sánchez C, Millán-Scheiding M. Assessment tools for minimally invasive surgery simulation programmes: a narrative review. Ann Laparosc Endosc Surg 2024;9:23.

Download Citation