Chapter 209 Education and Knowledge-Base Acquisition and Retention
Spine surgeons generally train in formal residency programs in either neurosurgery (6–7 years) or orthopaedic surgery (5–6 years), which may be followed by a 1- to 2-year fellowship in spine care. For residency training, the North American Spine Society (NASS) has defined five core categories of education that should be addressed during residency. These include (1) core knowledge, (2) clinical evaluation, (3) operative management, (4) postoperative care, and (5) rehabilitation.1 The fellowship should provide more in-depth study in hospital resources, teaching faculty, educational programs, research endeavors, and evaluation of the process.2 According to the NASS, the graduating resident should at least have a reasonable degree of comfort in caring for patients and performing surgery for disc herniation, decompressive laminectomy/foraminotomy, noninstrumented posterolateral and posterior spinal fusion, bone graft harvest, management of spinal fractures with appropriate instrumentation and external immobilization, and basic management of spinal deformity.1 As one might imagine, as technology advances, the knowledge base one must acquire during a fixed time period advances rapidly—especially given the fact that similar advances are taking place in the understanding of brain and nervous system diseases (neurosurgery residents) and long-bone and joint injuries (orthopaedic surgery residents), which also must be mastered by the end of residency. The response to this must be either an increase in the length of training or more focus in subspecialty training within a given specialty. Indeed there has been discussion of changing the traditional residency training system to include 2 to 3 years of general training followed by a 2- to 3-year subspecialty fellowship. This discussion has many implications to the health-care system in general, because fewer general providers are available. In any sense, maximizing the efficiency of education during the residency years is beneficial. Learning efficiency requires a more in-depth look at the elements of the learning process.
Elements of Learning
The most rapid learning in humans occurs during childhood. This consists of learning mostly facts, such as language, and overall exploration of the environment. For adults, learning is intimately tied to application of knowledge. Adults have a more difficult time learning facts and things that are not put to use in general daily life. The most productive learning occurs when concepts and principles are linked to existing knowledge and experiences.3 All of the human sensory modalities are at work in the learning process, and learning is generally more effective when several modalities are used in any given task. People generally remember 20% of what they hear, 30% of what they see, 50% of what they hear and see, 70% of what they say, and 90% of what they do.3,4 This concept is highly useful when planning a learning program for trainees. This finding has greatly influenced learning as a whole. Presenters generally use audiovisual adjuncts to their lectures to assist in retention. It is particularly important to residency training, in that residents must be involved heavily in doing their craft. This phenomenon occurs due to the development of collateral brain pathways among the multiple sensory systems, which provides for more durable learning and knowledge retention.
Cognitive Domain
Affective Domain
This domain is largely based on feelings, emotions, and degree of acceptance or rejection of the learner. Unlike the cognitive domain, this domain is largely based on intangible information. This domain is extremely hard to assess by objective methods, and evaluation is often based on the subjective and affective experiences of the examiner.5 Skills learned in this domain are acquired throughout life and are based on a wide range of influences. In medicine, this domain is often referred to as the “healer’s art” and includes empathy and “bedside manner.”
The other important aspect of the affective domain is its ability to affect the other domains of learning. This ability is perhaps best exemplified in the old medical tradition of “pimping,” in which a teacher, presumably an attending physician or other provider, asks questions of medical students or residents in the presence of their peers. The teacher in this case has power over the learner in the teaching process. Although some learners excel in this environment, most feel uncomfortable and this can cause nervousness that detracts from a valuable learning experience.6 On the other hand, embarrassment in front of one’s peers is often an effective mechanism for “driving home” a point, and information learned in this manner may be tied to an emotional response, which is often quite durable.
Psychomotor Domain
In spine surgery, as in most other training environments, the skills learned during residency form a foundation for more complex skills. This may have significant implications for learning in subspecialty fellowship training. An inadequate foundation established early in one’s career can become quite difficult to rectify.
Teaching–Learning Plan
Teaching Strategies
Teacher-based strategies for cognitive learning include lectures and discussion groups. The advantage of these methods is that a larger group of trainees can be efficiently taught information from a teacher who may have accumulated expertise in an area not shared by the majority of physicians. The teacher is responsible for assessing the learners’ needs and most effective means of learning. These types of learning methods often place a higher degree of responsibility on the teacher regarding preparation of discussion materials, lectures, and audiovisual aids. Preparation for lectures should account for the average attention span of about 20 minutes. Active learning, in the form of questions and discussion, may be used to maintain interest in the topic. Lectures have the advantage of repetition of core concepts, which can then be elaborated on in smaller discussion groups. Evaluation and analysis are vitally important in these learning methods since the groups tend to be larger and the deficiencies of the individual may be missed. Questions from the audience should provide some information on content that may not be well understood and suggest areas for future improvement. When used in combination, lectures and smaller discussion groups can be highly effective as material is repeated several times, often by several different teachers.
It is clear that a relationship exists between the quality of teaching and student performance.7 Residents who rated their attending surgeons as better teachers scored higher on national boards than those who did not think as highly of the quality of teaching. It is also clear that not all surgeons excel in teaching or have expertise in many different areas. These facts should encourage arrangements in which residents and fellows have higher exposure to more effective educators.4
One of the most exciting developments on the frontier for surgical skills training is virtual reality. This technology has been primarily used thus far in minimally invasive general surgical procedures. One randomized, double-blinded study of surgical residents performing a laparoscopic cholecystectomy found that residents trained on a virtual reality system were 29% faster and six times less likely to make a mistake than their conventionally trained counterparts.8 In neurosurgical care, these technologies have been employed in surgical planning as well as the performance of ventriculostomy and endoscopic intracranial surgery. These systems allow trainees to practice tasks in a virtual reality environment and compare their performance with that of their mentors. With the expansion of haptic (tactile) feedback technology, these simulators may indeed become more prevalent in neurosurgical training and integrated into residency and fellowship for spine care.9–11
Evaluation
There must be a means of evaluating the effectiveness of the teaching–learning process and make adjustments where necessary. This evaluation should be efficient and occur often. The trainee should get a concise assessment of his or her performance as well as the opportunity to evaluate the training program and communicate if his or her learning needs are not being met. This is essential even in established teaching environments because the needs may change with any individual group of trainees. The NASS guidelines have set forth a method for this evaluation.2 It is also essential for a specific set of goals or benchmarks for each level of training to be well known to both the surgical trainee and teacher prior to the initiation of the program. If the goals are not clearly defined, then the resident and training program cannot be assessed by any reliable means.
Cognitive Skills
Cognitive skills are by far the easiest skill set to evaluate. Most commonly, this evaluation comes in the form of written examinations. In medical school training, the U.S. Medical Licensing Examination (USMLE) board examinations consist of three “steps” that evaluate different levels of training. In neurosurgical training, the American Board of Neurological Surgery (ABNS) mandates that a written board examination that evaluates cognitive learning be taken during residency and an oral board examination that aims to assess cognitive, affective, and psychomotor learning together be taken after the completion of residency. These tests often use true-false and multiple-choice questions to assess a certain area of knowledge. Reliability is increased with larger numbers of test questions per topic and larger numbers of possible answers.12 Other testing methods using short-answer and essay questions are also useful, but are quite subjective in scoring and are somewhat cumbersome for the graders. The longer format for answering can be both an advantage and a disadvantage in that it allows the students to express a large volume of their knowledge in one question but may discriminate against students with large volumes of knowledge on the topic but poor verbal skills. These types of questions, if designed well, can also test synthesis of information and judgment. Oral examinations are also used to test cognitive skills, but these are somewhat difficult to standardize. Oral testing allows the flexibility to probe deeper into areas where the examiner notices deficiency; however, by nature this method eliminates a standardized experience and reproducibility in grading. With careful selection of questions and a clear set of grading standards, these examinations can have score reliability of up to 88%.12 The most common themes to keep in mind with the evaluation of cognitive skills are validity, reliability, and standardization.
Attitudes and Skills
As mentioned previously, the objective and effective assessment of operative skills is difficult. Some suggest that this assessment is easier in the surgical skills laboratory than in the operating room.13 This alternative approach, however, tends to be a rather unrealistic, so true skills are still difficult to assess. The expansion of virtual-reality technology will undoubtedly change the evaluation of these skills in the future. Where skills were once assessed by comparing them with those of individual teachers at an institution, virtual-reality technology and telemedicine will allow trainees to compare their skills with those of the world’s experts.
Objective structured clinical examinations (OSCEs) have been used since the 1970s in medical schools,14 and have gained some limited use in residency training. In these examinations, clinical skills can be broken down into individual components that can then be evaluated by independent observers. Reliability in these tests appears to be at least 80%. These examinations have been modified to objective structured assessment of technical skills (OSATs).15 These tests use bench models of surgical tissues to evaluate performance. Still these methods fall quite short of reality and do not capture the intangible aspect of operating on real patients. They may also provoke an element of anxiety by being observed, which further deviates from reality. These tests are also costly and difficult to create and administer.
Blue A.V., Griffith C.H.3rd, Wilson J., et al. Surgical teaching quality makes a difference. Am J Surg. 1999;177:86-89.
Douglas K.C., Hosokawa M.C., Lawler F.H. A practical guide to clinical teaching in medicine. New York: Springer; 1988.
Education, ACGME Outcomes Project and the Accreditation Council for Graduate Medical: Toolbox of Assessment Methods. In ACGME/ABMS, 2000.
Herkowitz H.N., Connolly P.J., Gundry C.R., et al. Educational guidelines for orthopaedic and neurosurgical spinal fellowship training. Spine (Phila Pa 1976). 2000;25:2704-2705.
Herkowitz H.N., Connolly P.J., Gundry C.R., et al. Resident and fellowship guidelines: educational guidelines for resident training in spinal surgery. Spine (Phila Pa 1976). 2000;25:2703-2707.
1. Herkowitz H.N., Connolly P.J., Gundry C.R., et al. Resident and fellowship guidelines: educational guidelines for resident training in spinal surgery. Spine (Phila Pa 1976). 2000;25:2703-2707.
2. Herkowitz H.N., Connolly P.J., Gundry C.R., et al. Educational guidelines for orthopaedic and neurosurgical spinal fellowship training. Spine (Phila Pa 1976). 2000;25:2704-2705.
3. Douglas K.C., Hosokawa M.C., Lawler F.H. A practical guide to clinical teaching in medicine. New York: Springer; 1988.
4. Benzel E: Teaching and Learning the Fundamentals. Lecture at Joint Section on Disorders of Spine and Peripheral Nerves, Tampa, FL, March 5–8, 2003.
5. Evans A.W. Assessing competence in surgical dentistry. Br Dent J. 2001;190:343-346.
6. Detsky A.S. The art of pimping. JAMA. 2009;301:1379-1381.
7. Blue A.V., Griffith C.H.3rd, Wilson J., et al. Surgical teaching quality makes a difference. Am J Surg. 1999;177:86-89.
8. Seymour N.E., Gallagher A.G., Roman S.A., et al. Virtual reality training improves operating room performance: results of a randomized, double-blinded study. Ann Surg. 2002;236:458-463. discussion 463–464
9. Kockro R.A., Stadie A., Schwandt E., et al. A collaborative virtual reality environment for neurosurgical planning and training. Neurosurgery. 2007;61:379-391. discussion 391
10. Lemole G.M.Jr., Banerjee P.P., Luciano C., et al. Virtual reality in neurosurgical education: part-task ventriculostomy simulation with dynamic visual and haptic feedback. Neurosurgery. 2007;61:142-148. discussion 148–149
11. Banerjee P.P., Luciano C.J., Lemole G.M.Jr., et al. Accuracy of ventriculostomy catheter placement using a head- and hand-tracked high-resolution virtual reality simulator with haptic feedback. J Neurosurg. 2007;107:515-521.
12. Toolbox of Assessment Methods: Accreditation Council for Graduate Medical Education (ACGME) and American Board of Medical Specialties (ABMS), Version 1.1, September, 2000.
13. Darzi A., Smith S., Taffinder N. Assessing operative skill. Needs to become more objective. BMJ. 1999;318:887-888.
14. Harden R.M., Gleeson F.A. Assessment of clinical competence using an objective structured clinical examination (OSCE). Med Educ. 1979;13:41-54.
15. Winckel C.P., Reznick R.K., Cohen R., et al. Reliability and construct validity of a structured technical skills assessment form. Am J Surg. 1994;167:423-427.