Header bg
  • Users Online: 95
  • Print this page
  • Email this page
Header bg


 
 
Table of Contents
ORIGINAL ARTICLES
Year : 2021  |  Volume : 5  |  Issue : 4  |  Page : 239-245

Introduction of direct observation of procedural skills as workplace-based assessment tool in department of anesthesiology: Evaluation of students’ and teachers’ perceptions


Department of Anaesthesiology, Jawaharlal Nehru Medical College, Ajmer, India

Date of Submission25-May-2021
Date of Decision27-Apr-2021
Date of Acceptance05-Aug-2021
Date of Web Publication24-Nov-2021

Correspondence Address:
Dr. Pooja R Mathur
Department of Anaesthesiology, Jawaharlal Nehru Medical College, B-10, Aravali Vihar, Ajmer 305001.
India
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/bjoa.BJOA_59_21

Rights and Permissions
  Abstract 

Background: The direct observation of procedural skills (DOPS) was introduced for the workplace-based assessment of procedural skills. It offers an opportunity to provide feedback to trainees. This makes DOPS an authentic measure of clinical competence in anesthesiology training. The goal of this study was to assess the perceptions of both trainees and consultants regarding the use of DOPS and to evaluate the performance of anesthesia postgraduate (PG) trainees over consecutive assessments. Materials and Methods: After approval from the ethical committee and sensitization workshop, two exposures of DOPS per trainee were given for three common anesthesia skills as per their years in training. Thereafter anonymous feedback was collected from faculty and trainees to gather their perception regarding DOPS. Consecutive DOPS scores for trainees were analyzed. Data were presented in terms of percentages, mean, and standard deviation. A P value of <0.05 was considered significant. Results: More than 50% of participants were satisfied with the way DOPS was conducted and thought it was feasible for formative assessment. About 80% of participants were of the view that DOPS is helpful for anesthesia training and improving anesthesia procedural skills. Yet only 40%–50% favored the addition of DOPS to the departmental assessment plan. Significant improvement was observed in DOPS scores of PG trainees. Mean DOPS scores of postgraduate trainee year 1, 2, and 3 (JR 1, JR 2, and JR 3) increased from 2.6 to 4.8, 4 to 5.7, and 5.6 to 7, respectively (P < 0.05). Conclusions: DOPS may be considered as a useful tool for workplace-based assessment for anesthesia PG training.

Keywords: Anesthesia, clinical competence, education measurement, training support, workplace


How to cite this article:
Mathur PR. Introduction of direct observation of procedural skills as workplace-based assessment tool in department of anesthesiology: Evaluation of students’ and teachers’ perceptions. Bali J Anaesthesiol 2021;5:239-45

How to cite this URL:
Mathur PR. Introduction of direct observation of procedural skills as workplace-based assessment tool in department of anesthesiology: Evaluation of students’ and teachers’ perceptions. Bali J Anaesthesiol [serial online] 2021 [cited 2021 Nov 28];5:239-45. Available from: https://www.bjoaonline.com/text.asp?2021/5/4/239/330953




  Introduction Top


Increased concern for patient safety has driven medical education toward a “competency-based” curriculum leading to a demand for a reliable, valid, and feasible method for clinical skills assessment. The direct observation of procedural skills (DOPS) was introduced by the Royal College of Physicians in 2003 as one means of workplace-based assessment (WPBA).[1] DOPS was specifically designed to assess procedural skills involving real patients in a single encounter; thus, in contrast to many other assessments in medical education, it takes place in and as a part of daily work.[2] Another feature of WPBA is that it offers the opportunity to provide trainees with feedback on their performance. In DOPS, focus lies on procedural skills that form an integral part of the postgraduate (PG) anesthesia training curriculum. Here, the trainee is evaluated regarding his or her demonstrated understanding of indications, relevant anatomy, the technique of procedure, obtaining informed consent, demonstrating appropriate preparation preprocedure, technical ability, aseptic technique, seeking help where appropriate, postprocedure management, communication skills, consideration of patient/professionalism, and overall ability to perform the procedure.[3] The opportunity of direct observation of trainees in the clinical workplace makes DOPS an authentic measure of performance and clinical competence.[4] Therefore, it was decided to introduce DOPS as a tool for WPBA in the department of anesthesiology at our tertiary care teaching hospital for PG anesthesia trainees and evaluate the perceptions of participants as well as to assess the feasibility of its use.


  Materials and Methods Top


This study is designed to assess the opinions and experiences of both trainees and consultants regarding the use of DOPS as a WPBA tool, as well as to evaluate the performance of anesthesia PG trainees over consecutive assessments using DOPS in the department of anesthesiology at a teaching hospital after the approval from institutional ethical committee. A workshop was conducted in the department of anesthesia for faculty as well as trainees in order to train them regarding the use of DOPS. A plan of introduction of DOPS as a WPBA tool was made with the departmental curriculum committee. At least two DOPS encounters for 20 trainees were planned over a period of four months involving three common anesthesia skills appropriate for the level of training [Figure 1]. The institutional ethics committee waived the need for ethics approval and the need to obtain consent for the collection, analysis, and publication of this noninterventional study.
Figure 1: CONSORT flow diagram

Click here to view


An assessor observed a student performing a practical procedure from start to finish and scored the student against a predefined criterion on a rating score of 0–9 taken from DOPS form of the Australian and New Zealand College of Anesthetists (ANZCA).[5] The assessment took place during the normal course of student work. The degree of difficulty and level of competence expected varied with the experience of the student [Figure 2].
Figure 2: Standard DOPS rating form

Click here to view


Thereafter feedback was collected anonymously through two appropriately designed structured and validated questionnaires, separately from both trainees and consultants. A pilot survey questionnaire was used, and feedback from peers was considered before finalizing the study questionnaire. Scales measuring agreement with attitudinal items were written using a five-point Likert scale format. Some space was provided for open-ended comments regarding users’ experiences [Annexures 1 and 2]. Data were entered into an Excel spreadsheet in percentages and mean ± standard deviation and analyzed statistically using appropriate statistical tests (paired t-test). A P value of <0.05 was considered significant.


  Results Top


As seen in [Table 1], most participants in our study were female. Most of the faculty were aged above 40 years, whereas most of the trainees were aged between 25 and 30 years. Sixty percent of faculty had more than 10 years of experience, and 50% of the trainees were in the second year [Table 1].
Table 1: Demographic profile and experience of participants

Click here to view


Time taken for making an observation in DOPS was found to be less than 10 min by 90% faculty and 70% trainees. Time taken in providing feedback was found to be less than 10 min by 100% faculty and 95% trainees. Therefore, it may be said that DOPS took less than 10 min in most cases, demonstrating its feasibility. In most cases, timely feedback was provided after DOPS. About 90% of the time, feedback was provided within 30 min of DOPS [Table 2].
Table 2: Time taken to conduct DOPS

Click here to view


Fifty percent of faculty and 65% of trainees were satisfied with the way in which DOPS was conducted. Eighty percent of faculty and 55% of trainees thought it was a feasible tool for formative assessment (FA). Eighty percent of faculty and 70% of trainees were of the view that DOPS is helpful for anesthesia training. Eighty percent of faculty and 75% of trainees felt that DOPS is helpful in improving anesthesia procedural skills [Table 3]. Seventy to eighty percent of all participants felt that DOPS was easy to use in WPBA. Still, only 40%–50% of participants favored the addition of DOPS to the departmental assessment plan [Figure 3].
Table 3: Perception of participants regarding introduction of DOPS on five-point Likert scale

Click here to view
Figure 3: Participants’ perception about DOPS

Click here to view


Significant improvement was observed in DOPS scores in all the three different PG year trainees. Mean DOPS scores of postgraduate trainee year 1, 2, and 3 (JR 1, JR 2, and JR 3) increased from 2.6 to 4.8, 4 to 5.7, and 5.6 to 7, respectively. This displays the ability of DOPS as a tool for teaching anesthesia procedural skills [Table 4].
Table 4: DOPS assessment scores

Click here to view



  Discussion Top


There has been an increasing emphasis on defining outcomes of medical education in terms of the “performance” of trainees. PG training is directed not merely at the attainment of knowledge, attitude, and skills but also at observable responsiveness and appropriate functioning in real-life situations.[6],[7],[8] A key aspect of the practice of anesthesia is the ability to perform practical procedures efficiently and safely, which requires proficiency in both technical and nontechnical skills. Gaba et al. differentiated between technical performance, the “adequacy of actions taken from a medical and technical perspective,” and nontechnical performance, “decision-making and team interaction processes.”[9] All over the world, we are witnessing a paradigm shift with medical education regulatory bodies, stating that PG training should be competency-based. However, these regulations do not provide any details of in-training assessments.

Most of the conventional assessments methods conducted in examination settings will fall short of measuring these outcomes. There is indeed a need to assess the trainees in real situations, so that necessary mid-course corrections can be provided to the trainees. WPBA is increasingly used to assess the trainees by direct observation and to shape their learning. WPBA assesses the optimal and judicious use of competencies in authentic settings.[10] Various methods used for WPBA include mini-clinical evaluation exercise, DOPS, mini-peer assessment tool, and multisource feedback as some of the common ones.[11]

The major advantages of WPBA are that they conform to the highest level of Miller’s Pyramid, focus on clinical skills including the necessary soft skills (communication, behavior, professionalism, ethics, and attitude), observation (in a real situation) and feedback, context, and content specificity, compensate for some shortcomings in the traditional assessment methods, alignment of learning with actual working, and encourage reflective practice.[3],[4]

There are five stages in the acquisition of procedural skills: novice, advanced beginner, competent, proficient, and expert. There is an old adage—evaluation drives learning.[12] We observed significant improvement in DOPS scores. Mean DOPS scores of JR 1, JR 2, and JR 3 increased from 2.6 to 4.8, 4 to 5.7, and 5.6 to 7, respectively. This displays the ability of DOPS as a tool for teaching anesthesia procedural skills. There is evidence in the literature about the educational impact of DOPS assessments.[13],[14] Its educational value lies in the process of immediate feedback after the assessment takes place including highlighting trainees’ strengths and weaknesses and formulation of an action plan to meet any learning needs.[15] FAs are used to aid learning and have been described as “assessment for learning.” In order to be useful, feedback from FA needs to occur in a timely manner, so that it can influence a trainee’s progress. Thus DOPS, because of its feasibility and comprehensiveness, can prove to be a useful tool for FA of anesthesia trainees and, in turn, enhance their learning of procedural skills.

As is clear from [Figure 2], DOPS has 11 domains that are used to assess performance in procedural skills. It is worth noting that DOPS focuses on the context of the procedural skill: nine of the domains describe pre- and postprocedure care and nontechnical skills. The actual assessment of actual technical ability to perform the procedure is only one of them.

A DOPS was specifically designed to assess procedural skills involving real patients in a single encounter and can be easily integrated into trainees’ and assessors’ normal routine and therefore considered highly feasible.[16] Time taken for making an observation in DOPS was found to be less than 10 min by 90% of faculty and 70% of trainees. Time taken in providing feedback was found to be less than 10 min by 100% of faculty and 95% of trainees. Therefore, it may be said that DOPS took less than 10 min in most cases, demonstrating its feasibility even in a busy operating room setup.

In most cases, timely feedback was provided after DOPS. About 90% of the time, feedback was provided within 30 min of DOPS. Feedback following WPBAs was found to be useful by a cohort of core medical trainees following WPBAs (Johnson et al. 2009).[17] The anesthetic DOPS specifically requires the assessor to feedback to the trainee areas of good and bad practice and also identify a focus for future learning.[15] DOPS has excellent reliability in trained observers. Its construct validity established in anesthesiology for lumbar epidurals and interscalene blocks.[18] DOPS is widely used in residency training. DOPS and other WPBA tools are routinely used in residency training in the United Kingdom, Ireland, Canada, and Australia.[5],[7],[8]

Kundra and Singh, in their study, concluded that DOPS is a feasible and acceptable tool for skills assessment.[19] Direct observation followed by contextual feedback helps PGs to learn and improve practical skills. Another study found that DOPS can be incorporated in the in-training assessment of undergraduate dental students and seems to have good feasibility and acceptability.[20] Khanghahi and Azar, in their systematic review, revealed that DOPS tests can be used as an effective and efficient evaluation method to assess medical students because of their appropriate validity and reliability, positive impact on learning, and high satisfaction level of students.[21]

We also found a positive attitude toward DOPS, with regard to satisfaction in the way DOPS assessment was conducted (50% and 65% in faculty and trainees), whether it was a feasible tool for FA (80% and 55% in faculty and trainees), DOPS is helpful in improving anesthesia procedural skills (80% and 75% in faculty and trainees). Seventy to eighty percent of all participants felt that DOPS was easy to use in WPBA. Still, only 40%–50% of participants favored the addition of DOPS to the departmental assessment plan. Another study found that 88.7% thought DOPS was easy to use and administer.[22] Students were also very positive about the opportunity that DOPS creates for feedback to a medical student (76.1%). An overwhelming majority (79.6%) agreed that this immediate feedback is helpful to their development. Students also supported the notion (77.3%) that DOPS identifies the developmental needs of a medical student to carry out a procedural skill.

In a recent study on perceptions of anesthesiology, 12 PG students and 10 faculty about DOPS reported 10 students perceived DOPS as an effective teaching–learning tool and were satisfied with the same. Eleven students felt that DOPS had the potential to create more opportunities for learning. The time for feedback was considered adequate by nine students. Eight students felt that DOPS can improve the student–teacher relationship. Six students opined that observation does not affect the performance, whereas the remaining six students were unsure. All the participating faculty members agreed that DOPS improved their attitude toward teaching and perceived it as an effective teaching–learning tool. Nine faculty members felt that DOPS can assess more aspects of procedural skills compared with the traditional methods and that it can be a part of FA. They found DOPS easy to carry out. DOPS was perceived by eight faculty members as a satisfactory tool that can create more opportunities for learning.[23]

In contrast to this, the study by Bindal et al. found that trainees and consultants felt the DOPS assessment was not a useful training tool within anesthesia and felt that it was a checkbox exercise.[1] This shows that research into the assessment of technical skills in anesthesia has been conducted with heterogeneous methodologies, which makes comparison difficult. Perhaps the strongest area of potential pedagogic advantage with the DOPS tool is in the provision of rapid feedback in the form of marks and comments. While students’ exposure to a required experience does not in itself assess clinical competency, documenting and monitoring those experiences remain a major component in the education and accreditation process.[24],[25] Hence, further research is necessary to investigate the use of DOPS in anesthesia specialist training, and improvements are needed to ensure that it is of educational benefit. This includes viewing it as a formative rather than summative tool and training for all those who are participating in DOPS assessments. There needs to be a greater emphasis on how DOPS is conducted, especially regarding planning and time for these assessments.

A few of the limitations of our study are lack of adequate time for assessments in busy operating room scheduling, the requirement of participant training, small sample size, and no standardization of DOPS for various anesthesiology skills and therefore difficult to compare. We recommend that further studies on the validity and reliability of DOPS in the workplace and simulated contexts need to be conducted.


  Conclusion Top


We found that DOPS assessments help in improving the clinical skills of PG anesthesia students. The faculty and PG students had very positive feedback about the usefulness of DOPS. DOPS should be implemented as a method of FA in the regular curriculum of PG students. To conclude, we can state that WPBA is not a replacement for conventional assessment but is a complement to it. The two should be used in judicious combinations as per feasibility and context.

Acknowledgement

We would like to thank the Department of Anesthesia, Jawaharlal Nehru Medical College, Ajmer, for their support and cooperation. We extend our heartfelt gratitude to Dr. Praveen Singh, Dr. Suman Singh, and Dr. Sanjay Gupta for their valuable guidance.

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.



 
  References Top

1.
Bindal N, Goodyear H, Bindal T, Wall D. DOPS assessment: A study to evaluate the experience and opinions of trainees and assessors. Med Teach 2013;35:e1230-4.  Back to cited text no. 1
    
2.
Kogan JR, Holmboe ES, Hauer KE. Tools for direct observation and assessment of clinical skills of medical trainees: A systematic review. JAMA 2009;302:1316-26.  Back to cited text no. 2
    
3.
Norcini J, Burch V. Workplace-based assessment as an educational tool: AMEE guide no. 31. Med Teach 2007;29:855-71.  Back to cited text no. 3
    
4.
Bindal T, Wall D, Goodyear HM. Trainee doctors’ views on workplace-based assessments: Are they just a tick box exercise? Med Teach 2011;33:919-27.  Back to cited text no. 4
    
5.
Australian and New Zealand College of Anaesthetists. ANZCA Handbook for Training and Accreditation. Melbourne: Australian and New Zealand College of Anaesthetists; 2012. Available from: https://www.anzca.edu.au/resources/all-handbooks-and-curriculums/anzca-training-handbook. [Last accessed on 20 Aug 2021].  Back to cited text no. 5
    
6.
Medical Council of India. Postgraduate Medical Education Regulations 2000. Available from: https://www.nmc.org.in/wp-content/uploads/2019/12/Postgraduate-Medical-Education-Regulations-2000.pdf. [Last accessed on 20 Aug 2021].  Back to cited text no. 6
    
7.
PMETB Assessment Committee. Developing and Maintaining an Assessment System—A PMETB Guide to Good Practice. London: Postgraduate Medical Education and Training Board; 2007. Available from: https://www.gmc-uk.org/-/media/documents/Designing_and_maintaining_postgraduate_assessment_programmes_0517.pdf_70434370.pdf. [Last accessed on 20 Aug 2021].  Back to cited text no. 7
    
8.
Boker A. Toward competency-based curriculum: Application of workplace-based assessment tools in the National Saudi Arabian Anesthesia Training Program. Saudi J Anaesth 2016;10:417-22.  Back to cited text no. 8
[PUBMED]  [Full text]  
9.
Gaba DMHS, Fish KJ, Smith BE, Sowb YA. Simulation-based training in anesthesia crisis resource management (ACRM): A decade of experience. Simul Gaming 2001;32:175-93.  Back to cited text no. 9
    
10.
Miller GE. The assessment of clinical skills/competence/performance. Acad Med 1990;65:S63-7.  Back to cited text no. 10
    
11.
Bould MD, Crabtree NA, Naik VN. Assessment of procedural skills in anaesthesia. Br J Anaesth 2009;103:472-83.  Back to cited text no. 11
    
12.
Rethans JJ, Norcini JJ, Barón-Maldonado M, Blackmore D, Jolly BC, LaDuca T, et al. The relationship between competence and performance: Implications for assessing practice performance. Med Educ 2002;36:901-9.  Back to cited text no. 12
    
13.
Wanjari S, Rawekar A. Effectiveness of DOPS “direct observation of procedural skills” as a method of formative assessment for improving the clinical skills of post-graduate students in the Department of Obstetrics and Gynecology. J Educ Techno Health Sci 2019;6:29-34.  Back to cited text no. 13
    
14.
Kathirgamanathan A, Woods L. Educational tools in the assessment of trainees in anesthesia. Contin Educ Anaesth Crit Care Pain 2011;1:138-42.  Back to cited text no. 14
    
15.
Boursicot K, Etheridge L, Setna Z, Sturrock A, Ker J, Smee S, et al. Performance in assessment: Consensus statement and recommendations from the Ottawa conference. Med Teach 2011;33:370-83.  Back to cited text no. 15
    
16.
Hauer KE, Holmboe ES, Kogan JR. Twelve tips for implementing tools for direct observation of medical trainees’ clinical skills during patient encounters. Med Teach 2011;33:27-33.  Back to cited text no. 16
    
17.
Johnson GJ, Barrett J, Jones M, Wade W. The acute care assessment tool: A workplace-based assessment of the performance of a physician in training on the acute medical take. Clin Teach 2009;6:105-9.  Back to cited text no. 17
    
18.
Watson MJ, Wong DM, Kluger R, Chuan A, Herrick MD, Ng I, et al. Psychometric evaluation of a direct observation of procedural skills assessment tool for ultrasound‑guided regional anesthesia. Anesthesia 2014;69:604-12.  Back to cited text no. 18
    
19.
Kundra S, Singh T. Feasibility and acceptability of direct observation of procedural skills to improve procedural skills. Indian Pediatr 2014;51:59-60.  Back to cited text no. 19
    
20.
Singh G, Kaur R, Mahajan A, Thomas AM, Singh T. Piloting direct observation of procedural skills in dental education in India. Int J Appl Basic Med Res 2017;7:239-42.  Back to cited text no. 20
    
21.
Erfani Khanghahi M, Ebadi Fard Azar F. Direct observation of procedural skills (DOPS) evaluation method: Systematic review of evidence. Med J Islam Repub Iran 2018;32:45.  Back to cited text no. 21
    
22.
McLeod R, Mires G, Ker J. Direct observed procedural skills assessment in the undergraduate setting. Clin Teach 2012;9:228-32.  Back to cited text no. 22
    
23.
Lagoo JY, Joshi SB. Introduction of direct observation of procedural skills (DOPS) as a formative assessment tool during postgraduate training in anaesthesiology: Exploration of perceptions. Indian J Anaesth 2021;65:202-9.  Back to cited text no. 23
  [Full text]  
24.
Lörwald AC, Lahner FM, Nouns ZM, Berendonk C, Norcini J, Greif R, et al. The educational impact of mini-clinical evaluation exercise (mini-CEX) and direct observation of procedural skills (DOPS) and its association with implementation: A systematic review and meta-analysis. PLoS One 2018;13:e0198009.  Back to cited text no. 24
    
25.
Singh T, Sood R. Workplace‑based assessment: Measuring and shaping clinical learning. Natl Med J India 2013;26:42-6.  Back to cited text no. 25
    


    Figures

  [Figure 1], [Figure 2], [Figure 3]
 
 
    Tables

  [Table 1], [Table 2], [Table 3], [Table 4]



 

Top
 
  Search
 
    Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
    Access Statistics
    Email Alert *
    Add to My List *
* Registration required (free)  

 
  In this article
Abstract
Introduction
Materials and Me...
Results
Discussion
Conclusion
References
Article Figures
Article Tables

 Article Access Statistics
    Viewed154    
    Printed0    
    Emailed0    
    PDF Downloaded24    
    Comments [Add]    

Recommend this journal