您的位置: 首页 > 2017年7月 第2卷 第7期 > 文字全文

Workplace-based assessments

Workplace-based assessments

来源期刊: Annals of Eye Science | 2017年7月 第2卷 第7期 - 发布时间: 04 July 2017.阅读量:1032
作者:
关键词:
Rubric formative feedback workplace based assessment
Rubric formative feedback workplace based assessment
DOI:
10.21037/aes.2017.06.04

Abstract: The goal of ophthalmology residency training is to produce competent ophthalmologists. Appropriate assessments must be employed to ensure this goal is met. Valid and reliable workplace-based assessments are designed to assess competence in the many domains required of a good ophthalmologist. These assessments increase standardization and objectivity as compared to simple observational feedback. When used appropriately, workplace based assessments not only provide measures of competence but also facilitate effective formative feedback and enhance learning.

Abstract: The goal of ophthalmology residency training is to produce competent ophthalmologists. Appropriate assessments must be employed to ensure this goal is met. Valid and reliable workplace-based assessments are designed to assess competence in the many domains required of a good ophthalmologist. These assessments increase standardization and objectivity as compared to simple observational feedback. When used appropriately, workplace based assessments not only provide measures of competence but also facilitate effective formative feedback and enhance learning.

Desired physician competencies have been defined by both the Royal College of Physicians and Surgeons of Canada and the United States’ Accreditation Council for Graduate Medical Education (ACGME) (1,2). Competence can be defined as “the ability to do something well.” The goal of ophthalmology residency training is to produce competent ophthalmologists. The Royal College developed the CanMEDS framework that described abilities required to be competent physicians (Table 1) (1). The ACGME’s Outcomes Project which described six general competencies that every physician should achieve (Table 2) (2).

table1

Table 1

The CanMEDS competencies

Medical expert: possess the knowledge and skills required to provide up-to-date, ethical, and resource efficient clinical care. This is the central role of physicians and requires all of the roles listed below
Communicator: able to effectively manage the doctor-patient relationship
Collaborator: able to work effectively in the health care team to provide optimal patient care
Manager: able to organize practices, allocate resources appropriately, and contribute to the effectiveness of the healthcare system
Health advocate: able to advance the health and well-being of patients, communities and populations
Scholar: able to demonstrate life-long learning principles to enhance professional activities, create and apply new medical information and educate students, patients and peers
Professional: practice ethically and have high standards of personal behavior

table2

Table 2

The core competencies

Patient care: residents must be able to provide patient care that is compassionate, appropriate, and effective for the treatment of health problems and the promotion of health
Medical knowledge: residents must demonstrate knowledge of established and evolving biomedical, clinical, epidemiological and social behavioral sciences, as well as the application of this knowledge to patient care
Practice based learning and improvement: residents must demonstrate the ability to investigate and evaluate their care of patients, to appraise and assimilate scientific evidence, and to continuously improve patient care based on constant self-evaluation and life-long learning
Interpersonal and communication skills: residents must demonstrate interpersonal and communication skills that result in the effective exchange of information and collaboration with patients, their families, and health professionals
Professionalism: residents must demonstrate a commitment to carrying out professional responsibilities and an adherence to ethical principles
Systems based practice: residents must demonstrate an awareness of and responsiveness to the larger context and system of health care, as well as the ability to call effectively on other resources in the system to provide optimal health care

Competence cannot be assumed simply because one completes a training program. Medical knowledge is typically assessed with written or oral examination but most competencies (e.g., surgical skill) cannot be adequately assessed in this way. Workplace-based assessments (WPBA) are structured assessment tools designed to objectively assess surgical skill, patient care, professionalism and communication ability. WPBA usually consist of rubrics ideally with behavioral descriptors at each rating level. A rubric is a tool that can help one give timely, specific, structured feedback and is defined as an explicit set of criteria used to assess a particular skill. Good rubrics consist of three parts: (I) dimensions (e.g., steps of a surgical procedure); (II) levels (e.g., score of 1–5 or novice, beginner, advanced beginner, competent, expert); and (III) behavioral descriptors (what it means to perform at a certain level for any of the dimensions). For example, an assessment rubric for cataract surgery might include dimensions of prepping and draping the patient, levels from 1–5, and descriptions of exactly what behavior is necessary to score 1–5. Types of WPBA include rubrics for directly observed procedural skills, directly observed patient care and multisource feedback (360-degree evaluation).

WPBA should adhere to important assessment principles including brevity balanced with reliability and validity, testing application of knowledge and covering the spectrum of competence required (3). The majority of completed WPBA should be shared with the resident and feedback designed to improve performance—it is called teaching! Fortunately, a variety of ophthalmology-specific WPBA have already been developed to assess the competencies described in Tables 1,2.

WPBA for procedural skills

Several WPBAs of surgical skill have been devised. Cremers and associates developed the “Objective Assessment of Skills in Intraocular Surgery” (OASIS), a one-page objective evaluation form to assess residents’ skills in cataract surgery (4). The form is completed by an evaluator who directly observes the surgical procedure and includes objective data such as wound placement and size, phacoemulsification time, and total surgical time, etc. They showed that the OASIS had both face and content validity. To complement this objective assessment the same group developed a subjective rating of surgical skills named “Global Rating Assessment of Skills in Intraocular Surgery” (GRASIS) (5). This one-page form allows the evaluator to assign scores from 1–5 based on a behaviorally anchored rubric to domains such as pre-operative knowledge, microscope use, instrument handling, and tissue treatment in addition to seven other areas. Thus, the combination of the OASIS and GRASIS provides objective and subjective evaluation of surgical skill. Feldman and Geist described the Subjective Phacoemulsification Skills Assessment as an evaluative instrument designed specifically for intraoperative assessment of resident phacoemulsification cataract extraction (PCE) surgery (6). This form delineates PCE into overall performance and specific steps of the procedure [e.g., capsulorhexis, hydrodelineation, intraocular lens (IOL) implantation, etc.]. The performance was graded with a rubric defining a good outcome at each step and asking the evaluator to rate on a 1–5 spectrum from strongly agree to strongly disagree. They were able to show a degree of inter-rater reliability. The Royal College of Ophthalmologists In the United Kingdom have developed an extensive number of WPBAs called either direct observation of clinical skills (DOCS) or objective structured assessment of technical skills (OSATS) (7). These WPBAs are designed to cover all important procedures and surgeries in ophthalmology. Similar to the other WPBAs described, the rubrics do not contain behavioral descriptors for every rating and leaves the assessment significantly subjective in nature.

Saleh and colleagues described an assessment tool called the “Objective Structured Assessment of Cataract Surgical Skill” (OSACSS) (8). This tool breaks down the phacoemulsification procedure into 20 steps that are scored on a 5-point Likert scale. The scale anchors are: 1= “poorly or inadequately performed”, 3= “performed with some errors or hesitation”, and 5= “performed well with no prompting or hesitation”. There are no scale anchors for scores of 2 or 4. An international panel of authors modified the OSACSS by producing a globally-applicable rubric with levels based on the Dreyfus model of skill acquisition (novice, beginner, advanced beginner, competent, and expert) and with behavioral anchors for each level in each step of the surgical procedure was created (9). Once drafted, content and face validity were achieved by having an international panel of 15 experts review the draft instrument and provide feedback. After incorporating suggestions from the international panel, a final document, the ICO-Ophthalmology Surgical Competency Assessment Rubric (OSCAR)—phacoemulsification was produced (9). In a similar fashion internationally applicable assessment tools for extracapsular cataract surgery (ICO-OSCAR:ECCE) (9), small incision cataract surgery (ICO-OSCAR:SICS) (10), lateral tarsal strip surgery (ICO-OSCAR:LTS) (11), strabismus surgery (ICO-OSCAR:strabismus) (12), and pediatric cataract surgery (ICO-OSCAR:pedscat) (13) were developed. Furthermore, the ICO-OSCAR:phaco and ICO-OSCAR:strabismus tools have been shown to have inter-rater reliability (14,15). Similar tools for trabeculectomy (ICO-OSCAR:trab), panretinal photocoagulation (ICO-OSCAR:PRP), and vitrectomy (ICO-OSCAR:Vit) are in press at the time of this manuscript preparation. More recently another cataract surgery assessment tool was developed in Canada and was shown to have a degree of validity and reliability (16). It was not created by an international panel nor does it have a specific behaviorally grounded rubric.

The ICO-OSCAR assessment tools serve a variety of purposes: (I) they are internationally applicable, as comments from an international panel of experts were used to adapt it and make it flexible to any setting; (II) they will decrease subjectivity of the assessment by clearly defining for the assessor what behavior must be observed for each level of proficiency; (III) the rubric will clearly communicate to the learner what is expected to attain competence, and thus this tool can be used for both assessment and teaching.


WPBA for patient care

The Ophthalmic Clinical Evaluation Exercise (OCEX) completed by a teaching physician as they observe the resident performing a patient history, examination and then listens to the case presentation (17). The teaching physician completes scoring in 33 categories that rate the residents’ ability to communicate effectively, perform a history and examination, and synthesize the information into a differential diagnosis and plan. Importantly, a rubric that describes the behavior necessary to achieve each grade on the OCEX was developed. The OCEX has been shown to have content validity and inter-rater reliability (17,18). It was not developed by an international panel and thus may need to be modified to reflect cultural differences. The ICO is currently modifying the OCEX to be internationally applicable. The OCEX is a valid and reliable WPBA (at least in North America) for assessing the competencies of patient care, medical knowledge and communication skills. It is available in multiple languages on the ICO website (www.icoph.org). The Royal Colleges WPBA handbook referenced above also contains rubrics for assessing a variety of patient care competencies (7). Like the WPBAs for surgical skill there no rubrics with behavioral descriptors for every category.


WPBA for professionalism & communication skill

Professionalism and communication skill can be difficult to assess. Traditionally, the teaching faculty assesses these competencies in addition to procedural skill and patient care. However, traditional evaluators may not be best for these competencies as residents are usually on their best behavior around these evaluators. Therefore, multisource (360-degree) WPBA is needed to provide residents feedback regarding professionalism and communication skills. Multisource refers to who is doing the evaluating. In addition to teaching faculty, nurses, assistants, patients and peers are evaluators. Of course, questions on a multi-source WPBA are tailored to the assessor. A nurse or assistant would not rate a physician’s medical knowledge but rather their professionalism and communication skills. Probyn and associates used a multisource WPBA and also asked for resident self-assessment (19). They found self-assessment scores were significantly lower than multisource scores. Interestingly, but not surprisingly, a teaching physician was more likely to rate the resident highly in professionalism than a secretary or program assistant. This emphasizes the importance of obtaining information about professionalism and communication skills from someone other than the resident’s supervisor. Jagadeesan and associates have shown their patient satisfaction survey can discriminate levels of resident communication skill and thus may be useful to assess this competency (20). Internationally, cultural differences may produce difficulties with this type of tool. It is crucial that the evaluators using multisource WPBA believe it is anonymous and that the information is to be used to improve the young doctors’ performance. To my knowledge, no internationally valid 360-degree WPBA exists and thus the ICO has developed one in a manner similar to development of the ICO-OSCARs (in submitted for publication at the time this manuscript was written).


Recommendations for use of WPBA


Conclusions

WPBA provide a more objective measure of whether residents have become competent. In addition, they improve performance by facilitating effective, specific and timely feedback. Ideally, every resident training program would at least utilize WPBAs of procedural skills, patient care, professionalism and communication skills. WPBA of resident performance is essential in both teaching and demonstrating graduating residents are able to function as competent ophthalmologists.


1、Jagadeesan R, Kaylan DN, Lee P, et al. Use of a standardized patient satisfaction questionnaire to assess the quality of care provided by ophthalmology residents. Ophthalmology 2008;115:738-743.e3. Jagadeesan R, Kaylan DN, Lee P, et al. Use of a standardized patient satisfaction questionnaire to assess the quality of care provided by ophthalmology residents. Ophthalmology 2008;115:738-743.e3.
2、Probyn L, Lang C, Thomlinson G, et al. Multisource feedback and self-assessment of the communicator, collaborator, and professional Can MEDS roles for diagnostic radiology residents. Can Assoc Radiol J 2014;65:379-84. Probyn L, Lang C, Thomlinson G, et al. Multisource feedback and self-assessment of the communicator, collaborator, and professional Can MEDS roles for diagnostic radiology residents. Can Assoc Radiol J 2014;65:379-84.
3、Golnik KC, Goldenhar L. The Ophthalmic Clinical Evaluation Exercise (OCEX): Interrater reliability determination. Ophthalmology 2005;112:1649-54. Golnik KC, Goldenhar L. The Ophthalmic Clinical Evaluation Exercise (OCEX): Interrater reliability determination. Ophthalmology 2005;112:1649-54.
4、Golnik KC, Goldenhar LM, Gittinger JW Jr, et al. The Ophthalmic Clinical Evaluation Exercise (OCEX). Ophthalmology 2004;111:1271-4. Golnik KC, Goldenhar LM, Gittinger JW Jr, et al. The Ophthalmic Clinical Evaluation Exercise (OCEX). Ophthalmology 2004;111:1271-4.
5、Rootman DB, Lam K, Sit M, et al. Pyschometric properties of a new tool to assess task-specific and global competency in cataract surgery. Ophthalmic Surg Lasers Imaging 2012;43:229-34. Rootman DB, Lam K, Sit M, et al. Pyschometric properties of a new tool to assess task-specific and global competency in cataract surgery. Ophthalmic Surg Lasers Imaging 2012;43:229-34.
6、Motley WW, Golnik KC, Anteby I, et al. Validity of ophthalmology surgical competency assessment rubric for strabismus surgery in resident training. J AAPOS 2016;20:184-5. Motley WW, Golnik KC, Anteby I, et al. Validity of ophthalmology surgical competency assessment rubric for strabismus surgery in resident training. J AAPOS 2016;20:184-5.
7、Golnik KC, Beaver H, Gauba V, et al. Development of a new valid, reliable, and internationally applicable assessment tool of residents’ competence in ophthalmic surgery (An American Ophthalmological Society Thesis). Trans Am Ophthalmol 2013;111:24-33.Golnik KC, Beaver H, Gauba V, et al. Development of a new valid, reliable, and internationally applicable assessment tool of residents’ competence in ophthalmic surgery (An American Ophthalmological Society Thesis). Trans Am Ophthalmol 2013;111:24-33.
8、Swaminathan M, Ramasubramanian S, Pilling R, et al. ICO-OSCAR for pediatric cataract surgical skill assessment. J AAPOS 2016;20:364-5. Swaminathan M, Ramasubramanian S, Pilling R, et al. ICO-OSCAR for pediatric cataract surgical skill assessment. J AAPOS 2016;20:364-5.
9、Golnik KC, Motley WW, Atilla H, et al. The ophthalmology surgical competency rubric for strabismus surgery. J AAPOS 2012;16:318-21. Golnik KC, Motley WW, Atilla H, et al. The ophthalmology surgical competency rubric for strabismus surgery. J AAPOS 2012;16:318-21.
10、Golnik KC, Gauba V, Saleh GM, et al. The Ophthalmology Surgical Competency Assessment Rubric for Lateral Tarsal Strip Surgery. Ophthal Plast Reconstr Surg 2012;28:350-4. Golnik KC, Gauba V, Saleh GM, et al. The Ophthalmology Surgical Competency Assessment Rubric for Lateral Tarsal Strip Surgery. Ophthal Plast Reconstr Surg 2012;28:350-4.
11、Golnik KC, Haripriya A, Beaver H, et al. Cataract surgery skill assessment. Ophthalmology 2011;118:2094-2094.e2. Golnik KC, Haripriya A, Beaver H, et al. Cataract surgery skill assessment. Ophthalmology 2011;118:2094-2094.e2.
12、Golnik KC, Beaver H, Gauba V, et al. Cataract surgical skill assessment. Ophthalmology 2011;118:427.e1-5. Golnik KC, Beaver H, Gauba V, et al. Cataract surgical skill assessment. Ophthalmology 2011;118:427.e1-5.
13、Saleh GM, Gauba V, Mitra A, et al. Objective Structured Assessment of Cataract Surgical Skill. Arch Ophthalmol 2007;125:363-6. Saleh GM, Gauba V, Mitra A, et al. Objective Structured Assessment of Cataract Surgical Skill. Arch Ophthalmol 2007;125:363-6.
14、Available online: https://www.rcophth.ac.uk/wp-content/uploads/2014/11/WpBA-Handbook-V4-2014.pdfAvailable online: https://www.rcophth.ac.uk/wp-content/uploads/2014/11/WpBA-Handbook-V4-2014.pdf
15、Feldman BH, Geist CG. Assessing residents in phacoemulsification. Ophthalmology 2007;114:1586-8. Feldman BH, Geist CG. Assessing residents in phacoemulsification. Ophthalmology 2007;114:1586-8.
16、Cremers SL, Lora AN, Ferrufino-Ponce ZK. Global rating assessment of skills in intraocular surgery. Ophthalmology 2005;112:1655-60. Cremers SL, Lora AN, Ferrufino-Ponce ZK. Global rating assessment of skills in intraocular surgery. Ophthalmology 2005;112:1655-60.
17、Cremers SL, Ciolino JB, Ferrufino-Ponce ZK, et al. Objective assessment of skills in intraocular surgery. Ophthalmology 2005;112:1236-41. Cremers SL, Ciolino JB, Ferrufino-Ponce ZK, et al. Objective assessment of skills in intraocular surgery. Ophthalmology 2005;112:1236-41.
18、Golnik KC. Assessment Principles and Tools. Middle East Afr J Ophthalmol 2014;21:109-13. Golnik KC. Assessment Principles and Tools. Middle East Afr J Ophthalmol 2014;21:109-13.
19、Swing SR. The ACGME outcome project: retrospective and prospective. Med Teach 2007;29:648-54. Swing SR. The ACGME outcome project: retrospective and prospective. Med Teach 2007;29:648-54.
20、Available online: http://www.ub.edu/medicina_unitateducaciomedica/documentos/CanMeds.pdfAvailable online: http://www.ub.edu/medicina_unitateducaciomedica/documentos/CanMeds.pdf
上一篇
下一篇
其他期刊
  • 眼科学报

    主管:中华人民共和国教育部
    主办: 中山大学
    承办: 中山大学中山眼科中心
    主编: 林浩添
    主管:中华人民共和国教育部
    主办: 中山大学
    浏览
  • Eye Science

    主管:中华人民共和国教育部
    主办: 中山大学
    承办: 中山大学中山眼科中心
    主编: 林浩添
    主管:中华人民共和国教育部
    主办: 中山大学
    浏览
出版者信息
中山大学中山眼科中心 版权所有粤ICP备:11021180
目录