Friday, June 26, 2015

Work place based assessments

What is Work place based assessment (WBA)?

The primary purpose of WPBA is to provide short loop feedback between trainers and their trainees – a formative assessment to support learning. They are designed to be mainly trainee driven but may be triggered or guided by the trainer.

What is the Purpose of WBA?

Several purposes of WBA has been identified. WBA helps to form a comprehensive assessment system, blueprinted to important curriculum requirements. It also provides educational feedback on which to reflect and develop practice. Another purpose of WBA is it provides a reference point on which to compare past, current and future levels of competence. WBA also supports remedial / targeted training and Provides evidence of progression. At the end of WBA it informs summative assessment

Benefits of WBA

Main benefit of WBA is it has a strong educational impact. Availability of clinical materials and skilled teachers are other benefits of WBA. Some other benefits of WBA are,
       WBA is Based on observable performance and specific criteria
       Encompasses skills, knowledge, behaviour and attitudes including judgement and leadership
       Provides descriptors to aid the assessor’s judgement
       Samples across important workplace tasks
       Encourages trainee/trainer dialogue
       Can identify those in need of additional support
       Encourages reflection to improve practice
       Provides a personal trajectory of progress
       Indicates readiness for summative tests

Position of work place based assessment in Miller’s Pyramid


Preparation for WBA

First most important fact in WBA is Patient consent and safety must be assured by the assessor. Also the assessors should be trained in the tool and have expertise in the area being assessed. Reliability of assessing can be improved by using on a range of different assessors. It Should be used in different settings with different cases.

Use of WBA

       Trainee led and  trainer guided
       Structured forms should inform debriefing
       Feedback immediately after observation
       Written feedback should describe performance
       WBA should be followed by reflection by the trainee
       Use more often for trainees who need remedial support
       Judge the trainee against the standard at the end point 
       The interaction between trainee and trainer is key

Trainee role

       Triggers WBA, in line with the LA
       Puts the safety of the patient first
       Agrees case and time with assessor in advance
       Ensures sufficient WBAs are completed throughout placement
       Uploads to the portfolio comments accurately within 2  weeks of assessment
       Respects confidentiality of patients and colleagues
       Reflects on feedback
       Follows up action plans

 

Assessor role

       Must be appropriately qualified in the relevant discipline
       Must be trained on  the WBA method
       Ensures consent and safety of patient
       Carries out observation and provides feedback
       Completes / checks online form and signs to validate
       Keeps the AES informed of issues or concerns

Criteria for feedback

There should be a written record describing performance to look back on.
Good quality feedback should:
       Reinforce what was done well
       Explain areas for development
       Suggest appropriate corrective action

Barriers to WBA

       Unintentionally seen as threatening (e.g. as mini-exams)
       Low ratings are seen as failures by trainees (and some trainers)
       Lack of trainer time, especially senior trainers

Actions to overcome barriers

       Provide faculty development  and trainee induction
       Promote WBA as opportunities for learning
       Written feedback puts ratings in context
       Low scores should be seen as the norm early on
       Provide time in job plans for those in key roles to use WBAs and discuss concerns

Utility of assessment

Refers to the relative value of using a type of assessment.
The criteria are:
       Reliability
       Validity
       Acceptability to users
       Feasibility of use
       Educational impact
It is unlikely that one assessment type will cover all these areas
The challenge is to improve the utility of all types of assessment to enhance the overall assessment system.

Reliability


Enhanced by:
       Assessor training
       Use of a range of assessors
       Use of all WBA methods
       Use of WBA frequently
       Triangulation with other assessments

Validity

Enhanced by:
       Blueprinting to curriculum and GMP
       Linking WBA with clear objectives within a structured a learning agreement
       Direct observation of workplace tasks
       Increasing complexity of tasks in line with progression through the training programme

Acceptability

Enhanced by:
       Providing assessor training and trainee induction to enhance understanding of criteria, standards and methods
       Interaction between trainee and trainer

Feasibility

Enhanced by:
       Linking WBA with clear objectives, standards and a structured learning agreement
       Assessing what trainees would normally do in training situations
       Working feedback into normal dialogue

Educational Impact

Enhanced by:
       Supervised training and appraisal
       Clear objectives and learning agreement
       Learning opportunities
       Good quality feedback
       Reflection on feedback

The Learning Environment

An environment that supports learning will:
       Ensure everyone understands and values their role and that of others in the educational process
       Provide faculty development and trainee induction
       Make time for training and assessment
       Encourage performance beyond competency; an aspiration to excellence
       Encourage the development of reflective practitioners
       Provide professional educational support
       Support trainers in making difficult decisions or negative judgements
       Support for trainees in difficulty

Types of Work place based assessments

Mini  Clinical Examination (CEX)
The CEX traditionally involved observation of the trainee carrying out a thorough history taking and physical examination and presenting their findings and diagnosis, and a written report of conclusions for the supervising clinician to evaluate.
Case-based discussions
Case-based discussion (CbD) in medical Foundation Training is a structured discussion with an assessor of clinical cases managed by the foundation doctor. Its strength is assessment and discussion of clinical reasoning. The foundation doctor selects two case records from patients they have seen recently, and in whose notes they have made an entry. The assessor selects one of these for the CbD session. The discussion starts from and is centred on the foundation doctor’s own record in the notes. CbD assesses medical record keeping, clinical assessment, investigation and referral, treatment rationale, follow up and future planning, professionalism and overall clinical care. Feedback is provided to the trainee immediately following the discussion.
  
Direct Observation of Procedural Skills (DOPS)
Direct observation of procedural skills (DOPS) has been defined as the observation and evaluation of a procedural skill performed by a trainee on a real patient. Procedural skills are also known as technical or practical skills. Evaluation by an experienced practitioner is carried out using either a checklist of defined tasks, a global rating scale, or a combination of both.
  
360 Degree assessment
Multiple assessors Including senior colleagues, nurses, AHPs is done. This includes the self assessment as well. Student would be assessed for their routine performance. Ultimately the feedback is reviewed with trainee and supervisor on agreed action plan.

Portfolios
Snadden (1998) describes a portfolio as “a collection of evidence that learning has taken place which in practice includes documentation of learning and progression, an articulation of what has been learned, and a reflection on these learning events/experiences.” Portfolios are used both as a learning tool to stimulate reflective, experiential and deep learning and as an assessment method to judge progression towards or achievement of specific learning objectives, competencies or fitness to practice. Depending on the specialised purpose of the portfolio, its content including evidence required, and assessment criteria vary from context to context. Any portfolio that is used for assessment purposes should clearly articulate the amount, type and quality of evidence required to establish proof of competence and the marking criteria used to evaluate the quality of the evidence.



No comments:

Post a Comment

Popular Posts

Join This site