Peer Learning
Capture clinically meaningful learning opportunities to reduce errors and potential harm to patients and to improve our practice through individual and group learning.
Not Punitive
Just culture principles, participation as measure of engagement
Safety
Promote a culture of safety
Learning Opportunities
Learning from clinical feedback, great calls, consultation, discrepant opinion capture, etc.
Meet Requirements
Meet institutional requirements based on TJC strong recommendations for peer learning/review
Accreditation
Replace ACR peer review needed for accreditation (randomly selected 2% case review of concordancy, clinically inconsequential, and wasteful. Single ACR approval to move to peer learning)
Structure
Create structure, processes to ensure appropriate use (monitor to identify and correct unacceptable behavior)
Functional requirements of peer learning .
- Ability to identify ad hoc learning opportunities
○ Efficient workflow (submit a case in ‘seconds’; <1 min)
○ Launchable in context from various enterprise IT platform
– E.g., MGB single sign-on, mobile version, web-enabled
– E.g., Visage, powerscribe, epic, etc.
○ Ability to submit cases based on pre-defined easily customizable classifications (e.g., great call, etc.)
○ Diagnostic radiology as well as IR
○ Launch viewer (e-unity, visage) in context for image/report viewing - Interfaced but independent of clinical medical record applications to ensure peer review protection
- Ability to provide feedback between multiple stakeholders:○ Notification by email/page, though containing no patient identifier or clinical information—only alert category
○ Radiologist to Radiologist
○ Radiologist to Technologist
○ Referring provider to Radiologist
○ Enables Transparent (identifiable sender and receiver) or Anonymized (e.g., Radiologist to QS/Senior Leaders at user’s discretion)
○ Close loop communication and feedback capability between sender and receiver to acknowledge receipt/action;
○ Ability to rate peer learning alert by receiver (‘uber rate’) to allow system-wide quality assessment. - Administrative functions including:
○ Easy user set up and management
○ Data visualization and analytics for to monitor program use
○ Case files by subspecialty/practice to enable cases selection for peer learning conferences
○ Administrative overview to help ensure appropriate behavior
– E.g., identified bad behavior
Program Structure
Oversight structure (policies, practice expectations)
- e.g., average one or more submission per clinical day measured quarterly
- e.g., RPOMS
- e.g., department-specific incentives
- e.g., RPOMS
- Radiologist-in-Chief to approve process and decision making for short and longer term solutions
Implementation, training, support, monitoring and reporting
- MGB Radiology Q&S (Kristine Burk, Patrick Curley)
- Supported in part by AHRQ Grant 1R18HS029348-01 September 2022- March 2027
- Evaluate MGB-wide implementation’s impact on diagnostic error and potential harm
MGB Radiology subspecialty peer learning risk management CME x 8 per year
- (Kristine Burk, Taj Qureshi)
‘Ease-in’ implementation to build a culture of safety (e.g., first quarter)
- Recommend users prioritize Great calls
- Transparent messaging within department/practice
- Cross-department/practice messaging initially through Vice Chair of quality or practice lead
- Subspecialty MGB-wide conference (risk management CME) to include cases from all practices
- Other