Browsing by Author "Donnon, Tyrone"
Now showing 1 - 5 of 5
Results Per Page
Sort Options
Item Open Access Comprehensive Video-Module Instruction an Alternative for Teaching IUD Insertion to Family Medicine Residents(2013-05-13) Garcia-Rodriguez, Juan Antonio; Donnon, Tyrone; Kassam, AliyaThere are different challenges currently faced by the Department of Family Medicine to impart procedural skills instruction to residents. An IUD video-module was created to provide an alternative approach for such training, offering the possibility of its use in any clinical setting or teaching situation. A randomized, two group experimental research design was used to focused on the comparison of residents’ procedural skills performance between the instruction methods (video-module vs. traditional approach) to teaching IUD insertion. The results showed that both methods were effective in providing procedural skill instruction. The performance scores were significantly higher in the video-module group, but there were no significant differences in residents’ satisfaction scores. There was no correlation between the different scores and sex or age, or between performance and level of satisfaction. In conclusion, the use of video-module instruction is effective to provide IUD training and significantly higher than the gold standard in the performance component.Item Open Access Leadership competencies for medical education and healthcare professions: population-based study(BMJ, 2012-03-27) Çitaku, Fadil; Violato, Claudio; Beran, Tanya; Donnon, Tyrone; Hecker, Kent; Cawthorpe, DavidItem Open Access Novice and Expert Differences and Educational Interventions to Improve Veterinary Pathology Visual Diagnostic Reasoning Measured by Eye-tracking Technology(2013-12-13) Warren, Amy Louise; Donnon, Tyrone; Hecker, Kent; Beran, TaraPurpose: There were two objectives, to 1) to use eye-tracking to establish baseline quantitative and qualitative differences between novice and expert veterinary pathologists and explore dual process theory of clinical reasoning, and 2) determine if the introduction of two educational interventions, the active use of key diagnostic features and image repetition, improved novice visual diagnostic reasoning skills. Method: A pre-experimental static group comparison between novice and expert veterinary pathologists was used. Participants were shown 10 veterinary cytology images and asked to formulate a diagnosis while wearing eye-tracking equipment (10 slides) and while concurrently thinking aloud (5 slides). A quasi-experimental, pre-test and post-test comparison group design was used to compare the two teaching interventions to a comparison group using eye-tracking as an assessment method. The time to diagnosis and percentage time spent viewing an area of diagnostic interest (AOI) were compared using independent t-tests (novice and expert) and paired t-tests (time) and analysis of covariance (ANCOVA) (between groups) was used for the educational interventions. Diagnostic accuracy as a dichotomous variable was compared using chi-square tables. Results: Compared to novice, experts demonstrated significantly higher diagnostic accuracy (p < 0.017), shorter time to diagnosis (p < 0.017) and a higher percentage of time spent viewing AOIs (p < 0.017). Experts elicited more key-diagnostic features in the think-aloud protocol and had more efficient patterns of eye-movement. Students in the extended visual reasoning teaching intervention: active learning, image repetition behaved most like experts with no significant difference to experts for diagnostic accuracy, percentage time spent in the AOIs and a significantly faster time to diagnosis than experts (p < 0.017). Discussion: I suggest that experts’ fast time to diagnosis, efficient eye-movement patterns, and preference for viewing AOIs supports system 1 (pattern-recognition) reasoning and script-inductive knowledge structures with system 2 (analytic) reasoning to verify their diagnosis. Our results from the educational interventions suggest a greater level of improvement in the eye-tracking of students that were taught key-diagnostic features in an active learning forum and were shown multiple case examples.Item Open Access Reliability & Validity of the Objective Structured Clinical Examination (OSCE): A Meta-Analysis(2016) Al Ghaithi, Ibrahim; Donnon, Tyrone; Oddone Paolucci, Elizabeth; Kassam, Aliya; Felisa Palacios, MariaBackground: The objective structured clinical examination (OSCE) provides one of the most commonly used methods for assessing clinical skill competencies in the health professions. Objectives: To investigate the existing published research on the reliability, validity and feasibility of the OSCE in the assessment of physicians and residents in medical education programs. Methods: In addition to a MEDLINE, the literature search for peer-reviewed, journal publications that used an OSCE assessment method to evaluate clinical skill competence also included PsychINFO, ERIC and EMBASE databases. Results: In total, 49 studies met the inclusion and exclusion criteria in the final analysis. The OSCE assessment method has a moderate internal reliability [mean alpha coefficient (α) = 0.70], low to moderate criterion validity [mean Pearson correlation (r)= 0.46] and low to moderate construct validity (mean r = 0.42). High heterogeneity was observed and large part was attributed to multiple sources of measurement errors. The mean cost per candidate is $353 ± $ 362 (95% Confidence Intervals: $25-$1083). Conclusions: The OSCE method for the assessment of clinical skill competence was found to be reliable and valid, however, the administration costs are much higher than written or direct observation of clinical skill performance in practice.Item Open Access The Effect of Immediate and Delayed Feedback on Knowledge and Performance Development in Athletic Therapy Students during a Simulated Cardiac Emergency(2013-12-09) Valdez, Dennis; Donnon, TyroneFeedback is intended to reduce the gap between actual and expected performance. Feedback can be provided during or following a learning event. However, in uncontrolled and unpredictable learning environments (medical residencies), feedback following a learning event may be delayed or absent. Feedback timing strategies have been studied in variety of disciplines, but is lacking in the field of sports medicine. Therefore, this study examined the effectiveness of feedback timing strategies on knowledge acquisition and performance skill development of athletic therapy students using simulated cardiac emergencies. Thirty athletic therapy students were randomly assigned to an immediate feedback (IF), delayed feedback (DF), or no feedback (NF) group. Students completed a baseline performance test, received standardized instruction on cardiac emergency management and completed knowledge and performance pretests. During the intervention period, students managed nine emergency simulations. The IF group received feedback immediately following each simulation. The DF group did not receive any feedback between simulations; they received all feedback on each of the nine simulations after the ninth simulation. The NF group received all feedback on each simulation at the end of the study. Knowledge and performance posttests were administered after the last feedback session of the intervention period (acquisition), and a follow- up test was administered two weeks later (retention). Several one-way ANOVA tables were generated to compare group knowledge and performance outcome measures from the pretest, posttest, and follow-up test. A Tukey’s post hoc analysis was used to examine significant interactions. The IF and DF groups performed significantly better on the knowledge posttest compared to the NF group, F(2, 27) = 5.64, p < 0.05. There were no significant differences between the igroups on the performance pretest, posttest, or follow-up tests. However, the IF and DF groups scored a higher total performance score with automated external defibrillator (AED) application compared to the NF group, F(2, 27) = 6.10, p < 0.05. The results suggest that feedback has a positive impact on learning, regardless of timing strategy. However, previous research has demonstrated that different feedback delay times may have different effects on learning. Regardless, instructors must research and wisely choose the most optimal feedback strategies to enhance learning.