Participants
Of 1142 GP registrars who invited to take part in the online survey, 391 (34.2%) completed the survey between Nov 2016-Feb 2020. Fourteen GP registrars participated in interviews. Of the ten participants for whom follow-up interviews were indicated, eight completed the second interview. Details of the recruitment process and reasons for exclusion are provided in Figure 1.
The interview sample comprised nine women, five men; two registrars enrolled in the rural pathway, and only one registrar who had completed medical training overseas. The interviewees enrolled in the course during the second or last year of their three-year (full-time equivalent) program. Participants stated that they spent an average of nine hours to complete the whole course.
Results are presented to answer each research question.
RQ1. Participant experience with the course
Registrars’ responses to 11 questions on satisfaction with the course are presented in Figure 2.
Data from the online survey and the interviews indicated that participants’ course experience was generally positive. As illustrated in Table 1, the triangulated methods provided some unique and some common feedback.
Table 1 Participant experience feedback from online survey (n=391) and interviews (n=14)
|
Survey multiple choice
|
Survey open-ended questions
|
Qualitative interviews
|
Content
|
|
|
|
Easy to understand
|
|
ü
|
üI felt like the content was delivered in a way that made it very easy to understand [#14].
|
Relevant to practice, particularly/ case studies
|
|
ü
|
ü I think it was quite relevant. Especially when they brought in a case study that is very common in the general practice room [#12].
|
A good resource
|
|
ü
|
ü I think the main things were resources I could use to look up things in future if I had trouble. So, I saved them as bookmarks [#12].
|
Delivery
|
|
|
|
Flexible
|
ü (Q.5) Flexibility in learning
|
ü Flexible with time and pace
|
ü The fact that it was online, that I could do it in my own time, and that flexibility was great [#09].
|
Easy navigation / user friendly
|
ü (Q.6)
|
Mixed
|
Mixed
The navigating the system was fine. It was easy [#04].
I did find a problem with it. When you had to move answers into boxes, if it accidentally went into the wrong box [#08].
|
Individualised learning
|
ü (Q.8)
|
|
|
Feedback /interactivity
|
ü (Q.11)
|
üLiked the interactivity e.g. quizzes & feedback
|
ü It was good that we were sort of asked to generate a response, to answer a question. I also enjoyed that they [the course] gave you a model answer [#14].
|
Engaging
|
|
ü
|
Mixed
So, I thought it was good mix of media which made the presentation interesting and it kept my attention for longer than it may have otherwise [#07].
I found that it was a little bit difficult to engage with some of the modules [#3].
|
The responses to online questions on registrar satisfaction were mostly in the upper half of the rating scale, suggesting general satisfaction (Figure 2). Consistent with the qualitative data, the participants highly rated the ease of navigation (Q.6), the flexibility of the online delivery (Q.5) and the feedback (Q.11).
In contrast to the positive responses to Q.6, the open-ended question about what participants would like changed (Table S2) identified that many participants had problems with technical issues, and some thought the interface could have been more user friendly.
Data from the open-ended questions (Additional File 1, Tables S2 and S3) and interviews (Table 1) provided additional feedback on the course content: that it was easy to understand, provided relevant examples, it was relevant to practice, and it provided useful resources for future use.
RQ2. A. Impacts of course on attitudes, knowledge skills and clinical practice
Data from the survey and the interviews indicated that the participants thought the course positively increased their confidence, knowledge and skills of EBM. Data on changes to clinical practice were only available from the interview data collected at least three months after completion of the course. Interview data (presented below) suggested the course changed the participant’s clinical practice to better incorporate EBM.
Attitudes
Most participants said they were only interested in the application of research into practice. A few participants said that they were interested in doing or participating in research. Of those, few said that this was influenced by taking the course.
Participants mostly found themselves more confident in understanding, interpreting and appraising research evidence after completing the course.
I feel more confident, and I think that I would be able to if I needed to look through a study and make some comments [#08-female].
However, they acknowledged that they might not be able to conduct research, or there is room for improvements. A few stated that they did not find any change in their confidence.
I probably feel about the same as I did before. I don’t know that I feel any more or less confident [#10-female].
Knowledge
An increase in knowledge was identified in the interview and the online survey. Based on the survey findings, participants’ self-reported understanding of the topic substantially improved from the mean of 4.4 (out of 10) ± 0.1 (SD) before the course to 7.2 ± 0.1 after completing the course (n=320, p˂0.0001) which represents a very large effect size (d=1.6). Most of the participants believed that the course increased or refreshed their understanding of different research skills and awareness of how to incorporate research skills into their practice.
It’s a good idea I think to have an understanding of research when we’re going out into clinical practice [#05-female].
It [the course] makes me more aware to incorporate research into my practice [#07-male].
In addition, participants acknowledged that the course, led them to start questioning practice where clinicians follow others’ experience/opinion without thinking critically.
The way it’s [the course] influenced my management is just always being aware that just because it’s been done as usual practice doesn’t necessarily mean like it’s evidence-based and doesn’t mean that it’s necessarily proven to be effective. So, I don’t always rely on that advice [#01-male].
One participant highlighted an improvement in her understanding of the importance of contextual and environmental factors, including patient preferences in applying evidence.
It’s not just whether or not they’re sick, it’s also about how they approach health providers, and also how their general life impacts on whether or not they’re willing to engage or continue their treatment. So, I just keep it in the back of my mind that we need to be aware of the other socioeconomic and environmental factors which a lot of that qualitative research helps us consider as well [#14_female].
Skills
Some participants said they were now able to critique research evidence and interpret research studies.
It [the course] definitely taught me to be a bit more critical in my thinking of what I’m reading [#14-female].
I think it’s just helpful to …being able to interpret and synthesise how that can apply [research] to your clinical practice [#03-female].
Some participants reported that they learnt how to frame their clinical questions and find an evidence-based answer for it.
Instead of thinking that you don’t know something, and letting that overwhelm you, you become a little bit better at devising a clinical question and knowing where to answer is [#13_2-male].
Clinical practice
Some participants reported that the skills that they learnt had led to some changes in their practices. They used the skills learnt and research evidence to investigate an answer to their clinical questions particularly when they were dealing with uncertainty. They acknowledged that they used skills such as clinical appraisal, particularly the levels of evidence [#3,4,7,14] to interpret study findings, findings that drug representatives presented to them, and guidelines.
Looking at evidence and working out how reliable this is, and then using that to, you know, guide treatment or, you know, not needing to always rely on guidelines to make a better judgment, depending on the clinical situation after reviewing what evidence is available [#08_female].
… often I will have a lot of different clinical questions that I need to have answered. …and that’s what the course are really good for, for making that clinical question and then finding that answer, and knowing that it’s a trustworthy and reliable kind of answer that I actually apply to the patient [#13_2-Male].
A few participants indicated that the skills that they learnt in the course reaffirmed the importance of communication skills with patients. Their improved understanding of research findings and being able to elaborate the difference between high quality and low-quality evidence to patients reportedly improved their communication with patients.
It [the course] does help to clarify things in your own mind, which then explain it in a simple way to the Patient [#13_2-male].
I do bring it back to research and how just because one thing works for someone doesn’t mean that it works with you and just break it down that way, and also talking like, high-quality studies versus poor quality as evidence [03-Female].
RQ3: Mediators of impacts (Reported barriers and facilitators)
Responses from the qualitative interviews indicated that barriers and facilitators to practicing EBM related to the GP (GPs perceptions of EBM, comfort and priority); the work-place (time, the influence of supervisors, the impact of system and access to resources); and patients (treatment expectation being different from evidence).
GP factors
Most of the participants had positive attitudes towards EBM and acknowledged the value of EBM.
Obviously, we have to practice evidence-based medicine, so in order to do so, we need to be able to understand and interpret and incorporate research into our practice [#07-male].
Almost all participants reported they needed to seek information to inform their decisions on a daily basis. They reported that they tried to choose EBM resources that were recent and relevant to the Australian context and had confidence in the quality of the information.
All of them [the guidelines that I use] are sort of peer-reviewed and accepted by the wider community as factually correct [#11_2-male].
A few participants expressed a different opinion and described research evidence as neither relevant nor transferable into clinical practice.
I think they’re [research evidence] just answering questions that are quite different from the questions that we get in general practice [#06-female].
However, a few participants described an approach based upon trust in the credentials of the source of information:
I do feel if it’s [a research findings] published in a reputable source, I tend to leave it without thinking too critically… I trust my supervisor and feel that they are quite competent [#05-female].
A barrier that was described was, despite that the participants were willing to change, they felt more comfortable with what they have already learnt and get accustomed to, than using new evidence-based resources.
Often you’re introduced to something like an UpToDate [a resource for supporting clinical decision] quite early, so you get good at searching and using it, that you know what sort of services are on it [#13_2-male].
Work-place factors
Time was one of the main barriers identified for accessing and using research evidence in practice by almost all participants. The time-consuming nature of using research evidence was attributed to the way it was accessed, (for example, an initial need for login to the webpage of some organisations such as RACGP), the overwhelming amount of information identified by searches, and the time needed to critically appraise the findings.
If I were to go through Cochrane and look up a whole bunch of different articles which would take a lot longer [#05-female].
For almost all registrars, pre-appraised resources that are brief and ready to use, such as guidelines, were preferred over primary research evidence.
I would tend to use resources that are incorporated study findings into a summary like eTG [online Therapeutic Guidelines], I don’t read the specific articles and therefore analyse the data’ [#5-female].
Participants’ responses indicated that using research evidence might not be their priority; work and exams were specified as activities that they prioritised.
Because I do have exams coming up I haven’t been able to do – look into research papers [#11_2].
Participants reported thatsupervisors could berole models influencing GP registrars’ beliefs about evidence-based medicine by encouraging and guiding GP registrars to practice evidence-based medicine. Participants who said their supervisors encouraged them to use research evidence had a stronger belief than other participants about the applicability of research into practice.
Well, I think evidence-based research should inform good clinical practice, and it should always be the starting point for good management… Well, my supervisors have mostly been very evidence-based as well, so I’ve actually just learnt a lot from how they appraise studies [#03-female].
Some participants also indicated that some supervisors expect GP registrars to follow their advice and treatment approach without critically evaluating the relevance of the advice into the clinical situation.
[one of my supervisors is a] real old-school doctor, right, so they are less likely to change or read the literature or – this is the way we’ve always done it, so that’s the way they’ll always do it. So that’s the way they want me to do it as well. Which may not be – it’s not dangerous, but it may not be an optimal solution [#11_2_male].
One participant described an external system barrier to changing practice as an impediment to practicing EBM. In particular, he noted that the Pharmaceutical Benefits Scheme (PBS) system did not always allow the prescription of new medicine.
That’s a good skill [using research to inform the practice], but in most instances, it’s not applicable to general practice because we’re following these well-worn pathways in terms of the treatment and the investigation. An example of that might be, a study might come up with a fantastic new diabetes drug, well unless it’s on the PBS the vast majority of people don’t care because they would have to pay more for it. In terms of the incorporation of that drug into established treatment patterns, it goes through years of iterations before it’s widely accepted [#11-male].
Access to research papers was raised as a problem by some participants who indicated that having access could be costly unless they had free access through RACGP or other affiliations.
Patient factors
That’s mainly patient preference, so potentially they don’t want to go with the evidence-based therapy [#04-male].
Summary
Figure 3 illustrates the main themes identified to answer the study questions and their interactions. The interactions were interpreted from the qualitative data and whether the specified changes and influences were related to the course. In summary, participants had a positive experience from the course and stated that taking the course led them to improve their, confidence, knowledge, and skills.
Participants specified that they used the skills learnt in the course for interpreting evidence, investigating their clinical questions, and the course led them to have better communication with patients in their clinical practice. The factors that influenced the course's impact were specified to be related to GP, work-place, and patient factors.