Although the impact and effectiveness of accreditation on medical education processes have been reported [1,2], this current study is the first within the chiropractic profession to assess the effects, if any, that accreditation has on the way chiropractic programmes evaluated and accredited by a chiropractic specific accrediting agency develop and function. It was very encouraging to discover that 90% of the ECCE accredited programmes reported that they had made improvements to their operations based on the feedback from the ECCE accreditation reports. The only programme that did not report any improvements/changes stated that because they are also accredited by an agency within their own country, this takes priority over feedback from the ECCE. Consequently, they admitted in the comments section of the questionnaire, that some recommendations contained in the ECCE report could not be implemented. That is interesting as 6 of the other ECCE accredited programmes also undergo accreditation by their individual country’s chiropractic, higher or medical education accrediting bodies. Never-the-less, they were able to also address issues that arose from the ECCE feedback reports. This is most likely due to the varying regulations and laws governing education throughout Europe and South Africa. Furthermore, there is currently considerable collaboration between the national chiropractic accrediting bodies in the United Kingdom and Switzerland with the ECCE. The accrediting body in South Africa also recognizes the ECCE (personal communication from the ECCE vice president and South African department head).
The most common section of the ECCE Standards where improvements were made following feedback from the ECCE accreditation reports was ‘Educational Resources’ (section 6) [7]. This section of the ECCE Standards contains three of the eighteen ‘critical Standards’ which must be at least ‘substantially compliant’ in order to achieve the maximum eight-year accreditation time period [13]. Thus, chiropractic programmes would likely be very cognizant of the relative importance of these particular Standards. Indeed, nine of the ten accredited programmes reported that the ECCE accreditation reports motivated their programme’s academic leaders to make needed improvements in this area. This section also dominated the written comments listing the various changes implemented. Thematic analysis of these written comments found that the most common specific changes/improvements made were in Infrastructure/Physical Facilities and Equipment, increasing the number of faculty members with the appropriate qualifications and experience and increasing the opportunities for inter-disciplinary teaching and learning.
The second most common section of the ECCE Standards where respondents reported improvements to their programmes arising from the ECCE accreditation reports was the ‘Educational Programme’ itself (section 2) [7]. Thematic analysis of the written comments relating to this topic found that several programmes increased their focus on evidence-based teaching and learning as well as evidence-based practice which incorporates the Bio-Psycho-Social model of health-care education. Additional themes that arose for this section included improving the integration of subjects, closing the theory-practice gap as well as increasing self-directed learning, which all lead to decreased contact time for students. Five of the 18 ‘critical’ ECCE Standards which must be at least ‘substantially compliant’ for the maximum accreditation time period fall into this category [13]. This most likely provided additional impetus for programmes to change and improve.
The researchers also identified a third area in the thematic analysis of the written comments entitled ‘Research’. While this may appear to link into the previous thematic section on evidence-based teaching and learning, the actual responses falling into this category were more specifically on teaching faculty and students how to perform research studies, analyze the quality of research publications, as well as how to use existing research in daily practice with patients. This aligns with ECCE Standard 2.2 ‘The Scientific Method’ which is also one of the ‘critical’ Standards requiring at least a substantially compliant rating for the maximum accreditation time period [13]. The additional theme identified by the researchers for this section of ‘Standardized Evaluations and Feedback’ refers to using questionnaires to obtain feedback within the programme regarding course/class evaluations and faculty performance evaluations. The purpose of this type of feedback is to perform internal quality assurance and facilitate changes and improvements to the programme.
Although the ECCE Standards section on ‘Students’ showed that seven of the ten programmes reported no changes to their programmes based on evaluation report feedback regarding this area, this is because this particular section of the Standards deals only with how students are selected and admitted to the programmes and how they are supported and counselled. No other issues regarding students are addressed in this section as they are covered in other Standards [7]. Most accredited programmes were confident that their selection criteria were appropriate and that the students were sufficiently supported. Furthermore, the ECCE did not identify these areas as ‘Critical’ Standards in the previous research on this subject [13]. The majority of ECCE accredited programmes are part of larger universities which provide good student support services.
Strengths and Limitations to the study:
The most obvious strength to this study is the documentation in the form of written feedback from accredited programmes that the ECCE accreditation reports have resulted in many significant improvements for 90% of the accredited programmes. In particular, improvements in equipment and resources as well as the number and quality of the faculty were frequently mentioned. Improving evidence-based teaching and practice was also a frequent theme.
The most obvious limitation to this study is that the respondents to the questionnaires were all heads of the ten accredited programmes. Thus institutional/programme memory of past ECCE accreditation evaluation reports was required in order to accurately complete the questionnaire, as it was not specifically requesting information from only the most recent accreditation event. Thus, newer department heads in the older programmes would not have experienced several past ECCE accreditations and perhaps did not know about the specific feedback from those events and the subsequent changes made by the programme unless these new department heads had specifically taken the time to try to investigate this by looking at old evaluation reports. This may have been a factor in the answers received from two of the programmes who have had ECCE accreditation for over 20 years. One of these programmes reported only one change and the other reported no changes at all based on the accreditation reports.
As usual there could be memory lapses or recall bias or even tendency of the subjects to report a biased answer (halo effect) to please the researcher. Further, it is possible that the level of change for one school may have been considered ‘minor’ and yet ‘substantial’ to a different school. In the future, these terms should be carefully defined.