Towards a programme theory for fidelity in the evaluation of complex interventions.
Journal article
Masterson-Algar, P., Burton, C., Rycroft-Malone, J., Sackley, C.M. and Walker, M.F. 2014. Towards a programme theory for fidelity in the evaluation of complex interventions. Journal of Evaluation in Clinical Practice . 20 (4), pp. 445 - 452. https://doi.org/10.1111/jep.12174
Authors | Masterson-Algar, P., Burton, C., Rycroft-Malone, J., Sackley, C.M. and Walker, M.F. |
---|---|
Abstract | Rationale, aims and objectives: This paper addresses the challenge of investigating fidelity in the implementation of a complex rehabilitation intervention designed to increase the level of independence in personal activities of daily living of stroke patients living in UK care homes. A programme theory of intervention fidelity was constructed to underpin a process evaluation running alongside a cluster randomized trial of the rehabilitation intervention. Methods: The programme theory has been constructed drawing on principles of realist evaluation. Using data from in-depth semi-structured interviews (n = 17) with all occupational therapists (OTs) and critical incident reports from the trial (n = 20), and drawing from frameworks for implementation, the programme theory was developed. Results: The programme theory incorporates four potential mechanisms through which fidelity within the trial can be investigated. These four programme theory areas are (1) the balancing of research and professional requirements that therapists performed in a number of areas while delivering the study interventions; (2) the OTs rapport building with care home staff; (3) the work focused on re-engineering the personal environments of care home patients; and (4) the learning about the intervention within the context of the trial and its impacts over time. Conclusions: These findings characterize the real-world nature of fidelity within intervention research, and specifically the negotiated nature of implementation within clinical settings, including individual patients' needs. This research adds to the evidence base because current frameworks for fidelity neglect the importance of learning over time of individuals and across the time span of a trial. |
Keywords | Complex interventions; Fidelity; Learning curve; Process evaluation; Programme theory; Realist evaluation |
Year | 2014 |
Journal | Journal of Evaluation in Clinical Practice |
Journal citation | 20 (4), pp. 445 - 452 |
Publisher | Wiley |
ISSN | 1356-1294 |
Digital Object Identifier (DOI) | https://doi.org/10.1111/jep.12174 |
Official URL | https://doi.org/10.1111/jep.12174 |
Publication dates | |
Online | 19 May 2014 |
Aug 2014 | |
Publication process dates | |
Accepted | 09 Apr 2014 |
Deposited | 15 Jun 2020 |
Output status | Published |
References | 1 Sackley, C. M., Burton, C. R., Herron‐Marx, S., et al . (2012) A cluster randomised controlled trial of an OT intervention for patients with stroke living in UK care homes (OTCH): study protocol. BMC Neurology, 12, 52– 60. 2 Sackley, C. M., Wade, D. T., Mant, D., Atkinson, J. C., Yudkin, P., Cardoso, K., Levin, S., Lee, B. V. & Reel, K. (2006) Cluster randomized pilot controlled trial of an OT intervention for patients with stroke in UK care homes. Stroke: A Journal of Cerebral Circulation, 37, 2336– 2341. 3 Medical Research Council (2008) Developing and evaluation complex interventions: new guidance. London: Medical Research Council. 4 Dane, A. & Schneider, B. (1998) Program integrity in primary and early secondary prevention: are implementation effects out of control. Clinical Psychology Review, 18, 23– 45. 5 Steckler, A. & Linnan, L. (2002) Process Evaluation for Public Health Interventions and Research. San Francisco: Jossey‐Bass. 6 Dusenbury, L., Brannigan, R., Falco, M. & Hansen, W. B. (2003) A review of research on fidelity of implementation: implications for drug abuse preventions in school settings. Health Education Research, 18 (2), 237– 256. 7 Carroll, C., Patterson, M., Wood, S., Booth, A., Rick, J. & Balain, S. (2007) A conceptual framework for implementation fidelity. Implementation Science, 2, 40– 49. 8 Holliday, J., Audrey, S., Moore, L., Parry‐Langdon, N. & Campbell, R. (2009) High fidelity? How should we consider variations in the delivery of school‐based health promotion interventions? Health Education Journal, 68 (1), 44– 62. 9 Hasson, H., Blomberg, S. & Duner, A. (2012) Fidelity and moderating factors in complex interventions: a case study of a continuum of care program for frail elderly people in health and social care. Implementation Science, 7, 23– 34. 10 McCormack, B., Kitson, A., Harvey, G., Rycroft‐Malone, J., Titchen, A. & Seers, K. (2002) Getting evidence into practice: the meaning of ‘context’. Journal of Advanced Nursing, 38 (1), 94– 104. 12 Greenhalgh, T., Humphrey, C., Hughes, J., McFarlane, F., Butler, C. & Pawson, R. (2009) How do you modernize a health service? A realist evaluation of whole‐scale transformation in London. The Milbank Quarterly, 87 (2), 391– 416. 13 Hasson, H. (2010) Systematic evaluation of implementation fidelity of complex interventions in health and social care. Implementation Science, 5, 67– 76. 14 Damschroder, L. J., Aron, D. C., Keith, R. E., Kirsh, S. R., Alexander, J. A. & Lowery, J. C. (2009) Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implementation Science, 4, 50– 65. 15 Marchal, B., van Belle, S., van Olmen, J., Hoere, T. & Kegels, G. (2012) Is realist evaluation keeping its promise? A review of published empirical studies in the field of health systems research. Evaluation, 18 (2), 192– 212. 16 Rycroft‐Malone, J., McCormack, B., Hutchinson, A. M., et al . (2012) Realist synthesis: illustrating the method for implementation research. Implementation Science, 7, 33– 43. 17 Pawson, R. & Tilley, N. (1997) Realistic Evaluation. London: Sage Publications. 18 Gibbs, G. (1988) Learning by Doing: A Guide to Teaching and Learning Methods. Oxford: Further Educational Unit, Oxford Polytechnic. 19 Schluter, J., Seaton, P. & Chaboyer, W. (2007) Critical incident technique: a user's guide for nurse researchers. Journal of Advanced Nursing, 61, 107– 114. 20 Argyris, C. & Schon, D. A. (1974) Theory in Practice: Increasing Professional Effectiveness. San Francisco: Jossey Bass. 21 Novick, G. (2008) Is there a bias against telephone interviews in qualitative research? Research in Nursing and Health, 31 (4), 391– 398. 22 Cohen, D. J. & Crabtree, B. F. (2008) Evaluative criteria for qualitative research in health care: controversies and recommendations. Annals of Family Medicine, 6 (4), 331– 339. 23 Rubin, H. J. & Rubin, I. S. (2012) Data analysis in the responsive interviewing model. In Qualitative Interviewing: The Art of Hearing Data, 3rd edn (eds H. J. Rubin & I. S. Rubin), pp. 189– 212. London: Sage. 24 Law, M., Baptiste, S. & Mills, J. (1995) Client‐centred practice: what does it mean and does it make a difference? Canadian Journal of Occupational Therapy, 62 (5), 250– 257. 25 Finlay, L. (2001) Holism in OT: elusive fiction and ambivalent struggle. The American Journal of Occupational Therapy, 55, 268– 276. 26 May, C. & Finch, T. (2009) Implementing, embedding and integrating practices: an outline of normalization process theory. Sociology, 43 (3), 535– 554. 27 Kerns, S. E. U. & Prinz, R. J. (2002) Critical issues in the prevention of violence‐related behavior in youth. Clinical Child and Family Psychology Review, 2 (5), 133– 160. 28 Bellg, A. J., Borrelli, B., Resnick, B., Hecht, J., Minicucci, D. S., Ory, M., Ogedegbe, G., Orwig, D., Ernst, D. & Czajkowski, S. (2004) Enhancing treatment fidelity in health behavior change studies: best practices and recommendations from NIH behavior change consortium. Health Psychology, 23, 443– 451. 29 Ross, S., Grant, A., Counsell, C., Gilespie, W., Russell, I. & Prescott, R. (1999) Barriers to participation in randomised controlled trials: a systematic review. Journal of Clinical Epidemiology, 52 (12), 1143– 1156. 30 Fletcher‐Smith, J. C., Walker, M. F., Sackley, C. M., Moody, A. & Drummond, A. E. R. (2013) OT provision for care home patients with stroke: what is routine practice? In Proceedings of the Society for Research in Rehabilitation Meeting, 1–2 July 2013; Nottingham, in press. 31 Ramsay, C. R., Wallace, S. A., Garthwaite, P. H., Monk, A. F., Russell, I. T. & Grant, A. M. (2002) Assessing the learning curve effect in health technologies. Lessons from the nonclinical literature. International Journal of Technology Assessment in Health Care, 18 (1), 1– 10. 32 Cook, J. A., Ramsay, C. R. & Fayers, P. (2004) Statistical evaluation of learning curve effects in surgical trials. Clinical Trials, 1, 421– 427. 33 Helitzer, D., Hollis, C., de Hernandez, B. U., Sanders, M., Roybal, S. & Van Deusen, I. (2010) Evaluation for community‐based programs: the integration of logic models and factor analysis. Evaluation and Program Planning, 33 (3), 223– 233. 34 Marchal, B., Westhorp, G., Wong, G., Van Belle, S., Greenhalgh, T., Kegels, G. & Pawson, R. (2013) Realist RCTs of complex interventions – an oxymoron. Social Science and Medicine, 94, 124– 128. 35 Byng, R., Norman, I., Redfern, S. & Jones, R. (2008) Exposing the key functions of a complex intervention for shared care in mental health: case study of a process evaluation. BMC Health Services Research, 8, 274– 284. 36 Bonell, C., Fletcher, A., Morton, M., Lorenc, T. & Moore, L. (2012) Realist randomised controlled trials: a new approach to evaluating complex public health interventions. Social Science & Medicine, 75, 2299– 2306. |
https://repository.canterbury.ac.uk/item/8vq3z/towards-a-programme-theory-for-fidelity-in-the-evaluation-of-complex-interventions
157
total views0
total downloads0
views this month0
downloads this month