Citation
Hurst, K. (2014), "Guest editorial", International Journal of Health Care Quality Assurance, Vol. 27 No. 8. https://doi.org/10.1108/IJHCQA-08-2014-0086
Publisher
:Emerald Group Publishing Limited
Guest editorial
Article Type: Guest editorial From: International Journal of Health Care Quality Assurance, Volume 27, Issue 8
Getting (QA) research into practice (GRIP)
Readers, immersed in research bids and projects, are aware that getting good research into practice (GRIP) requires several important and independent elements and steps, such as:
selecting a topic that adds new knowledge and insights to health and social care policy and practice;
using a meaningful and comprehensive theoretical framework in which to set the study;
finding out what's already known about the topic;
posing problem statements meaningfully as aims and objectives, questions or hypotheses;
choosing the right research design;
creating or adopting valid and reliable research instruments;
locating responsive, informative and representative data sources and collecting data;
complying with ethical rules;
analysing data accurately and presenting findings meaningfully; and
disseminating and implementing findings and recommendations.
In this special issue, therefore, we explore GRIP (i) to (x) in a QA context.
Element (ii) in our GRIP acronym � adopting the right theoretical framework; i.e. how to structure the topic, is a crucial and often a make-or-break first step. Occasionally, off-the-shelf theoretical frameworks cause our literature review, data collection and report writing to simply and easily fall into place. Donabedian's QA triad: structure, process and outcome, for example, is among the better known QA frameworks and one that has guided data collection and report writing in several projects. In this issue, Nancy Bouranta and colleagues triangulate two competing theoretical frameworks, which clearly facilitates, adds weight and importance to their patient satisfaction study. Similarly, Paul Hong and colleagues explore (among other things) how existing QA models can be adopted for healthcare's benefits. Their paper's value lies in highlighting the bear traps that await na�ve researchers who think tried and tested QA models are healthcare's silver bullets.
I�m sure my figures are dated, but the last time I checked, there were 600 health and social care journals that are published several times each year. While these printed and web-based data mines are invaluable, keeping track and locating relevant information, challenges all busy health and social care professionals; GRIP element (iii), therefore, is explored by Joanne Gard Marshall and colleagues in this issue. Their study shows how busy clinicians used traditional and electronic libraries to gather and implement evidence-based practice (EBP), with encouraging results.
As we�ve explored in previous IJHCQA issues, gathering data is the most fragile and expensive GRIP element (vi and vii). Labour-intensive activities, e.g. interviewing patients, are probably the most time-consuming and costly research phase. In this issue, however, James Stahl and colleagues describe how staff-patient contact time, often recorded using time-consuming personal diaries or time-and-motion studies, was efficiently and effectively gathered by electronically tagging staff and patients so that face-to-face consultation time (an important service-quality variable) could be easily and accurately measured. So, the authors� work includes important GRIP elements; i.e. gathering meaningful and accurate data efficiently, effectively and unobtrusively (vii), which overlaps with ethical rigour (GRIP element viii); i.e. protecting research participants, which is explored by Michael Brown and colleagues. The authors remind us that research ethics is more than just informed consent and the right to withdraw, it's also about testing findings before implementing them in case there are hidden side-effects; e.g. the Thalidomide disaster. Computer simulation is a relatively risk-free and useful approach, and the authors effectively use the technique to see their finding's impact on workflow, job and patient satisfaction.
Quality assurance datasets are rich and sometimes we don�t capitalise on their value. Off-the-shelf software-based analytical tools; notably templates that assimilate and model exiting QA data, can be a blessing. Maria Filipa Mour�o and colleagues model and apply Receiver Operating Characteristic (ROC) curves to see if they can generate new service-quality insights (GRIP element ix) using second-hand data.
If the final product has been time consuming and costly to develop then doesn�t step (x) deserve more effort and financial support. Kate Bak and colleagues, for example, describe a successful Canada-wide strategy to ensure that coal-face staff know about research-based evidence-based guidelines and what knowledge and skills are needed to ensure implementation. The IMRT Project�s impact is impressive and, moreover, is replicable. David Munoz and colleagues employ a more systematic and mathematical GRIP Step-10 technique, which answers the question: �How do you eat an elephant?� Answer: in small pieces. Usefully, the authors underline the GRIP (x) elements having most impact in different contexts. Marco Santos and colleagues explore the GRIP (x) step � how physicians adopt and apply EBP guidelines. Their findings, before EBP processes were tightened, are startling; and they advocate what needs to be done to overcome policy and practice weaknesses. Michael Courtney and colleagues publish an equally startling paper in this issue � the extent to which junior doctors (JDs) understand and implement EBP guidelines (GRIP element x). The authors� simple but effective study shows how JDs, often pressured and working alone at nights and weekends, failed (during written tests) to implement EBP with potentially serious implications for severely ill patients The authors� work call for two important GRIP element x-related recommendations: reviewing JD education and training; and testing other EBP scenarios locally and nationally.
Dr Keith Hurst, Independent Research and Analysis, Mansfield, UK