The impact of story: measuring the impact of story for organisational change

Lisa Rossetti (Lapidus International, Bristol, UK) (Positive Lives, Chester, UK)
Tony Wall (Centre for Work Related Studies, University of Chester, Chester, UK) (International Thriving at Work Research Group, University of Chester, Chester, UK)

Journal of Work-Applied Management

ISSN: 2205-2062

Article publication date: 4 December 2017

7040

Abstract

Purpose

The role of dialogue has recently been identified as being important in generating impact in organisations, but the purposeful use of narrative or story-based approaches to effect organisational change and service improvement is still relatively innovative. The purpose of this paper is to document and examine two projects in health and social care settings which aim to generate organisational development and service improvement.

Design/methodology/approach

The paper evaluates and compares two case studies of story-based organisational development and service improvement projects in the UK. This involved developing an appropriate evaluation framework and assessing the impacts in each case using semi-structured interviews and thematic content analysis.

Findings

This paper reports the diversity of impacts and outcomes that were generated by the projects. Specifically, it is argued that there is a strong indication that story-based projects best achieve their objectives when clearly linked to key organisational strategic drivers or pathways, as evidenced by robust evaluation.

Practical implications

This paper recommends that researchers and practitioners, working with story-based methods, design credible and robust evaluative practices, in order to evidence how their work supports organisations to meet current sector challenges. The paper recommends a flexible evaluation framework for evaluating story-based projects in the workplace.

Originality/value

This paper offers new evidence and insight into the impacts and outcomes of using story-based approaches, and a new evaluation framework for these sorts of projects.

Keywords

Citation

Rossetti, L. and Wall, T. (2017), "The impact of story: measuring the impact of story for organisational change", Journal of Work-Applied Management, Vol. 9 No. 2, pp. 170-184. https://doi.org/10.1108/JWAM-07-2017-0020

Publisher

:

Emerald Publishing Limited

Copyright © 2017, Lisa Rossetti and Tony Wall

License

Published in Journal of Work-Applied Management. Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial & non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode


Introduction

The impact agenda is a highly contested space and has been criticised for limiting creativity and indeed changes in practice beyond academe (Alvesson and Sandberg, 2013; Johnston and Reeves, 2017). As such, rather than “in the box” thinking, there have been calls for “box changing, jumping or transcendence” for more imaginative approaches (Alvesson and Sandberg, 2014, p. 967) which engage stakeholders in collaborative forms of inquiry (Cunliffe and Scaratti, 2017; Ozanne et al., 2017; Pettigrew and Starkey, 2016; Wall, 2013, 2014, 2015, 2016a, b, 2017a, b, 2018).

Within this context, MacIntosh et al. (2017) highlighted the importance of dialogue and reflexivity, and the role of the importance of narrative within the impact debate. Alongside this, story-based and narrative approaches are gradually becoming more respected as an effective tool for learning and development and for understanding organisational change (McCormack and Milne, 2003; Gabriel, 2008; Gabriel and Connell, 2010; Reissner, 2011; Pässilä and Vince, 2016). Evidence of impact has included: service improvement in health care settings (IDEA, 2009; SCIE, 2010; Ellis et al., 2011); positive impacts on policy (in terms of client outcomes) (IDEA, 2009; Clark and Purdy, 2007; SROI Network, 2011); improvements in performance indicators (Schalock, 2001); improvements in staff engagement (MacLeod and Clarke, 2009); and improvements in well-being outcomes (Boorman, 2009; Rath and Harter, 2010; NEF, 2011).

However, although there is a diversity of potential methods and strategies to evaluate story-based interventions, there is no agreed standard or process. Therefore, a practice problem facing the practitioner researcher using story-based methods in the workplace is how to analyse, interpret and present the data in a systematic way that results in credible evidence. As Guest et al. (2012) propose “good data analysis (and research design, for that matter) combines appropriate elements and techniques from across traditions and epistemological perspectives”. In this way, evaluation can not only evidence the project outcomes but also create convincing links to personal learning as well as wider organisational development objectives, thus adding credibility to story-based methods.

This paper draws from a practitioner research project in the UK, as part of a work applied learning and organisational development project, to evaluate the impacts of two case studies. In order to achieve this, however, the practitioner researcher had to develop an appropriate evaluation framework and methodology which was ecologically appropriate for the well-being and narrative nature of the project, the practice setting of the practitioner researcher and generated valid results which could then be utilised in practice to support organisational development and service improvement.

This paper is structured as follows. The first section reviews some of the key evaluative methods and tools which are used in practice to measure impact and organisational learning in the context of health and social care organisations. The second section then outlines the methodology adopted as part of this study, exploring the suitability of various evaluative methods in the context of health and social care settings. The following sections then present and compare two case studies, highlighting the key impacts and broader findings from the case studies. Finally, the paper moves to a discussion of some of the challenges of evaluating story-based methods for organisational learning and change, and reflects on the stages of designing robust evaluative frameworks in the context of story and health.

Assessing impact in health care

Over a decade ago, a cross-government and social care sector working party produced the document “Putting People First: Transforming Adult Social Care” (IDEA, 2009) setting out the vision for adult social care and its direction over the next ten years. This paper was a keystone paper as it set forth a strategic direction which is generically known as “personalisation”, or highlighting the importance of the individual experience. Similarly, Shepherd et al. (2010), in their position paper for the Sainsbury Centre for Mental Health, identified peoples’ lived experience as the most potent driver of organisational change within a culture of recovery. This has positioned and framed the work of external providers ever since, with an emphasis on co-production, laying the ground for participatory methods of working and of evaluation.

In terms of approaches to evaluating work within this broader professional context, there are different varieties to how and why evaluation is done. For example, Trochim (2006) postulates that evaluation strategies fall broadly into four major groups: scientific/experimental, management-orientated systems, qualitative/anthropological and participant-orientated (the latter of which seem appropriately aligned to the context). In contrast, Mertens and Wilson (2012) propose four categories of evaluative purpose: to determine inputs and need, to improve or change practices, to assess programme effectiveness and to address issues of social justice. Again, these seem relevant to helping decide the frame of practitioner oriented evaluation in the above professional context.

Within these broader approaches, there are specific methodologies which are used in contemporary health care settings. One of the most popular, and which continues to influence many other models, is Kirkpatrick’s (1998) model and toolkit, which was developed as an evaluation tool for assessing impact and outcomes of learning and development programmes. Bespoke methodological approaches utilising Kirkpatrick’s thinking have been developed by governments. For example, The Impact Evaluation Model uses principles of outcomes-based accountability, and has been recommended by the UK Government for localised impact evaluation of activities especially around service and workforce reform. Reio et al. (2017), however, critique Kirkpatrick’s work as being overly focussed on the achievement of outcomes of training rather than on the impact on the stakeholder and whether their needs have been met. Reio et al. propose that stakeholders should be able to input to design, development and evaluation.

Return on investment (ROI) models have also been adopted to measure impact in a very specific and narrow sense (Wall et al., 2016, 2017). More recently, social return on investment (SROI) methodologies have also appeared which have also been participatory by nature, and emphasise those outcomes which are valued by people, including stakeholders and beneficiaries of social programmes, and provide a participatory mechanism for their voice or story to be heard. For example, The SROI Network (2011), which promotes the use of SROI methods internationally to address social injustice, claims:

SROI tells the story of how change is being created by measuring social, environmental and economic outcomes […] SROI is a framework to structure thinking and understanding. It’s a story not a number. The story should show how you understand the value created, manage it and can prove it

(SROI Network 2011, p. 2).

Other forms of participatory evaluation methodologies typically assess progress, performance and impact of a project, but with a primary objective of creating a culture of learning for project staff, beneficiaries and partners. Hasenfeld et al. (2004), as an example, promote the participatory model of evaluation (PME) as a highly collaborative process, relying upon a feedback loop from partners and staff. In their work, Hasenfeld et al. (2004) explored how involving clients in the community in ongoing feedback makes them part of the evaluation process. The validity accorded to case studies by PME lends credence to personal narratives as a methodology in evaluation.

The practical issues of implementing such complex evaluation approaches can stifle widespread use (Wall et al., 2017). In contrast to complex methodologies, Davies and Dart (2005) claim that the most significant change (MSC) technique serves as a legitimate form of participatory monitoring and evaluation. MSC was first developed as a means of auditing changes in overseas development aid projects, but can support organisational learning and service improvement. It is participatory because of the multiple perspectives elicited. As Davies and Dart (2005) explain:

[…] it contributes to evaluation because it provides data on impact and outcomes that can be used to help assess the performance of the program as a whole […] MSC makes use of […] “thick, description”, closely textured accounts of events, placed in their local context, and where the role of the observer and their subjectivity, is visible. In the world of ordinary people these often take the form of stories or anecdotes

(p. 67).

Methodology

This paper adopts a case study approach to document and examine the impact of story in the context of health care organisations, and was undertaken by a practitioner research seeking the dual roles of contributing to the development of the organisations and generating new practitioner knowledge for the individual (Wall, 2014; Heikkinen et al., 2016). The two case studies relate to two story-based intervention projects focussed on organisational development and service improvement as dual outcomes. The projects were delivered within two public sector organisations: one is an adult social services organisation (now referred to as “Social Care Co.”) and the other is a health care organisation (now referred to as “Recovery Co.”) in England.

The intention was a form of case study which was discovery led and inclined towards emphasising social processes and relationships within a natural phenomenon, rather than restricting the attention upon outcomes, and is also suitable for comparison case studies (e.g. of individuals or organisations). In this way, the descriptive case studies focus on contemporary events, explored in their real-life contexts rather than in controlled environment (Yin). The use of multiple cases also provides the opportunity to compare and contrast the findings across different real-life contexts, in terms of different real-life organisational cultures and story interventions (Yin, 2013). However, it is acknowledged that the case study approach is also vulnerable to criticism re-credibility of generalisations from findings (Denscombe, 2010).

Several options were considered when designing the project for suitable data collection methods. However, given the nature of the projects, it was argued that evaluation can be a “sense-making process” in organisations (Weick et al., 2005; Weick, 2016), as well as one that collects and interprets data, and sharing of personal stories could be a useful experience for participants in the evaluation. Furthermore, it was also argued that practitioner researchers in the context of providing services to health care organisations need to consider how the provider-client relationship might be affected by their choice of methods, for example, a rigorous “root and branch” investigative survey might jeopardise future relationships.

It was therefore decided that the project data would be collected through semi-structured interviews incorporating the MSC method (Davies and Dart, 2005). This was chosen as it was the most ecologically appropriate for the well-being and narrative nature of the project, the practice setting of the practitioner researcher and generated valid results which could then be utilised in practice to support organisational development and service improvement. The interview guide, which was the initial proposed evaluation framework to be used with story projects, is presented in Box 1. For both case studies, purposive sampling (or purposeful sampling) was used for data collection, with between 6 and 12 staff and service users. The evaluation framework (interview questions) was initially trialled outside of the two evaluations and questions which appeared to prompt repeated answered were adjusted.

Box 1. Initial evaluation framework (interview guide) for evaluating story projects

As a result of participating in the (project):

  1. What were your personal expectations of what the story project would deliver in terms of your own learning? (Prompts: In what way were these realised? In what way were they different?)

  2. What has been your experience of using what you learnt in your everyday environment? (Prompts: new skills, understanding, or behaviours)

  3. What has particularly enabled you to use this learning in your workplace? (Prompts: Opportunities? Particular support?)

  4. What has made it difficult to use this learning in your workplace? (Prompts: Obstacles? Lack of opportunities? Culture?)

  5. Looking back at the last 6 months, i.e. the duration of the current story project, what has been the most significant change for you in your own work as a result of this project? (Prompts: Behaviours? Practices? Team work?)

  6. What were your initial expectations of what the story project would deliver in terms of organisational benefits? (Prompts: In what way were these realised? In what way were they different?)

  7. What have been the actual outcomes and benefits to the organisation? (Prompts: Efficiency. Budgetary. Knowledge. Partnership working)

  8. Looking back at the last [XX] months, i.e. the duration of the current story project, what do you think has been the most significant change in the organisation’s service delivery, as a result of this project? (Prompts: Better delivery of Recovery services. Better teamwork. Better partnership working)

  9. Looking back at the last [XX] months, i.e. the duration of the current story project, what has been the most significant change for your clients (and/or stakeholders and partnership organisations)? (Prompts: Social return on investment. Improvements in well-being or confidence. Better client/organisation relationships. Better take-up of services)

  10. Looking ahead, what are your recommendations to your organisation regarding future story-based projects? (Prompts: More workshops? More training? Sustainability and improvements? Less/None?)

Case study 1: Recovery Co.

Background

The story project was commissioned by a health care organisation which focusses on the recovery of adults who have or are currently experiencing mental health issues (also referred to as “service users”). The project began in October 2012, and explicitly aimed to support culture change, challenge attitudes and practices around “recovery”, improve organisational teamwork, increase the well-being of service users, develop a shared vision for the “recovery” team and improve the team’s profile within the wider organisation. The main intervention involved story-based team-building workshops and “Story Cafes”, which use stories and conversational circles as springboards to new empathetic awareness and learning.

Evaluating the project

The evaluation was conducted by semi-structured interviews using the evaluation framework (Box 1). Evaluation focussed on Learnings and Outcomes, and participants were asked to identify the MSC in the following areas: own practice; service delivery; and client benefits. Six people participated, and included service users, organisational staff, “recovery” leaders and team members (RIPFA, 2011). The interviews were conducted face to face and recorded. The ethicality of this approach was discussed at length with the organisation and the “recovery” team, and agreed before any data were collected.

Organisational outcomes and impact

Outcomes from the project included: set the scene for creative team working; encouraged innovative working; created a sense of community in the team; changes in team experience of itself; changes in behaviour as a team leader and manager; legitimised new ways of reporting incidents; using narrative to support staff in an incident risk review process/handling difficult emotions/staff well-being; encouraged use of anecdotal evidence to inform higher level management; and significant changes in team practices.

Service delivery

In terms of service delivery, the evaluation identified a number of MSCs. The first area of change was that communications within the team have improved and that this is a cultural shift. An indicative statement from a participant said: “Because we’re using it (stories), it’s changing some of the culture already, and the language that we use and the way that we speak to each other”.

The second area of MSC from the story work in the organisation related to developing/finding a community of “recovery”, giving credence to more creative and innovative work, and supporting the promotion of “recovery” principles. One research participant reported a change in knowledge sharing within the “recovery” teams and to higher levels in the organisation (see also reference to the risk procedure above). Exploring the broader impact, the participant further felt that her experience of the Story Café project was helping guide her through leading a piece of work around values across a number of organisational units and processes, for example, revising the annual appraisal and personal development review and supervision templates, to ensure culture change and workforce well-being.

Overall, it was also reported that understanding the importance of using story approaches and seeing the impact of story of the team was reported to have real significance in context of, for example, very high-profile health care incidents, and the importance of taking anecdotal evidence seriously and linking this to best practice. There was considerable importance given to ethics and process of delivery and evaluation, how to collect narrative, use it responsibly and have a process around its collection and use.

Client benefits

The evaluation found that engagement with clients was improved as was their relationship with the “recovery” teams, in additional to the level of trust in the team. It seemed that the joint participation in the Story Café by service users and staff prompted a change in attitudes towards service users, their capabilities and the respect shown towards them. Although no baseline evaluation of well-being was carried out, there has been positive feedback from service users in the Story Cafes (informal storytelling and conversation circles). It was reported that The Story Cafes enabled service users to be seen to have more capabilities and this was considered to be helpful in creating a culture shift towards more inclusive approaches to “recovery”.

A summary of the outcomes and impacts generated through use of the evaluation framework (Box 1) are outlined in Table I.

Case study 2: Social Care Co.

Background

The second case is based in a public health care organisation, and specifically commissioned by the organisational lead for the “personalisation” agenda. Starting in April 2012, the project aimed to collate evidence of personalisation practices and generate a repository of this evidence. The project aimed to: inform and educate staff, policy makers, other stakeholders and the public about personalisation practices; develop staff skills around gathering, and using customer stories for service improvement in training and teams; and improve internal and external communications and engagement.

Evaluation of the project

The evaluation was carried out by semi-structured interviews in person or by telephone using the evaluation framework designed for the project (Box 1). The evaluation was agreed through the organisational leaders who complied with the organisation’s own research governance framework. The project involved interviewing ten service users.

Organisational outcomes

The evaluation identified that all of the participants stressed the importance of the following MSCs: the achievement of better engagement with clients, and public education and awareness of personalisation practices. However, there was a sense from all participants that the story-gathering group now needed to be supported and developed for its potential outcomes to be realised fully. As one manager said: “We’ve got to do something strategic to create the space for this”.

In addition to the hard outcomes of a media-based repository of stories, the softer outcomes related to partnership working and engagement. Although organisational outcomes could not readily be evaluated nor costed out in terms of ROI, the project was also considered to have built a platform and a legacy for the future.

Service delivery

The evaluation identified that the story project had successfully supported the “transformation agenda”, enabling more creative support planning as well as challenging resistance to culture change. One participant expressed: “The stories are for me the most powerful thing we can offer in this climate in terms of the Change Agenda”. According to the participants, this has impacted upon service delivery where clients’ needs have been met more effectively through a shift in primary focus towards story listening rather than assessment of a “Category of Need” (a bureaucratic assessment of a specific need). Participants reported seeing the beginnings of meaningful change in service delivery of “personalisation”. For one social worker, the time spent in listening to stories was very significant:

What I’m hearing is different – I’m listening to the words that the person uses and how they describe their experiences and what they’re describing because that could be the most important thing they need help with – rather than the Category of Need.

Client impact

For service users, a “SROI” was identified as a common theme: “Where the […] project has been able to influence the practice of staff, then people who use services are going to get a service that is much more tailored to their individual life histories and experiences”.

Similarly, well-being or a “therapeutic perspective” was a significant outcome for the clients, “feeling listened to is very important” and more consideration of what is important to them in their lives; as was raising awareness of use of personal budgets. Additionally, through involvement of partnership organisations and by providing a framework for knowledge sharing, better services can be offered through better multi-disciplinary working.

A summary of the outcomes and impacts generated through use of the evaluation framework (Box 1) are outlined in Table II.

Discussion

A cross-case analysis of the findings of both projects indicated similarities around dimensions: how story work underpins radical organisational cultural change, its training application for staff to be better educated around new policies and approaches in health and social care, and its impact on professional relationships particularly partnership working and with service users. A strong indication from this study is that story work enhances team building and benefits new projects in the early stages, as strong organisational outcomes were demonstrated for both projects.

The benefits to Recovery Co. were significant enough for both strategic level and other staff to extrapolate ways of integrating story work into management practice, such as staff support, knowledge sharing and leadership development. In as much as story work evidences good practice and aligns with transformation of services, both projects stated that the outcomes of the story projects potentially enhanced the reputation of the organisation as an “honest broker” (Social Care Co.) or as “innovative” organisation (Recovery Co.).

Yet both projects were different in their focus and ongoing issues. The Social Care Co. project had a skills development focus to support the evidencing of personalisation, whereas the Recovery Co. project focussed on team building, culture change (towards a “recovery culture”). In the Social Care Co. project, participants further reflected in broad terms on sustainability and developing systems to support their “story gatherers”, whereas in the Recovery Co. project, the reflection was towards further exploration of narrative approaches and how these could improve best practice at all levels.

Key themes and outcomes from the interviews were therefore mapped visually for each project using wordle software. Wordles are easily created from key words emerging from the data as visual images; words are “weighted” by occurrence, represented as the larger words in the wordle. These were shared with the clients as a thematic illustration of project outcomes to assist with personal and organisational learning. The Social Care Co. wordle highlights that improvement in skills was dominant (story writing skills, creative thinking, better listening) as well as improvement in service-related relationships (partnership working, engagement, personalisation) (see Figure 1), whereas the Recovery Co. project wordle reflects the current recovery team’s focus on change and on therapeutic relationships (change management, relationships, culture change, well-being) (see Figure 2).

There are also wider implications of such variability in project impacts and outcomes. Specifically, it was recognised that some of the evaluation framework prompts were not necessarily relevant in both contexts, and reflected the nature of the original scoping of the project (as discussed above). The initial evaluation framework that was developed for the purposes of evaluating story work in workplaces therefore needed to be adjusted to reflect the diversity of projects that would be developed. Reflections and decisions about this are reflected in Table III.

This reflects the responsive design of evaluation in workplace learning projects. For example, on consideration, questions 2 to 4 in the evaluation framework are most relevant where the project involves skills training or/and mentoring, and less so where the project delivers service user interventions or team-building workshops. Questions 6 and 7 are difficult to answer if the participants are not responsible for or knowledgeable of strategic and organisational goals, or where projects involve participation by stakeholder and partnership organisations. Similarly question 9 presupposes that the project is delivered to those who have direct relationship with service users. As such the evaluation framework design needs addressing early into the project design, and purposively linked to organisational outcomes – and reflects Reio et al. (2017) critique of evaluation being overly focussed on the achievement of outcomes of training rather than on the impact on the stakeholder and whether their needs have been met.

Conclusion and implications

This paper concludes that narrative or story-based work is efficacious and credible in generating workplace impacts, especially in the context of service transformation and improvement, and that practitioners can examine such dimensions in participatory ways. The willingness of staff to be involved in the project that this paper has examined further demonstrated that evaluation is regarded as valuable and a way, in itself, of engaging staff. Significantly, the involvement of service users in the evaluation was also said to have “recovery potential”, which further emphasises the suitability of participatory methods of project design and evaluation as well as research more broadly (IDEA, 2009; MacIntosh et al., 2017).

The richness of the evaluation reflect two areas: the reported processes that story activates and shapes, including sense making, team working, re-framing and collective empathy towards workplace impacts (Gabriel and Connell, 2010; Reissner, 2011; Wall and Rossetti, 2013; Pässilä and Vince, 2016), but also the reported impacts of practical and participatory forms of MSC-informed evaluation processes which also facilitate similar processes of sense making, framing, re-framing and collective empathy, but also motivation to act, change and improve (Wall et al., 2017). In addition, the elicited experiential content generated through the evaluation was found to be persuasive when presenting to higher level managers, as it provided strong links between the story work and organisational strategic priorities and pathways. As a result, there are a number of specific implications for different stakeholder groups, and these are represented in Table IV.

In this way, this paper argues that evaluative frameworks benefit from being designed in conjunction with the client organisation to align with their outcomes and be conducted through participatory forms. The decision to adapt the “MSC” method and integrate this into the evaluation framework enabled the strong links to the use of stories as data and evidence. Moreover, the MSC domains of change can be identified by a top-down or bottom-up process, through participatory consultation – in other words – the framework can be adapted to the specific aims and cultural context of the project, for example, more or less skills content, more or less service user involvement.

Findings showed that the MSC-informed questions can generate important stories as data in work-based projects, and can accommodate scaling up. In addition, participatory or co-production of evaluative design has exciting potential, and one which aligns readily with guiding ethos within health and social care organisational governance and culture. In this way, this paper documents contemporary evidence of the variety of organisational development and service improvement that story work can generate as part of workplace learning projects.

Figures

A wordle-analysis of the outcomes from the Social Care Co. project

Figure 1

A wordle-analysis of the outcomes from the Social Care Co. project

A wordle-analysis of the outcomes from the Recovery Co. project

Figure 2

A wordle-analysis of the outcomes from the Recovery Co. project

Recovery Co.’s summary of project outcomes

Personal expectations Personal learning Organisational outcomes Change and/or impact on own work Change and/or impact on service delivery Change and/or impact on clients and service users
Check alignment with recovery principles (2)
Deepen understanding of own work
Team development (2)
Therapeutic/well-being benefit (2)
Change management (2)
Theory of story practice
Get to know colleagues
New group energy
Change in staff attitudes towards service users (4)
Team development (2)
Impact on therapeutic relationship (2)
Setting scene for RAG teams to work more creatively (2)
Potential of using Narrative and stories in organisation to support other processes/staff development (2)
Experiencing stories is powerful, connects people (2)
Galvanising
Creative team practices (2)
More supportive management practices (2)
Encouraging creativity in the team (2)
Team development (2)
New ways of knowledge sharing (2)
Culture change (2)
Links to other processes and projects (2)
Improved therapeutic relationship (2)
Stories as a powerful tool (2)
Knowledge sharing (2)
Working in a holistic and supportive way (2)
Improved communications (2)
Confidence to use narrative to support staff in risk review
Networking with community partners
Clarifying thinking
Cultural shift (2)
Alignment with recovery impact assessment (2)
RAG team development as a community (2)
Renewed team purpose (2)
Using narrative and stories in organisation (2)
Model for future narrative projects (2)
Well-being of workforce
Enhanced offer (4)
Change in staff attitudes towards service users (4)
Communication (4)
Self-expression (3)
Therapeutic benefit (2)
Socialising; being part of a group
Concentration
Confidence – “being myself”
Understanding 5 ways of well-being

Note: Numbered themes refer to order of frequency

Social Care Co.’s summary of project outcomes

Personal expectations Personal learning Organisational outcomes Change and/or impact on own work Change and/or impact on service delivery Change and/or impact on clients and service users
Develop story writing skills (7)
Client engagement (7)
Evidence collecting for Personalisation (7)
More sympathetic approach (3)
Tool to promote organisation
A tool for collecting and analysing information
Tap into practical experience
Explicit organisational learning shared in public arena
Developed listening skills (5)
Better listening; listening differently (5)
Confidence to talk to people about their life experience and needs (5)
Confidence to write up stories (5)
Changing ways of thinking about situation (4)
Letting clients have more time to tell their story in their words (3)
Sharing experiences with other story gatherers
Impact of different media on presenting stories
Impact on personal life (listening to children)
A new way of learning about people (behaviours)
Educating public (7)
Educating social workers (7)
Partnership working (7)
Staff skills and knowledge development (5)
Improvements in efficiency (4)
Cultural shift (4)
Assist positive risk taking (2)
New ways of working (2)
Material (stories) for Training
First step in the right direction
Cost-effective
Better understanding of service user’s perspective
Partnership involvement (7)
Listening skills (6)
Transformation of service delivery (5)
Different ways of working (5)
Story awareness (4)
Meeting client’s needs (4)
More effective use of time (3)
Significant contribution to transformation agenda (2)
Better recording (profile, care plan, journal)
Recognition of social workers as champions
Knowledge-sharing skill
Engagement with service users (7)
Creative thinking (7)
Partnership working (7)
Sharing good practice (5)
More person-centred approach (4)
Impact on resistance to culture change (2)
Engendering trust in the profession and co-operation (2)
Transferable knowledge
Supporting creative thinking and practice
Tool for social workers
Supports personalisation (7)
Better personalisation services (6)
Improved awareness of personal budgets (5)
Trust and confidence in services (4)
Better services through multi-disciplinary working (4)
Social return
culture/relationship shift
Engages co-production
Therapeutic perspective
Feeling empowered
Better working

Note: Numbered themes refer to order of frequency

Reflections on implementing the evaluative framework (EF)

Evaluation framework (EF) – interview question Response Recovery Co. Response Social Care Co. Comment
 1. What were your personal expectations of what the story project would deliver in terms of your own learning? (Prompts: In what way were these realised? In what way were they different?) Leads and those involved with the design of the project responded easily; OTTs did not Cohort participants including partnership organisations responded easily. Not asked of strategic lead Keep in the generic evaluation framework
 2. What has been your experience of using what you learnt in your everyday environment? (Prompts: new skills, understanding, or behaviours) Leads and those involved with the design of the project responded easily; OTTs did not Cohort participants including partnership organisations responded easily. Not asked of strategic lead Keep in the generic evaluation framework
 3. What has particularly enabled you to use this learning in your workplace? (Prompts: Opportunities? Particular support?) Leads and those involved with the design of the project responded easily; OTTs did not Cohort participants including partnership organisations responded easily. Not asked of strategic lead Contextual: use in evaluation framework for projects with skills training element
 4. What has made it difficult to use this learning in your workplace? (Prompts: Obstacles? Lack of opportunities? Culture?) Leads and those involved with the design of the project responded easily; OTTs did not Cohort participants responded easily. Not asked of strategic lead Contextual: use in evaluation framework for projects with skills training element
 5. Looking back at the last 6 months, i.e. the duration of the current story project, what has been the most significant change for you in your own work as a result of this project? (Prompts: Behaviours? Practices? Team work?) Leads and those involved with the design of the project responded easily; OTTs made partial response Cohort participants responded easily. Not asked of strategic lead Keep in the generic evaluation framework
 6. What were your initial expectations of what the story project would deliver in terms of organisational benefits? (Prompts: In what way were these realised? In what way were they different?) Leads and those involved with the design of the project responded easily; OTTs did not Strategic lead responded easily; partnership organisation member did not Contextual: use in evaluation framework for projects delivered at management or leadership level
 7. What have been the actual outcomes and benefits to the organisation? (Prompts: Efficiency. Budgetary. Knowledge. Partnership working) Leads and those involved with the design of the project responded easily; OTTs did not Strategic lead responded easily; partnership organisation member did not. Project manager had difficulty responding. Contextual: use in evaluation framework for projects delivered at management or leadership level; review sample selection
 8. Looking back at the last [XX] months, i.e. the duration of the current story project, what do you think has been the most significant change in the organisation’s service delivery, as a result of this project? (Prompts: Better delivery of Recovery services. Better teamwork. Better partnership working) Leads and those involved with the design of the project responded easily; OTTs did not Strategic lead responded easily; partnership organisation member did not. Some difficulty in responding from original project manager (see above) Contextual: use in evaluation framework for projects delivered at management or leadership level; review sample selection
 9. Looking back at the last [XX] months, i.e. the duration of the current story project, what has been the most significant change for your clients (and/or stakeholders and partnership organisations)? (Prompts: Social return on investment. Improvements in well-being or confidence. Better client/organisation relationships. Better take-up of services) Leads and those involved with the design of the project responded easily; OTTs made partial response Strategic lead responded easily; partnership organisation member did not. Some difficulty in responding from original project manager (see above) Keep in generic evaluation framework; review sample selection
10. Looking ahead, what are your recommendations to your organisation regarding future story-based projects? (Prompts: More workshops? More training? Sustainability & Improvements? Less/None?) All interviewees responded easily All interviewees responded easily Keep in the generic evaluation framework

Summary of implications for story-practitioners, project evaluators and organisations

Implications about story work in organisational change Implications about evaluation frameworks, strategies or techniques
Story-practitioners Access and utilise evidence to demonstrate the variety of impacts that can be generated through story work
Utilise cases examples to demonstrate the value, richness and possible application areas of story work
Clearly define own evaluation “toolkit” as a flexible menu of options, which might include formal methodologies (as required by clients) as well as adapted techniques (such as MSC)
Negotiate the evaluation framework and techniques with the project owners – to fit their particular outcomes as well as their requirements
Project evaluators Position story work as a way to inform the strategic planning, monitoring and evaluation of strategic change programmes – notice the who, what, why, when elements of the story construction to identify issues or ideas
Collect and analyse a variety of stories (e.g. from different stakeholders) at the various stages of the project process (e.g. design, delivery, decision-gates, evaluation) – story listening and recording processes will be important
Involve different parts of the organisation at the evaluation stage to be able to make sense of alternative stories as data/evidence for (1) progression or change and (2) deliverables, impacts and outcomes
Involve partner organisations where possible in the original project to improve the reach and impact of workplace projects at the outset
Adopt MSC-informed questions to enable deeper levels of evidence to emerge
Organisations (e.g. health or social care) Position story work as a way to inform and evaluate strategic commitments to service improvement – notice the who, what, why, when elements of the story construction to identify issues or ideas
Engage stakeholders across the organisation by capturing their stories, and telling them in planning and feedback contexts rather than being confined to managers or PR
Establish story generation mechanisms across the organisation and establish links to teams and managers – and develop skills in noticing story elements (e.g. storyline, characters, actors, transition stages and morals) (see Wall and Rossetti, 2013)
Utilise real client stories to enrich and “humanise” planning and strategy formulation processes – the story
Establish story curation (collection and display) mechanisms across the organisation to make evaluation a part of a culture
Incorporate partnership working and knowledge sharing around aspects of cultural change in an organisation

References

Alvesson, M. and Sandberg, J. (2013), “Has management studies lost its way? Ideas for more imaginative and innovative research”, Journal of Management Studies, Vol. 50 No. 1, pp. 128-152.

Alvesson, M. and Sandberg, J. (2014), “Habitat and habitus: boxed-in versus box-breaking research”, Organization Studies, Vol. 35 No. 7, pp. 967-987.

Boorman, S. (2009), “NHS health and well-being – final report”, Department of Health, Leeds.

Clark, M. and Purdy, R. (2007), Designing for Outcomes: A Practical Resource to Support Effective Design, Delivery and Evaluation of Work in Health and Social Care, CSIP & Department of Health, London.

Cunliffe, A.L. and Scaratti, G. (2017), “Embedding impact in engaged research: developing socially useful knowledge through dialogical sensemaking”, British Journal of Management, Vol. 28 No. 1, pp. 29-44.

Davies, R. and Dart, J. (2005), “The most significant change (MSC) technique – a guide to its use”, Version 1.00, CARE International, London.

Denscombe, M. (2010), The Good Research Guide: For Small-Scale Social Research Projects, McGraw-Hill/Open University Press, Maidenhead.

Ellis, J., Parkinson, D. and Wadia, A. (2011), “Making connections: using a theory of change to develop planning and evaluation”, Charities Evaluation Services, February, London.

Gabriel, Y. (2008), Organizing Words: A Critical Thesaurus for Social and Organization Studies, Oxford University Press, Oxford.

Gabriel, Y. and Connell, N.A.D. (2010), “Co-creating stories, collaborative experiments in storytelling”, Management Learning, Vol. 41 No. 5, pp. 507-523.

Guest, G., MacQueen, K. and Namey, E. (2012), Applied Thematic Analysis, Sage Publications, London.

Hasenfeld, Y., Hill, K. and Weaver, D. (2004), A Participatory Model for Evaluating Social Programs, The James Irvine Foundation, Fresno, CA.

Heikkinen, H.L.T., de Jong, F.P.C.M. and Vanderlinde, R. (2016), “What is (good) practitioner research?”, Vocations and Learning, Vol. 9 No. 1, pp. 1-19.

IDEA (2009), “New routes to better outcomes”, Learning in Health and Social Care, Vol. 7 No. 4, pp. 227-234.

Johnston, J. and Reeves, A. (2017), “Assessing research performance in UK universities using the case of the economics and econometrics unit of assessment in the 1992-2014 research evaluation exercises”, Research Evaluation, Vol. 26 No. 1, pp. 28-40.

Kirkpatrick, D.L. (1998), Evaluating Training Programs: The Four Levels, 2nd ed., Berrett-Koehler, San Francisco, CA.

McCormack, C. and Milne, P. (2003), “Stories create space for understanding organisational change”, Qualitative Research Journal, Vol. 3 No. 2, pp. 46-59.

MacIntosh, R., Beech, N., Bartunek, J., Mason, K., Cooke, B. and Denyer, D. (2017), “Impact and management research: exploring relationships between temporality, dialogue, reflexivity and praxis”, British Journal of Management, Vol. 28 No. 1, pp. 3-13.

MacLeod, D. and Clarke, N. (2009), Engaging for Success: Enhancing Performance through Employee Engagement, Department of Business, Innovation and Skills, London.

Mertens, D.M. and Wilson, A.T. (2012), Program Evaluation Theory and Practice: A Comprehensive Guide, Guilford Press, Guilford.

NEF (2011), Five Ways to Wellbeing, The New Economics Foundation, London.

Ozanne, J.L., Davis, B., Murray, J.B., Grier, S., Benmecheddal, A., Downey, H., Ekpo, A.E., Garnier, M., Hietanen, J., Le Gall-Ely, M., Seregina, A., Thomas, K.D. and Veer, E. (2017), “Assessing the societal impact of research: the relational engagement approach”, Journal of Public Policy & Marketing, Vol. 36 No. 1, pp. 1-14.

Pässilä, A. and Vince, R. (2016), “Critical reflection in management and organization studies”, in Fook, J., Collington, V., Ross, F., Ruch, G. and West, L. (Eds), Researching Critical Reflection Multidisciplinary Perspectives, Routledge, London, pp. 48-62.

Pettigrew, A. and Starkey, K. (2016), “The legitimacy and impact of business schools-key issues and a research agenda”, Academy of Management Learning & Education, Vol. 15 No. 4, pp. 649-664.

Rath, T. and Harter, J.K. (2010), Wellbeing: The Five Essential Elements, Gallup Press, New York, NY.

Reio, T.G., Rocco, T.S., Smith, D.H. and Chang, E. (2017), “A critique of Kirkpatrick’s evaluation model”, New Horizons in Adult Education and Human Resource Development, Vol. 29, pp. 35-53.

Reissner, S.C. (2011), “Patterns of stories of organisational change”, Journal of Organizational Change Management, Vol. 24 No. 5, pp. 593-609.

RIPFA (2011), Reablement: Policy, Research and Practice, RIPFA, Dartington.

SCIE (2010), The Good Practice Framework: A Full Guide to the GPF, Social Care Institute for Excellence, London.

Schalock, R.L. (2001), An Overview of Outcome Based Evaluation, Kluwer Academic/Plenum Publishers, Springer, Boston, MA.

Shepherd, G., Boardman, J. and Burns, M. (2010), Implementing Recovery: A Methodology for Organisational Change, Sainsbury Centre for Mental Health, London.

SROI Network (2011), “The seven principles of SROI”, SROI Network, Liverpool.

Trochim, W.M.K. (2006), “Introduction to evaluation”, Web Centre for Research Methods, available at: www.socialresearchmethods.net/kb/intreval.php (accessed 9 November 2017).

Wall, T. (2013), “Diversity through negotiated higher education”, in Bridger, K., Reid, I. and Shaw, J. (Eds), Inclusive Higher Education: An International Perspective on Access and the Challenge of Student Diversity, Libri Publishing, Middlesex, pp. 87-98.

Wall, T. (2014), “Transforming the research-learning performance of professional lifelong learners”, Procedia – Social and Behavioral Sciences, Vol. 116, pp. 189-193.

Wall, T. (2015), “Global perspectives on profound pedagogies”, Higher Education, Skills and Work BasedWork-based Learning, Vol. 5 No. 4, pp. 323-338.

Wall, T. (2016a), “Author response: provocative education: from the Dalai Lama’s Cat® to Dismal Land®”, Studies in Philosophy and Education, Vol. 35 No. 6, pp. 649-653.

Wall, T. (2016b), “Žižekian ideas in critical reflection: the tricks and traps of mobilising radical management insight”, Journal of Work-Applied Management, Vol. 8 No. 1, pp. 5-16.

Wall, T. (2017a), “Reciprocal pedagogies – flexible learning exemplar”, in Devitt-Jones, S. (Ed.), Flexible Learning Practice Guide, HEA/QAA, York, pp. 21-22.

Wall, T. (2017b), “A manifesto for higher education, skills and work-based learning: through the lens of the manifesto for work”, Higher Education, Skills and Work-Based Learning, Vol. 7 No. 3, pp. 304-314.

Wall, T. (2018), “Infusing ethics into leadership learning & development”, in Knights, J., Grant, D. and Young, G. (Eds), Leading Beyond the Ego: How to be a Transpersonal Leader, Routledge, London (forthcoming).

Wall, T. and Rossetti, L. (2013), Story Skills for Managers, CreateSpace, Charleston, CA.

Wall, T., Iordanou, I., Hawley, R. and Csigás, Z. (2016), Research Policy and Practice Provocations: Bridging the Gap: Towards Research that Sparks and Connects, the European Mentoring and Coaching Council, Brussels.

Wall, T., Jamieson, M., Csigás, Z. and Kiss, O. (2017), Research Policy and Practice Provocations: Coaching Evaluation in Diverse Landscapes of Practice – Towards Enriching Toolkits and Professional Judgement, the European Mentoring and Coaching Council, Brussels.

Weick, K., Sutcliffe, K. and Obstfeld, D. (2005), “Organizing and the Process of sensemaking”, Organization Science, Vol. 16 No. 4, July-August, pp. 409-421.

Weick, K.E. (2016), “Constrained comprehending: the experience of organizational inquiry”, Administrative Science Quarterly, Vol. 61 No. 3, pp. 333-346.

Yin, R.K. (2013), Case Study Research: Design and Methods, 5th ed., Sage, London.

Further reading

Elliott, J. (2006), Using Narrative in Social Research, Sage Publications, London.

Acknowledgements

The writing of this paper was supported by The University of Chester QR Grant Scheme.

Corresponding author

Lisa Rossetti can be contacted at: story@lapidus.org.uk

Related articles