Practical and theoretical judgment in data-driven financial due diligence

Tim Kastrup (Department of Business Studies, Uppsala University, Uppsala, Sweden)
Michael Grant (Department of Business Studies, Uppsala University, Uppsala, Sweden)
Fredrik Nilsson (Department of Business Studies, Uppsala University, Uppsala, Sweden)

Accounting, Auditing & Accountability Journal

ISSN: 0951-3574

Article publication date: 8 July 2024

938

Abstract

Purpose

New digital technologies are reshaping the business landscape and accounting work. This paper aims to investigate how incorporating more data and new data analytics (DA) tools impacts the role and use of judgment in financial due diligence (FDD).

Design/methodology/approach

The paper reports findings from a field study at a Big Four accounting firm in Sweden (“DealCo”). The primary data includes semi-structured interviews, observations and other meetings. Theoretically, it draws on Dewey’s The Logic of Judgments of Practise and Logic: The Theory of Inquiry and distinguishes between theoretical (what is probably true) and practical judgment (what to do).

Findings

In DealCo’s FDD practice, using more data and new DA tools meant that the realm of possibility had expanded significantly. To manage the newfound abundance and to use DA effectively, DealCo’s advisors invoked practical and theoretical judgments in different stages and areas of the data-driven FDD. The paper identifies four critical uses of judgment: Setting priorities and exercising restraint (practical judgment) and forming hypotheses and doing sense checks (theoretical judgment). In these capacities, practical judgment and theoretical judgment were essential in transforming raw data into actionable insights and, in effect, an indeterminate situation into a determinate one.

Originality/value

The study foregrounds the practical dimension of knowledge production for decision-making and contributes to a better understanding of the role, use and importance of accounting professionals’ judgment in a data-driven world.

Keywords

Citation

Kastrup, T., Grant, M. and Nilsson, F. (2024), "Practical and theoretical judgment in data-driven financial due diligence", Accounting, Auditing & Accountability Journal, Vol. ahead-of-print No. ahead-of-print. https://doi.org/10.1108/AAAJ-11-2022-6167

Publisher

:

Emerald Publishing Limited

Copyright © 2024, Tim Kastrup, Michael Grant and Fredrik Nilsson

License

Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode


1. Introduction

The world is going digital, and so is accounting (Knudsen, 2020; Möller et al., 2020). New data-gathering, -processing and -analysis tools and technologies, i.e. data analytics (DA), allow organizations to collect and analyze more data than ever before (Casas-Arce et al., 2022). These possibilities, in turn, may fundamentally reshape, if not disrupt, accounting work (Moll and Yigitbasioglu, 2019; Möller et al., 2020), especially in the context of accounting for decision-making (Davidson and Trueblood, 1961), i.e. how accounting professionals meet their responsibilities for “fiduciary decisions and [and] […] broader management decisions” (Davidson and Trueblood, 1961, p. 582; see also Quattrone, 2016). Although the significance of DA – as opportunity and threat for the accounting profession – has been recognized (see Richins et al., 2017; Schneider et al., 2015), empirical insights into its potentially disruptive effects remain relatively scarce in the broader accounting field (e.g. Casas-Arce et al., 2022; Knudsen, 2020; Marton et al., 2024; Möller et al., 2020; Ruhnke, 2023).

DA is often synonymous with data-driven, fact-based decision-making (see Elgendy et al., 2022; Nielsen, 2018). This has led to questions about (the future of) accounting professionals’ judgment in information and knowledge production for decision-making (Arnaboldi et al., 2022; Knudsen, 2020; Quattrone, 2016). Some have voiced concerns that accounting information will turn into packaged goods (Quattrone, 2016, 2017) and that data, analytics and automation will crowd out judgment and leave little room for human intervention (see Bhimani and Willcocks, 2014; Scott and Orlikowski, 2012; Sutton et al., 2018). In contrast, many others believe that accounting information must always be judged, questioned and debated (see Quattrone, 2016, 2017), that the potential for automation in accounting has been overestimated (Korhonen et al., 2021) and that organizational decision-making processes, “like never before, will require sound human judgment” (Rikhardsson and Yigitbasioglu, 2018, p. 46; Arnaboldi et al., 2022).

One area of accounting in which digitalization and DA have provoked a silent revolution is financial due diligence (FDD) within mergers and acquisitions (M&A) (Brands and Holtzblatt, 2015; Tiemann and Hartman, 2013). FDD is a detailed inquiry into the financial performance, the financial positions and the financial risks and opportunities of an acquisition target, usually performed by a large accounting firm (Pomp, 2015). Not long ago, sellers would set up physical data rooms, stacked with yellowed binders. Today, they set up virtual data rooms, stacked with digital files. Furthermore, sellers now share more data than ever. To be able to use and analyze all these data, the FDD providers, i.e. the accounting firms, have invested in “deal analytics” (PwC, 2021) and “M&A technology” (Deloitte, 2021a), and their teams have started using new DA tools. EY (2020) promises “actionable insights at a faster pace and deeper level than ever before”; KPMG (2021) “better information, and smarter decision making.”

Although FDD has been identified as a particularly suitable application of DA in accounting by scholars and practitioners (e.g. Brands and Holtzblatt, 2015; Tiemann and Hartman, 2013), there is little academic literature on FDD and DA. As a consequence, little is known about how DA adoption has impacted the work of accounting professionals, including the use of judgment, in this context. To help fill this research gap and to contribute to an improved understanding of the role, use and importance of accounting professionals’ judgments in accounting for decision-making in the digital age, we carried out a field study at a Big Four accounting firm in Sweden (Pseudonym: “DealCo”) in which we explored the use of judgment in data-driven FDD.

Theoretically, we draw on the works of the pragmatist John Dewey, more specifically, The Logic of Judgments of Practise (1915) and Logic: The Theory of Inquiry (1938). Based on these seminal works, we conceptualize FDD as a data-driven inquiry that seeks to transform an indeterminate situation, in which buyers/clients have an investment hypothesis, into a determinate one, in which buyers take an investment decision. We also mobilize the distinction between theoretical and practical judgment. The former describes an “assertion about what is probably true or correct” (Downie and Macnaughton, 2009, p. 322), the latter a “judgment of what to do” (Dewey, 1915, p. 514).

Dewey’s framework recognizes that knowledge production is highly situational and typically involves practical complications, such as intense time pressure, a need to act and an imperative to justify one’s judgments and decisions (see Svetlova and Dirksen, 2014). At the same time, it allows for the possibility that organizational actors aspire to do the right thing and evaluate means with “ends-in-view” (Dewey, 1915, p. 518). With its joint concern for the situation and rational action within this situation, Dewey’s framework occupies the middle ground between hyper- (e.g. rational choice theory) and never-rational (e.g. political model) perspectives on human behavior – which is a reasonable position on professional(s’) behavior.

In DealCo’s FDD practice, more data and new DA tools meant that the realm of possibility, e.g. what data to request, what analyses to perform, and what results to present, had expanded significantly. To manage this abundance, and to use DA effectively, DealCo’s advisors invoked practical and theoretical judgments in different stages and areas of a data-driven FDD. On the one hand, they invoked practical judgment in setting priorities (e.g. which analyses to perform) and exercising restraint (e.g. when to terminate an analysis). On the other hand, they invoked theoretical judgment in forming hypotheses (e.g. about profit drivers) and doing sense checks (e.g. on management explanations). In these capacities, both practical and theoretical judgment were essential in transforming raw data into actionable insights and, in effect, an indeterminate situation into a determinate one.

This study contributes to the literature on digitalization in accounting, especially knowledge production for decision-making (Arnaboldi et al., 2022; Bhimani and Willcocks, 2014; Casas-Arce et al., 2022; Knudsen, 2020; Quattrone, 2016; Schneider et al., 2015). It shows that FDD, despite the increase in data volumes and computing power, remains a highly practical endeavor in which practical and theoretical judgments are exceedingly important. Our study underlines the need to view and investigate knowledge production as an inherently practical activity that is best understood in connection to the “contextual whole” (see Dewey, 1915, 1938). Previous research on accounting professionals’ judgment, most of which is about professional judgment in financial reporting and auditing, has often lacked detail with regard to the questions of how judgment is exercised and what constitutes good judgment (see West and Buckby, 2023). By framing FDD as a Deweyan inquiry and mobilizing the concepts of practical and theoretical judgment, we add detailed, practice-based insights to these questions and contribute to a better understanding of the role, use and importance of accounting professionals’ judgment in a data-driven world.

The remainder of the paper is structured as follows: First, we build the theoretical framework, which consists of Dewey’s theory of inquiry and a concise discussion of accounting in a digital world. Then, we describe the methodology and the research setting. Thereafter, we present the findings from the field study. Subsequently, we discuss the findings and relate them to the relevant literature. To conclude, we summarize the paper’s contributions, point out practical implications, note research limitations and outline future research opportunities.

2. Theoretical framework

2.1 Dewey’s theory of inquiry

The exercise of judgment can be regarded as “one of the distinguishing features of a profession” (West and Buckby, 2023, p. 121). In accounting, this is perhaps most salient in the context of financial statement preparation and auditing, where preparers and auditors are expected to use professional judgment. Although many definitions (e.g. AICPA, 2023) mention concepts such as experience, circumstances and appropriateness, the discourse on professional judgment in accounting has been dominated by (narrow) conceptualizations that foreground the correct application of accounting standards (West and Buckby, 2023). Accounting standards, however, give only incomplete guidance and do not concretize what constitutes good judgment (Brown et al., 1993). Furthermore, judgment and decision-making research in accounting has been dominated by psychological theories, models and perspectives (e.g. the lens model, the probabilistic judgment model, heuristics and biases) (Trotman et al., 2011; West and Buckby, 2023) that are mostly based on axiomatic rationality, that is, rationality as conformity with the axioms of rational choice theory (Gigerenzer, 2021).

These theories and models are geared toward small worlds in which all possible future states and their consequences are known; outside of small worlds, they have limited normative power (Gigerenzer, 2021). In the real world, professionals frequently encounter uncertain, ambiguous situations in which means are difficult to evaluate and outcomes are hard to foresee (see Daft and Lengel, 1986). Furthermore, in business practice, decision-making usually “imposes conditions […] such as: time pressure, the necessity to act, the fatefulness and immediacy of decisions, and the high demand for legitimization and justification” (Svetlova and Dirksen, 2014, p. 567). In such situations, axioms, calculations and computations can aid decision-making, but they cannot solve the problem; a more practical approach is needed. Different approaches have been suggested. Svetlova and Dirksen (2014, p. 566) list, for example, “‘muddling-through’ (Lindblom, 1959), the process of ‘calculating where we can’ (Keynes, 1936/2012) […] [and] ‘acting sensibly’ (Smith, 2011).” In one way or another, these approaches all propose pragmatic efforts to “carry an incomplete situation to its fulfillment” (Dewey, 1915, p. 514). This pragmatic effort, in turn, is the subject matter of Dewey’s (1938) theory of inquiry. In the following, we present this theory, introduce important analytical concepts and distinctions, and discuss why Dewey’s framework is suitable for analyzing accounting professionals’ use of judgment.

In Dewey’s writing, inquiry refers to the “transformation of an indeterminate situation into one that is so determinate in its constituent distinctions and relations as to convert the elements of the original situation into a unified whole” (Dewey, 1938, pp. 104–105). Moreover, Dewey insists that “we never experience nor form judgments about objects and events in isolation, but only in connection with a contextual whole. This latter is what is called a ‘situation’” (Dewey, 1938, p. 66). In that regard, he distinguishes between indeterminate and determinate situations. In indeterminate situations “habit and practice do not fully determine how the situation will or should proceed” (Brown, 2015, p. 65); they are too doubtful to be “straightened out, cleared up and put in order, by manipulation of our personal states of mind” (Dewey, 1938, p. 106). Only after transformation, i.e. after inquiry, do they become doubtless enough to determine what should be done; at that point, they classify as determinate situations (Dewey, 1938).

An inquiry can, thus, be characterized as “an active method of responding to problems that involve feeling, abstract analysis and practical experimentation” (Hildebrand, 2008, pp. 56–57). Indeed, an “[…] inquiry is not a purely logical process – feeling is a useful and orienting presence throughout each phase” (Hildebrand, 2008). At the same time, Dewey (1938, p. 161) cautions that affect and emotion can “frustrate wise decision.” It follows that feeling can help and hinder the inquiry (Guénin-Paracini et al., 2014, make a similar point about the role of fear in auditing). This shows a nuanced view on (the influence of) affect and emotion in judgment and decision-making that is suitable to study how accounting professionals, i.e. feeling humans, invoke judgment in practice. On a closely related note, Dewey (1938) acknowledges that inquiries and decision-making are subject to cognitive limitations (see “bounded rationality”, Simon, 1947). However, unlike heuristics and biases researchers, Dewey foremost views these limitations as a backdrop against which to analyze inquiry as a “search for solutions under conditions of uncertainty and necessity to act as well as to justify decisions” (Svetlova and Dirksen, 2014, p. 567). Hence, Dewey’s inquiry is a method of practical problem-solving in a specific situation.

To facilitate problem-solving in a specific situation and make this situation determinate, a particular kind of judgment is required: Practical judgment. Unlike theoretical judgment which may be defined as “an assertion about what is probably true or correct,” practical judgment is about “what we ought to do” (Downie and Macnaughton, 2009, p. 322). Practical judgments can be differentiated from other types of judgments in several ways. Most importantly, practical judgments (1) concern situations that require action, (2) have “existential” consequences, i.e. they will affect the situation and its termination and (3) concern the evaluation of means with “ends-in-view,” that is, a simultaneous appraisal of means and ends (Dewey, 1915). It follows that actions are valuable in relation to the contextual whole and to the extent to which they can affect this whole for the better and help determine the situation (Sinclair, 2014). In FDD, this includes judgments about the value of requesting certain data, performing certain analyses and presenting certain conclusions, among other things. Unlike prizing/valuing, which refers to automatic esteeming and is rooted in routines, habits and personal preference (Dewey, 1915, 1938), appraisal/practical judgment is a matter of deliberate evaluation, justification and proof, as it “involves a judgment likely to be publicly defended” (DeMunck and Zimmermann, 2015, p. 122). Finally, it is worth noting that practical judgment bears some resemblance to “practical wisdom” as described by Aristotle, and later, Aquinas (e.g. an emphasis on action, the context and joint consideration of acceptable means and worthy outcomes) (see Dunne, 1999; Ferrero et al., 2020; Melé, 2010). Despite these similarities, the concepts differ fundamentally in terms of context and purpose: Practical judgment is about situated problem-solving in the context of knowledge production; practical wisdom is about doing what is right in the context of virtuous living (see Ferrero et al., 2020). This makes the former the more expedient concept for a study on accounting for decision-making.

At this point, a note on the apparent dichotomy between practical judgment and theoretical judgment is in order. Without detailed knowledge of the “constituent distinction and relations” of a situation, sound practical judgments about how to best handle and transform the situation are hardly possible. Dewey himself points out: “The judgment of what is to be done implies […] a statement of what the given facts of the situation are, taken both as indications of the course to pursue and as furnishing the means to be employed in its pursuit” (Dewey, 1915, pp. 509–510). It follows that practical judgment and theoretical judgment are tightly intertwined. Nonetheless, the distinction is analytically useful, as will be evident later. It should be noted that in Dewey’s original terminology, statements about the “given facts of a situation” are called propositions. Unlike practical judgment, propositions have no direct existential consequences; they do not alter the subject matter they concern (Welchman, 2002). Rather, propositions are intermediate, instrumental devices that can facilitate, orientate and re-orientate an inquiry (Brown, 2015). In contemporary philosophy, the notion is often used differently. Also, today, few refer to (their) theoretical judgments as propositions. To avoid unnecessary confusion, we thus use the term theoretical judgment instead of propositions hereinafter.

To make the above easily accessible to accounting professionals, some explanations on the relationship between practical and theoretical judgment and (1) professional judgment and (2) professional skepticism are warranted. In a narrower (accounting) sense, professional judgment refers to the exercise of discretion in applying accounting standards in specific situations, based on relevant training, knowledge and experience (AICPA, 2023). In a wider sense, professional judgment refers to the exercise of discretion in determining the appropriate course of action in specific situations, based on those grounds. Professional skepticism, as defined in the auditing standards, includes a questioning mind and alertness to the possibility of error and fraud (Boyle and Carpenter, 2015). Compared with professional judgment in the narrower sense, practical judgment goes beyond the application of accounting/auditing standards and concerns a wider range of “what-to-do”-judgments implicated in accounting work. With respect to professional judgment in the wider sense, the distinction between practical and theoretical judgment draws attention to two different types of judgment that are part of professional judgment but become obscured in this broader concept. Thus, this distinction enables a sharper analysis of the use of judgment in accounting practice. Finally, (professional) skepticism may inform and/or suspend practical and theoretical judgment, for instance, by motivating the collection of additional data before drawing a conclusion.

To summarize, Dewey’s theory of inquiry offers a comprehensive framework for studying accounting professionals’ use of judgment in accounting for decision-making/data-driven FDD: It foregrounds key complexities, uncertainties and practicalities of knowledge production, it recognizes professionals’ cognitive limitations but maintains a nuanced view on the influence of feeling and affect, and it includes useful analytical distinctions, e.g. practical and theoretical judgment, that are novel to the accounting literature. As such, this framework can contribute to developing a better understanding of how accounting professionals invoke judgment, and how this is affected by ongoing digitalization.

2.2 Accounting in a digital world

New digital technologies are reshaping the business landscape and could have a major impact on the work of accounting professionals (Arnaboldi et al., 2022; Moll and Yigitbasioglu, 2019). One of these technologies is DA which, in this paper, denotes the extensive use of data and data collection, processing and analysis tools to generate insights for decision-making (Rikhardsson and Yigitbasioglu, 2018; Schneider et al., 2015). According to Schneider et al. (2015, p. 719), DA is fundamentally changing task processes, “particularly those tasks that provide inference, prediction, and assurance to decision-makers.” For instance, DA makes it possible to collect, process and analyze larger and more varied datasets and sources than ever before, including so-called big data (Huerta and Jensen, 2017; Vasarhelyi et al., 2015). This includes internal data (e.g. operations data), external data (e.g. sociodemographic data), structured data (e.g. financial data) and unstructured data (e.g. textual data) (Bhimani and Willcocks, 2014; Richins et al., 2017). To make this more concrete, consider DA use in FDD.

While there is little scholarly research on this precise topic (an exception is Neumann, 2020), accounting firms’ announcements and publications give a glimpse of how DA is used in FDD practice (see Deloitte, 2021b; EY, 2018; KPMG, 2018; PwC, 2018). Three applications and rationales, respectively, stand out: Speed, depth and scope. First, DA tools are used to automate and, in effect, significantly reduce time spent on data processing activities, such as cleaning or transforming data. Second, DA tools are used to process, analyze and draw conclusions from more detailed data, especially transactional-level data, which enables highly granular analyses of purchasing trends/patterns and profitability drivers, among other things. Third, DA tools are used to extend the scope of FDD, i.e. do more in the same amount of time. For instance, DA tools make it possible to integrate non-traditional data sources, such as geolocational and sociodemographic data and perform novel analyses, e.g. white spot analyses (in retail cases). This is in line with Neumann’s (2020) findings on DA adoption and use in FDD.

The above outlines how incorporating DA might enable accounting professionals to provide better information in support of better (management) decisions. In the public decision-making literature, these opportunities have been discussed under the labels “information optimization” and “decision optimization” (van der Voort et al., 2019). This implies that, as a consequence of DA use, human rationality may no longer be severely constrained by imperfect information, cognitive limitations and insufficient time. Moreover, DA may lessen the need to satisfice (see Simon, 1947), that is, to search for solutions that are good enough rather than optimal (Pittenger et al., 2023). Indeed, some hold that human-machine collaboration “can result in a collaborative rationality, extending beyond the classically defined bounded rationality” (Elgendy et al., 2022, p. 337). As is evident in the above, recent advancements in the area of data-intensive technologies have breathed new life into humans’ age-old quest for perfect information and rational decision-making (Quattrone, 2016). Moreover, they have given rise to the ideal of fact-based/data-driven decision-making (Elgendy et al., 2022; Nielsen, 2018).

According to Quattrone (2017, p. 608), this ideal “will lay the basis for a logical argument in favour of having algorithms replace double entry, coding experts replace accountants, and data scientists replace accounting professors.” Moreover, Quattrone sees a risk that accounting information may no longer be questioned or debated and that automated solutions will limit “the space for judgment to the very last, and short, part of the relationship between knowledge and action” (2016, p. 120). Along similar lines, several others have discussed the potential of new technology to automate the conversion of mass data into managerially relevant knowledge (see Arnaboldi et al., 2017; Bhimani and Willcocks, 2014; Scott and Orlikowski, 2012). These possibilities, in turn, have nourished the concern that big data and algorithms could eventually crowd out accounting professionals’ judgment (replacement scenario). Contrary to that, others believe that “a combination of human judgment [i.e. accounting professionals’ judgment] and business acumen with the extensive use of data and technology are key” (Möller et al., 2020, p. 3) (augmentation scenario). They hold that accounting work is often complex and, thus, not programmable, which, in their view, renders judgment and tacit knowledge indispensable (see Arnaboldi et al., 2022; Bhimani and Willcocks, 2014; Korhonen et al., 2021).

Although many believe that technology cannot replace expert judgment, the question as to why this should be the case is not always fully answered. Often, scholars have underlined the importance of “business acumen” or “domain knowledge” without specifying the areas or tasks that will continue to require accounting professionals’ judgment. Indeed, empirical insights into this have remained scarce, and we still know little about the concrete ways in which accounting professionals’ judgment “is key” when leveraging big data and the new DA tools in accounting for decision-making. With respect to FDD, accounting firms’ own publications, and the limited scholarly research (i.e. Neumann, 2020), offer some insights on DA adoption and use patterns. However, neither says much about accounting professionals’ judgment and how it is implicated in and impacted by DA adoption and use. In summary, despite widespread interest in questions surrounding judgment, and despite the acknowledgment that judgment plays an important role in accounting work, knowledge about how accounting professionals actually invoke judgment, especially in (1) other-than-audit accounting work and (2) in a digitalized world, is still scarce. Connecting back to Dewey, this justifies the following research question: How are practical and theoretical judgments invoked in data-driven FDD?

3. Methodology

3.1 Research strategy

The scarcity of research on DA in FDD presumably stems from the difficulty of gaining access, which relates to two major complications. First, in M&A deals, confidentiality is of utmost importance. Because information leaks are likely to cause serious disruptions non-essential people are seldom involved in an ongoing transaction process. Moreover, FDD providers sign non-disclosure agreements that bar them from sharing sensitive information with third parties (Wangerin, 2019). Second, FDD teams work long hours and bill by the hour. Because of this, there is a limit to how much time advisors are able and willing to set aside for research-related activities. Despite these complications, thanks to one author’s network and another’s two decades of experience from working with M&A, we were able to perform a field study at a Big Four accounting firm in Sweden (Pseudonym: “DealCo”).

Field studies can contribute to a rich understanding of a real-world phenomenon, especially with respect to the influence of contextual factors (Ahrens and Dent, 1998; Hopwood, 1983). For this reason, field studies are “perhaps the best way to understand […] new state-of-the-art accounting developments” (Merchant and Van der Stede, 2006, p. 117), including the adoption of next-generation DA tools and its implications. The setting, that is, a Big Four FDD practice, is “uniquely interesting” (Merchant and Van der Stede, 2006, p. 118) for the following reasons: First, the Big Four are at the forefront of digital innovation in accounting. Second, FDD is not subject to regulation, i.e. it is free from standards and oversight that constrain data and technology use and govern the exercise of judgment. Hence, FDD makes it possible to study how DA use plays out at Big Four accounting firms, in relation to judgment, in the absence of regulatory concerns. What is more, studies of firms at the forefront of developing accounting are likely to contribute to both literature and practice (Kaplan, 2011). In conclusion, given the scarcity of prior research and the aforementioned considerations, a field study at a Big Four firm was deemed well-suited to study how practical and theoretical judgments are invoked in data-driven FDD practice.

3.2 Research setting

In M&As, sellers possess better information than buyers. To reduce this information asymmetry, buyers commission an investigation called due diligence. The overarching aim of an FDD project is to identify decision- and value-relevant information (Wangerin, 2019). In this paper, we focus on the data-intensive stages of an FDD project, namely, data preparation, data analysis and reporting, which allows us to highlight the path from raw data to actionable insights (see Figure 1). Note that this paper examines the judgments and, in effect, decisions, made in FDD (by accounting professionals) that go into the generation and delivery of actionable insights for investment decision-making (by executives). In other words, it highlights some of the small but important judgments that precede buyers’ eventual investment decisions. Content-wise, FDD is about identifying historical value drivers, understanding how a business is making money and examining to what extent firm performance can be attributed to good or poor management. In addition, discussions with the target’s management often provide inputs for analyses and offer a means to verify if data and DA outputs are accurate and make sense.

At DealCo, about 35–40 people work directly with FDD. Although this varies across deals, FDD teams often comprise around four consultants, almost always with an accounting/finance background (DealCo also employs some data/DA specialists for the development of new DA tools. These specialists do not work in FDD engagements). In light of ever-increasing data volumes, it had become apparent that working exclusively in Excel would not be future-proof. In response, DealCo’s management launched a DA initiative that involved forming an informal DA group, adopting new DA tools and training the junior staff in using these tools. Today, DealCo’s DA tooling landscape includes commercial products, most notably Alteryx (used to automate data preparation activities via so-called workflows) and in-house tools, most notably SalesAnalyzer (used to transform transactional data into a ready-made reporting package) and AutoConverter (used to convert SIE files into well-structured Excel or PowerBI outputs). SalesAnalyzer and AutoConverter are code names for proprietary software; SIE is an open standard for accounting data in Sweden. At the time of data collection, DealCo’s DA initiative was in its second year.

3.3 Data collection

The present study is part of a larger research project on digital transformation in FDD. Due to the aforementioned confidentiality restrictions, semi-structured interviews were chosen as the main primary data source. Interviews are suitable as a method to “capture” judgment, which is an important concept in our theoretical framework. Practical and theoretical judgments are part of conscious human experience. Practical judgments, for instance, per definition, relate to situations in which habit and practice do not determine action and that require a deliberate appraisal (Brown, 2015). This renders interviews a suitable method to capture judgment. Based on the relevant literature, we developed an interview guide that centered on three parts: (1) Analytics, (2) Data and (3) Judgment (see Appendix). While keeping with the main themes, questions were sometimes altered so they would match the position and the experience of the respective interviewee. While this paper uses data from each part of the interview guide, the quotes and the examples given in Tables 2 and 3 mostly relate to parts (2) and (3), especially parts 2.2, 3.1 and 3.3 in the Appendix. Figure 1 shows a theoretical synthesis of the FDD process that is informed by all three parts.

Interviewees were to have operational exposure, possess a thorough understanding of the practice and have the authority to make judgments and decisions in ambiguous situations. Based on these criteria, FDD managers were chosen as the key informants. In addition, to complement and corroborate the managers’ perspectives and accounts, we interviewed several partners (directors), analysts/associates and a data scientist. First, we interviewed the partner and the manager to whom we had initially presented the research project. Thereafter, the aforementioned manager suggested suitable interviewees. This included those colleagues with the most first-hand experience with using DA in FDD projects (i.e. enthusiastic adopters), plus a data scientist, who sat outside of the FDD department (see Section 3.2) and later colleagues with some, but less, first-hand experience (i.e. cautious adopters), to give space to alternative perspectives, including skepticism toward DA. Apart from the data scientist, all interviewees worked mainly in FDD. Finally, we conducted three shorter follow-up interviews to seek some clarifications. The main interviews lasted about 60 minutes and the follow-up interviews lasted about 30 minutes. In total, 14 interviews were held. The interviews, conducted in English, were recorded and then transcribed. Since DealCo’s advisors were highly proficient in English, we did not experience any language issues.

After about half of the interviews, our understanding of the phenomenon, i.e. the role and use of practical and theoretical judgment in data-driven FDD, began to stabilize. The later interviews repeated and corroborated much of what had been stated in the earlier interviews (“data saturation”) and did not lead to meaningful new themes (“thematic saturation”) (Saunders et al., 2018). This is best understood in light of clearly defined project roles (i.e. Partner, Manager, Associate/Analysts) and the – in some ways – standardized nature of FDD. This likely contributed to somewhat similar experiences on each hierarchy level and, in effect, to reaching data and thematic saturation quite fast. Also, since DealCo was still in the early stages of DA adoption, only some of its advisors could give rich first-hand accounts.

To verify what we had learned during the interviews and to enhance our understanding of how exactly the new DA tools had been used, we also collected observational data on the occasion of two walkthrough sessions. In these sessions, a manager would demonstrate “on-screen” how they had used the new tools in past FDD engagements. The walkthrough sessions, which were not recorded (notes were taken instead), lasted 60 and 30 min, respectively. The first two interviews were conducted by the first and the second author; the other interviews and the walkthrough sessions were conducted and attended by the first author. Due to the COVID-19 situation, the interviews and walkthrough sessions, which took place from November 2021 to March 2022, were conducted online using Zoom. The participants appeared to be comfortable conversing via Zoom and were very willing to share their experiences.

Moreover, we held kickoff and close-out meetings that were critical for forming an initial understanding of the situation and reality-checking our analyses and interpretations. During the kickoff meeting, DealCo’s attendees gave a short but highly insightful overview of their DA journey, the challenges they faced, and the ambitions they had moving forward. During the close-out meeting, in which we presented the results of our study, DealCo’s attendees confirmed that our analyses and conclusions were appropriate and that they, for the most part, closely matched their own experiences and perceptions. Data collection concluded with a final, one-year follow-up interview in May 2023 (in person, not recorded). On that occasion, the interviewee confirmed that this paper’s findings and conclusions were still current and valid. Table 1 gives an overview of the primary data collected for this study.

In addition, we collected various types of secondary materials (quantity in brackets). This included white papers (12), company videos (10), Website posts (6), presentations (2), podcasts (2) and a webcast. These materials, all publicly available, were spread out over the global DealCo network. The main objective for collecting these materials was to gain a thorough understanding of the new contextual whole, including new data sources, new DA tools and new applications.

3.4 Data analysis

In the data analysis, we focused on the interviews. However, the analysis was greatly aided by the observations and notes from the walkthrough sessions, kickoff and close-out meetings. The secondary data contributed to a better background understanding. The data analysis is best described as thematic as it centered around thematic coding (see Flick, 2009). In retrospect, the analysis could, in a simplified way, be described as consisting of three phases.

In phase one, detached from firm theoretical preoccupations, we wanted to establish how using more data and new DA tools was affecting the need for accounting professionals’ judgment in FDD (our initial research interest). In this phase, the analysis was iterative and involved moving back and forth between data and literature; an approach that is commonly referred to as abductive reasoning (Gehman et al., 2018). When reading the interview transcripts, it became apparent that “judgment” was too broad of a concept and needed to be refined. Some judgments had a “what is” form; many others had a “what to do” form. After some literature search, including reading Svetlova and Dirksen (2014), we adopted the concepts “practical judgment” and “theoretical judgment,” taken respectively derived from Dewey (1915, 1938), to describe and theorize what we were seeing in the data. In addition, we re-conceptualized FDD as a Deweyan inquiry, that is, an empirical investigation that seeks to transform an indeterminate situation (investment hypothesis) into a determinate one (investment decision). Both actions were taken after collecting the primary data.

Phase two involved a round of a priori coding in which we used “practical judgment” and “theoretical judgments” as codes. Practically, this meant coding the transcripts, statement by statement, into “(related to) practical judgment,” “(related to) theoretical judgment” or “not relevant to the research question.” This produced a large table showing all the statements related to either practical or theoretical judgment. Next, for each type of judgment, we grouped similar statements (“open coding,” see Flick, 2009) by developing first-order codes such as “too much information,” “abundant opportunities” or “knowing when to stop” (as an example for the practical judgment). In this phase, we applied a temporal bracketing strategy breaking down the FDD process into more homogeneous, easier-to-analyze parts (Gehman et al., 2018). Three subprocesses were identified on the basis of the temporal flow of the activities of the FDD: Data preparation, analysis (including management discussions) and reporting.

Phase three concerned the development of overarching themes to identify and describe the most important uses of practical and theoretical judgment in the data-driven FDD (see research question). This involved elements of “selective coding” and “axial coding” (see Flick, 2009). First, we examined how firmly each first-order code was supported by the data. Codes that did not have firm support or were not central to the research question, were then excluded from further analysis. Finally, the remaining codes were grouped into four overarching themes: two for practical judgment (“setting priorities” and “exercising restraint”) and two for theoretical judgment (“forming hypotheses” and “doing sense checks”).

4. Findings

In the following, we present key findings from our study of DealCo advisors’ use of judgment in data-driven FDDs. First, we highlight the use of practical judgment (Section 4.1) and, thereafter, the use of theoretical judgment (Section 4.2).

4.1 Practical judgment

We identified two critical uses of practical judgment in data-driven FDD: Setting priorities and exercising restraint. Table 2 shows abbreviated examples, from different areas of FDD work, that juxtapose big data and DA issues and the need for practical judgment. Below we will focus on analyzing examples one and three.

4.1.1 Setting priorities

I mean, when you get lots of data, that we are getting more and more, and we are applying more and more data analytics tools, there are just more potential analyses that we can do. There is a lot of judgment about what to do. You know, you can do a million things and you can analyze every SKU [stock-keeping unit] in detail. You may need to have a judgment to decide … should you? Or is this not important? (M4)

As illustrated by the quote above, in FDD practice, more data and more DA tools equated with more potential analyses, which meant there was a lot of “judgment about what to do,” i.e. a lot of practical judgment (Dewey, 1915). Consider the following example. Previously, FDD teams would only receive trial balances, i.e. account-level data. More recently, they started to receive transactional details, i.e. transactional-level data, as well. In the former case, it had been quite clear what to do: Make use of all the (little) data and all the (limited) details. In contrast, in the latter case, it often was not clear. Since time was limited, it simply was not possible to analyze large transactional datasets from all angles. Moreover, it was all but certain that highly granular analyses would generate highly relevant insights. Therefore, more data required more practical judgment in the sense that there now was a greater need for prioritization and focusing on those few analyses that were going to make the biggest difference to the client’s investment decision. Below, P1 gives a vivid example of a situation where they had access to rich transactional data with lots of information that, in the greater scheme of things, was not overly important.

We did one recently, 30, 40 different countries and different products, where we used one of the [analytics] tools. At the end of the day, you can come up with a million questions because you can look at the data from so many different angles. Why is the margin going down in Mozambique, in 2018 compared to 2019, why is the margin doubling in the Philippines as compared to Korea, or whatever? But when you stand back, well, Korea is less than one percent of the business and Mozambique has a revenue of 400 dollars. Who cares?! One can get distracted. Many questions that get asked are not needed or maybe do not matter. When we are looking at the business, we try to focus on the 80-20, what, the Pareto principle […] You get this initial wow, that is really interesting and, then, quickly you get bored of it because you realize that, why? What am I really learning here? (P1)

Ultimately, insights into margin developments in markets that, from a revenue standpoint, were insignificant, were unlikely to have much bearing on the buyer’s decision to invest in the target and its management team. To prevent investigating questions that were “not needed” or “maybe did not matter,” DealCo’s advisors tried to “stand back,” keep an eye on the bigger picture and “focus on the 80-20,” i.e. the vital analyses and conclusions that were going to help the client value the target company and commit or walk away from the acquisition. Determining this was a matter of practical judgment, of appraising how any given element or product of the inquiry contributed to (the unification of) the contextual whole, that is, transforming the indeterminate situation (investment hypothesis) into a determinate situation (investment decision) (Dewey, 1938). When making and, respectively, to make, these judgments, the advisors asked themselves questions such as: How significant are these accounts? How important would that be to know? What can we learn here? Ultimately, however, it all boiled down to the question of “What is the value of doing this?”, i.e. the valuation performed within every practical judgment (see Dewey, 1915).

Things that have to do with the revenue growth of the company and any conclusions around how much are they making on different parts of the company – that has clear implications on the valuation of the full business. And that has a higher value compared to other things like … costs like the interest expenses of the company. I just know that that’s not interesting for a buyer. (M4)

Notwithstanding the above, judging what information was or would be most valuable to clients was, to some degree, deal-specific. Although some information mattered in almost all deals (e.g. underlying earnings, net debt, net working capital), the value of other information was contingent on the buyer’s value hypothesis or, more generally, the deal rationale. For instance, strategic buyers (i.e. operating firms) and financial buyers (e.g. private equity firms) were typically looking and paying for different information. It follows that the aforementioned value judgment was largely situational and had to consider the deal’s contextual whole. Moreover, understanding and judging what was more and less relevant to clients was important to reduce the risk of information overload. The more insights client’s management had to attend to, the more difficult they would often find it to form conclusions and commit to a course of action.

They [the target; a retailer] had like 50,000 different products they were selling, and they were selling to different companies and different customers. […] We basically showed them [the client] the data cube and they asked us “ok, can we look at that customer, can we look at that product, can we look at those margins, can we look at that country?” […] basically, you could see everything. But, as I said, it was information overload. Because, since it was so much information, it was hard to summarize everything. To see, ok, this has been a main driver for the company, this is what makes the company profitable. And the client said that it was good but they needed it to be more specific. Like, ok, just summarize it into five bullets [bullet points] or something. (A1)

The clients were not paying for thick reports, they were paying for their advisor’s judgment on what were the five most important things they had to know about the company they were going to buy. Therefore, judging which pieces of information (i.e. means) held the greatest promise for helping the client to commit to or walk away from the acquisition, that is, to determine the situation (i.e. the final end), was critical when reporting and presenting to clients. In relation to that, P1 also emphasized that one needed to see the larger picture and understand that FDD was only one area of due diligence. Clients also had to attend to information relating to matters such as tax, pension, legal or IT. For this reason, prioritization was paramount when communicating the findings of a data-driven FDD. Finally, there has always been a need for some prioritization. However, the availability of more data and the ability to process these data with new DA tools had amplified this need, and, in effect, the need for invoking practical judgment in FDD.

4.1.2 Exercising restraint

Notwithstanding the importance of prioritization in the data-driven FDD, it is equally important to note that a priory prioritization only went so far as to give initial direction. The product (i.e. the due diligence report) may have been fairly standardized. Yet, the process leading up to it was never quite the same. The advisors never knew exactly what they were going to find. Sometimes they ran into unexpected issues that required further scrutiny, sometimes they did not. Either way, given this uncertainty, they constantly needed to re-evaluate if they were still putting time and effort to their best use. Despite the presence of constraints, such as little time and tight scopes, as datasets were getting larger (e.g. transaction-level sales data with millions of data points) the risk of losing sight of what was important was getting bigger too. Because of this, when analyzing large datasets, knowing when to stop and pull out had become a critical piece of practical judgment in FDD.

There are so many different ways you can slice and dice data. And it is so tempting, especially if you are a numbers person, to really want to analyze something to death, and that’s not what we are meant to do, because, again I am trying to get to that story, is it management that made things better? Because if it is management, then they can continue to do it [laughter]. It’s worth investing in that team. I need to get to that feeling of why things are the way they are. I can lose myself, I can get too stuck in the data. So, having kind of that judgment to know when to pull out and what is important and using it [analytics] effectively and efficiently. (P1)

The quote above not only points toward the importance of knowing when to stop in using DA effectively and efficiently, it also foregrounds that FDD is done by certain people who feel a certain way about their work. Almost everyone in DealCo’s FDD practice was “into numbers” and considered themselves a “numbers person.” Moreover, many were genuinely excited about the possibilities that more data and next-generation DA tools had offered to them. When talking about these possibilities, many invoked expressions like “it is so tempting” or “it’s really fun.” At the same time, this was often immediately followed by qualifications such as “but that’s not what we meant to do” or “I might think it’s fun, but …”. The “but,” in one way or another, always related to putting client interests, rather than personal enjoyment, first. This implies that using more data and new tools involved a greater need for exercising restraint, especially in situations in which curiosity pointed in one direction (e.g. digging deeper) whilst client needs pointed in another direction (e.g. pulling out). By making practical judgments and appraising means with ends-in-view (Dewey, 1915, 1938), DealCo’s advisors exercised restraint. This highlights that a human-led inquiry, inevitably, will involve uniquely human opportunities, e.g. feelings as an orienting presence, and uniquely human challenges, e.g. feelings as a tempting presence.

All of us who work in the FDD are super into numbers and think it’s really fun to play around KPIs [key performance indicators] – it’s actually super hard to know when to stop. (A1)

To know when to stop, it was essential to recognize that some insight or conclusion was “good enough”; that it was sufficient to answer the question or solve the problem at hand. Usually, this was not about solving for x but about presenting a convincing explanation for why things were the way they were, something the advisors called “qualitative work.” It was often not obvious, however, that a conclusion was good enough. The quality of a conclusion depended on the features of the situation, for instance, how much time was available and what objectives the client had. This meant that the advisors had to step back and set each element or product of the inquiry in relation to the contextual whole, i.e. the investment decision of the client.

It’s kind of assessing when we have something that is good enough, which comes with experience, I think. That’s really one of the things that comes with experience, saying that “well, this is a good conclusion.” Now we see which country has driven the performance. We could see which city is driving the performance – but I don’t think that’s interesting enough. I think they [the client] will be happy just noting it’s this country that’s doing it […] For instance, if we’re looking at the company’s EBITDA margin over two years and want to understand why it is going up, I break down the company into five different countries. I see that all the countries have the same margin in each period, but they are growing more in a market that has a higher margin. That would be a conclusion that I think solves the problem […] it’s good enough based on the time you have and the fields you have and what you are trying to achieve. (M4)

In summary, because of more data and new DA tools, the realm of possibility in FDD had expanded significantly. Instead of choosing one out of ten, it was now about “choosing one out of a thousand ways of doing things.” Time frames, however, had not gotten any longer. This meant that setting the right priorities and exercising restraint had become more important than ever in data-driven FDD. Given the available time, the agreed-upon scope, the team’s technical capabilities, and the client’s information needs, what action would make the biggest difference to the situation? Answering this question was a matter of practical judgment, i.e. of appraising means with “ends-in-view.”

4.2 Theoretical judgment

Moreover, we identified two critical uses of theoretical judgment in data-driven FDD: Forming hypotheses and doing sense checks. Table 3 provides abbreviated examples that juxtapose big data and DA issues and the need for theoretical judgment. Below the focus will be on analyzing examples two and three. To showcase the intertwining of theoretical and practical judgments, we selectively refer back to Section 4.1, especially in relation to management discussions.

4.2.1 Forming hypotheses

With larger and larger datasets, to me at least, it becomes more urgent to have an understanding of the business and have some sort of hypothesis of how the business works and what makes it tick […]. The main challenge is to ask the right questions from the dataset. (P3)

As is evident in the quote above, as data volumes were getting larger, the need to form working hypotheses about the target’s business was becoming more urgent too. To cope with ever larger datasets (see Section 4.1), DealCo’s advisors first put forward assertions about what was probably true or correct, thereby exercising theoretical judgment (Downie and Macnaughton, 2009). Then, in the spirit of a scientific inquiry, they turned to the data to see if the assertions were warranted.

The key thing here is that one needs to have a macro hypothesis: I believe that more people go to the cinemas, because of XYZ. And then I try to prove it. If you just gave me all of the cinema admission data from Sweden, for example, I would just drown in the data, because there are so many things to look at, so many things to try to understand. So, having this macro perspective, I believe this is going to happen, or this is that, is kind of key. (P1)

P1’s example of cinema admission data illustrates three important things. First, working hypotheses seemed to be the first line of defense against “drowning” in large datasets. Second, working hypotheses, despite their “theoretical nature,” gave an impetus for action; for instance, to go and check if the data supported XYZ. Third, the example highlights the close link between theoretical and practical judgments: Because I believe X to be true, and Y to be valuable, I should be doing Z. Essentially, working hypotheses connected the theoretical (what to believe) and the practical (what to do) by connecting beliefs about what was true with beliefs about what was (going to be) valuable. The quote below is a case in point.

When working with these types of data [transaction-level data], it’s very important to understand what you want to get out of the data. And setting up what are the questions we want to answer? With that type of data, there is a risk that you can dive in too deep and get lost in the data. I mean, the possibilities to make interesting analyses are endless [laughter]. But you have to set the scope and set the limitations of where you want to look, what you want to understand, and what will generate value for the client. (M1)

While experienced deal advisors possess good business and industry knowledge, their main expertise is in accounting and supporting transactions. In cases in which analyses had produced unexpected results, talking to the target’s management (as part of the so-called Q&A sessions) was often the most effective way to determine if the advisors had a genuine finding or if it was just an error in the data (if data were too erroneous, the advisors often preferred not to use them, so that they would not draw wrong conclusions from bad data). Previously, i.e. in the times of account-level data (see Section 4.1), DealCo’s advisors spent considerable time getting answers to high-level questions. Now, i.e. in the times of transaction-level data, they can ask much more detailed and pointed questions.

We have to ask less about why something is going up and down because we can see it in the transactions. We can see that they had a big invoice from a consultant, for instance. And, then, if we want to get further than that, we can ask “what is the invoice about?” rather than asking them “why did the cost as a whole go up?” You can ask more pointed questions if you have all the data. You get one layer more for your own. But at some point, you have to ask “why is that?” because the data ends somewhere – it doesn’t tell you everything. (M4)

On the flip side, there was now a greater risk of going too deep and asking questions that the discussion partners, often Group CFOs, were not prepared to answer. Quite possibly, they had never seen these data themselves or had not engaged with the data on that level. That, however, would necessarily deter them from giving spontaneous (and false) answers anyway.

The insight we’re able to pull out from there, they are often news to the target company as well. And, then, of course, they typically provide the first explanations that come to their mind when they see the data for the first time. And, of course, that can lead into the sidetrack. (P2)

In these situations, not getting an answer only came at the cost of wasted time. Getting a wrong answer, which could lead the inquiry off track, was much costlier. To avoid this, the advisors had to carefully plan the management discussions/Q&A sessions and anticipate the right level of detail for their questions. As part of this, they would exercise theoretical judgment and form hypotheses about their discussion partners (e.g. “the CFO is not knowledgeable about topic X”). This, again, illustrates the tight link between theoretical and practical judgment: Decisions on what do to about a situation, e.g. which questions to ask, required knowledge of and judgment on the “constituent distinction and relations” of that situation, e.g. how much the CFO knew. In fact, these considerations mattered not only for posing questions but also informed decisions on which analyses to perform.

Sometimes, doing the analysis may not provide that much more insight when you can’t really get good explanations for the factors you are seeing. I mean, you can present the results, but it’s always good to have not just the how but [also] the why from management […]. (M1)

What is evident here is that means must be appraised with “ends-in-view” and that neither the means nor the ends can be appraised independently of each other (Dewey, 1915, 1938). The value of the analysis often depended on later getting the “why” from the target’s management, whereas the value of getting the “why” depended on the hours put into the preceding analysis.

4.2.2 Doing sense checks

Apart from generating working or macro hypotheses, exercising theoretical judgment was also a means for sense-checking input data, DA outputs and answers from the target’s management. To sense-check input data and DA outputs, DealCo’s advisors used different “mechanical controls.” On the accounting side, this primarily concerned reconciliations with the audited financial statements, on the technical side, it concerned coding numerical plausibility checks.

On the technical part, you can control certain things. For example, if there is missing data, if you know the specific range that it should not cross, you can code specific controls for it. But usually, it’s up to the analyst [the advisor] to see if the data actually makes sense. (A4)

Mechanical controls, however, were not always feasible nor did they always suffice. In many cases, it was therefore “up to the analysts to see if the data actually made sense.” On the part of the analysts, in our case, the deal advisors, this required good knowledge about the business, the system architecture and the industry.

Big data […] it forces you to understand the business. Can this actually be in the way it is presented in the data? But at the same time, it probably is the biggest area where you need to make a judgment on the output and the result you get. (P2)

As pointed out in Section 4.1, talking to the target’s management was often the most effective way to check if data, outputs and conclusions were correct and reasonable, respectively. Moreover, numbers alone often did not mean that much to clients if there was no qualitative explanation for why the numbers looked the way they did (as they often looked for “assurance” on assumed trends and underlying value drivers). Thus, whenever possible, the advisors would try to enrich their quantitative analyses with qualitative inputs from the target’s management. Without them, the same means (i.e. an analysis) were less potent to accomplish the desired end (i.e. casual understanding) and, ultimately, help clients in determining the situation, that is, taking an investment decision. It follows that, in many cases, management explanations were integral to making sense of complex data and, ultimately, deriving value from the use of big data and DA in FDD. But there was a problem.

Essentially, I can’t trust what people say [when challenging/questioning a management team]. I have to assume that almost, that everyone is lying to me. And then I need them to prove that they are not […] someone could be telling me that something is not an issue when it is because they want to make a million dollars and then walk away. But kind of that judgment, are they lying to me or not […]. Sometimes, the judgment is: Do they know the answer? Because even if they wanted to tell me the truth, maybe they don’t know it. So, kind of, is this person good at what they do? Because, if they are not, then I am going to have to take everything they tell me as potentially wrong. (P1)

These judgments had been important for a long time, i.e. not only after incorporating big data and DA tools. Yet, their continued importance is telling with respect to the need for and nature of judgment in data-driven FDD and, more generally, data-driven inquiries. More granular data gave the advisors more possibilities for challenging and checking management and its answers. More data, however, did not lessen the need for judgment in this. The data may have supported, not supported or even contradicted what management was saying. In effect, the data may have given an indication that management was telling the truth and was knowledgeable. Indication, however, did not replace judgment. As Dewey (1910, p. 111) put it, “through judging, confused data are cleared up, and seemingly incoherent and disconnected facts brought together.” When making these judgments, the advisors relied on experience, gut feel, logic and data, which all acted as complementary means of triangulation.

I think it’s generally more experience- and gut-feel-based. But it’s also, to some extent, just logical thinking. And, obviously, that comes with experience as well. When they [the target’s management] are guessing or answering something spontaneously, they could say something that contradicts something else they have said or something we have seen in the data. I had a Q&A session yesterday. We were asking them “why is the margin going down within the project over time?” They said, “well, it’s because we are more conservative at the beginning of the project.” We challenged them on that saying “well, this doesn’t really make sense, because when you set the initial budget, you have said before that you are conservative there.” (M4)

When pressed on this question (“What informs your judgment there?”), DealCo’s advisors had difficulties giving a precise answer. M4, for instance, invoked the circle metaphor to express how the different means of inquiry (e.g. data and DA outputs, judgment, gut feel, management input, logic) iteratively informed each other. Others referred to triangulation and trying to get to a point where the above-mentioned means of inquiry matched each other without any major contradictions. These descriptions match Dewey’s descriptions of judging as bringing together disconnected elements and inquiry as converting “the elements of the original situation into a unified whole” (1938, p. 105). This is interesting as it seems to suggest that Dewey’s theories of judgment and inquiry remain highly applicable in today’s digital and data-driven world.

In summary, DealCo’s advisors invoked practical judgment as a means to set priorities and exercise restraint and theoretical judgments as a means to form hypotheses and do sense checks – all essential when working with large datasets. In these capacities, practical and theoretical judgment were critical in transforming raw data into actionable insight and in transforming an indeterminate situation, in which buyers had an investment hypothesis, into a determinate one, in which buyers took an investment decision.

5. Discussion

DA enables efficient processing of large data volumes and churning out impressive analyses in little time. That, however, does not ensure effectiveness, i.e. that an inquiry produces desired outcomes. In the following, we discuss how practical and theoretical judgments are implicated in effective data-driven FDDs. In this discussion, we focus on three central aspects: Asking the right questions (Section 5.1), getting qualitative explanations (Section 5.2) and recognizing good enough (Section 5.3).

5.1 Asking the right questions

Previous studies on DA in accounting have noted the importance of asking the right questions when working with big data (see Huerta and Jensen, 2017; Spraakman et al., 2021). However, they reveal little about what informs these questions. Contrary to that, our study gives a detailed and theoretically grounded account of how they are developed and used in practice. To navigate large datasets and not get lost, DealCo’s advisors formed hypotheses that expressed theoretical beliefs (i.e. I believe X causes Y), a feasibility assessment (e.g. I believe the data allows us to test this) and a value judgment (i.e. I believe confirming this relationship will be valuable for the client). Put differently, despite their “theoretical nature,” the hypotheses contained practical substance in the form of an appraisal of means with ends-in-view (Dewey, 1915). Moreover, they gave an impetus for action; for instance, to go and check if the data supported XYZ. This points toward the reciprocal relationship between practical and theoretical judgments. Whilst Dewey stresses that theoretical judgments (“propositions,” see Section 2) inform practical judgments, our study highlights instances in which value judgments are reflected in theoretical judgments (hypotheses) – as illustrated in the example above – thus showing that practical and theoretical judgments are intertwined in more than one way.

This finding has important implications for the organization of data-driven FDDs. The right questions or good hypotheses combined an understanding of the business and industry, the data and DA possibilities and the particulars of the situation (i.e. the contextual whole). This calls into question the idea of building advanced analytics competencies by hiring data scientists. In data-driven FDDs, judging was often about synthesizing knowledge, facts and experiences from different domains, including accounting and IT. If knowledge is too siloed, this becomes difficult. To lessen this issue, accounting firms could make use of secondments as part of which accounting professionals shadow or work with IT professionals. To that end, DealCo’s advisors noted that even a single individual with advanced knowledge of both FDD/accounting and DA can make a big difference to the trajectory and, in consequence, the effectiveness of DA efforts.

It is also worth noting that the advisors’ extensive use of hypotheses – to derive value from big data – goes against the belief that big data and data mining will lessen the importance of theory and hypotheses (e.g. Kitchin, 2014). Indeed, the DA paradigm is said to involve a shift from problem-driven to exploratory analyses (see Richins et al., 2017). However, according to DealCo’s advisors, a purely exploratory approach would be ineffective as they would “drown” in the data. This assessment could reflect the fact that FDDs are subject to high time pressure. It could also relate to the fact that, usually, buyers enter a transaction with a deal rationale, i.e. a logic of how the acquisition would generate value, for which they seek support through FDD. Nonetheless, given how adamant the advisors were in their view that hypotheses are key when analyzing large, rich datasets, e.g. transaction-level sales data, there is reason to believe that, in accounting for decision-making, problem-driven analyses will long remain the norm.

5.2 Getting qualitative explanations

According to Quattrone (2016, pp. 119–120), “data are now [just] ‘given’ to decision makers” and “accounting numbers are no longer […] debated in communicative acts.” Moreover, ideals like data-driven, fact-based decision-making (see Elgendy et al., 2022; Nielsen, 2018) indicate strong, if not blind, faith in quantitative analyses and outputs. Our study paints another picture. Despite being “super into numbers,” DealCo’s advisors stressed the importance of qualitative work, i.e. getting to the bottom of things and explaining why numbers were the way they were (“the data ends somewhere – it doesn’t tell you everything,” see Section 4.2). Moreover, they were well aware that data were often erroneous and that analyses were sometimes flawed. Knowing that, they routinely engaged in discussions with the target’s management to sense-check conclusions and get qualitative explanations for what they were seeing in the data. Hence, in the data-driven FDD, the “facts of the situation” (see Dewey, 1915) were often established in “communicative acts” that involved questioning and debating (see Quattrone, 2016).

In addition, they involved numerous judgments. On the practical end, this included matters such as “How detailed should our questions be?” and, on the theoretical end, matters such as “Is management telling the truth?” These judgments, in turn, shaped the discussions and how the FDD would proceed thereafter. It is worth noting that these considerations also had a bearing on project planning and what analyses (not) to perform. When getting a why from management seemed rather unlikely (theoretical judgment), performing the corresponding analysis was less attractive (practical judgment). This, again, points toward the intertwining of theoretical and practical judgment and the need to appraise means with ends-in-view (Dewey, 1915, 1938).

The evolution of management discussions in FDD, from high-level to increasingly detailed, also gives some insights into how new digital technologies can make accounting-related work more interesting and more demanding. As noted earlier (see Section 4.2), access to transactional-level data enabled more pointed questions and often enabled more interesting discussions. However, to have these detailed discussions, the advisors also had to spend more time and effort preparing them. Arguably, this is somewhat representative of the impact of DA on FDD work in general. The use of new DA tools made it possible to spend less time on mundane parts of the job (e.g. building a data book) and more time on more exciting parts, first and foremost, doing analyses and drawing conclusions (see Bhimani and Willcocks, 2014). However, this shift also involved a need to develop new technical skills and acquire broader technology and data-analysis-related knowledge (see Alles and Gray, 2016; Schäffer and Brückner, 2019).

Finally, connecting back to “data-driven,” “fact-based” decision-making (see Elgendy et al., 2022), our study is a timely reminder that it is important to distinguish between data and facts and that transforming data into facts, i.e. into provisionally accepted knowledge, sometimes involves considerable judgment and qualitative work by accounting professionals.

5.3 Recognizing “good enough”

Big data and DA are associated with information optimization (Quattrone, 2016; van der Voort et al., 2019). DA tools enable more exhaustive information processing; thus, one would assume fewer instances of satisficing, i.e. settling for solutions that are “good enough” (Simon, 1947), in data-driven FDDs (see Brown-Liburd et al., 2015). Interestingly, our study does not support this conjecture. As evident in Section 4.1, in data-driven FDDs, opportunities for interesting analyses were abundant. Time, however, was not. Therefore, setting priorities and determining what was truly important for the client – by means of exercising practical judgment – was critical for an effective inquiry. That, however, often only went as far as to give initial direction(s). Knowing when to stop was equally important. In that respect, it was about recognizing “good enough”: Given the objectives and the constraints of the situation, was the conclusion (or finding) good enough? The answer to this question was a matter of practical judgment and striking a balance between the desirability of an end and the feasibility of the means – which is how satisficing is conceptualized today (see Luan and Li, 2017). Like Pittenger et al. (2023, p. 898), who studied managerial decision-making, we thus conclude that, in accounting for decision-making, time bounds and satisficing remain “fully in play, even in data-enriched environments.”

Moreover, the “good enough question” is telling with regard to the multifaceted influence of feelings and emotions in the formation of judgments in FDD. On the one hand, for DealCo’s advisors, the richness of transaction-level data was exciting and tempting. So much so that they felt the need to exercise constraint and invoke practical judgments in the form of “I should stop here because going deeper might not result in much added value.” At the same time, for some, feelings were an important indicator of “good enough” (“I need to get to that feeling of why things are the way they are,” see Section 4.1). Hence, instead of putting “facts over feelings,” our study suggests viewing facts and feelings (and data, logic and management input) as complementary means of inquiry that can inform each other – and judgment – in an iterative way (see Dewey, 1938). This implies that feelings and emotions can cloud and support judgment and, in effect, hinder and help data-driven FDDs. By making this explicit, our study sheds further light on the ambivalent role of feelings and emotions in accounting work (see Guénin-Paracini et al., 2014; Repenning et al., 2022).

The “good enough question” is also a good reference point for informed speculations about the future of judgment in FDD. Given the recent advancements in artificial intelligence, data-driven FDDs could look quite different in 5–10 years. As part of this, it seems likely that automation tools will be used extensively in data analysis (and not just in data preparation) (see Sutton et al., 2018). This could lead to a displacement of judgment and a situation in which upstream judgments determine many downstream activities and outputs. Notwithstanding this, our study shows that key judgments in FDD (e.g. What is good enough?) are highly situational and require a simultaneous appraisal of means and ends in light of the contextual whole, i.e. the client’s investment decision. Hence, as long as humans possess better situational understanding than machines, aggressive automation may not lead to satisfactory outcomes in all dimensions. More specifically, it may result in high efficiency, i.e. churning out a lengthy report in no time, but low effectiveness, i.e. producing a report that does little to solve the problem and determine the situation. To this end, accounting professionals’ judgment will likely remain integral.

6. Conclusions

Judgment is known to play an important role in accounting work (see West and Buckby, 2023). Moreover, in the wake of new digital technologies and automated decision systems, accounting scholars’ interest in judgment-related questions appears to be stronger than ever (see Arnaboldi et al., 2022; Quattrone, 2016). However, while there is quite a lot of research on professional judgment in auditing, relatively little is known about accounting professionals’ use of judgment in (1) other-than-audit accounting work and (2) in digitalized, data-rich environments/contexts. Our field study, which investigates the use of judgment in data-driven FDDs, analyzed through the lens of Dewey’s judgments of practice (1915) and theory of inquiry (1938), sheds light on this and contributes to a better understanding of the role, use and importance of accounting professionals’ judgment in a data-driven world.

By framing data-driven FDD as a Deweyan inquiry, we are able to contribute more broadly to the literature on digitalization in accounting, especially knowledge production for decision-making (Arnaboldi et al., 2017, 2022; Bhimani and Willcocks, 2014; Casas-Arce et al., 2022; Knudsen, 2020; Nielsen, 2018; Quattrone, 2016, 2017; Schneider et al., 2015). We contribute to this literature by developing and demonstrating a pragmatic action perspective on knowledge production that centers around managers’ decision problem and how accounting can best help resolve it. We show that FDD, despite the use of more data and new DA tools, remains a highly practical endeavor in which practical and theoretical judgments are exceedingly important. On a more detailed level, we identify four critical uses of judgment – setting priorities, exercising restraint, forming hypotheses and doing sense checks – and discuss how these are implicated in effective data-driven FDDs. These findings point toward a two-folded answer to the initial question of how adding more data and new DA tools impacts the role and use of judgment in FDD. For one thing, it has created a need for new judgments, e.g. in the areas of validating big data and preparing management discussions. For another thing, it has amplified the importance of “old” judgments, e.g. in relation to prioritization and asking the right questions. Our findings challenge several commonly held ideas, including the supposition that big data will lessen the relevance of theory and hypotheses (see Kitchin, 2014), the impression that numbers are no longer debated (see Quattrone, 2016) and the presumption that big data and DA will lead to less satisficing (see Pittenger et al., 2023). In doing so, the paper contributes to a better understanding of how big data and DA actually impact accounting work. Finally, on a more general level, we contribute to the literature on digitalization in accounting by shedding light on the role of accounting professionals’ judgment in turning big data and DA into actionable insights and business value (see Mikalef et al., 2020; Yasmin et al., 2020).

This study has several implications for accounting practice. As implied above, organizations are eager to leverage big data and DA and realize performance benefits. Lately, this enthusiasm has been dampened by reports about failing data-science projects and organizations’ struggles to capture value from DA (Joshi et al., 2021). In light of this, this study is a timely reminder to accounting practitioners not to do DA just for the sake of doing it. Instead, it suggests viewing DA as a means of inquiry – nothing more and nothing less – and using it to the extent to which it does more for determining the situation than alternative means of inquiry would. Connected to that, the study highlights different pitfalls and perils of incorporating more data and new DA tools and shows the importance of practical and theoretical judgments in performing effective data-driven inquiries. This includes cautioning against both too much and too little automation and drawing attention to judgment-related difficulties in the case of too siloed DA knowledge. These insights, in turn, can inform and guide the augmentation, especially the use of judgment and automated solutions, of knowledge production in accounting practice. Moreover, they may help accounting professionals develop a better understanding of how to (continue to) add value in an increasingly data-driven world.

This study has several limitations. First, we studied only one FDD practice, which makes it difficult to generalize the findings. Moreover, even though the Big Four firms are assumed to leverage DA in similar ways, meaningful variations might nevertheless exist. Second, because of confidentially concerns, it was not possible to collect real-time observational data. Possibly, such data would have shed light on facets of the phenomenon that the interviews, meetings and walkthrough sessions did not capture. Finally, at the time of data collection, DealCo’s push toward DA was in its second year, i.e. we entered the organization when it was at an early stage in its DA journey. At a later stage, some of the findings may have been different.

Last, we propose two promising avenues for further research. First, as argued in Section 5.3, ready-made reporting packages with a large number of pre-determined, automated analyses will likely play an integral part in future FDDs. This creates several interesting research opportunities. For instance, scholars could study how these reporting packages are developed, how relevant actors determine which analyses to include and how practical and theoretical judgment intertwine in this process. Moreover, scholars could examine how accounting professionals, in live projects, decide between using (only) the standardized package and performing additional, custom-made analyses. Second, to substantiate and extend our findings, future studies could try to incorporate management perspectives and experiences by studying advisor-seller and advisor-buyer pairs. This could produce even richer insights into how big data and DA are changing the nature, role and dynamics of management discussions in the FDD context. Furthermore, it could shed light on potential discrepancies in how advisors and management see “the situation” and assess the feasibility of means and desirability of ends and, ultimately, what is (not) “good enough.”

Figures

FDD as data-driven inquiry

Figure 1

FDD as data-driven inquiry

Practical judgment: setting priorities and exercising restraint

Setting prioritiesExercising restraint
Example 1
Data analysis
Issue: More data and new DA tools equate to more ways of doing things. Countless opportunities for interesting analysesIssue: Easy to get lost and/or carried away in rich/granular datasets and waste time on interesting but unimportant analyses
Need: Determine the focus and level of analysis that is most likely to produce decision- and value-relevant insights (considering time and resource constraints)Need: Recognize “good enough” solutions. Continuously assess how much means and ends (analyses, findings) contribute to the contextual whole (investment decision)
Example 2
Management discussions
Issue: More granular data does not eliminate the need for questions. Rather, questions tend to arise in different places and/or at a more granular levelIssue: Access to detailed data makes it tempting to ask equally detailed questions. Risk that questions go beyond discussion partner’s, e.g. Group CFO’s, knowledge
Need: Assess what questions are (not) important enough to be asked, investigated and answered (overall relevance, see Ex. 1)Need: Pose questions that, whilst having a high information value, minimize the risk of non- or false answers from management
Example 3
Reporting
Issue: High-quality data and sufficient time make it comparably easy to produce insights in abundant numbers – only some of which are relevant to clients’ decisionsIssue: Although it is tempting to show everything, i.e. all the findings, presenting too much information can overburden the client (“information overload”)
Need: Develop reports around the most essential findings; assess different insights’ impact on decision, valuation and pricingNeed: Plan and perform presentations in such a way that clients’ attention is directed to the most important insights and findings

Source(s): Table created by the authors

Theoretical judgment: forming hypotheses and doing sense checks

Forming hypothesesDoing sense checks
Example 1
Data preparation
Issue: With lots of data coming from many different systems, data curation can be poor or inconsistent (especially for data that is not looked at frequently by management)Issue: Unlike many traditional accounting data, big data generally does not permit “manual” (i.e. row-by-row, column-by-column) data quality reviews
Need: Form expectations about strengths and weaknesses of system architecture to determine appropriate controls/measuresNeed: Sense-check input data based on knowledge of accounting, the system architecture, the business and the world
Example 2
Data analysis
Issue: With the advent of big data, e.g. transaction-level sales data, “drowning” in the data has become a common experienceIssue: Larger datasets, especially big data, tend to suffer from more data quality issues. DA outputs may reflect data quality issues
Need: Form a macro-hypothesis (factual assertation), i.e. “I believe this to be true”, and check the hypothesis against the dataNeed: Be vigilant and ask “are these results plausible given everything we know about the business, the industry, and the world?”
Example 3
Management discussions
Issue: Posing questions that discussion partners cannot answer wastes valuable Q&A time with the target’s managementIssue: Despite not having the relevant knowledge, management may give sponta-neous (false) answers to novel questions
Need: Based on the available information, (e.g. role, tenure) make an assertion as to what discussion partners will likely knowNeed: Be wary and ask “given what we know about management and the business, can we trust what management is saying?”

Source(s): Table created by the authors

Interview guide

ThemesSubthemesSample questions
1 Analytics1.1 DA use throughout FDD (Neumann, 2020)What DA tools do you use in FDD?
Please give examples from past project(s)
How did you use DA tools in the
  1. planning/preparation

  2. analysis and

  3. reporting stage?

1.2 Facilitator and inhibitors (Alles and Gray, 2016)Were there any factors that favored/supported the use of DA in this project?
Were there any factors that discouraged/hindered the use of DA in this project?
How does the use of DA in this project compare to other FDD projects?
1.3 DA cost-benefit trade-off (Neumann, 2020)What are the benefits and costs of using DA in FDD? Do you see any trade-offs?
Do you see any hidden costs?
Do you see any hidden benefits?
1.4 Unrealized DA opportunities (Richins et al., 2017)Where do you see unrealized opportunities for DA use in the FDD?
What could be the reasons these opportunities have not been exploited yet?
2 Data2.1 Use of ADS (Alles and Gray, 2016)Have you used any alternative (i.e. external, non-financial) data sources (ADS)
Can you think of something that restricts the use and usefulness of ADS in FDD?
2.2 Data difficulties (Brown-Liburd et al., 2015)Have you experienced any difficulties when working with big data/ADS? E.g
  1. information overload

  2. information relevance

  3. pattern recognition, or

  4. ambiguity?

How did you respond to that?
How do you extract useful insight from large datasets? What’s your strategy?
3 Judgment3.1 Judgment → DA (Huerta and Jensen, 2017)How do you determine suitable DA applications (in FDD)?
How do you choose the right DA techniques/models?
How do you identify and evaluate suitable ADS?
How do you verify the plausibility and significance of DA outputs?
3.2 DA → judgment (van der Voort et al., 2019)What statement reflects your experience?
  1. DA complements judgment?

  2. DA substitutes for judgment?

How so?
How has DA supported your judgment? Please give examples
Can you think of instances where DA had adverse effects, i.e. where it made judgments more difficult? How so?
3.3 Need for judgment (Quattrone, 2016)In your experience, does DA decrease or increase the need for judgment in FDD? How so?
As DA use becomes more advanced (predictive, prescriptive), how do you think that will affect the need for judgment?

Source(s): Appendix created by the authors

Appendix

Table A1

Table A1 shows the interview guide that was used to structure the interviews. Interviewees were encouraged to give examples from past FDD projects. Questions were sometimes altered to ensure they matched the position and the experience of the respective interviewee. The development and choice of themes, subthemes and questions were, in parts, informed by prior literature (see referenced literature in the subthemes column).

Table 1

Overview of primary data

Data sourcePositionIDMinutes
InterviewPartner, DirectorP160
P260
P350
(Senior) ManagerM160
M260
M360
M460
(Senior) Analyst/Associate, other staffA160
A240
A350
A460
Follow-up interview M130
M430 + 30
Walkthrough session M460
M430
Kick-off meeting P1, M430
Close-out meeting P1, M1, M480

Note(s): To preserve the anonymity of the participants, we use three broader categories: 1) Partner and Director, 2) (Senior) Manager and 3) (Senior) Analyst/Associate and other staff. In some firms, the junior staff is called “Analysts”, in others “Associates”

Source(s): Table created by the authors

References

Ahrens, T. and Dent, J.F. (1998), “Accounting and organizations: realizing the richness of field research”, Journal of Management Accounting Research, Vol. 10, pp. 1-39.

AICPA (2023), “Professional judgment”, available at: https://us.aicpa.org/interestareas/frc/professional-judgment (accessed 11 November 2023).

Alles, M. and Gray, G.L. (2016), “Incorporating big data in audits: identifying inhibitors and a research agenda to address those inhibitors”, International Journal of Accounting Information Systems, Vol. 22, pp. 44-59, doi: 10.1016/j.accinf.2016.07.004.

Arnaboldi, M., Azzone, G. and Sidorova, Y. (2017), “Governing social media: the emergence of hybridised boundary objects”, Accounting, Auditing & Accountability Journal, Vol. 30 No. 4, pp. 821-849, doi: 10.1108/aaaj-07-2015-2132.

Arnaboldi, M., de Bruijn, H., Steccolini, I. and Van der Voort, H. (2022), “On humans, algorithms and data”, Qualitative Research in Accounting & Management, Vol. 19 No. 3, pp. 241-254, doi: 10.1108/qram-01-2022-0005.

Bhimani, A. and Willcocks, L. (2014), “Digitisation, ‘Big Data’ and the transformation of accounting information”, Accounting and Business Research, Vol. 44 No. 4, pp. 469-490, doi: 10.1080/00014788.2014.910051.

Boyle, D.M. and Carpenter, B.W. (2015), “Demonstrating professional skepticism”, CPA Journal, Vol. 85 No. 3, pp. 31-35.

Brands, K. and Holtzblatt, M. (2015), “Business analytics: transforming the role of management accountants”, Management Accounting Quarterly, Vol. 16 No. 3, pp. 1-12.

Brown, M.J. (2015), “John Dewey's pragmatist alternative to the belief-acceptance dichotomy”, Studies In History and Philosophy of Science Part A, Vol. 53, pp. 62-70, doi: 10.1016/j.shpsa.2015.05.012.

Brown, G.A., Collins, R. and Thornton, D.B. (1993), “Professional judgment and accounting standards”, Accounting, Organizations and Society, Vol. 18 No. 4, pp. 275-289, doi: 10.1016/0361-3682(93)90017-z.

Brown-Liburd, H., Issa, H. and Lombardi, D. (2015), “Behavioral implications of big data's impact on audit judgment and decision making and future research directions”, Accounting Horizons, Vol. 29 No. 2, pp. 451-468, doi: 10.2308/acch-51023.

Casas-Arce, P., Cheng, M.M., Grabner, I. and Modell, S. (2022), “Managerial accounting for decision-making and planning”, Journal of Management Accounting Research, Vol. 34 No. 1, pp. 1-7, doi: 10.2308/jmar-10784.

Daft, R.L. and Lengel, R.H. (1986), “Organizational information requirements, media richness and structural design”, Management Science, Vol. 32 No. 5, pp. 554-571, doi: 10.1287/mnsc.32.5.554.

Davidson, H.J. and Trueblood, R.M. (1961), “Accounting for decision-making”, The Accounting Review, Vol. 36 No. 4, pp. 577-582.

Deloitte (2021a), “M&A technology to turbocharge your transactions”, available at: https://www2.deloitte.com/us/en/pages/mergers-and-acquisitions/articles/m-and-a-technology-helps-speed-up-m-and-a-transactions.html (accessed 17 June 2022).

Deloitte (2021b), “M&A: how data analytics and data visualization enhances the financial due diligence processes”, available at: https://www2.deloitte.com/za/en/pages/finance/solutions/how-data-analytics-and-data-visualisation-enhances-the-financial-due-diligence-process.html (accessed 2 July 2023).

DeMunck, J. and Zimmermann, B. (2015), “Evaluation as practical judgment”, Human Studies, Vol. 38 No. 1, pp. 113-135, doi: 10.1007/s10746-014-9325-1.

Dewey, J. (1910), How We Think, D. C. Health, Boston, MA.

Dewey, J. (1915), “The logic of judgments of practise”, The Journal of Philosophy, Psychology, and Scientific Methods, Vol. 12 No. 19, pp. 505-523, doi: 10.2307/2013688.

Dewey, J. (1938), Logic: The Theory of Inquiry, Henry Holt, New York City, NY.

Downie, R. and Macnaughton, J. (2009), “In defence of professional judgement”, Advances in Psychiatric Treatment, Vol. 15 No. 5, pp. 322-327, doi: 10.1192/apt.bp.108.005926.

Dunne, J. (1999), “Professional judgment and the predicaments of practice”, European Journal of Marketing, Vol. 33 Nos 7/8, pp. 707-719, doi: 10.1108/03090569910274339.

Elgendy, N., Elragal, A. and Päivärinta, T. (2022), “DECAS: a modern data-driven decision theory for big data and analytics”, Journal of Decision Systems, Vol. 31 No. 4, pp. 337-373, doi: 10.1080/12460125.2021.1894674.

EY (2018), “Analytics trends in diligence and beyond”, available at: https://assets.ey.com/content/dam/ey-sites/ey-com/en_gl/topics/tmt/tmt-pdfs/ey-analytics-trends-in-diligence-and-beyond.pdf (accessed 13 November 2023).

EY (2020), “EY platform hosted on IBM cloud and using IBM Watson Discovery to reframe the future of M&A due diligence”, available at: https://www.ey.com/en_hu/news/2020/12/ey-platform-hosted-on-ibm-cloud-and-using-ibm-watson-discovery-to-reframe-the-future-of-m-a-due-diligence (accessed 17 June 2022).

Ferrero, I., Rocchi, M., Pellegrini, M.M. and Reichert, E. (2020), “Practical wisdom: a virtue for leaders. Bringing together Aquinas and authentic leadership”, Business Ethics: A European Review, Vol. 29 No. S1, pp. 84-98, doi: 10.1111/beer.12298.

Flick, U. (2009), An Introduction to Qualitative Research, Sage, Los Angeles, CA.

Gehman, J., Glaser, V.L., Eisenhardt, K.M., Gioia, D., Langley, A. and Corley, K.G. (2018), “Finding theory – method fit: a comparison of three qualitative approaches to theory building”, Journal of Management Inquiry, Vol. 27 No. 3, pp. 284-300, doi: 10.1177/1056492617706029.

Gigerenzer, G. (2021), “Axiomatic rationality and ecological rationality”, Synthese, Vol. 198 No. 4, pp. 3547-3564, doi: 10.1007/s11229-019-02296-5.

Guénin-Paracini, H., Malsch, B. and Paillé, A.M. (2014), “Fear and risk in the audit process”, Accounting, Organizations and Society, Vol. 39 No. 4, pp. 264-288, doi: 10.1016/j.aos.2014.02.001.

Hildebrand, D. (2008), Dewey: A Beginner's Guide, Oneworld Publications, Oxford.

Hopwood, A.G. (1983), “On trying to study accounting in the contexts in which it operates”, Accounting, Organizations and Society, Vol. 8 Nos 2/3, pp. 287-305, doi: 10.1016/0361-3682(83)90035-1.

Huerta, E. and Jensen, S. (2017), “An accounting information systems perspective on data analytics and Big Data”, Journal of Information Systems, Vol. 31 No. 3, pp. 101-114, doi: 10.2308/isys-51799.

Joshi, M.P., Su, N., Austin, R.D. and Sundaram, A.K. (2021), “Why so many data science projects fail to deliver”, MIT Sloan Management Review, Vol. 62 No. 3, pp. 85-89.

Kaplan, R.S. (2011), “Accounting scholarship that advances professional knowledge and practice”, The Accounting Review, Vol. 86 No. 2, pp. 367-383, doi: 10.2308/accr.00000031.

Keynes, J.M. (1936), The General Theory of Employment, Interest and Money, Palgrave Macmillan, London.

Kitchin, R. (2014), “Big data, new epistemologies and paradigm shifts”, Big Data and Society, Vol. 1 No. 1, pp. 1-12, doi: 10.1177/2053951714528481.

Knudsen, D.-R. (2020), “Elusive boundaries, power relations, and knowledge production: a systematic review of the literature on digitalization in accounting”, International Journal of Accounting Information Systems, Vol. 36, 100441, doi: 10.1016/j.accinf.2019.100441.

Korhonen, T., Selos, E., Laine, T. and Suomala, P. (2021), “Exploring the programmability of management accounting work for increasing automation: an interventionist case study”, Accounting, Auditing & Accountability Journal, Vol. 34 No. 2, pp. 253-280, doi: 10.1108/aaaj-12-2016-2809.

KPMG (2018), “Data analytics in M&A”, available at: https://assets.kpmg.com/content/dam/kpmg/au/pdf/2018/data-analytics-in-mergers-acquisitions.pdf (accessed 13 November 2023).

KPMG (2021), “Data analytics in M&A”, available at: https://home.kpmg/ch/en/home/insights/2021/01/data-analytics-in-ma.html (accessed 17 June 2022).

Lindblom, C.E. (1959), “The science of ‘muddling through”, Public Administration Review, Vol. 19 No. 2, pp. 79-88, doi: 10.2307/973677.

Luan, M. and Li, H. (2017), “Good enough – compromise between desirability and feasibility: an alternative perspective on satisficing”, Journal of Experimental Social Psychology, Vol. 70, pp. 110-116, doi: 10.1016/j.jesp.2017.01.002.

Marton, J., Nilsson, F. and Öhman, P. (Eds) (2024), Auditing Transformation: Regulation, Digitalisation and Sustainability, Routledge, Abingdon and New York.

Melé, D. (2010), “Practical wisdom in managerial decision making”, The Journal of Management Development, Vol. 29 Nos 7/8, pp. 637-645, doi: 10.1108/02621711011059068.

Merchant, K.A. and Van der Stede, W.A. (2006), “Field-based research in accounting: accomplishments and prospects”, Behavioral Research in Accounting, Vol. 18 No. 1, pp. 117-134, doi: 10.2308/bria.2006.18.1.117.

Mikalef, P., Pappas, I.O., Krogstie, J. and Pavlou, P.A. (2020), “Big data and business analytics: a research agenda for realizing business value”, Information and Management, Vol. 57 No. 1, 103237, doi: 10.1016/j.im.2019.103237.

Moll, J. and Yigitbasioglu, O. (2019), “The role of internet-related technologies in shaping the work of accountants: new directions for accounting research”, The British Accounting Review, Vol. 51 No. 6, 100833, doi: 10.1016/j.bar.2019.04.002.

Möller, K., Schäffer, U. and Verbeeten, F. (2020), “Digitalization in management accounting and control: an editorial”, Journal of Management Control, Vol. 31 No. 1, pp. 1-8, doi: 10.1007/s00187-020-00300-5.

Neumann, C.M. (2020), Data Analytics in Financial Due Diligence – A Mixed Methods Approach to Use and Adoption, Ph.D. Thesis, University of St. Gallen, available at: http://ux-tauri.unisg.ch/EDIS/Dis5006.pdf (accessed 30 September 2022).

Nielsen, S. (2018), “Reflections on the applicability of business analytics for management accounting – and future perspectives for the accountant”, Journal of Accounting & Organizational Change, Vol. 14 No. 2, pp. 167-187, doi: 10.1108/jaoc-11-2014-0056.

Pittenger, L.M., Glassman, A.M., Mumbower, S., Merritt, D.M. and Bollenback, D. (2023), “Bounded rationality: managerial decision-making and data”, Journal of Computer Information Systems, Vol. 63 No. 4, pp. 890-903, doi: 10.1080/08874417.2022.2111380.

Pomp, T. (2015), Praxishandbuch Financial Due Diligence, Springer, Wiesbaden.

PwC (2018), “Deal analytics & technology–rapid value identification”, available at: https://www.pwc.de/de/deals/deal-analytics-and-technology-rapid-value-identification.pdf (accessed 13 November 2023).

PwC (2021), “Deal analytics: delivering business-driven, data-fueled human thinking”, available at: https://www.pwc.com/us/en/services/deals/deal-analytics.html (accessed 17 June 2022).

Quattrone, P. (2016), “Management accounting goes digital: will the move make it wiser?”, Management Accounting Research, Vol. 31, pp. 118-122, doi: 10.1016/j.mar.2016.01.003.

Quattrone, P. (2017), “Embracing ambiguity in management controls and decision-making processes: on how to design data visualisations to prompt wise judgement”, Accounting and Business Research, Vol. 47 No. 5, pp. 588-612, doi: 10.1080/00014788.2017.1320842.

Repenning, N., Löhlein, L. and Schäffer, U. (2022), “Emotions in accounting: a review to bridge the paradigmatic divide”, European Accounting Review, Vol. 31 No. 1, pp. 241-267, doi: 10.1080/09638180.2021.1908906.

Richins, G., Stapleton, A., Stratopoulos, T.C. and Wong, C. (2017), “Big data analytics: opportunity or threat for the accounting profession?”, Journal of Information Systems, Vol. 31 No. 3, pp. 63-79, doi: 10.2308/isys-51805.

Rikhardsson, P. and Yigitbasioglu, O. (2018), “Business intelligence & analytics in management accounting research: status and future focus”, International Journal of Accounting Information Systems, Vol. 29, pp. 37-58, doi: 10.1016/j.accinf.2018.03.001.

Ruhnke, K. (2023), “Empirical research frameworks in a changing world: the case of audit data analytics”, Journal of International Accounting, Auditing and Taxation, Vol. 51, 100545, doi: 10.1016/j.intaccaudtax.2023.100545.

Saunders, B., Sim, J., Kingstone, T., Baker, S., Waterfield, J., Bartlam, B., Burroughs, H. and Jinks, C. (2018), “Saturation in qualitative research: exploring its conceptualization and operationalization”, Quality and Quantity, Vol. 52 No. 4, pp. 1893-1907, doi: 10.1007/s11135-017-0574-8.

Schäffer, U. and Brückner, L. (2019), “Rollenspezifische Kompetenzprofile für das Controlling der Zukunft”, Controlling & Management Review, Vol. 63 No. 7, pp. 14-30.

Schneider, G.P., Dai, J., Janvrin, D.J., Ajayi, K. and Raschke, R.L. (2015), “Infer, predict, and assure: accounting opportunities in data analytics”, Accounting Horizons, Vol. 29 No. 3, pp. 719-742, doi: 10.2308/acch-51140.

Scott, S.V. and Orlikowski, W.J. (2012), “Reconfiguring relations of accountability: materialization of social media in the travel sector”, Accounting, Organizations and Society, Vol. 37 No. 1, pp. 26-40, doi: 10.1016/j.aos.2011.11.005.

Simon, H.A. (1947), Administrative Behavior, The Macmillan Company, New York, NY.

Sinclair, R. (2014), “Dewey and White on value, obligation, and practical judgment”, SATS, Vol. 15 No. 1, pp. 39-54, doi: 10.1515/sats-2014-0003.

Smith, C.W. (2011), “Coping with contingencies in equity option markets: the ‘rationality’ of pricing”, in Beckert, J. and Aspers, P. (Eds), The Worth of Goods: Valuation & Pricing in the Economy, Oxford University Press, Oxford, pp. 272-294.

Spraakman, G., Sanchez-Rodriguez, C. and Tuck-Riggs, C.A. (2021), “Data analytics by management accountants”, Qualitative Research in Accounting & Management, Vol. 18 No. 1, pp. 127-147, doi: 10.1108/qram-11-2019-0122.

Sutton, S.G., Arnold, V. and Holt, M. (2018), “How much automation is too much? Keeping the human relevant in knowledge work”, Journal of Emerging Technologies in Accounting, Vol. 15 No. 2, pp. 15-25, doi: 10.2308/jeta-52311.

Svetlova, E. and Dirksen, V. (2014), “Models at work – models in decision making”, Science in Context, Vol. 27 No. 4, pp. 561-577, doi: 10.1017/s0269889714000209.

Tiemann, D. and Hartman, J. (2013), “Data analytical due diligence is driving M&A deals”, Financial Executive, Vol. 29 No. 3, pp. 32-36.

Trotman, K.T., Tan, H.C. and Ang, N. (2011), “Fifty-year overview of judgment and decision-making research in accounting”, Accounting and Finance, Vol. 51 No. 1, pp. 278-360, doi: 10.1111/j.1467-629x.2010.00398.x.

van der Voort, H.G., Klievink, A.J., Arnaboldi, M. and Meijer, A.J. (2019), “Rationality and politics of algorithms. Will the promise of big data survive the dynamics of public decision making?”, Government Information Quarterly, Vol. 36 No. 1, pp. 27-38, doi: 10.1016/j.giq.2018.10.011.

Vasarhelyi, M.A., Kogan, A. and Tuttle, B.M. (2015), “Big data in accounting: an overview”, Accounting Horizons, Vol. 29 No. 2, pp. 381-396, doi: 10.2308/acch-51071.

Wangerin, D. (2019), “M&A due diligence, post-acquisition performance, and financial reporting for business combinations”, Contemporary Accounting Research, Vol. 36 No. 4, pp. 2344-2378, doi: 10.1111/1911-3846.12520.

Welchman, J. (2002), “Logic and judgments of practice”, in Burke, F.T., Hester, D.M. and Talisse, R.B. (Eds), Dewey's Logical Theory: New Studies and Interpretations, Vanderbilt University Press, Nashville, TN, pp. 27-42.

West, A. and Buckby, S. (2023), “Professional judgement in accounting and Aristotelian practical wisdom”, Accounting, Auditing & Accountability Journal, Vol. 36 No. 1, pp. 120-145, doi: 10.1108/aaaj-09-2020-4949.

Yasmin, M., Tatoglu, E., Kilic, H.S., Zaim, S. and Delen, D. (2020), “Big data analytics capabilities and firm performance: an integrated MCDM approach”, Journal of Business Research, Vol. 114, pp. 1-15, doi: 10.1016/j.jbusres.2020.03.028.

Acknowledgements

We wish to express our gratitude to Editor Giuseppe Grossi and the anonymous Reviewers for their invaluable comments and guidance throughout the reviewing process. Furthermore, we wish to thank those – especially Daniela Argento, Ryan Armstrong, Dorota Dobija, Mathias Karlsson, Jan Lindvall, Oliver Lindqvist Parbratt and Anika Wiese – who commented on earlier versions of this paper on various occasions (namely, at The Research School in Accounting’s Spring Conference in Stockholm, April 7–8, 2022, at the AAAJ Special Issue Workshop at Kozminksi University (online), May 19–20, 2022, at the Ph.D. workshop of the 26th Nordic Academy of Management Conference in Örebro, August 22–23, 2022, at the Swedish Research School of Management and Information Technology's Autumn Conference in Uppsala, October 4–6, 2022, and at the Higher Seminar at the Department of Business Studies at Uppsala University, October 31, 2022). Finally, we wish to extend our gratitude to DealCo and the interviewees for their participation. A special thanks goes to M4 for providing invaluable support throughout the project.

Funding: The study was partly funded by The Swedish Research School of Management and Information Technology.

Corresponding author

Tim Kastrup can be contacted at: tim.kastrup@fek.uu.se

Related articles