John L. Abernathy, Michael Barnes and Chad Stefaniak
For the past 10 years, the Public Company Accounting Oversight Board (PCAOB) has operated as an independent overseer of public company audits. Over 70 percent of PCAOB studies…
Abstract
For the past 10 years, the Public Company Accounting Oversight Board (PCAOB) has operated as an independent overseer of public company audits. Over 70 percent of PCAOB studies have been published since 2010, evidencing the increasing relevance of PCAOB-related research in recent years. Our paper reviews the existing literature on the PCAOB’s four primary functions – registration, standard-setting, inspections, and enforcement. In particular, we examine PCAOB registration trends and evaluate the effects of PCAOB registration requirements on the issuer audit market, as well as discuss the relative costs and benefits (e.g., auditor behavior changes, improvements in audit quality, auditor perceptions) of the 16 auditing standards the PCAOB passed in its first 10 years of operation. Further, we summarize the literature’s findings on the effects of the PCAOB inspection process on various facets of audit quality. Finally, we analyze the research concerning the PCAOB’s enforcement actions to determine how markets have responded to sanctions against auditors and audit firms. We contend that understanding and reviewing the effects of the PCAOB’s activities are important to future audit research because of the PCAOB’s authority over and oversight of the issuer audit profession. We also identify PCAOB-related research areas that have not been fully explored and propose several research questions intended to address these research areas.
Details
Keywords
Michael J. Barnes, Bruce P. Hunn and Regina A. Pomranky
The most important advance in system design is the development of modeling and simulation methods to predict complex performance before prototypes are developed. New systems are…
Abstract
The most important advance in system design is the development of modeling and simulation methods to predict complex performance before prototypes are developed. New systems are developed in a spiraling approach; as more is learned about the system, design changes are proposed and evaluated. This approach allows the engineering team to “spin out” early versions of the system for preliminary evaluation, permitting changes to be made to the system design without incurring unacceptable cost. Because of the complexity of human performance, current modeling techniques provide only a first approximation. However, it has been demonstrated that even simple, inexpensive modeling approaches are useful in uncovering workload and performance problems related to developing systems (Barnes & Beevis, 2003). More important, these models can serve as the basis for operator simulation experiments that verify and also calibrate the original models. Furthermore, early field tests and system of systems demonstrations that can validate these results under actual conditions are becoming an increasingly significant part of the early design process. Fig. 1 illustrates this interdependence indicating a spiraling process throughout the design starting with simple predictive methods and progressing to more expensive validation methods. These iterations should continue until most of the soldier's variance is accounted for, and before any formal soldier testing is conducted. Fig. 1 presents the ideal combination of techniques; not all systems can be evaluated this thoroughly but more cost-effective modeling and simulation tools combined with realistic field exercises should make this approach more the norm as future unmanned systems are developed (Barnes & Beevis, 2003). In the remainder of this chapter, several case studies are presented to illustrate how the techniques in Fig. 1 have been applied in UAV programs.
Bing Zhang, Raiyan Seede, Austin Whitt, David Shoukr, Xueqin Huang, Ibrahim Karaman, Raymundo Arroyave and Alaa Elwany
There is recent emphasis on designing new materials and alloys specifically for metal additive manufacturing (AM) processes, in contrast to AM of existing alloys that were…
Abstract
Purpose
There is recent emphasis on designing new materials and alloys specifically for metal additive manufacturing (AM) processes, in contrast to AM of existing alloys that were developed for other traditional manufacturing methods involving considerably different physics. Process optimization to determine processing recipes for newly developed materials is expensive and time-consuming. The purpose of the current work is to use a systematic printability assessment framework developed by the co-authors to determine windows of processing parameters to print defect-free parts from a binary nickel-niobium alloy (NiNb5) using laser powder bed fusion (LPBF) metal AM.
Design/methodology/approach
The printability assessment framework integrates analytical thermal modeling, uncertainty quantification and experimental characterization to determine processing windows for NiNb5 in an accelerated fashion. Test coupons and mechanical test samples were fabricated on a ProX 200 commercial LPBF system. A series of density, microstructure and mechanical property characterization was conducted to validate the proposed framework.
Findings
Near fully-dense parts with more than 99% density were successfully printed using the proposed framework. Furthermore, the mechanical properties of as-printed parts showed low variability, good tensile strength of up to 662 MPa and tensile ductility 51% higher than what has been reported in the literature.
Originality/value
Although many literature studies investigate process optimization for metal AM, there is a lack of a systematic printability assessment framework to determine manufacturing process parameters for newly designed AM materials in an accelerated fashion. Moreover, the majority of existing process optimization approaches involve either time- and cost-intensive experimental campaigns or require the use of proprietary computational materials codes. Through the use of a readily accessible analytical thermal model coupled with statistical calibration and uncertainty quantification techniques, the proposed framework achieves both efficiency and accessibility to the user. Furthermore, this study demonstrates that following this framework results in printed parts with low degrees of variability in their mechanical properties.
Details
Keywords
Robert L. Engle and Michael L. Barnes
A 42‐question survey on usage and beliefs regarding sales force automation (SFA) was collected, along with actual sales performance data, on 1,641 sales representatives of a large…
Abstract
A 42‐question survey on usage and beliefs regarding sales force automation (SFA) was collected, along with actual sales performance data, on 1,641 sales representatives of a large international pharmaceutical company in Germany, England, and the United States. The relationships between beliefs and usage and individual sales performance were examined both within and across countries and a cost‐benefit analysis completed. Factor analysis identified five usage groupings including: Planning and territory management; Administration and external information exchange; Within company communication; Active sales tool; and Passive sales tool. Significant usage, belief, and performance differences between countries were found, with the use of SFA explaining 16.4 per cent of the variance in sales performance across countries. General findings indicated that management and representatives believed SFA to be useful. US$22.2 million in sales increases were found to be attributable to SFA usage. At the same time, non‐discounted cash flow payback periods were found to range from 6.2 to 7.4 years. Potential contributing factors and implications are discussed.
Details
Keywords
The lengthy review of the Food Standards Committee of this, agreed by all public analysts and enforcement officers, as the most complicated and difficult of food groups subject to…
Abstract
The lengthy review of the Food Standards Committee of this, agreed by all public analysts and enforcement officers, as the most complicated and difficult of food groups subject to detailed legislative control, is at last complete and the Committee's findings set out in their Report. When in 1975 they were requested to investigate the workings of the legislation, the problems of control were already apparent and getting worse. The triology of Regulations of 1967 seemed comprehensive at the time, perhaps as we ventured to suggest a little too comprehensive for a rational system of control for arguments on meat contents of different products, descriptions and interpretation generally quickly appeared. The system, for all its detail, provided too many loopholes through which manufacturers drove the proverbial “carriage and pair”. As meat products have increased in range and the constantly rising price of meat, the “major ingredient”, the number of samples taken for analysis has risen and now usually constitutes about one‐quarter of the total for the year, with sausages, prepared meats (pies, pasties), and most recently, minced meat predominating. Just as serial sampling and analysis of sausages before the 1967 Regulations were pleaded in courts to establish usage in the matter of meat content, so with minced meat the same methods are being used to establish a maximum fat content usage. What concerns food law enforcement agencies is that despite the years that the standards imposed by the 1967 Regulations have been in force, the number of infringements show no sign of reduction. This should not really surprise us; there are even longer periods of failures to comply; eg., in the use of preservatives which have been controlled since 1925! What a number of public analysts have christened the “beefburger saga” took its rise post‐1967 and shows every indication of continuing into the distant future. Manufacturers appear to be trying numerous ploys to reduce the content below the Regulation 80% mainly by giving their products new names. Each year, public analysts report a flux of new names and ingenious defences; eg, “caterburgers” and similar concocted nomenclature, and the defence that because the name does not incorporate a meat, it is outside the statutory standard.
The following is an introductory profile of the fastest growing firms over the three-year period of the study listed by corporate reputation ranking order. The business activities…
Abstract
The following is an introductory profile of the fastest growing firms over the three-year period of the study listed by corporate reputation ranking order. The business activities in which the firms are engaged are outlined to provide background information for the reader.
Christopher Owen Cox and Hamid Pasaei
According to the Project Management Institute, 70% of projects fail globally. The causes of project failure in many instances can be identified as non-technical or behavioral in…
Abstract
Purpose
According to the Project Management Institute, 70% of projects fail globally. The causes of project failure in many instances can be identified as non-technical or behavioral in nature arising from interactions between participants. These intangible risks can emerge in any project setting but especially in project settings having diversity of cultures, customs, beliefs and traditions of various companies or countries. This paper provides an objective framework to address these intangible risks.
Study design/methodology/approach
This paper presents a structured approach to identify, assess and manage intangible risks to enhance a project team’s ability to meet its objectives. The authors propose a user-friendly framework, Intangible Risk Assessment Methodology for Projects (IRAMP), to address these risks and the factors that cause them. Meta-network (e.g., a network of networks) simulation and established social network analysis (SNA) measures provide a quantitative assessment and ranking of causal events and their influence on the intangible behavior centric risks.
Findings
The proposed IRAMP and meta-network approach were utilized to examine the project delivery process of an international energy firm. Data were gathered using structured interviews, surveys and project team workshops. The use of the IRAMP to highlight intangible risk areas underpinned by the SNA measures led to changes in the company’s organizational structure to enhance project delivery effectiveness.
Originality/value
This work extends the existing project risk management literature by providing a novel objective approach to identify and quantify behavior centric intangible risks and the conditions that cause them to emerge.