Thomas G. Calderon, James W. Hesford and Michael J. Turner
In recent years professional accountancy bodies (e.g., CPA), accreditation institutions (e.g., AACSB) and employers have steadily raised, and continue to raise expectations…
Abstract
In recent years professional accountancy bodies (e.g., CPA), accreditation institutions (e.g., AACSB) and employers have steadily raised, and continue to raise expectations regarding the need for accounting graduates to demonstrate skills in data analytics. One of the obstacles accounting instructors face in seeking to implement data analytics, however, is that they need access to ample teaching materials. Unfortunately, there are few such resources available for advanced programming languages such as R. While skills in commonly used applications such as Excel are no doubt needed, employers often take these for granted and incremental value is only added if graduates can demonstrate knowledge in using more advanced data analytics tools for decision-making such as coding in programming languages. This, together with the current dearth of resources available to accounting instructors to teach advanced programming languages is what drives motivation for this chapter. Specifically, we develop an intuitive, two-dimensional framework for incorporating R (a widely used open-source analytics tool with a powerful embedded programming language) into the accounting curriculum. Our model uses complexity as an integrating theme. We incorporate complexity into this framework at the dataset level (simple and complex datasets) and at the analytics task level (simple and complex tasks). We demonstrate two-dimensional framework by drawing on authentic simple and complex datasets as well as simple and complex tasks that could readily be incorporated into the accounting curriculum and ultimately add value to businesses. R script programming code are provided for all our illustrations.
Details
Keywords
Huijun Tu and Shitao Jin
Due to the complexity and diversity of megaprojects, the architectural programming process often involves multiple stakeholders, making decision-making difficult and susceptible…
Abstract
Purpose
Due to the complexity and diversity of megaprojects, the architectural programming process often involves multiple stakeholders, making decision-making difficult and susceptible to subjective factors. This study aims to propose an architectural programming methodology system (APMS) for megaprojects based on group decision-making model to enhance the accuracy and transparency of decision-making, and to facilitate participation and integration among stakeholders. This method allows multiple interest groups to participate in decision-making, gathers various perspectives and opinions, thereby improving the quality and efficiency of architectural programming and promoting the smooth implementation of projects.
Design/methodology/approach
This study first clarifies the decision-making subjects, decision objects, and decision methods of APMS based on group decision-making theory and value-based architectural programming methods. Furthermore, the entropy weight method and fuzzy TOPSIS method are employed as calculation methods to comprehensively evaluate decision alternatives and derive optimal decision conclusions. The workflow of APMS consists of four stages: preparation, information, decision, and evaluation, ensuring the scientific and systematic of the decision-making process.
Findings
This study conducted field research and empirical analysis on a practical megaproject of a comprehensive transport hub to verify the effectiveness of APMS. The results show that, in terms of both short-distance and long-distance transportation modes, the decision-making results of APMS are largely consistent with the preliminary programming outcomes of the project. However, regarding transfer modes, the APMS decision-making results revealed certain discrepancies between the project's current status and the preliminary programming.
Originality/value
APMS addresses the shortcomings in decision accuracy and stakeholder participation and integration in the current field of architectural programming. It not only enhances stakeholder participation and interaction but also considers various opinions and interests comprehensively. Additionally, APMS has significant potential in optimizing project performance, accelerating project processes, and reducing resource waste.
Details
Keywords
Amir Hossein Alavi and Amir Hossein Gandomi
The complexity of analysis of geotechnical behavior is due to multivariable dependencies of soil and rock responses. In order to cope with this complex behavior, traditional forms…
Abstract
Purpose
The complexity of analysis of geotechnical behavior is due to multivariable dependencies of soil and rock responses. In order to cope with this complex behavior, traditional forms of engineering design solutions are reasonably simplified. Incorporating simplifying assumptions into the development of the traditional models may lead to very large errors. The purpose of this paper is to illustrate capabilities of promising variants of genetic programming (GP), namely linear genetic programming (LGP), gene expression programming (GEP), and multi‐expression programming (MEP) by applying them to the formulation of several complex geotechnical engineering problems.
Design/methodology/approach
LGP, GEP, and MEP are new variants of GP that make a clear distinction between the genotype and the phenotype of an individual. Compared with the traditional GP, the LGP, GEP, and MEP techniques are more compatible with computer architectures. This results in a significant speedup in their execution. These methods have a great ability to directly capture the knowledge contained in the experimental data without making assumptions about the underlying rules governing the system. This is one of their major advantages over most of the traditional constitutive modeling methods.
Findings
In order to demonstrate the simulation capabilities of LGP, GEP, and MEP, they were applied to the prediction of: relative crest settlement of concrete‐faced rockfill dams; slope stability; settlement around tunnels; and soil liquefaction. The results are compared with those obtained by other models presented in the literature and found to be more accurate. LGP has the best overall behavior for the analysis of the considered problems in comparison with GEP and MEP. The simple and straightforward constitutive models developed using LGP, GEP and MEP provide valuable analysis tools accessible to practicing engineers.
Originality/value
The LGP, GEP, and MEP approaches overcome the shortcomings of different methods previously presented in the literature for the analysis of geotechnical engineering systems. Contrary to artificial neural networks and many other soft computing tools, LGP, GEP, and MEP provide prediction equations that can readily be used for routine design practice. The constitutive models derived using these methods can efficiently be incorporated into the finite element or finite difference analyses as material models. They may also be used as a quick check on solutions developed by more time consuming and in‐depth deterministic analyses.
Details
Keywords
Giorgio Petroni, Alberto Ivo Dormio, Anna Nosella and Chiara Verbano
Total quality management programmes applied to research and development often produce great improvements in productivity because they follow trajectories that are lateral and…
Abstract
Total quality management programmes applied to research and development often produce great improvements in productivity because they follow trajectories that are lateral and parallel to the principal thrust which is aimed at client satisfaction. Such improvements are obtained through a better definition of the objectives of research activity, and through organisational overhauls which highlight the contribution to the innovation process made by the marketing, engineering, and production functions, through the adoption of more efficient management systems (programming, control, and management of human resources in particular). These changes are met with strong resistance from researchers and technologists who, during the start‐up phase of such programmes, often refuse to operate in accordance with the internal supplier‐client scheme. Such resistance, which was confirmed by the two Italian case studies presented, can be overcome by the intervention of strong leadership and with the setting up of an intense training programme.
Details
Keywords
Ehsan Shekarian and Alireza Fallahpour
The housing sector is one of the main sources of economic growth in both developing and developed countries. Although many methods for modeling house prices have been proposed…
Abstract
Purpose
The housing sector is one of the main sources of economic growth in both developing and developed countries. Although many methods for modeling house prices have been proposed, each has its own limitations. The present paper aims to propose gene expression programming (GEP) as a new approach for prediction of housing price.
Design/methodology/approach
This study introduces gene expression programming (GEP) as a new approach for predicting housing price. This is the first time that this metaheuristic method is used in the housing literature.
Findings
The housing price model based on the gene expression programming is compared with a least square regression model that is derived from a stepwise process. The results indicate that the GEP‐based model provides superior performance to the traditional regression.
Originality/value
Data used in this study is derived from the Household Income and Expenditure Survey (HIES) in Iran that is conducted by the Statistical Center of Iran (SCI). Housing price model is estimated by administering the questionnaires of this survey in Hamedan Province. To show the applicability of the derived model by GEP technique, it is verified applying parts of the data, namely test data sets that were not included in the modeling process.
Details
Keywords
Hao Wang, Guangming Dong and Jin Chen
The purpose of this paper is building the regression model related to tool wear, and the regression model is used to identify the state of tool wear.
Abstract
Purpose
The purpose of this paper is building the regression model related to tool wear, and the regression model is used to identify the state of tool wear.
Design/methodology/approach
In this paper, genetic programming (GP), which is originally used to solve the symbolic regression problem, is used to build the regression model related to tool wear with the strong regression ability. GP is improved in genetic operation and weighted matrix. The performance of GP is verified in the tool vibration, force and acoustic emission data provided by 2010 prognostics health management.
Findings
In result, the regression model discovered by GP can identify the state of tool wear. Compared to other regression algorithms, e.g. support vector regression and polynomial regression, the identification of GP is more precise.
Research limitations/implications
The regression models built in this paper can only make an assessment of the current wear state with current signals of tool. It cannot predict or estimate the tool wear after the current state. In addition, the generalization of model has some limitations. The performance of models is just proved in the signals from the same type of tools and under the same work condition, and different tools and different work conditions may have influences on the performance of models.
Originality/value
In this study, the discovered regression model can identify the state of tool wear precisely, and the identification performances of model applied in other tools are also excellent. It can provide a significant information about the health of tool, so the tools can be replaced or repaired in time, and the loss caused by tool damage can be avoided.
Details
Keywords
Tim Brady, Andrew Davies and Paul Nightingale
The purpose of this paper is to review the content and contributions of the article by Klein and Meckling entitled “Application of operations research to development decisions”…
Abstract
Purpose
The purpose of this paper is to review the content and contributions of the article by Klein and Meckling entitled “Application of operations research to development decisions” which was published in the journal Operations Research in May‐June 1958. The paper explores the major concepts and contributions in the article and suggests that these are relevant to today's complex and uncertain development projects.
Design/methodology/approach
The paper outlines the context in which the research on which the article is based took place and presents the main ideas in the article which relate to decision making in the procurement and development of complex systems.
Findings
The paper demonstrates the utility of the concepts in the original article, shows how they have been used in academic research on project management and innovation and that they are still relevant for both practical project management and project‐based research.
Practical implications
The primary implication is to demonstrate the value of revisiting a classic contribution in project management, in this case, one which remained hidden for a long period, but has recently come to the fore again.
Originality/value
The issues raised by the original article – related to decision making under conditions of uncertainty – remain high on the agenda today and revisiting the article may help provide a better appreciation of how to deal with those issues.
Details
Keywords
Rosalind Taylor and Alan Pearson
Quality in research and development (R&D) work has become increasinglyimportant as companies commit themselves to quality improvementprogrammes in all areas of their activity…
Abstract
Quality in research and development (R&D) work has become increasingly important as companies commit themselves to quality improvement programmes in all areas of their activity. Quality improvement forms an important part of their competitive strategy. Quality management systems have been successfuly designed and implemented for manufacturing and service functions; but so far the quality principles and systems have been difficult to translate to the R&D function. Looks at the challenge of effective implementation of quality management and total quality principles in R&D. Discusses quality concepts, terms, systems and critical factors for successful implementation. Uses brief case histories to highlight particular approaches to implementation. Finally, introduces a new, versatile method for evaluating the capabilities of an R&D organization in terms of total quality management. It is presented in the form of a case study showing its use in a large R&D laboratory of a major multinational corporation.
Details
Keywords
Explores the idea of trajectories of innovation in software development.Patterns of Innovation are analysed within social and institutionalcontexts, and within the context of…
Abstract
Explores the idea of trajectories of innovation in software development. Patterns of Innovation are analysed within social and institutional contexts, and within the context of changes in the ways computer technology is used. Three main trajectories of innvation in software development are discussed: technical change (e.g. languages, techniques, tools, methods); organizational and managerial change; and commodification (the substitution of packaged products for custom development). Sub‐trajectories are also described. Concludes that the scope and heterogeneity of software development activity has supported the formation of a number of different and competing trajectories which lead to quite different conclusions about the future of software development.