Search results
1 – 10 of 259Huy Minh Vo, Jyh-Bin Yang and Veerakumar Rangasamy
Construction projects commonly encounter complicated delay problems. Over the past few decades, numerous delay analysis methods (DAMs) have been developed. There is no consensus…
Abstract
Purpose
Construction projects commonly encounter complicated delay problems. Over the past few decades, numerous delay analysis methods (DAMs) have been developed. There is no consensus on whether existing DAMs effectively resolve delays, particularly in the case of complex concurrent delays. Thus, the primary objective of this study is to undertake a comprehensive and systematic literature review on concurrent delays, aiming to answer the following research question: Do existing delay analysis techniques deal with concurrent delays well?
Design/methodology/approach
This study conducts a comprehensive review of concurrent delays by both bibliometric and systematic analysis of research publications published between 1982 and 2022 in the Web of Science (WoS) and Scopus databases. For quantitative analysis, a bibliometric mapping tool, the VOSviewer, was employed to analyze 68 selected publications to explore the co-occurrence of keywords, co-authorship and direct citation. Additionally, we conducted a qualitative analysis to answer the targeted research question, identify academic knowledge gaps and explore potential research directions for solving the theoretical and practical problems of concurrent delays.
Findings
Concurrent delays are a critical aspect of delay claims. Despite DAMs developed by a limited number of research teams to tackle issues like concurrence, float consumption and the critical path in concurrent delay resolution, practitioners continue to face significant challenges. This study has successfully identified knowledge gaps in defining, identifying, analyzing and allocating liability for concurrent delays while offering promising directions for further research. These findings reveal the incompleteness of available DAMs for solving concurrent delays.
Practical implications
The outcomes of this study are highly beneficial for practitioners and researchers. For practitioners, the discussions on the resolution process of concurrent delays in terms of identification, analysis and apportionment enable them to proactively address concurrent delays and lay the groundwork for preventing and resolving such issues in their construction projects. For researchers, five research directions, including advanced DAMs capable of solving concurrent delays, are proposed for reference.
Originality/value
Existing research on DAMs lacks comprehensive coverage of concurrent delays. Through a scientometric review, it is evident that current DAMs do not deal with concurrent delays well. This review identifies critical knowledge gaps and offers insights into potential directions for future research.
Details
Keywords
Yanyan Shi, Hao Su, Meng Wang, Hanxiao Dou, Bin Yang and Feng Fu
In the brain imaging based on electrical impedance tomography, it is sometimes not able to attach 16 electrodes due to space restriction caused by craniotomy. As a result of this…
Abstract
Purpose
In the brain imaging based on electrical impedance tomography, it is sometimes not able to attach 16 electrodes due to space restriction caused by craniotomy. As a result of this, the number of boundary measurements decreases, and spatial resolution of reconstructed conductivity distribution is reduced. The purpose of this study is to enhance reconstruction quality in cases of limited measurement.
Design/methodology/approach
A new data expansion method based on the shallow convolutional neural network is proposed. An eight-electrode model is built from which fewer boundary measurements can be obtained. To improve the imaging quality, shallow convolutional neural network is constructed which maps limited voltage data of the 8-electrode model to expanded voltage data of a quasi-16-electrode model. The predicted data is compared with the quasi-16-electrode data. Besides, image reconstruction based on L1 regularization method is conducted.
Findings
The results show that the predicted data generally coincides with the quasi-16-electrode data. It is found that images reconstructed with the data of eight-electrode model are the poorest. Nevertheless, imaging results when the limited data is expanded by the proposed method show large improvement, and there is a minor difference with the images recovered with the quasi-16-electrode data. Also, the impact of noise is studied, which shows that the proposed method is robust to noise.
Originality/value
To enhance reconstruction quality in the case of limited measurement, a new data expansion method based on the shallow convolutional neural network is proposed. Both simulation work and phantom experiments have demonstrated that high-quality images of cerebral hemorrhage and cerebral ischemia can be obtained when the limited measurement is expanded by the proposed method.
Details
Keywords
This study aims to review earned value management (EVM)-relative methods, including the original EVM, earned schedule method (ESM) and earned duration management (EDM(t)). This…
Abstract
Purpose
This study aims to review earned value management (EVM)-relative methods, including the original EVM, earned schedule method (ESM) and earned duration management (EDM(t)). This study then proposes a general implementation procedure and some basic principles for the selection of EVM-relative methods.
Design/methodology/approach
After completing an intensive literature review, this study conducts a case study to examine the forecasting performance of project duration using the EVM, ESM and EDM(t) methods.
Findings
When the project is expected to finish on time, ESM with a performance factor equal to 1 is the recommended method. EDM(t) would be the most reliable method during a project's entire lifetime if EDM(t) is expected to be delayed based on past experience.
Research limitations/implications
As this research conducts a case study with only one building construction project, the results might not hold true for all types of construction projects.
Practical implications
EVM, ESM and EDM(t) are simple and data-accessible methods. With the help of a general implementation procedure, applying all three methods would be better. The power of the three methods is definitely larger than that of choosing only one for complex construction projects.
Originality/value
Previous studies have discussed the advantages and disadvantages of EVM, ESM and EDM(t). This study amends the available outcomes. Thus, for schedulers or researchers interested in implementing EVM, ESM and EDM(t), this study can provide more constructive instructions.
Details
Keywords
An S-curve is an essential project-management tool. However, it is difficult to adjust S-curve to deal with a force majeure event. The present study develops four valuable…
Abstract
Purpose
An S-curve is an essential project-management tool. However, it is difficult to adjust S-curve to deal with a force majeure event. The present study develops four valuable adjustment approaches, designed to achieve a compromise between the views of the client and contractor. These can be used to control projects after a force majeure event.
Design/methodology/approach
The present study develops four adjustment approaches, which can be used to achieve a compromise between the views of the client and those of the contractor when controlling projects after a force majeure. To determine the S-curves during a force majeure event, two approaches can be selected: BCWS (budgeted cost of scheduled work)-base approach, or BCWP (budgeted cost of work performed)-base approach. To determine the rest of S-curves after a force majeure event, two approaches can be considered: maintaining the original curve of the remaining BCWS, or allocating the original curve of the remaining BCWS. Based on the validation of three empirical cases, drawn from a professional project-management website, this study confirms the feasibility of four proposed empirical approaches and a selection procedure for S-curve adjustment.
Findings
The S-curve-adjustment approaches presented here can be used to deal with cases that are ahead of, on and behind schedule. Using the proposed approaches and selection procedure, contractors can easily revise S-curves and control projects more effectively. To deal with a force majeure event, such as COVID-19, they are strongly advised to adopt the approaches labeled SA-A1 (to adjust the S-curve based on the extension ratio multiplied by the difference in progress during the force majeure) and SA-B1 (to maintain the original curve of the remaining BCWS) for the A/E and E/F curves, respectively.
Research limitations/implications
The proposed approaches can be used in cases of continuous construction during force majeure events. If construction work is totally suspended during such an event, it will be necessary to fine-tune the proposed approaches.
Originality/value
Previous studies have used case-oriented or mathematical-simulation approaches to forecast S-curves. The present study proposes simple approaches that allow the client and contractor to adjust the S-curve easily after a force majeure event. These approaches can be used to adjust work and project-completion targets within an extended duration. Selecting the right S-curve adjustment approach can help to control the remainder of the project, reducing the possibility of delay claims.
Details
Keywords
Yogesh Patil, Milind Akarte, K. P. Karunakaran, Ashik Kumar Patel, Yash G. Mittal, Gopal Dnyanba Gote, Avinash Kumar Mehta, Ronald Ely and Jitendra Shinde
Integrating additive manufacturing (AM) tools in traditional mold-making provides complex yet affordable sand molds and cores. AM processes such as selective laser sintering (SLS…
Abstract
Purpose
Integrating additive manufacturing (AM) tools in traditional mold-making provides complex yet affordable sand molds and cores. AM processes such as selective laser sintering (SLS) and Binder jetting three-dimensional printing (BJ3DP) are widely used for patternless sand mold and core production. This study aims to perform an in-depth literature review to understand the current status, determine research gaps and propose future research directions. In addition, obtain valuable insights into authors, organizations, countries, keywords, documents, sources and cited references, sources and authors.
Design/methodology/approach
This study followed the systematic literature review (SLR) to gather relevant rapid sand casting (RSC) documents via Scopus, Web of Science and EBSCO databases. Furthermore, bibliometrics was performed via the Visualization of Similarities (VOSviewer) software.
Findings
An evaluation of 116 documents focused primarily on commercial AM setups and process optimization of the SLS. Process optimization studies the effects of AM processes, their input parameters, scanning approaches, sand types and the integration of computer-aided design in AM on the properties of sample. The authors performed detailed bibliometrics of 80 out of 120 documents via VOSviewer software.
Research limitations/implications
This review focuses primarily on the SLS AM process.
Originality/value
A SLR and bibliometrics using VOSviewer software for patternless sand mold and core production via the AM process.
Details
Keywords
Bowen Miao, Xiaoting Shang, Kai Yang, Bin Jia and Guoqing Zhang
This paper studies the location-inventory problem (LIP) in pallet pooling systems to improve resource utilization and save logistics costs, which is a new extension of the…
Abstract
Purpose
This paper studies the location-inventory problem (LIP) in pallet pooling systems to improve resource utilization and save logistics costs, which is a new extension of the classical LIP and also an application of the LIP in pallet pooling systems.
Design/methodology/approach
A mixed-integer linear programming is established, considering the location problem of pallet pooling centers (PPCs) with multi-level capacity, multi-period inventory management and bi-directional logistics. Owing to the computational complexity of the problem, a hybrid genetic algorithm (GA) is then proposed, where three local searching strategies are designed to improve the problem-solving efficiency. Lastly, numerical experiments are carried out to validate the feasibility of the established model and the efficiency of the proposed algorithm.
Findings
The results of numerical experiments show that (1) the proposed model can obtain the integrated optimal solution of the location problem and inventory management, which is better than the two-stage model and the model with single-level capacity; (2) the total cost and network structure are sensitive to the number of PPCs, the unit inventory cost, the proportion of repairable pallets and the fixed transportation cost and (3) the proposed hybrid GA shows good performance in terms of solution quality and computational time.
Originality/value
The established model extends the classical LIP by considering more practical factors, and the proposed algorithm provides support for solving large-scale problems. In addition, this study can also offer valuable decision support for managers in pallet pooling systems.
Details
Keywords
The paper aims to clarify the influence of the equivalent particles number (EPN) change on the flow velocity characteristic.
Abstract
Purpose
The paper aims to clarify the influence of the equivalent particles number (EPN) change on the flow velocity characteristic.
Design/methodology/approach
The paper opted for an exploratory study using PIV technology to obtain the transient flow toxicity vector of oil in the square pipeline.
Findings
The paper provides empirical insights about the influence of EPN on the flow average velocity which is most prominent in the middle of the pipeline, and smaller EPN values have a greater impact.
Originality/value
These influence laws of EPN can be used to obtain the dynamic characteristics of oil, which provides theoretical support for oil pollution control and effective treatment measures and lays a preliminary foundation for the online monitoring of particles in oil.
Details
Keywords
Bin Zhang, Qizhong Yang and Qi Hao
Drawing on social information processing theory, this study constructs a multilevel moderated mediation model. This model seeks to delve into the intricate and previously…
Abstract
Purpose
Drawing on social information processing theory, this study constructs a multilevel moderated mediation model. This model seeks to delve into the intricate and previously overlooked interplay between supervisor bottom-line mentality (BLM) and knowledge hiding. Within this context, we introduce self-interest as a mediating factor and incorporate performance climate as a team-level moderating variable.
Design/methodology/approach
The time-lagged data involve 336 employees nested in 42 teams from 23 automobile sales companies in five regions of China. The analysis was meticulously executed using Hierarchical Linear Modeling, complemented by bias-corrected bootstrapping techniques.
Findings
The findings reveal that self-interest acts as a full mediator in the positive link between supervisor BLM and knowledge hiding. Furthermore, the performance climate plays a moderating role in both the relationship between supervisor BLM and self-interest, and the entire mediation process. Notably, these relationships are intensified in environments with a high performance climate compared to those with a low one.
Originality/value
This research stands as one of the pioneering efforts to integrate supervisor BLM into the discourse on knowledge hiding, elucidating the underlying psychological mechanisms and delineating the boundary conditions that shape the “supervisor BLM–knowledge hiding” relationship. Further, our insights provide organizations with critical guidance on strategies to curtail knowledge hiding among their employees.
Details
Keywords
Pingqing Liu, Yunyun Yuan, Lifeng Yang, Bin Liu and Shuang Xu
The aim of this study is to examine the relationships between taking charge, bootlegging innovation and innovative job performance, and to explore the moderating roles of felt…
Abstract
Purpose
The aim of this study is to examine the relationships between taking charge, bootlegging innovation and innovative job performance, and to explore the moderating roles of felt responsibility for constructive change (FRCC) and creative self-efficacy (CSE).
Design/methodology/approach
Data for this research was collected from 503 employees working in a chain company. Through a longitudinal study design, a three-wave survey with 397 valid data provided support for the proposed theoretical model.
Findings
The results maintain a positive association between taking charge, bootlegging innovation and innovative job performance, indicating the mediating effect of bootlegging innovation. Additionally, both the FRCC and CSE facilitate the indirect effect of taking charge on innovative job performance through bootlegging innovation. Furthermore, the integrated moderated mediation model analysis suggested that FRCC is more vital in improving employees' innovative job performance.
Originality/value
This research aims to break the black box between taking charge and innovative job performance, which has been relatively unexplored. Drawing from self-determination theory (SDT) and the proactive motivation model, the authors verify the bridge-building role of bootlegging innovation and the dual-facilitating effects of FRCC and CSE while employees conduct taking charge. This study’s results provide new insight for managers to foster, encourage and support employees' proactive behavior.
Details
Keywords
An efficient e-waste management system is developed, aided by deep learning techniques. Here, a smart bin system using Internet of things (IoT) sensors is generated. The sensors…
Abstract
Purpose
An efficient e-waste management system is developed, aided by deep learning techniques. Here, a smart bin system using Internet of things (IoT) sensors is generated. The sensors detect the level of waste in the dustbin. The data collected by the IoT sensor is stored in the blockchain. Here, an adaptive deep Markov random field (ADMRF) method is implemented to determine the weight of the wastes. The performance of the ADMRF is boosted by optimizing its parameters with the help of the improved corona virus herd immunity optimization algorithm (ICVHIOA). Here, the main objective of the developed ADMRF-based waste weight prediction is to minimize the root mean square error (RMSE) and mean absolute error (MAE) rate at the time of testing. If the weight of the bins is more than 80%, then an alert message will be sent to the waste collector directly. Optimal route selection is carried out using the developed ICVHIOA for efficient collection of wastes from the smart bin. Here, the main objectives of the optimal route selection are to reduce the distance and time to minimize the operational cost and the environmental impacts. The collected waste is then considered for recycling. The performance of the implemented IoT and blockchain-based smart dustbin is evaluated by comparing it with other existing smart dustbins for e-waste management.
Design/methodology/approach
The developed e-waste management system is used to collect the waste and to avoid certain diseases caused by the dumped waste. Disposal and recycling of the e-waste is necessary to decrease pollution and to manufacture new products from the waste.
Findings
The RMSE of the implemented framework was 33.65% better than convolutional neural network (CNN), 27.12% increased than recurrent neural network (RNN), 22.27% advanced than Resnet and 9.99% superior to long short-term memory (LSTM).
Originality/value
The proposed E-waste management system has given an enhanced performance rate in weight prediction and also in optimal route selection when compared with other conventional methods.
Details