Search results

1 – 10 of over 4000
Per page
102050
Citations:
Loading...
Access Restricted. View access options
Article
Publication date: 17 September 2024

Mohammad Yaghtin and Youness Javid

The purpose of this research is to address the complex multiobjective unrelated parallel machine scheduling problem with real-world constraints, including sequence-dependent setup…

46

Abstract

Purpose

The purpose of this research is to address the complex multiobjective unrelated parallel machine scheduling problem with real-world constraints, including sequence-dependent setup times and periodic machine maintenance. The primary goal is to minimize total tardiness, earliness and total completion times simultaneously. This study aims to provide effective solution methods, including a Mixed-Integer Programming (MIP) model, an Epsilon-constraint method and the Nondominated Sorting Genetic Algorithm (NSGA-II), to offer valuable insights into solving large-sized instances of this challenging problem.

Design/methodology/approach

This study addresses a multiobjective unrelated parallel machine scheduling problem with sequence-dependent setup times and periodic machine maintenance activities. An MIP model is introduced to formulate the problem, and an Epsilon-constraint method is applied for a solution. To handle the NP-hard nature of the problem for larger instances, an NSGA-II is developed. The research involves the creation of 45 problem instances for computational experiments, which evaluate the performance of the algorithms in terms of proposed measures.

Findings

The research findings demonstrate the effectiveness of the proposed solution approaches for the multiobjective unrelated parallel machine scheduling problem. Computational experiments on 45 generated problem instances reveal that the NSGA-II algorithm outperforms the Epsilon-constraint method, particularly for larger instances. The algorithms successfully minimize total tardiness, earliness and total completion times, showcasing their practical applicability and efficiency in handling real-world scheduling scenarios.

Originality/value

This study contributes original value by addressing a complex multiobjective unrelated parallel machine scheduling problem with real-world constraints, including sequence-dependent setup times and periodic machine maintenance activities. The introduction of an MIP model, the application of the Epsilon-constraint method and the development of the NSGA-II algorithm offer innovative approaches to solving this NP-hard problem. The research provides valuable insights into efficient scheduling methods applicable in various industries, enhancing decision-making processes and operational efficiency.

Details

Journal of Modelling in Management, vol. 20 no. 2
Type: Research Article
ISSN: 1746-5664

Keywords

Access Restricted. View access options
Article
Publication date: 11 March 2025

Linlin Xie, Ziyi Yu and Xianbo Zhao

To meet an ever - increasing urbanization demand, urban complex projects have evolved to form the development type of HOPSCA (an acronym for Hotel, Office, Park, Shopping mall…

1

Abstract

Purpose

To meet an ever - increasing urbanization demand, urban complex projects have evolved to form the development type of HOPSCA (an acronym for Hotel, Office, Park, Shopping mall, Convention and Apartment, representing a new type of urban complex). Its integrated functions, complex structures and superior siting expose HOPSCA’s construction phase to higher and more uncertain safety risks. Despite this, research on construction safety risks of large urban complexes is scarce. This study addresses this by introducing the interval ordinal priority approach (Interval-OPA) method to build a safety risk assessment model for HOPSCA, targeting its construction safety risk management.

Design/methodology/approach

This study initially identifies risk factors via literature review, field survey and three Delphi method rounds, forming a construction safety risk list of HOPSCA projects. Then, Interval-OPA is employed to create a safety risk assessment model, and its validity confirmed through a representative case study of an ongoing project. Lastly, uncertainty and weighting analyses of the model results identify the most probable major construction accidents, safety risk factors and targeted prevention strategies for the urban complex projects construction phase.

Findings

The findings reveal that (1) there are 33 construction safety risks in HOPSCA’s construction phase across 4 aspects: “man-machine-environment-management”; (2) object strikes are the most prominent of accidents and need to be prioritized for prevention, especially when managerial risks are arising; (3) falls from heights are evaluated with the highest level of uncertainty, which represents an ambiguous area for safety management and (4) the result of the risk evaluation shows that there are nine critical construction safety risk factors for the HOPSCA project and that most of the management-level risk factors have high uncertainty. This study explores and provides effective measures to combat these factors.

Originality/value

This study innovatively applies the Interval-OPA method to risk assessment, offering a fitting method for evaluating the HOPSCA project’s construction safety risks and accidents. The model aids decision-makers in appropriate risk classification and selection of scientific risk prevention strategies, enhances HOPSCA’s construction safety management system and even benefits all under-construction projects, promoting the construction industry’s sustainable development.

Details

Engineering, Construction and Architectural Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0969-9988

Keywords

Access Restricted. View access options
Article
Publication date: 24 February 2025

Hongtai Cheng and Jiayi Han

In situ 3D reconstruction is the basis for measuring and monitoring large-scale working environments such as construction sites and coal mining surfaces. The purpose of this paper…

8

Abstract

Purpose

In situ 3D reconstruction is the basis for measuring and monitoring large-scale working environments such as construction sites and coal mining surfaces. The purpose of this paper is to address the difficulties of on-site 3D reconstruction in large-scale dynamic environments. Based on panoramic rotating light detection and ranging sensors, a dynamic environment 3D reconstruction method combining occupancy grid and point cloud segmentation is proposed.

Design/methodology/approach

The algorithm fully combines the information between point clouds to process the points in the region of interest, improving the processing speed of the occupancy grid method and the accuracy of dynamic object filtering. Furthermore, this paper also proposes an incremental long-short horizon segmentation and reconstruction workflow, which performs different levels of point cloud differential segmentation for various types of dynamic objects.

Findings

This paper designs detailed physical and simulation experiments in different environments to verify the advantages and reliability of the algorithm. The entire system showed satisfactory performance in experiments.

Originality/value

The proposed dynamic environment 3D reconstruction method can effectively complete the 3D reconstruction of complex work environments, and the proposed incremental long-short horizon segmentation and reconstruction workflow further improves the accuracy of dynamic object removal.

Details

Sensor Review, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0260-2288

Keywords

Access Restricted. View access options
Article
Publication date: 7 March 2025

Roushan Roy, Krishnendu Shaw, Shivam Mishra and Ravi Shankar

The uncertain supply chain network design (SCND) problem, considering suppliers’ environmental, social and governance (ESG) ratings, has been infrequently addressed in the…

12

Abstract

Purpose

The uncertain supply chain network design (SCND) problem, considering suppliers’ environmental, social and governance (ESG) ratings, has been infrequently addressed in the literature. Looking at the importance of ESG ratings in achieving supply chain sustainability, this study aims to fill the gap by incorporating supplier ESG factors into SCND within an uncertain environment.

Design/methodology/approach

This paper presents a multi-period, multi product SCND model that integrates ESG factors and accounts for uncertainties in supply and production capacities. The model seeks to minimize total operational costs by determining the optimal selection of plant and warehouse locations across multiple time periods. Uncertainties in supply and production capacities are managed through a chance-constrained programming approach with right-hand side stochasticity. A Lagrangian relaxation-based heuristic method is applied to address the NP-hard nature of the problem.

Findings

The efficacy of the proposed model is illustrated through a numerical example, demonstrating its capability to optimize material flows across the supply chain under uncertain conditions. The model simultaneously considers economic and ESG factors in procurement decisions. A sensitivity analysis is conducted to examine different operational scenarios and their implications on the model’s outcomes.

Originality/value

To the best of the authors’ knowledge, this study is one of the first to integrate ESG factors into SCND under uncertainty. The proposed model provides a robust framework for decision-makers to optimize supply chain operations while considering both economic and ESG objectives in an uncertain environment.

Details

Journal of Modelling in Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1746-5664

Keywords

Access Restricted. View access options
Article
Publication date: 8 September 2023

Önder Halis Bettemir and M. Talat Birgonul

Exact solution of time–cost trade-off problem (TCTP) by the state-of-the-art meta-heuristic algorithms can be obtained for small- and medium-scale problems, while satisfactory…

169

Abstract

Purpose

Exact solution of time–cost trade-off problem (TCTP) by the state-of-the-art meta-heuristic algorithms can be obtained for small- and medium-scale problems, while satisfactory results cannot be obtained for large construction projects. In this study, a hybrid heuristic meta-heuristic algorithm that adapts the search domain is developed to solve the large-scale discrete TCTP more efficiently.

Design/methodology/approach

Minimum cost slope–based heuristic network analysis algorithm (NAA), which eliminates the unfeasible search domain, is embedded into differential evolution meta-heuristic algorithm. Heuristic NAA narrows the search domain at the initial phase of the optimization. Moreover, activities with float durations higher than the predetermined threshold value are eliminated and then the meta-heuristic algorithm starts and searches the global optimum through the narrowed search space. However, narrowing the search space may increase the probability of obtaining a local optimum. Therefore, adaptive search domain approach is employed to make reintroduction of the eliminated activities to the design variable set possible, which reduces the possibility of converging into local minima.

Findings

The developed algorithm is compared with plain meta-heuristic algorithm with two separate analyses. In the first analysis, both algorithms have the same computational demand, and in the latter analysis, the meta-heuristic algorithm has fivefold computational demand. The tests on case study problems reveal that the developed algorithm presents lower total project costs according to the dependent t-test for paired samples with α = 0.0005.

Research limitations/implications

In this study, TCTP is solved without considering quality or restrictions on the resources.

Originality/value

The proposed method enables to adapt the number of parameters, that is, the search domain and provides the opportunity of obtaining significant improvements on the meta-heuristic algorithms for other engineering optimization problems, which is the theoretical contribution of this study. The proposed approach reduces the total construction cost of the large-scale projects, which can be the practical benefit of this study.

Details

Engineering, Construction and Architectural Management, vol. 32 no. 2
Type: Research Article
ISSN: 0969-9988

Keywords

Available. Content available
Article
Publication date: 25 February 2025

Julian Martinez-Moya, Thierry Vanelslander, María Feo-Valero and Ramón Sala-Garrido

The present research aims to develop a Terminal Competitiveness Index (TCI) applied to the container terminals located in the Hamburg – Le Havre range, which is an area…

314

Abstract

Purpose

The present research aims to develop a Terminal Competitiveness Index (TCI) applied to the container terminals located in the Hamburg – Le Havre range, which is an area characterised for its intense container activity. The main components of the TPCI are productivity, foreland connectivity and infrastructure.

Design/methodology/approach

To construct the index, the Benefit-of-the-Doubt and the Common Set of Weights methods in Data Envelopment Analysis are used to obtain a common weighting scheme for the evaluation of container terminals.

Findings

Results show that connectivity and terminal efficiency are the most important factors for terminal competitiveness. The TCI has identified that APM Terminals Maavslakte II (Rotterdam), ECT Delta (Rotterdam) and MPET (Antwerp) turned out with the highest competitiveness score.

Originality/value

Container terminals play a key role in today’s marketplace since they are the main infrastructure responsible for loading and unloading the containers full of intermediate and final goods. Therefore, the competitiveness of such terminals is crucial for shipping lines and importing and exporting companies, influencing their cost and schedule reliability. However, there is scarce literature studying the competitiveness of container terminals, since the focus to date has been on ports as units of analysis. The terminal-approach used allows the analysis of the competitiveness of terminals belonging to different ports, but also between those located in the same port.

Details

Maritime Business Review, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2397-3757

Keywords

Access Restricted. View access options
Article
Publication date: 13 January 2025

Conor Brian Hamill, Raad Khraishi, Simona Gherghel, Jerrard Lawrence, Salvatore Mercuri, Ramin Okhrati and Greig Alan Cowan

Interest-free promotions are a prevalent strategy employed by credit card lenders to attract new customers, yet the research exploring their effects on both consumers and lenders…

25

Abstract

Purpose

Interest-free promotions are a prevalent strategy employed by credit card lenders to attract new customers, yet the research exploring their effects on both consumers and lenders remains relatively sparse. Selecting an optimal promotion strategy is intricate, involving the determination of an interest-free period duration and promotion-availability window, all within the context of market dynamics and complex consumer behaviour. The purpose of this study is to develop an agent-based model to assist with determining optimal promotion strategies.

Design/methodology/approach

In this paper, we introduce a novel agent-based model that facilitates the exploration of various credit card promotions under diverse market scenarios.

Findings

Our experiments reveal that, in the absence of competitor promotions, lender profit is maximised by an interest-free duration of approximately 12 months, while market share is maximised by offering the longest duration possible. In the context of concurrent interest-free promotions, we identify that the optimal lender strategy entails offering a more competitive interest-free period and a rapid response to competing promotional offers. Notably, a delay of three months in responding to a rival promotion corresponds to a 2.4% relative decline in income.

Originality/value

Our model consists of multiple lender and consumer agents that interact through a novel set of mechanisms based on well-studied consumer behaviours. Distinct from previous works, our model adopts a realistic billing cycle with a focus on interest charged to revolving accounts and supports a range of lender promotion strategies. It is calibrated to historical benchmarks and validated against both stylised facts and time-series data, ensuring a realistic reflection of market behaviour, which has been neglected in prior studies.

Details

International Journal of Bank Marketing, vol. 43 no. 4
Type: Research Article
ISSN: 0265-2323

Keywords

Access Restricted. View access options
Article
Publication date: 23 January 2024

Chong Wu, Zijiao Zhang, Chang Liu and Yiwen Zhang

This paper aims to propose a bed and breakfast (B&B) recommendation method that takes into account review timeliness and user preferences to help consumers choose the most…

86

Abstract

Purpose

This paper aims to propose a bed and breakfast (B&B) recommendation method that takes into account review timeliness and user preferences to help consumers choose the most satisfactory B&B.

Design/methodology/approach

This paper proposes a B&B ranking method based on improved intuitionistic fuzzy sets. First, text mining and cluster analysis are combined to identify the concerns of consumers and construct an attribute set. Second, an attribute-level-based text sentiment analysis is established. The authors propose an improved intuitionistic fuzzy set, which is more in line with the actual situation of sentiment analysis of online reviews. Next, subjective-objective combinatorial assignments are applied, considering the consumers’ preferences. Finally, the vlsekriterijumska optimizacija i kompromisno resenje (VIKOR) algorithm, based on the improved score function, is advised to evaluate B&Bs.

Findings

A case study is presented to illustrate the use of the proposed method. Comparative analysis with other multi-attribute decision-making (MADM) methods proves the effectiveness and superiority of the VIKOR algorithm based on the improved intuitionistic fuzzy sets proposed in this paper.

Originality/value

Proposing a B&B recommendation method that takes into account review timeliness and user customization is the innovation of this paper. In this approach, the authors propose improved intuitionistic fuzzy sets. Compared with the traditional intuitionistic fuzzy set, the improved intuitionistic fuzzy set increases the abstention membership, which is more in line with the actual situation of attribute-level sentiment analysis of online reviews.

Details

Kybernetes, vol. 54 no. 4
Type: Research Article
ISSN: 0368-492X

Keywords

Access Restricted. View access options
Article
Publication date: 3 March 2025

Yawen Liu, Bin Sun, Tong Guo and Zhaoxia Li

Damage of engineering structures is a nonlinear evolutionary process that spans across both material and structural levels, from mesoscale to macroscale. This paper aims to…

22

Abstract

Purpose

Damage of engineering structures is a nonlinear evolutionary process that spans across both material and structural levels, from mesoscale to macroscale. This paper aims to provide a comprehensive review of damage analysis methods at both the material and structural levels.

Design/methodology/approach

This study provides an overview of multiscale damage analysis of engineering structures, including its definition and significance. Current status of damage analysis at both material and structural levels is investigated, by reviewing damage models and prediction methods from single-scale to multiscale perspectives. The discussion of prediction methods includes both model-based simulation approaches and data-driven techniques, emphasizing their roles and applications. Finally, summarize the main findings and discuss potential future research directions in this field.

Findings

In the material level, damage research primarily focuses on the degradation of material properties at the macroscale using continuum damage mechanics (CDM). In contrast, at the mesoscale, damage research involves analyzing material behavior in the meso-structural domain, focusing on defects like microcracks and void growth. In structural-level damage analysis, the macroscale is typically divided into component and structural scales. The component scale examines damage progression in individual structural elements, such as beams and columns, often using detailed finite element or mesoscale models. The structural scale evaluates the global behavior of the entire structure, typically using simplified models like beam or shell elements.

Originality/value

To achieve realistic simulations, it is essential to include as many mesoscale details as possible. However, this results in significant computational demands. To balance accuracy and efficiency, multiscale methods are employed. These methods are categorized into hierarchical approaches, where different scales are processed sequentially, and concurrent approaches, where multiple scales are solved simultaneously to capture complex interactions across scales.

Details

International Journal of Structural Integrity, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1757-9864

Keywords

Access Restricted. View access options
Article
Publication date: 26 February 2025

Sung-Inn Pyo, Soohyun Ahn and Soon-Sun Kwon

Visualizing relations of textual data requires dimension reduction to increase the interpretability of output. However, traditional dimension reduction methods have some…

7

Abstract

Purpose

Visualizing relations of textual data requires dimension reduction to increase the interpretability of output. However, traditional dimension reduction methods have some limitations, such as the loss of feature information during extraction or projection in dimension reduction and uncertain results due to the mixture of word labels. In this study, we develop the textual data visualization algorithm using statistical methods to present statistical inferences on the data. We also construct the algorithm in a way that the user can analyze textual data easily.

Design/methodology/approach

Unstructured data, such as textual data, is sensitive to choosing analysis methods. In addition, textual data is generally large-sized and sparse. Considering such characteristics, we applied latent Dirichlet allocation to separate data to minimize the loss of information, and false discover rate (FDR) control to reduce dimension in a statistical way.

Findings

The relation of textual data can be derived in a one-click way, and the output can be interpreted without background information, with separated topics.

Originality/value

The algorithm is constructed based on the Korean language. However, any language can be used without linguistic information. This study can be an example of usage and flow, which using not well-known dimension reduction methods can replace traditional methods.

Details

Data Technologies and Applications, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2514-9288

Keywords

1 – 10 of over 4000
Per page
102050