Kamlesh Kumar Pandey and Diwakar Shukla
The K-means (KM) clustering algorithm is extremely responsive to the selection of initial centroids since the initial centroid of clusters determines computational effectiveness…
Abstract
Purpose
The K-means (KM) clustering algorithm is extremely responsive to the selection of initial centroids since the initial centroid of clusters determines computational effectiveness, efficiency and local optima issues. Numerous initialization strategies are to overcome these problems through the random and deterministic selection of initial centroids. The random initialization strategy suffers from local optimization issues with the worst clustering performance, while the deterministic initialization strategy achieves high computational cost. Big data clustering aims to reduce computation costs and improve cluster efficiency. The objective of this study is to achieve a better initial centroid for big data clustering on business management data without using random and deterministic initialization that avoids local optima and improves clustering efficiency with effectiveness in terms of cluster quality, computation cost, data comparisons and iterations on a single machine.
Design/methodology/approach
This study presents the Normal Distribution Probability Density (NDPD) algorithm for big data clustering on a single machine to solve business management-related clustering issues. The NDPDKM algorithm resolves the KM clustering problem by probability density of each data point. The NDPDKM algorithm first identifies the most probable density data points by using the mean and standard deviation of the datasets through normal probability density. Thereafter, the NDPDKM determines K initial centroid by using sorting and linear systematic sampling heuristics.
Findings
The performance of the proposed algorithm is compared with KM, KM++, Var-Part, Murat-KM, Mean-KM and Sort-KM algorithms through Davies Bouldin score, Silhouette coefficient, SD Validity, S_Dbw Validity, Number of Iterations and CPU time validation indices on eight real business datasets. The experimental evaluation demonstrates that the NDPDKM algorithm reduces iterations, local optima, computing costs, and improves cluster performance, effectiveness, efficiency with stable convergence as compared to other algorithms. The NDPDKM algorithm minimizes the average computing time up to 34.83%, 90.28%, 71.83%, 92.67%, 69.53% and 76.03%, and reduces the average iterations up to 40.32%, 44.06%, 32.02%, 62.78%, 19.07% and 36.74% with reference to KM, KM++, Var-Part, Murat-KM, Mean-KM and Sort-KM algorithms.
Originality/value
The KM algorithm is the most widely used partitional clustering approach in data mining techniques that extract hidden knowledge, patterns and trends for decision-making strategies in business data. Business analytics is one of the applications of big data clustering where KM clustering is useful for the various subcategories of business analytics such as customer segmentation analysis, employee salary and performance analysis, document searching, delivery optimization, discount and offer analysis, chaplain management, manufacturing analysis, productivity analysis, specialized employee and investor searching and other decision-making strategies in business.
Details
Keywords
Arun A. Elias, Matthew Pepper, Anand Gurumurthy and Avanish K. Shukla
Resilient and sustainable supply chain management is emerging as a focused area of research in the field of supply chain management. This article aims to introduce this edition of…
Abstract
Resilient and sustainable supply chain management is emerging as a focused area of research in the field of supply chain management. This article aims to introduce this edition of Advances in Environmental Accounting and Management and explore opportunities for research related to resilient and sustainable supply chain management. A critical analysis of literature found the need for developing the theory underpinning resilient and sustainable supply chains and the need for rich empirical studies. The six articles included in this edition present a variety of contexts including food supply chain, environmental accounting models, energy sector, human resources, modern slavery, horticultural worker exploitation and sustainable transport in jurisdictions like Australia, European Union, Fiji and India. Overall, this edition contributes to both theoretical and empirical literature on resilient and sustainable supply chain management and presents a repository of research that explores this area using an accounting and environmental management perspective.
Details
Keywords
Uttam Kumar Khedlekar and Priyanka Singh
For smooth running of business affairs, there needs to be a coordination among manufacturer, collector and retailer in forward and reverse supply chain. This paper handles the…
Abstract
Purpose
For smooth running of business affairs, there needs to be a coordination among manufacturer, collector and retailer in forward and reverse supply chain. This paper handles the problem of making pricing, collecting and percentage sharing decisions in a closed-loop supply chain. The purpose of this paper is to examine the effect of responsibility sharing percentage on the profits of a manufacturer, a retailer and a collector. The paper further aims to understand the mutual interactions among decision variables and profit functions. It also determines the optimal selling price, optimal time, wholesale price, sharing percentage and optimal return rate in such a manner that the profit function is maximized.
Design/methodology/approach
The authors presented a three-echelon model consisting of a manufacturer, a retailer and a collector in the closed-loop supply chain and optimized the profits of each supply chain member. The authors introduced SRR models for the remanufacturing by providing some percentage of physical and financial support to the collector. Optimization techniques have been applied to obtain optimal solutions. Numerical examples and graphical representations of the optimal solutions are provided to illustrate the model.
Findings
This study stresses on profitable value retrieval from returned products, and it discusses how responsibility sharing can improve profitability and reduce the workload of an individual. In total, three main results are found. First, sharing and coordination among chain members can improve collector’s profit. Second, supply chain performance may also improve over time. Third, the profit of each member of the supply chain increases with an increase in sharing percentage up to a certain limit. So, the manufacturer can share the responsibility of the collector under a fixed limit.
Research limitations/implications
The main limitation of this model is that there is no difference between manufactured and remanufactured products. There are many correlated issues that need to be further investigated. The future study in this direction may include multi-retailer, stochastic demand patterns.
Practical implications
It is directly utilized by supply chain industries in which coordination among chain members is still needed to maximize profits. This information enables the manufacturer to assist the collector financially or physically for the proper management of the three-layer supply chain. The present work will form a guideline to choose the appropriate parameter(s) and mathematical technique(s) in different situations for remanufacturable products.
Social implications
From the management point of view, this study delivers the strongest result to remanufacturing companies and for whom effective and efficient coordination among chain members is vital to the overall performance of the supply chain.
Originality/value
There are very few studies that consider the remanufacturing of used products under a fixed time period. The authors considered selling price-sensitive and time-dependent exponentially declining demand. This model is developed by considering all possible help to a collector from manufacturer to collect used products from consumers. This research complements past research by showing coordination among supply chain members within a fixed time horizon.
Details
Keywords
Guoqing Li, Yunhai Geng and Wenzheng Zhang
This paper aims to introduce an efficient active-simultaneous localization and mapping (SLAM) approach for rover navigation, future planetary rover exploration mission requires…
Abstract
Purpose
This paper aims to introduce an efficient active-simultaneous localization and mapping (SLAM) approach for rover navigation, future planetary rover exploration mission requires the rover to automatically localize itself with high accuracy.
Design/methodology/approach
A three-dimensional (3D) feature detection method is first proposed to extract salient features from the observed point cloud, after that, the salient features are employed as the candidate destinations for re-visiting under SLAM structure, followed by a path planning algorithm integrated with SLAM, wherein the path length and map utility are leveraged to reduce the growth rate of state estimation uncertainty.
Findings
The proposed approach is able to extract distinguishable 3D landmarks for feature re-visiting, and can be naturally integrated with any SLAM algorithms in an efficient manner to improve the navigation accuracy.
Originality/value
This paper proposes a novel active-SLAM structure for planetary rover exploration mission, the salient feature extraction method and active revisit patch planning method are validated to improve the accuracy of pose estimation.
Details
Keywords
Mayur Pratap Singh, Dinesh Kumar Shukla, Rajneesh Kumar and Kanwer Singh Arora
The key purpose of conducting this review is to identify the issues that affect the structural integrity of pipeline structures. Heat affected zone (HAZ) has been identified as…
Abstract
Purpose
The key purpose of conducting this review is to identify the issues that affect the structural integrity of pipeline structures. Heat affected zone (HAZ) has been identified as the weak zone in pipeline welds which is prone to have immature failures
Design/methodology/approach
In the present work, literature review is conducted on key issues related to the structural integrity of pipeline steel welds. Mechanical and microstructural transformations that take place during welding have been systematically reviewed in the present review paper.
Findings
Key findings of the present review underline the role of brittle microstructure phases, and hard secondary particles present in the matrix are responsible for intergranular and intragranular cracks.
Research limitations/implications
The research limitations of the present review are new material characterization techniques that are not available in developing countries.
Practical implications
The practical limitations are new test methodologies and associated cost.
Social implications
The fracture of pipelines significantly affects the surrounding ecology. The continuous spillage of oil pollutes the land and water of the surroundings.
Originality/value
The present review contains recent and past studies conducted on welded pipeline steel structures. The systematic analysis of studies conducted so far highlights various bottlenecks of the welding methods.
Details
Keywords
Mohammad Arif, Saurabh Kango and Dinesh Kumar Shukla
This study aims to purpose the suitable location of slip boundary condition and microscale surface textures to enhance the tribological performance of the hydrodynamic journal…
Abstract
Purpose
This study aims to purpose the suitable location of slip boundary condition and microscale surface textures to enhance the tribological performance of the hydrodynamic journal bearings.
Design/methodology/approach
Mass conserving Elrod cavitation algorithm with considering slip boundary condition has been used for predicting the static performance characteristics (load carrying capacity, coefficient of friction and volumetric inflow rate) of finite cylindrical shape textured journal bearings.
Findings
It has been observed that the full textured bearing with slip boundary condition in between 0°–180° circumferential region gives a significant reduction in the lubricant rupture zone. However, the introduction of textures up to the interface of slip and the no-slip region is increasing the load-carrying capacity and reduces the shear stress. This reduction in shear stress with combined slip and surface textures is effective in increasing the volumetric inflow rate of the lubricant.
Practical implications
The combined effect of slip boundary condition and surface texturing is increasing the scope of liquid lubricants in hydrodynamic journal bearings and further contributing toward the development of small-scale rotating machines.
Originality/value
The study related to the use of mass conserving Elrod cavitation algorithm for finding the optimum location of slip and surface texture zones has been found rare in the literature. Previous studies show that the mass conserving Elrod cavitation algorithm gives realistic results for textured bearings and its findings show good agreement with the experimental observations.
Details
Keywords
This work can be used as a building block in other settings such as GPU, Map-Reduce, Spark or any other. Also, DDPML can be deployed on other distributed systems such as P2P…
Abstract
Purpose
This work can be used as a building block in other settings such as GPU, Map-Reduce, Spark or any other. Also, DDPML can be deployed on other distributed systems such as P2P networks, clusters, clouds computing or other technologies.
Design/methodology/approach
In the age of Big Data, all companies want to benefit from large amounts of data. These data can help them understand their internal and external environment and anticipate associated phenomena, as the data turn into knowledge that can be used for prediction later. Thus, this knowledge becomes a great asset in companies' hands. This is precisely the objective of data mining. But with the production of a large amount of data and knowledge at a faster pace, the authors are now talking about Big Data mining. For this reason, the authors’ proposed works mainly aim at solving the problem of volume, veracity, validity and velocity when classifying Big Data using distributed and parallel processing techniques. So, the problem that the authors are raising in this work is how the authors can make machine learning algorithms work in a distributed and parallel way at the same time without losing the accuracy of classification results. To solve this problem, the authors propose a system called Dynamic Distributed and Parallel Machine Learning (DDPML) algorithms. To build it, the authors divided their work into two parts. In the first, the authors propose a distributed architecture that is controlled by Map-Reduce algorithm which in turn depends on random sampling technique. So, the distributed architecture that the authors designed is specially directed to handle big data processing that operates in a coherent and efficient manner with the sampling strategy proposed in this work. This architecture also helps the authors to actually verify the classification results obtained using the representative learning base (RLB). In the second part, the authors have extracted the representative learning base by sampling at two levels using the stratified random sampling method. This sampling method is also applied to extract the shared learning base (SLB) and the partial learning base for the first level (PLBL1) and the partial learning base for the second level (PLBL2). The experimental results show the efficiency of our solution that the authors provided without significant loss of the classification results. Thus, in practical terms, the system DDPML is generally dedicated to big data mining processing, and works effectively in distributed systems with a simple structure, such as client-server networks.
Findings
The authors got very satisfactory classification results.
Originality/value
DDPML system is specially designed to smoothly handle big data mining classification.
Details
Keywords
Ravinder Pal Singh, R. K. Garg and D. K. Shukla
Optimization of response parameter is essential in any process .The purpose of this paper is to focus at achieving the optimized parameter for submerged arc welding to furnish the…
Abstract
Purpose
Optimization of response parameter is essential in any process .The purpose of this paper is to focus at achieving the optimized parameter for submerged arc welding to furnish the quality welds at direct current electrode positive (DCEP) polarity and direct current electrode negative (DCEN) polarity.
Design/methodology/approach
This paper achieves the parameter after extensive trial runs and finally parameters are optimized to acquire the cost effective and quality welds in submerged arc welding using the response surface methodology.
Findings
Apart from effect of parameters on weld bead geometry has been identified but optimized parameters has also been achieved for submerged arc welding process for DCEP and DCEN polarities.
Practical implications
As this study is related to practical work it may be useful for any relevant application.
Social implications
The process parameters used in this experimental work will be basis for job work/industry for submerged arc welding.
Originality/value
This paper identifies the effect of polarity in submerged arc welding.