Abhishek Gupta, Dwijendra Nath Dwivedi, Jigar Shah and Ashish Jain
Good quality input data is critical to developing a robust machine learning model for identifying possible money laundering transactions. McKinsey, during one of the conferences…
Abstract
Purpose
Good quality input data is critical to developing a robust machine learning model for identifying possible money laundering transactions. McKinsey, during one of the conferences of ACAMS, attributed data quality as one of the reasons for struggling artificial intelligence use cases in compliance to data. There were often use concerns raised on data quality of predictors such as wrong transaction codes, industry classification, etc. However, there has not been much discussion on the most critical variable of machine learning, the definition of an event, i.e. the date on which the suspicious activity reports (SAR) is filed.
Design/methodology/approach
The team analyzed the transaction behavior of four major banks spread across Asia and Europe. Based on the findings, the team created a synthetic database comprising 2,000 SAR customers mimicking the time of investigation and case closure. In this paper, the authors focused on one very specific area of data quality, the definition of an event, i.e. the SAR/suspicious transaction report.
Findings
The analysis of few of the banks in Asia and Europe suggests that this itself can improve the effectiveness of model and reduce the prediction span, i.e. the time lag between money laundering transaction done and prediction of money laundering as an alert for investigation
Research limitations/implications
The analysis was done with existing experience of all situations where the time duration between alert and case closure is high (anywhere between 15 days till 10 months). Team could not quantify the impact of this finding due to lack of such actual case observed so far.
Originality/value
The key finding from paper suggests that the money launderers typically either increase their level of activity or reduce their activity in the recent quarter. This is not true in terms of real behavior. They typically show a spike in activity through various means during money laundering. This in turn impacts the quality of insights that the model should be trained on. The authors believe that once the financial institutions start speeding up investigations on high risk cases, the scatter plot of SAR behavior will change significantly and will lead to better capture of money laundering behavior and a faster and more precise “catch” rate.
Details
Keywords
Bhawana Rathore, Rohit Gupta, Baidyanath Biswas, Abhishek Srivastava and Shubhi Gupta
Recently, disruptive technologies (DTs) have proposed several innovative applications in managing logistics and promise to transform the entire logistics sector drastically…
Abstract
Purpose
Recently, disruptive technologies (DTs) have proposed several innovative applications in managing logistics and promise to transform the entire logistics sector drastically. Often, this transformation is not successful due to the existence of adoption barriers to DTs. This study aims to identify the significant barriers that impede the successful adoption of DTs in the logistics sector and examine the interrelationships amongst them.
Design/methodology/approach
Initially, 12 critical barriers were identified through an extensive literature review on disruptive logistics management, and the barriers were screened to ten relevant barriers with the help of Fuzzy Delphi Method (FDM). Further, an Interpretive Structural Modelling (ISM) approach was built with the inputs from logistics experts working in the various departments of warehouses, inventory control, transportation, freight management and customer service management. ISM approach was then used to generate and examine the interrelationships amongst the critical barriers. Matrics d’Impacts Croises-Multiplication Applique a Classement (MICMAC) analysed the barriers based on the barriers' driving and dependence power.
Findings
Results from the ISM-based technique reveal that the lack of top management support (B6) was a critical barrier that can influence the adoption of DTs. Other significant barriers, such as legal and regulatory frameworks (B1), infrastructure (B3) and resistance to change (B2), were identified as the driving barriers, and industries need to pay more attention to them for the successful adoption of DTs in logistics. The MICMAC analysis shows that the legal and regulatory framework and lack of top management support have the highest driving powers. In contrast, lack of trust, reliability and privacy/security emerge as barriers with high dependence powers.
Research limitations/implications
The authors' study has several implications in the light of DT substitution. First, this study successfully analyses the seven DTs using Adner and Kapoor's framework (2016a, b) and the Theory of Disruptive Innovation (Christensen, 1997; Christensen et al., 2011) based on the two parameters as follows: emergence challenge of new technology and extension opportunity of old technology. Second, this study categorises these seven DTs into four quadrants from the framework. Third, this study proposes the recommended paths that DTs might want to follow to be adopted quickly.
Practical implications
The authors' study has several managerial implications in light of the adoption of DTs. First, the authors' study identified no autonomous barriers to adopting DTs. Second, other barriers belonging to any lower level of the ISM model can influence the dependent barriers. Third, the linkage barriers are unstable, and any preventive action involving linkage barriers would subsequently affect linkage barriers and other barriers. Fourth, the independent barriers have high influencing powers over other barriers.
Originality/value
The contributions of this study are four-fold. First, the study identifies the different DTs in the logistics sector. Second, the study applies the theory of disruptive innovations and the ecosystems framework to rationalise the choice of these seven DTs. Third, the study identifies and critically assesses the barriers to the successful adoption of these DTs through a strategic evaluation procedure with the help of a framework built with inputs from logistics experts. Fourth, the study recognises DTs adoption barriers in logistics management and provides a foundation for future research to eliminate those barriers.
Details
Keywords
Mohit S. Sarode, Anil Kumar, Abhijit Prasad and Abhishek Shetty
This research explores the application of machine learning to optimize pricing strategies in the aftermarket sector, particularly focusing on parts with no assigned values and the…
Abstract
Purpose
This research explores the application of machine learning to optimize pricing strategies in the aftermarket sector, particularly focusing on parts with no assigned values and the detection of outliers. The study emphasizes the need to incorporate technical features to improve pricing accuracy and decision-making.
Design/methodology/approach
The methodology involves data collection from web scraping and backend sources, followed by data preprocessing, feature engineering and model selection to capture the technical attributes of parts. A Random Forest Regressor model is chosen and trained to predict prices, achieving a 76.14% accuracy rate.
Findings
The model demonstrates accurate price prediction for parts with no assigned values while remaining within an acceptable price range. Additionally, outliers representing extreme pricing scenarios are successfully identified and predicted within the acceptable range.
Originality/value
This research bridges the gap between industry practice and academic research by demonstrating the effectiveness of machine learning for aftermarket pricing optimization. It offers an approach to address the challenges of pricing parts without assigned values and identifying outliers, potentially leading to increased revenue, sharper pricing tactics and a competitive advantage for aftermarket companies.
Details
Keywords
Abhishek N., Sonal Devesh, Ashoka M.L., Neethu Suraj, Parameshwara Acharya and Divyashree M.S.
This study aimed to identify factors influencing AI/chatbot usage in education and research, and to evaluate the extent of the impact of these factors.
Abstract
Purpose
This study aimed to identify factors influencing AI/chatbot usage in education and research, and to evaluate the extent of the impact of these factors.
Design/methodology/approach
This study used a mixed approach of qualitative and quantitative methods. It is based on both primary and secondary data. The primary data were collected through an online survey. In total, 177 responses from teachers were included in this study. The collected data were analyzed using a statistical package for the social sciences.
Findings
The study revealed that the significant factors influencing the perception of the academic and research community toward the adoption of AI/interactive tools, such as Chatbots/ChatGpt for education and research, are challenges, benefits, awareness, opportunities, risks, sustainability and ethical considerations.
Practical implications
This study highlighted the importance of resolving challenges and enhancing awareness and benefits while carefully mitigating risks and ethical concerns in the integration of technology within the educational and research environment. These insights can assist policymakers in making decisions and developing strategies for the efficient adoption of AI/interactive tools in academia and research to enhance the overall quality of learning experiences.
Originality/value
The present study adds value to the existing literature on AI/interactive tool adoption in academia and research by offering a quantitative analysis of the factors impacting teachers' perception of the usage of such tools. Furthermore, it also indirectly helps achieve various UNSDGs, such as 4, 9, 10 and 17.
Arani Rodrigo and Trevor Mendis
The purpose of this paper is to provide the theoretical insights with regard to the green purchasing intention–behavior gap and the role played by…
Abstract
Purpose
The purpose of this paper is to provide the theoretical insights with regard to the green purchasing intention–behavior gap and the role played by social media influences in abating this gap. This paper takes into consideration a wider aspect with regard to the antecedents of behavioral intention through personal and social identities in place of the antecedents presented in the theory of planned behavior and social-identity theory. Furthermore, as the theories lack an explanation of how to reduce the intention–behavior gap, this paper also argues the source credibility model (SCM) in explaining the impact that social media influences can have on the behavioral gap.
Design/methodology/approach
Hypothetical deductive method is proposed for this concept paper under the positivism research paradigm.
Findings
Not applicable as this is a concept paper. However, the paper discusses the theoretical and managerial implications.
Research limitations/implications
This is a concept paper. Yes this paper discusses the theoretical, managerial, and social/ecological implications.
Practical implications
This paper highlights the relevance of consumers' personal and social identities when consumers make purchasing decisions regarding green products. How managers can make marketing strategies, based on credibility model, involving social media influences as product endorsers and ambassadors, as well as the policy makers to design products, earmark consumer behavior and to conduct marketing campaigns in time to come.
Social implications
As to how policies can be designed and adopted for bio-based economies where sustainability and circularity are given priority and to increase the attention of businesses moving toward sustainable practices.
Originality/value
Original thought developed based on research, theoretical and market gaps.
Details
Keywords
Adeyemi Adebayo and Barry Ackers
Within the context of public sector accountability, the purpose of this paper is to examine South African state-owned enterprises (SOEs) auditing practices and how they have…
Abstract
Purpose
Within the context of public sector accountability, the purpose of this paper is to examine South African state-owned enterprises (SOEs) auditing practices and how they have contributed to mitigating prevalent corporate governance issues in South African SOEs.
Design/methodology/approach
This paper utilised a thematic content analysis of archival documents relating to South African SOEs. Firstly, to assess the extent to which the auditing dimension of the corporate governance codes, applicable to South African SOEs, conforms with best practices. Secondly, to determine the extent to which the audit practices of all the 21 South African SOEs listed in Schedule 2 of the Public Finance Management Act, have implemented the identified best audit practices.
Findings
The findings suggest that South African SOEs appear to have adopted and implemented best audit practices to enhance the quality of their accountability in relation to their corporate governance practices, as contained in their applicable corporate governance frameworks. However, despite the high levels of conformance, the observation that most South African SOEs continue to fail and require government bailouts, appears to suggest that auditing has no bearing on poor SOE performance, and that other corporate governance factors may be at play.
Practical implications
The discussion and findings in this paper suggest that the auditing practices of South African SOEs are adequate. However, that SOEs in South Africa continue to be loss-making may imply that this has contributed little to mitigating their corporate governance problems. Thus, policymakers and standard setters, including the Institute of Directors South Africa and relevant oversight bodies should pay attention to better developing means by which to curtail fruitless and wasteful expenditures by South African SOEs through improved corporate governance practices.
Social implications
Most SOEs’ mission statements encourage SOEs to be socially responsible and utilise taxpayers’ monies efficiently and effectively without engaging in fruitless and wasteful expenditure. This study is conceived in this light.
Originality/value
To the best of the author’s knowledge, while acknowledging previous studies, this paper is the first to explore this topic in the context of SOEs and in the context of Africa.
Details
Keywords
Abhishek Das and Mihir Narayan Mohanty
In time and accurate detection of cancer can save the life of the person affected. According to the World Health Organization (WHO), breast cancer occupies the most frequent…
Abstract
Purpose
In time and accurate detection of cancer can save the life of the person affected. According to the World Health Organization (WHO), breast cancer occupies the most frequent incidence among all the cancers whereas breast cancer takes fifth place in the case of mortality numbers. Out of many image processing techniques, certain works have focused on convolutional neural networks (CNNs) for processing these images. However, deep learning models are to be explored well.
Design/methodology/approach
In this work, multivariate statistics-based kernel principal component analysis (KPCA) is used for essential features. KPCA is simultaneously helpful for denoising the data. These features are processed through a heterogeneous ensemble model that consists of three base models. The base models comprise recurrent neural network (RNN), long short-term memory (LSTM) and gated recurrent unit (GRU). The outcomes of these base learners are fed to fuzzy adaptive resonance theory mapping (ARTMAP) model for decision making as the nodes are added to the F_2ˆa layer if the winning criteria are fulfilled that makes the ARTMAP model more robust.
Findings
The proposed model is verified using breast histopathology image dataset publicly available at Kaggle. The model provides 99.36% training accuracy and 98.72% validation accuracy. The proposed model utilizes data processing in all aspects, i.e. image denoising to reduce the data redundancy, training by ensemble learning to provide higher results than that of single models. The final classification by a fuzzy ARTMAP model that controls the number of nodes depending upon the performance makes robust accurate classification.
Research limitations/implications
Research in the field of medical applications is an ongoing method. More advanced algorithms are being developed for better classification. Still, the scope is there to design the models in terms of better performance, practicability and cost efficiency in the future. Also, the ensemble models may be chosen with different combinations and characteristics. Only signal instead of images may be verified for this proposed model. Experimental analysis shows the improved performance of the proposed model. This method needs to be verified using practical models. Also, the practical implementation will be carried out for its real-time performance and cost efficiency.
Originality/value
The proposed model is utilized for denoising and to reduce the data redundancy so that the feature selection is done using KPCA. Training and classification are performed using heterogeneous ensemble model designed using RNN, LSTM and GRU as base classifiers to provide higher results than that of single models. Use of adaptive fuzzy mapping model makes the final classification accurate. The effectiveness of combining these methods to a single model is analyzed in this work.
Details
Keywords
Rubel Amin, Bijay Prasad Kushwaha and Md Helal Miah
This paper examines the process optimization method of the online sales model of information product demand concerning the spillover effect. It illustrates the spillover effect…
Abstract
Purpose
This paper examines the process optimization method of the online sales model of information product demand concerning the spillover effect. It illustrates the spillover effect (SE) of online product demand compared with traditional market demand. Also, optimized the SE for the ethical and ordinary consumer.
Design/methodology/approach
This article primarily focused on two types of models for online marketing: one is wholesales, and another is the agency. Firstly, the wholesale and agency models without SE and the wholesale and agency models with SE are constructed, respectively, to realize the SE in different sales models. Secondly, online channel participants' optimal price, demand and profit under variant conditions are compared and analyzed. Finally, efficient supply chain theory is optimized for the decision-making of online marketing consumers using an equation-based comparative analysis method.
Findings
The study found that when SEs are not considered, stronger piracy regulation makes online channel participants more beneficial. When the positive SE is strong, it is detrimental to manufacturers. When SEs are not considered, online channel participants only reach Pareto in agency mode. Pareto optimality can be achieved in wholesale and agency modes when SEs are considered.
Originality/value
The research has practical implications for an effective supply chain model for online marketing. This is the first algorithm-based comparative study concerning theoretical spillover effect analysis in supply chain management.
Details
Keywords
Patrick Dallasega, Manuel Woschank, Joseph Sarkis and Korrakot Yaibuathet Tippayawong
This study aims to provide a measurement model, and the underlying constructs and items, for Logistics 4.0 in manufacturing companies. Industry 4.0 technology for logistics…
Abstract
Purpose
This study aims to provide a measurement model, and the underlying constructs and items, for Logistics 4.0 in manufacturing companies. Industry 4.0 technology for logistics processes has been termed Logistics 4.0. Logistics 4.0 and its elements have seen varied conceptualizations in the literature. The literature has mainly focused on conceptual and theoretical studies, which supports the notion that Logistics 4.0 is a relatively young area of research. Refinement of constructs and building consensus perspectives and definitions is necessary for practical and theoretical advances in this area.
Design/methodology/approach
Based on a detailed literature review and practitioner focus group interviews, items of Logistics 4.0 for manufacturing enterprises were further validated by using a large-scale survey with practicing experts from organizations located in Central Europe, the Northeastern United States of America and Northern Thailand. Exploratory and confirmatory factor analyses were used to define a measurement model for Logistics 4.0.
Findings
Based on 239 responses the exploratory and confirmatory factor analyses resulted in nine items and three factors for the final Logistics 4.0 measurement model. It combines “the leveraging of increased organizational capabilities” (factor 1) with “the rise of interconnection and material flow transparency” (factor 2) and “the setting up of autonomization in logistics processes” (factor 3).
Practical implications
Practitioners can use the proposed measurement model to assess their current level of maturity regarding the implementation of Logistics 4.0 practices. They can map the current state and derive appropriate implementation plans as well as benchmark against best practices across or between industries based on these metrics.
Originality/value
Logistics 4.0 is a relatively young research area, which necessitates greater development through empirical validation. To the best of the authors knowledge, an empirically validated multidimensional construct to measure Logistics 4.0 in manufacturing companies does not exist.
Details
Keywords
Reza Marvi, Pantea Foroudi and Maria Teresa Cuomo
This paper aims to explore the intersection of artificial intelligence (AI) and marketing within the context of knowledge management (KM). It investigates how AI technologies…
Abstract
Purpose
This paper aims to explore the intersection of artificial intelligence (AI) and marketing within the context of knowledge management (KM). It investigates how AI technologies facilitate data-driven decision-making, enhance business communication, improve customer personalization, optimize marketing campaigns and boost overall marketing effectiveness.
Design/methodology/approach
This study uses a quantitative and systematic approach, integrating citation analysis, text mining and co-citation analysis to examine foundational research areas and the evolution of AI in marketing. This comprehensive analysis addresses the current gap in empirical investigations of AI’s influence on marketing and its future developments.
Findings
This study identifies three main perspectives that have shaped the foundation of AI in marketing: proxy, tool and ensemble views. It develops a managerially relevant conceptual framework that outlines future research directions and expands the boundaries of AI and marketing literature within the KM landscape.
Originality/value
This research proposes a conceptual model that integrates AI and marketing within the KM context, offering new research trajectories. This study provides a holistic view of how AI can enhance knowledge sharing, strategic planning and decision-making in marketing.