Noel Scott, Brent Moyle, Ana Cláudia Campos, Liubov Skavronskaya and Biqiang Liu
Chandra Sekhar Kolli and Uma Devi Tatavarthi
Fraud transaction detection has become a significant factor in the communication technologies and electronic commerce systems, as it affects the usage of electronic payment. Even…
Abstract
Purpose
Fraud transaction detection has become a significant factor in the communication technologies and electronic commerce systems, as it affects the usage of electronic payment. Even though, various fraud detection methods are developed, enhancing the performance of electronic payment by detecting the fraudsters results in a great challenge in the bank transaction.
Design/methodology/approach
This paper aims to design the fraud detection mechanism using the proposed Harris water optimization-based deep recurrent neural network (HWO-based deep RNN). The proposed fraud detection strategy includes three different phases, namely, pre-processing, feature selection and fraud detection. Initially, the input transactional data is subjected to the pre-processing phase, where the data is pre-processed using the Box-Cox transformation to remove the redundant and noise values from data. The pre-processed data is passed to the feature selection phase, where the essential and the suitable features are selected using the wrapper model. The selected feature makes the classifier to perform better detection performance. Finally, the selected features are fed to the detection phase, where the deep recurrent neural network classifier is used to achieve the fraud detection process such that the training process of the classifier is done by the proposed Harris water optimization algorithm, which is the integration of water wave optimization and Harris hawks optimization.
Findings
Moreover, the proposed HWO-based deep RNN obtained better performance in terms of the metrics, such as accuracy, sensitivity and specificity with the values of 0.9192, 0.7642 and 0.9943.
Originality/value
An effective fraud detection method named HWO-based deep RNN is designed to detect the frauds in the bank transaction. The optimal features selected using the wrapper model enable the classifier to find fraudulent activities more efficiently. However, the accurate detection result is evaluated through the optimization model based on the fitness measure such that the function with the minimal error value is declared as the best solution, as it yields better detection results.
Details
Keywords
Yakub Kayode Saheed, Usman Ahmad Baba and Mustafa Ayobami Raji
Purpose: This chapter aims to examine machine learning (ML) models for predicting credit card fraud (CCF).Need for the study: With the advance of technology, the world is…
Abstract
Purpose: This chapter aims to examine machine learning (ML) models for predicting credit card fraud (CCF).
Need for the study: With the advance of technology, the world is increasingly relying on credit cards rather than cash in daily life. This creates a slew of new opportunities for fraudulent individuals to abuse these cards. As of December 2020, global card losses reached $28.65billion, up 2.9% from $27.85 billion in 2018, according to the Nilson 2019 research. To safeguard the safety of credit card users, the credit card issuer should include a service that protects customers from potential risks. CCF has become a severe threat as internet buying has grown. To this goal, various studies in the field of automatic and real-time fraud detection are required. Due to their advantageous properties, the most recent ones employ a variety of ML algorithms and techniques to construct a well-fitting model to detect fraudulent transactions. When it comes to recognising credit card risk is huge and high-dimensional data, feature selection (FS) is critical for improving classification accuracy and fraud detection.
Methodology/design/approach: The objectives of this chapter are to construct a new model for credit card fraud detection (CCFD) based on principal component analysis (PCA) for FS and using supervised ML techniques such as K-nearest neighbour (KNN), ridge classifier, gradient boosting, quadratic discriminant analysis, AdaBoost, and random forest for classification of fraudulent and legitimate transactions. When compared to earlier experiments, the suggested approach demonstrates a high capacity for detecting fraudulent transactions. To be more precise, our model’s resilience is constructed by integrating the power of PCA for determining the most useful predictive features. The experimental analysis was performed on German credit card and Taiwan credit card data sets.
Findings: The experimental findings revealed that the KNN achieved an accuracy of 96.29%, recall of 100%, and precision of 96.29%, which is the best performing model on the German data set. While the ridge classifier was the best performing model on Taiwan Credit data with an accuracy of 81.75%, recall of 34.89, and precision of 66.61%.
Practical implications: The poor performance of the models on the Taiwan data revealed that it is an imbalanced credit card data set. The comparison of our proposed models with state-of-the-art credit card ML models showed that our results were competitive.
Details
Keywords
Aimin Yan, Biyun Jiang and Zhimei Zang
Drawing upon the conservation of resources theory, this study aims to investigate whether, how and when salespeople’s substantive attribution of the organization’s corporate…
Abstract
Purpose
Drawing upon the conservation of resources theory, this study aims to investigate whether, how and when salespeople’s substantive attribution of the organization’s corporate social responsibility (CSR) affects value-based selling (VBS). The authors argue that salespeople’s substantive CSR attribution increase value-based selling through two mechanisms (i.e. by lowering emotional exhaustion and increasing empathy), and treatment by customers can increase or decrease the strength of these relationships.
Design/methodology/approach
B2B salespeople working in various industries in China were recruited through snowball sampling to participate in the study. There were 462 volunteers (57.58% women; aged 30–55; tenure ranging from six months to 15 years) who provided valid self-report questionnaires.
Findings
Hierarchical multiple regression supported the association between salespeople’s substantive CSR attribution and VBS. The results showed that salespeople’s emotional state (i.e. emotional exhaustion and empathy) mediated the association between substantive CSR attribution and VBS. As expected, salespeople’s experiences of customer incivility weakened the mediating effect of emotional exhaustion; contrary to expectations, customer-initiated interpersonal justice weakened the mediation effect of empathy.
Originality/value
This study makes a unique contribution to the existing marketing literature by first investigating the role of salespeople’s attribution of CSR motives in facilitating their VBS, which answers the call to identify factors that predict VBS. In addition, to the best of the authors’ knowledge, the authors are the first to test salespeople’s emotions as a mechanism of the link between their CSR attributions and selling behaviors.
Details
Keywords
Chenyang Sun and Mohammad Khishe
The purpose of the study is to address concerns regarding the subjectivity and imprecision of decision-making in table tennis refereeing by developing and enhancing a sensor node…
Abstract
Purpose
The purpose of the study is to address concerns regarding the subjectivity and imprecision of decision-making in table tennis refereeing by developing and enhancing a sensor node system. This system is designed to accurately detect the points on the table tennis table where balls collide. The study introduces the twined-reinforcement chimp optimization (TRCO) framework, which combines two novel approaches to optimize the distribution of sensor nodes. The main goal is to reduce the number of sensor units required while maintaining high accuracy in determining the locations of ball collisions, with error margins significantly below the critical 3.5 mm cutoff. Through complex optimization procedures, the study aims to improve the efficiency and reliability of decision-making in table tennis refereeing by leveraging sensor technology.
Design/methodology/approach
The study employs a design methodology focused on developing a sensor array system to enhance decision-making in table tennis refereeing. It introduces the twined-reinforcement chimp optimization (TRCO) framework, combining dual adaptive weighting strategies and a stochastic approach for optimization. By meticulously engineering the sensor array and utilizing complex optimization procedures, the study aims to improve the accuracy of detecting ball collisions on the table tennis table. The methodology aims to reduce the number of sensor units required while maintaining high precision, ultimately enhancing the reliability of decision-making in the sport.
Findings
The optimization research study yielded promising outcomes, showcasing a substantial reduction in the number of sensor units required from the initial count of 60 to a more practical 49. The sensor array system demonstrated excellent accuracy in identifying the locations of ball collisions, with error margins significantly below the critical 3.5 mm cutoff. Through the implementation of the twined-reinforcement chimp optimization (TRCO) framework, which integrates dual adaptive weighting strategies and a stochastic approach, the study achieved its goal of enhancing the efficiency and reliability of decision-making in table tennis refereeing.
Originality/value
This study introduces novel contributions to the field of table tennis refereeing by pioneering the development and optimization of a sensor array system. The innovative twined-reinforcement chimp optimization (TRCO) framework, integrating dual adaptive weighting strategies and a stochastic approach, sets a new standard for sensor node distribution in sports technology. By substantially reducing the number of sensor units required while maintaining high accuracy in detecting ball collisions, this research offers practical solutions to address the inherent subjectivity and imprecision in decision-making processes. The study’s originality lies in its meticulous design methodology and complex optimization procedures, offering significant value to the field of sports technology and officiating.
Details
Keywords
Lerzan Aksoy, Sabine Benoit, Shreekant G. Joag, Jay Kandampully, Timothy Lee Keiningham and An L. Yan
The needs of CMOs to utilize a firm's data productively in order to support decision-making combined with the reported benefits of enterprise feedback management solutions has…
Abstract
Purpose
The needs of CMOs to utilize a firm's data productively in order to support decision-making combined with the reported benefits of enterprise feedback management solutions has resulted in a rapid rise in usage and valuation of EFM providers. The explicit promise of EFM providers is improved financial performance, whereas there is no scientific research investigating this link. To investigate the link between EFM usage and financial performance is core of this research.
Design/methodology/approach
To gain insight into this link survey data from 127 US-based firms on their usage of EFM platforms was linked to their stock market performance over several years.
Findings
This research did not find any significant positive relationships between different aspects of EFM usage investigated and stock returns. It is important to note that these results should not be taken as validation that EFM systems do not result in positive financial outcomes for firms. It may be that superior market performance as measured through stock returns is difficult to observe through a cross-sectional analysis. Instead these results indicate that superior market performance as measured through stock market performance is not an obvious, generalizable outcome for firms that have adopted EFM systems.
Originality/value
EFM has rapidly grown across many consumer facing industries, with EFM platform providers receiving very high market valuations on relatively small revenue streams. This is one of the first scientific papers to study the usage and impact of these EFM systems.
Details
Keywords
The purpose of this paper is an examination of the literature on team boundary activity to trace how team boundary activity has evolved as a construct and examine the dimensions…
Abstract
Purpose
The purpose of this paper is an examination of the literature on team boundary activity to trace how team boundary activity has evolved as a construct and examine the dimensions of team boundary activity and their relationships. It highlights the need for a deeper examination of the dimensions of buffering and reinforcement, and why buffering and reinforcement are required. It presents the case of why it is important to study this topic and maps out areas for future research.
Design/methodology/approach
The paper reviews conceptual and empirical papers published on team boundary activity in reputed journals between the years 1984 and 2016.
Findings
The focus of research in team boundary activity has been on external interactions of the team (boundary spanning), and very few papers have studied the activities through which the team defines and defends its borders (boundary strengthening). These boundary-strengthening activities can be equally important for innovation and learning in externally dependent teams. Further, there is a need to clearly distinguish these constructs from other variables like team identification. Another area that has here-to not been researched is the relationships between the dimensions of team boundary activity. Last, there is a need to consider a wider range of antecedents, outcomes and moderators of team boundary activity.
Research limitations/implications
This paper is based on past empirical and conceptual papers, identified using search terms such as team boundary activity, team boundary spanning and external communication. Other related areas can also be explored for identifying variables of interest.
Originality/value
As opposed to previous reviews which focused mainly on team boundary spanning, this paper considers all dimensions of team boundary activity, with special focus on buffering and reinforcement. It proposes a 2 × 2 framework to explain the effect of boundary-spanning and boundary-strengthening activities on the achievement of team objectives. It examines the cyclical nature of relationship between team boundary activity and team performance. It highlights measurement issues in the area of team boundary activity.
Details
Keywords
Lan Li, Tan Pan, Xinchang Zhang, Yitao Chen, Wenyuan Cui, Lei Yan and Frank Liou
During the powder bed fusion process, thermal distortion is one big problem owing to the thermal stress caused by the high cooling rate and temperature gradient. For the purpose…
Abstract
Purpose
During the powder bed fusion process, thermal distortion is one big problem owing to the thermal stress caused by the high cooling rate and temperature gradient. For the purpose of avoiding distortion caused by internal residual stresses, support structures are used in most selective laser melting (SLM) process especially for cantilever beams because they can assist the heat dissipation. Support structures can also help to hold the work piece in its place and reduce volume of the printing materials. The mitigation of high thermal gradients during the manufacturing process helps to reduce thermal distortion and thus alleviate cracking, curling, delamination and shrinkage. Therefore, this paper aims to study the displacement and residual stress evolution of SLMed parts.
Design/methodology/approach
The objective of this study was to examine and compare the distortion and residual stress properties of two cantilever structures, using both numerical and experimental methods. The part-scale finite element analysis modeling technique was applied to numerically analyze the overhang distortions, using the layer-by-layer model for predicting a part scale model. The validation experiments of these two samples were built in a SLM platform. Then average displacement of the four tip corners and residual stress on top surface of cantilever beams were tested to validate the model.
Findings
The validation experiments results of average displacement of the four tip corners and residual stress on top surface of cantilever beams were tested to validate the model. It was found that they matched well with each other. From displacement and residual stress standpoint, by introducing two different support structure, two samples with the same cantilever beam can be successfully printed. In terms of reducing wasted support materials, print time and high surface quality, sample with less support will need less post-processing and waste energy.
Originality/value
Numerical modeling in this work can be a very useful tool to parametrically study the feasibility of support structures of SLM parts in terms of residual stresses and deformations. It has the capability for fast prediction in the SLMed parts.
Details
Keywords
Ian Palmer and Richard Dunford
A burgeoning literature refers to the effect of hypercompetitive conditions on organizations. The new orthodoxy involves reference to the disintegration of vertical, rational…
Abstract
A burgeoning literature refers to the effect of hypercompetitive conditions on organizations. The new orthodoxy involves reference to the disintegration of vertical, rational bureaucracies and the corresponding emergence of widespread innovation in new organizational practices such as delayering, outsourcing, and reducing organizational boundaries. Differing assumptions occur regarding the compatibility of new organizational practices with more traditional practices such as centralization and formalization. We present systematic, survey‐based data in order to assist in assessing these differing assumptions about compatibility. Our results confirm greater use of new organizational practices by organizations operating in dynamic environments. They also show that greater use of new organizational practices is not associated with less use of either centralization or formalization—indeed it is associated with an increased use of formalization. We argue the need to move beyond a compatibility/incompatibility dichotomy and propose a research agenda for achieving this. The implications for management include the need to view with caution evangelical calls for radical restructuring that ignore the subtleties of the relationship between traditional and new organizational practices.
Tiago Oliveira, Wilber Vélez and Artur Portela
This paper is concerned with new formulations of local meshfree and finite element numerical methods, for the solution of two-dimensional problems in linear elasticity.
Abstract
Purpose
This paper is concerned with new formulations of local meshfree and finite element numerical methods, for the solution of two-dimensional problems in linear elasticity.
Design/methodology/approach
In the local domain, assigned to each node of a discretization, the work theorem establishes an energy relationship between a statically admissible stress field and an independent kinematically admissible strain field. This relationship, derived as a weighted residual weak form, is expressed as an integral local form. Based on the independence of the stress and strain fields, this local form of the work theorem is kinematically formulated with a simple rigid-body displacement to be applied by local meshfree and finite element numerical methods. The main feature of this paper is the use of a linearly integrated local form that implements a quite simple algorithm with no further integration required.
Findings
The reduced integration, performed by this linearly integrated formulation, plays a key role in the behavior of local numerical methods, since it implies a reduction of the nodal stiffness which, in turn, leads to an increase of the solution accuracy and, which is most important, presents no instabilities, unlike nodal integration methods without stabilization. As a consequence of using such a convenient linearly integrated local form, the derived meshfree and finite element numerical methods become fast and accurate, which is a feature of paramount importance, as far as computational efficiency of numerical methods is concerned. Three benchmark problems were analyzed with these techniques, in order to assess the accuracy and efficiency of the new integrated local formulations of meshfree and finite element numerical methods. The results obtained in this work are in perfect agreement with those of the available analytical solutions and, furthermore, outperform the computational efficiency of other methods. Thus, the accuracy and efficiency of the local numerical methods presented in this paper make this a very reliable and robust formulation.
Originality/value
Presentation of a new local mesh-free numerical method. The method, linearly integrated along the boundary of the local domain, implements an algorithm with no further integration required. The method is absolutely reliable, with remarkably-accurate results. The method is quite robust, with extremely-fast computations.