M. Ghahramani and A. Thavaneswaran
Financial returns are often modeled as stationary time series with innovations having heteroscedastic conditional variances. This paper seeks to derive the kurtosis of stationary…
Abstract
Purpose
Financial returns are often modeled as stationary time series with innovations having heteroscedastic conditional variances. This paper seeks to derive the kurtosis of stationary processes with GARCH errors. The problem of hypothesis testing for stationary ARMA(p, q) processes with GARCH errors is studied. Forecasting of ARMA(p, q) processes with GARCH errors is also discussed in some detail.
Design/methodology/approach
Estimating‐function methodology was the principal method used for the research. The results were also illustrated using examples and simulation studies. Volatility modeling is the subject of the paper.
Findings
The kurtosis of stationary processes with GARCH errors is derived in terms of the model parameters (ψ), Ψ‐weights, and the kurtosis of the innovation process. Hypothesis testing for stationary ARMA(p, q) processes with GARCH errors based on the estimating‐function approach is shown to be superior to the least‐squares approach. The fourth moment of the l‐steps‐ahead forecast error is related to the model parameters and the kurtosis of the innovation process.
Originality/value
This paper will be of value to econometricians and to anyone with an interest in the statistical properties of volatility modeling.
Details
Keywords
João Pedro C. de Souza, António M. Amorim, Luís F. Rocha, Vítor H. Pinto and António Paulo Moreira
The purpose of this paper is to present a programming by demonstration (PbD) system based on 3D stereoscopic vision and inertial sensing that provides a cost-effective pose…
Abstract
Purpose
The purpose of this paper is to present a programming by demonstration (PbD) system based on 3D stereoscopic vision and inertial sensing that provides a cost-effective pose tracking system, even during error-prone situations, such as camera occlusions.
Design/methodology/approach
The proposed PbD system is based on the 6D Mimic innovative solution, whose six degrees of freedom marker hardware had to be revised and restructured to accommodate an IMU sensor. Additionally, a new software pipeline was designed to include this new sensing device, seeking the improvement of the overall system’s robustness in stereoscopic vision occlusion situations.
Findings
The IMU component and the new software pipeline allow the 6D Mimic system to successfully maintain the pose tracking when the main tracking tool, i.e. the stereoscopic vision, fails. Therefore, the system improves in terms of reliability, robustness, and accuracy which were verified by real experiments.
Practical implications
Based on this proposal, the 6D Mimic system reaches a reliable and low-cost PbD methodology. Therefore, the robot can accurately replicate, on an industrial scale, the artisan level performance of highly skilled shop-floor operators.
Originality/value
To the best of the authors’ knowledge, the sensor fusion between stereoscopic images and IMU applied to robot PbD is a novel approach. The system is entirely designed aiming to reduce costs and taking advantage of an offline processing step for data analysis, filtering and fusion, enhancing the reliability of the PbD system.
Details
Keywords
Dr Nitish Ojha and Dr Nikhil VP
It's no longer a secret that a hassle-free life and better human development index are only possible in smart cities with appropriate and efficient deployment of artificial…
Abstract
It's no longer a secret that a hassle-free life and better human development index are only possible in smart cities with appropriate and efficient deployment of artificial intelligence (AI)-based technologies where the best results of data analysis are being used. Technology is becoming more productive using circular economy while employing all the dimensions of AI where integration of results is being incorporated as an outcome of data analysis received from different segments i.e., Traffic Management, Public Safety, and Movement, Security and surveillance, Waste Management Systems, or the Energy Management, etc. This chapter specifically talks about areas where AI is facing challenges in the implementation and administration of smart cities while covering the intrinsic challenges faced in specialized domains such as Public Sanitation, Virtual Parking Management, Traffic Congestion, Security Surveillance, and many more discussed as case study relating to the functioning of the circular economy. In the last, we have summarized the impact of AI on the CE and its future scope where AI can play a better role in increased productivity, increased efficiency, robust safety and finally economic benefit for long-term stable economic stability, development and inclusive growth.
Details
Keywords
This chapter describes how the anticipation of connected content relegates cognitive spacing, which opens the possibility for schema acquisition. Information organization does not…
Abstract
Purpose
This chapter describes how the anticipation of connected content relegates cognitive spacing, which opens the possibility for schema acquisition. Information organization does not simply involve putting new data into folders, but instead cognitively preparing for knowledge development.
Design/methodology/approach
Understanding information input and output is central to providing meaningful instructional opportunities. This chapter describes the three phrases of cognitive spacing: ready, set, and go.
Findings
Information organization does not simply involve putting new data into folders, but instead cognitively preparing for knowledge development. This is accomplished by ongoing reorganizations where new information, known information, and assumed information are evaluated against current stimuli. The subsequent shifts in understanding are the fundamental crux to instilling lifelong learning within students.
Relevancy
The importance of spacing theory in literacy development is significant to skill development and content acquisition.
Details
Keywords
Fangqi Hong, Pengfei Wei and Michael Beer
Bayesian cubature (BC) has emerged to be one of most competitive approach for estimating the multi-dimensional integral especially when the integrand is expensive to evaluate, and…
Abstract
Purpose
Bayesian cubature (BC) has emerged to be one of most competitive approach for estimating the multi-dimensional integral especially when the integrand is expensive to evaluate, and alternative acquisition functions, such as the Posterior Variance Contribution (PVC) function, have been developed for adaptive experiment design of the integration points. However, those sequential design strategies also prevent BC from being implemented in a parallel scheme. Therefore, this paper aims at developing a parallelized adaptive BC method to further improve the computational efficiency.
Design/methodology/approach
By theoretically examining the multimodal behavior of the PVC function, it is concluded that the multiple local maxima all have important contribution to the integration accuracy as can be selected as design points, providing a practical way for parallelization of the adaptive BC. Inspired by the above finding, four multimodal optimization algorithms, including one newly developed in this work, are then introduced for finding multiple local maxima of the PVC function in one run, and further for parallel implementation of the adaptive BC.
Findings
The superiority of the parallel schemes and the performance of the four multimodal optimization algorithms are then demonstrated and compared with the k-means clustering method by using two numerical benchmarks and two engineering examples.
Originality/value
Multimodal behavior of acquisition function for BC is comprehensively investigated. All the local maxima of the acquisition function contribute to adaptive BC accuracy. Parallelization of adaptive BC is realized with four multimodal optimization methods.
Details
Keywords
Xiaoping Zhang, Yanhui Li, Meixiu Li, Qiuju Du, Hong Li, Yuqi Wang, Dechang Wang, Cuiping Wang, Kunyan Sui, Hongliang Li, Yanzhi Xia and Yuanhai Yu
In order to discover a new adsorbent that can be used to purify dye wastewater in the textile and apparel industry, a novel type of graphene oxide/gluten composite material using…
Abstract
Purpose
In order to discover a new adsorbent that can be used to purify dye wastewater in the textile and apparel industry, a novel type of graphene oxide/gluten composite material using an improved acid bath coagulation method was synthesized, which can remove methylene blue in an aqueous environment.
Design/methodology/approach
After experimentally compounding different ratios of graphene oxide and gluten, the graphene oxide/gluten composite material with 20% graphene oxide content and superlative adsorption effect was chosen. The synthesized material was characterized by different techniques such as FT-IR and SEM, indicating the microstructure of the material and the success of the composite. Various factors were considered, namely, the influence of temperature, dosage, pH and contact time. The isotherms, kinetics and thermodynamic parameters were successively discussed.
Findings
The qmax value of 214.29 mg/g of the material was higher compared to the general sorbent, thus, the graphene oxide/gluten composite material was a suitable sorbent for methylene blue removal. Overall, the graphene oxide/gluten composite material can be considered as an effectual and prospective adsorbent to remove methylene blue in the textile and apparel industrial effluent.
Originality/value
Graphene oxide is a potentially excellent sorbent. However, the high dispersibility of GO is detrimental to adsorption, it disperses rapidly in an aqueous solution making separation and recovery difficult. The high load capacity and recyclability of gluten as a colloid make it a suitable carrier for fixing GO. Studies on the combination of GO and GT into composite adsorption material and for the removal of dyes from dyeing wastewater have not been reported. The composite material research on GO and GT can provide new ideas for the research of these kinds of materials and contribute to its wider and convenient application in wastewater treatment.
Details
Keywords
A. Thavaneswaran, J. Singh and S.S. Appadoo
To study stochastic volatility in the pricing of options.
Abstract
Purpose
To study stochastic volatility in the pricing of options.
Design/methodology/approach
Random‐coefficient autoregressive and generalized autoregressive conditional heteroscedastic models are studied. The option‐pricing formula is viewed as a moment of a truncated normal distribution.
Findings
Kurtosis for RCA and for GARCH process is derived. Application of random coefficient GARCH kurtosis in analytical approximation of option pricing is discussed.
Originality/value
Findings are useful in financial modeling.
Details
Keywords
K. Thiagarajah and A. Thavaneswaran
The purpose of this research is to introduce a class of FRC (fuzzy random coefficient) volatility models and to study their moment properties. Fuzzy option values and the…
Abstract
Purpose
The purpose of this research is to introduce a class of FRC (fuzzy random coefficient) volatility models and to study their moment properties. Fuzzy option values and the superiority of fuzzy forecasts over minimum mean‐square forecasts are also discussed in some detail.
Design/methodology/approach
Fuzzy components are assumed to be triangular fuzzy numbers. Buckley's data‐driven method is used to determine the spread of the triangular fuzzy numbers by using standard errors of the estimated parameters.
Findings
The fuzzy kurtosis of various volatility models is obtained in terms of fuzzy coefficients. Fuzzy option values and fuzzy forecasts are illustrated with examples. Fuzzy forecast intervals are narrower than the corresponding MMSE forecast intervals.
Originality/value
This paper will be of value to econometricians and to anyone with an interest in financial volatility models.
Details
Keywords
Rodolphe Durand and Paul Gouvard
Extant research presents firms’ purpose as a consensual and positive attribute. This paper introduces an alternative perspective, which sees firms’ purposefulness as defined in…
Abstract
Extant research presents firms’ purpose as a consensual and positive attribute. This paper introduces an alternative perspective, which sees firms’ purposefulness as defined in relation to specific audiences. A firm’s purposefulness to a focal audience can be either positive or negative. Audiences find firms with which they share a common prioritization of issues more purposeful in absolute terms. Audiences find firms with which they share a common understanding of issues positively purposeful. Conversely, audiences find firms with an opposite understanding of issues negatively purposeful. Audiences harness specific resources to support firms they find positively purposeful and to oppose firms they find negatively purposeful. This paper introduces topic modeling and word embeddings as two techniques to operationalize this audience-based approach to purposefulness.
Details
Keywords
Guanxiong Wang, Xiaojian Hu and Ting Wang
By introducing the mass customization service mode into the cloud logistics environment, this paper studies the joint optimization of service provider selection and customer order…
Abstract
Purpose
By introducing the mass customization service mode into the cloud logistics environment, this paper studies the joint optimization of service provider selection and customer order decoupling point (CODP) positioning based on the mass customization service mode to provide customers with more diversified and personalized service content with lower total logistics service cost.
Design/methodology/approach
This paper addresses the general process of service composition optimization based on the mass customization mode in a cloud logistics service environment and constructs a joint decision model for service provider selection and CODP positioning. In the model, the two objective functions of minimum service cost and most satisfactory delivery time are considered, and the Pareto optimal solution of the model is obtained via the NSGA-II algorithm. Then, a numerical case is used to verify the superiority of the service composition scheme based on the mass customization mode over the general scheme and to verify the significant impact of the scale effect coefficient on the optimal CODP location.
Findings
(1) Under the cloud logistics mode, the implementation of the logistics service mode based on mass customization can not only reduce the total cost of logistics services by means of the scale effect of massive orders on the cloud platform but also make more efficient use of a large number of logistics service providers gathered on the cloud platform to provide customers with more customized and diversified service content. (2) The scale effect coefficient directly affects the total cost of logistics services and significantly affects the location of the CODP. Therefore, before implementing the mass customization logistics service mode, the most reasonable clustering of orders on the cloud logistics platform is very important for the follow-up service combination.
Originality/value
The originality of this paper includes two aspects. One is to introduce the mass customization mode in the cloud logistics service environment for the first time and summarize the operation process of implementing the mass customization mode in the cloud logistics environment. Second, in order to solve the joint decision optimization model of provider selection and CODP positioning, this paper designs a method for solving a mixed-integer nonlinear programming model using a multi-layer coding genetic algorithm.