Hua Feng, Ahsan Habib and Gao liang Tian
The purpose of this paper is to investigate the association between aggressive tax planning and stock price synchronicity.
Abstract
Purpose
The purpose of this paper is to investigate the association between aggressive tax planning and stock price synchronicity.
Design/methodology/approach
Employing the special institutional background of China, this study constructs tax aggressiveness and stock price synchronicity measures for a large sample of Chinese stocks spanning the period 2003–2015. The authors employ OLS regression as the baseline methodology, and a fixed effect model, the Fama–Macbeth method and GMM as sensitivity checks. Matched samples and difference-in-difference analyses are used to control for endogeneity.
Findings
The authors find a significant and positive association between aggressive tax planning and stock price synchronicity. Because material information about risky tax transactions tends to be hidden in various tax accruals accounts, aggressive tax strategies make financial statements less transparent, thereby, increasing information asymmetry and decreasing stock price informativeness. The authors also find that the firms engaging in aggressive tax planning exhibit relatively high corporate opacity. In addition, the authors find that improvements in the tax enforcement regime, ownership status and high-quality auditors all constrain the adverse effects of tax aggressiveness.
Practical implications
This study has important practical implications for China’s regulators, who are striving to reduce the tax burden of enterprises. It also helps investors to consider investment decisions more appropriately from a taxation perspective.
Originality/value
First, this paper contributes to the stock price efficiency literature by identifying the effect of a hitherto unexamined factor, namely, firm-level aggressive tax planning, on the efficiency of stock prices. Second, this study provides further empirical evidence to support the agency view of tax aggressiveness, and the informational interpretation of stock price synchronicity. Third, this study helps us better understand the effects of firm-level tax policy on firm-specific information capitalization in an environment where overall country-level investor protection is relatively weak.
Details
Keywords
Kai Li, Cheng Zhu, Jianjiang Wang and Junhui Gao
With burgeoning interest in the low-altitude economy, applications of long-endurance unmanned aerial vehicles (LE-UAVs) have increased in remote logistics distribution. Given…
Abstract
Purpose
With burgeoning interest in the low-altitude economy, applications of long-endurance unmanned aerial vehicles (LE-UAVs) have increased in remote logistics distribution. Given LE-UAVs’ advantages of wide coverage, strong versatility and low cost, in addition to logistics distribution, they are widely used in military reconnaissance, communication relay, disaster monitoring and other activities. With limited autonomous intelligence, LE-UAVs require regular periodic and non-periodic control from ground control resources (GCRs) during flights and mission execution. However, the lack of GCRs significantly restricts the applications of LE-UAVs in parallel.
Design/methodology/approach
We consider the constraints of GCRs, investigating an integrated optimization problem of multi-LE-UAV mission planning and GCR allocation (Multi-U&G IOP). The problem integrates GCR allocation into traditional multi-UAV cooperative mission planning. The coupling decision of mission planning and GCR allocation enlarges the decision space and adds complexities to the problem’s structure. Through characterizing the problem, this study establishes a mixed integer linear programming (MILP) model for the integrated optimization problem. To solve the problem, we develop a three-stage iterative optimization algorithm combining a hybrid genetic algorithm with local search-variable neighborhood decent, heuristic conflict elimination and post-optimization of GCR allocation.
Findings
Numerical experimental results show that our developed algorithm can solve the problem efficiently and exceeds the solution performance of the solver CPLEX. For small-scale instances, our algorithm can obtain optimal solutions in less time than CPLEX. For large-scale instances, our algorithm produces better results in one hour than CPLEX does. Implementing our approach allows efficient coordination of multiple UAVs, enabling faster mission completion with a minimal number of GCRs.
Originality/value
Drawing on the interplay between LE-UAVs and GCRs and considering the practical applications of LE-UAVs, we propose the Multi-U&G IOP problem. We formulate this problem as a MILP model aiming to minimize the maximum task completion time (makespan). Furthermore, we present a relaxation model for this problem. To efficiently address the MILP model, we develop a three-stage iterative optimization algorithm. Subsequently, we verify the efficacy of our algorithm through extensive experimentation across various scenarios.
Details
Keywords
The purpose of this manuscript, a state feedback gain depends on the optimal design of fractional order PID controller to time-delay system is established. In established optimal…
Abstract
Purpose
The purpose of this manuscript, a state feedback gain depends on the optimal design of fractional order PID controller to time-delay system is established. In established optimal design known as advanced cuttlefish optimizer and random decision forest that is combined performance of random decision forest algorithm (RDFA) and advanced cuttlefish optimizer (ACFO).
Design/methodology/approach
The proposed ACFO uses the concept of crossover and mutation operator depend on position upgrading to enhance its search behavior, calculational speed as well as convergence profile at basic cuttlefish optimizer.
Findings
Fractional order proportional-integrator-derivative (FOPID) controller, apart from as tuning parameters (kp, ki and kd) it consists of two extra tuning parameters λ and µ. In established technology, the increase of FOPID controller is adjusted to reach needed responses that demonstrated using RDFA theory as well as RDF weight matrices is probable to the help of the ACFO method. The uniqueness of the established method is to decrease the failure of the FOPID controller at greater order time delay method with the help of controller maximize restrictions. The objective of the established method is selected to consider parameters set point as well as achieved parameters of time-delay system.
Originality/value
In the established technique used to evade large order delays as well as reliability restrictions such as small excesses, time resolution, as well as fixed condition defect. These methods is implemented at MATLAB/Simulink platform as well as outcomes compared to various existing methods such as Ziegler-Nichols fit, curve fit, Wang method, regression and invasive weed optimization and linear-quadratic regression method.
Details
Keywords
Jia-Lang Seng and Hsiao-Fang Yang
The purpose of this study is to develop the dictionary with grammar and multiword structure has to be used in conjunction with sentiment analysis to investigate the relationship…
Abstract
Purpose
The purpose of this study is to develop the dictionary with grammar and multiword structure has to be used in conjunction with sentiment analysis to investigate the relationship between financial news and stock market volatility.
Design/methodology/approach
An algorithm has been developed for calculating the sentiment orientation and score of data with added information, and the results of calculation have been integrated to construct an empirical model for calculating stock market volatility.
Findings
The experimental results reveal a statistically significant relationship between financial news and stock market volatility. Moreover, positive (negative) news is found to be positively (negatively) correlated with positive stock returns, and the score of added information of the news is positively correlated with stock returns. Model verification and stock market volatility predictions are verified over four time periods (monthly, quarterly, semiannually and annually). The results show that the prediction accuracy of the models approaches 66% and stock market volatility with a particular trend-predicting effect in specific periods by using moving window evaluation.
Research limitations/implications
Only one news source is used and the research period is only two years; thus, future studies should incorporate several data sources and use a longer period to conduct a more in-depth analysis.
Practical implications
Understanding trends in stock market volatility can decrease risk and increase profit from investment. Therefore, individuals or businesses can feasibly engage in investment activities for profit by understanding volatility trends in capital markets.
Originality/value
The ability to exploit textual information could potentially increase the quality of the data. Few scholars have applied sentiment analysis in investigating interdisciplinary topics that cover information management technology, accounting and finance. Furthermore, few studies have provided support for structured and unstructured data. In this paper, the efficiency of providing the algorithm, the model and the trend in stock market volatility has been demonstrated.
Details
Keywords
Stuti Thapa, Louis Tay and Daphne Hou
Experience sampling methods (ESM) have enabled researchers to capture intensive longitudinal data and how worker well-being changes over time. The conceptual advances in…
Abstract
Experience sampling methods (ESM) have enabled researchers to capture intensive longitudinal data and how worker well-being changes over time. The conceptual advances in understanding the variability of well-being are discussed. These emerging forms in the literature include affective inertia, affective variability, affective reactivity, and density distributions. While most ESM research has relied on the active provision of data by participants (i.e., self-reports), technological advances have enabled different forms of passive sensing that are useful for assessing and tracking well-being and its contextual factors. These include accelerometer data, location data, and physiological data. The strengths and weaknesses of passively sensed data and future ways forward are discussed, where the use of both active and passive forms of ESM data in the assessment and promotion of worker well-being is expected.
Details
Keywords
Sandipan Karmakar and Jhareswar Maiti
The purpose of this paper is to present a state‐of‐the‐art review of dimensional tolerance synthesis and to demonstrate the evolution of tolerance synthesis from product to…
Abstract
Purpose
The purpose of this paper is to present a state‐of‐the‐art review of dimensional tolerance synthesis and to demonstrate the evolution of tolerance synthesis from product to process‐oriented strategy, as well as to compare the same for single stage and multistage manufacturing systems (MMS). The main focus is in delineating the different approaches, methods and techniques used with critical appraisal of their uses, applicability and limitations, based on which future research directions and a generic methodology are proposed.
Design/methodology/approach
Starting with issues in tolerancing research, the review demonstrates the critical aspects of product and process‐oriented tolerance synthesis. The aspects considered are: construction of tolerance design functions; construction of optimization functions; and use of optimization methods. In describing the issues of process‐oriented tolerance synthesis, a comparative study of single and multistage manufacturing has been provided.
Findings
This study critically reviews: the relationship between the tolerance variables and the variations created through manufacturing operations; objective functions for tolerance synthesis; and suitable optimization methods based upon the nature of the tolerance variables and the design functions created.
Research limitations/implications
This study is limited to dimensional tolerance synthesis problems and evolution of process‐oriented tolerance synthesis to counteract dimensional variation problems in assembly manufacturing.
Originality/value
The paper provides a comprehensive and step‐by‐step approach of review of dimensional tolerance synthesis.
Details
Keywords
Shanying Zhu, Vijayalakshmi Saravanan and BalaAnand Muthu
Currently, in the health-care sector, information security and privacy are increasingly important issues. The improvement in information security is highlighted in adopting…
Abstract
Purpose
Currently, in the health-care sector, information security and privacy are increasingly important issues. The improvement in information security is highlighted in adopting digital patient records based on regulation, providers’ consolidation, and the growing need to exchange information among patients, providers, and payers.
Design/methodology/approach
Big data on health care are likely to improve patient outcomes, predict epidemic outbreaks, gain valuable insights, prevent diseases, reduce health-care costs and improve analysis of the quality of life.
Findings
In this paper, the big data analytics-based cybersecurity framework has been proposed for security and privacy across health-care applications. It is vital to identify the limitations of existing solutions for future research to ensure a trustworthy big data environment. Furthermore, electronic health records (EHR) could potentially be shared by various users to increase the quality of health-care services. This leads to significant issues of privacy that need to be addressed to implement the EHR.
Originality/value
This framework combines several technical mechanisms and environmental controls and is shown to be enough to adequately pay attention to common threats to network security.
Details
Keywords
V. Senthil Kumaran and R. Latha
The purpose of this paper is to provide adaptive access to learning resources in the digital library.
Abstract
Purpose
The purpose of this paper is to provide adaptive access to learning resources in the digital library.
Design/methodology/approach
A novel method using ontology-based multi-attribute collaborative filtering is proposed. Digital libraries are those which are fully automated and all resources are in digital form and access to the information available is provided to a remote user as well as a conventional user electronically. To satisfy users' information needs, a humongous amount of newly created information is published electronically in digital libraries. While search applications are improving, it is still difficult for the majority of users to find relevant information. For better service, the framework should also be able to adapt queries to search domains and target learners.
Findings
This paper improves the accuracy and efficiency of predicting and recommending personalized learning resources in digital libraries. To facilitate a personalized digital learning environment, the authors propose a novel method using ontology-supported collaborative filtering (CF) recommendation system. The objective is to provide adaptive access to learning resources in the digital library. The proposed model is based on user-based CF which suggests learning resources for students based on their course registration, preferences for topics and digital libraries. Using ontological framework knowledge for semantic similarity and considering multiple attributes apart from learners' preferences for the learning resources improve the accuracy of the proposed model.
Research limitations/implications
The results of this work majorly rely on the developed ontology. More experiments are to be conducted with other domain ontologies.
Practical implications
The proposed approach is integrated into Nucleus, a Learning Management System (https://nucleus.amcspsgtech.in). The results are of interest to learners, academicians, researchers and developers of digital libraries. This work also provides insights into the ontology for e-learning to improve personalized learning environments.
Originality/value
This paper computes learner similarity and learning resources similarity based on ontological knowledge, feedback and ratings on the learning resources. The predictions for the target learner are calculated and top N learning resources are generated by the recommendation engine using CF.
Details
Keywords
M.V.A. Raju Bahubalendruni, Anil Gulivindala, Manish Kumar, Bibhuti Bhusan Biswal and Lakshumu Naidu Annepu
The purpose of this paper is to develop an efficient hybrid method that can collectively address assembly sequence generation (ASG) and exploded view generation (EVG) problem…
Abstract
Purpose
The purpose of this paper is to develop an efficient hybrid method that can collectively address assembly sequence generation (ASG) and exploded view generation (EVG) problem effectively. ASG is an act of finding feasible collision free movement of components of a mechanical product in accordance with the assembly design. Although the execution of ASG is complex and time-consuming in calculation, it is highly essential for efficient manufacturing process. Because of numerous limitations of the ASG algorithms, a definite method is still unavailable in the computer-aided design (CAD) software, and therefore the explosion of the product is not found to be in accordance with any feasible disassembly sequence (disassembly sequence is reverse progression of assembly sequence). The existing EVG algorithms in the CAD software result in visualization of the entire constituent parts of the product over single screen without taking into consideration the feasible order of assembly operations; thus, it becomes necessary to formulate an algorithm which effectively solves ASG and EVG problem in conjugation. This requirement has also been documented as standard in the “General Information Concerning Patents: 1.84 Standards for drawings” in the United States Patent and Trademark office (2005) which states that the exploded view created for any product should show the relationship or order of assembly of various parts that are permissible.
Design/methodology/approach
In this paper, a unique ASG method has been proposed and is further extended for EVG. The ASG follows a deterministic approach to avoid redundant data collection and calculation. The proposed method is effectively applied on products which require such feasible paths of disassembly other than canonical directions.
Findings
The method is capable of organizing the assembly operations as linear or parallel progression of assembly such that the assembly task is completed in minimum number of stages. This result is further taken for EVG and is found to be proven effective.
Originality/value
Assembly sequence planning (ASP) is performed most of the times considering the geometric feasibility along canonical axes without considering parallel possibility of assembly operations. In this paper, the proposed method is robust to address this issue. Exploded view generation considering feasible ASP is also one of the novel approaches illustrated in this paper.
Details
Keywords
Yu‐Hsin Lin, Chih‐Hung Tsai, Ching‐En Lee and Chung‐Ching Chiu
Constructing an effective production control policy is the most important issue in wafer fabrication factories. Most of researches focus on the input regulations of wafer…
Abstract
Constructing an effective production control policy is the most important issue in wafer fabrication factories. Most of researches focus on the input regulations of wafer fabrication. Although many of these policies have been proven to be effective for wafer fabrication manufacturing, in practical, there is a need to help operators decide which lots should be pulled in the right time and to develop a systematic way to alleviate the long queues at the bottleneck workstation. The purpose of this study is to construct a photolithography workstation dispatching rule (PADR). This dispatching rule considers several characteristics of wafer fabrication and influential factors. Then utilize the weights and threshold values to design a hierarchical priority rule. A simulation model is also constructed to demonstrate the effect of the PADR dispatching rule. The PADR performs better in throughput, yield rate, and mean cycle time than FIFO (First‐In‐First‐Out) and SPT (Shortest Process Time).