Jun Liu, Asad Khattak, Lee Han and Quan Yuan
Individuals’ driving behavior data are becoming available widely through Global Positioning System devices and on-board diagnostic systems. The incoming data can be sampled at…
Abstract
Purpose
Individuals’ driving behavior data are becoming available widely through Global Positioning System devices and on-board diagnostic systems. The incoming data can be sampled at rates ranging from one Hertz (or even lower) to hundreds of Hertz. Failing to capture substantial changes in vehicle movements over time by “undersampling” can cause loss of information and misinterpretations of the data, but “oversampling” can waste storage and processing resources. The purpose of this study is to empirically explore how micro-driving decisions to maintain speed, accelerate or decelerate, can be best captured, without substantial loss of information.
Design/methodology/approach
This study creates a set of indicators to quantify the magnitude of information loss (MIL). Each indicator is calculated as a percentage to index the extent of information loss (EIL) in different situations. An overall information loss index named EIL is created to combine the MIL indicators. Data from a driving simulator study collected at 20 Hertz are analyzed (N = 718,481 data points from 35,924 s of driving tests). The study quantifies the relationship between information loss indicators and sampling rates.
Findings
The results show that marginally more information is lost as data are sampled down from 20 to 0.5 Hz, but the relationship is not linear. With four indicators of MILs, the overall EIL is 3.85 per cent for 1-Hz sampling rate driving behavior data. If sampling rates are higher than 2 Hz, all MILs are under 5 per cent for importation loss.
Originality/value
This study contributes by developing a framework for quantifying the relationship between sampling rates, and information loss and depending on the objective of their study, researchers can choose the appropriate sampling rate necessary to get the right amount of accuracy.
Details
Keywords
Md. Tofael Hossain Majumder and Xiaojing Li
This study aims to investigate the impacts of bank capital requirements on the performance and risk of the emerging economy, i.e. Bangladeshi banking sector.
Abstract
Purpose
This study aims to investigate the impacts of bank capital requirements on the performance and risk of the emerging economy, i.e. Bangladeshi banking sector.
Design/methodology/approach
The study applies an unbalanced panel data which comprises 30 banks yielding a total of 413 bank-year observations over the period 2000 to 2015.
Findings
Using generalized methods of moments, the empirical results of this research reveal that bank capital is positively and significantly impressive on bank performance, whereas negatively and significantly impact on risk. The study also finds the inverse relationship between risk and performance in both the performance and risk equations. The results also indicate that there is a persistence of performance and risk from one year to the next year.
Originality/value
This is the unique investigation on Bangladeshi bank industry that considers the simultaneous effect of bank capital requirements on risk and performance. Therefore, it is predicted that the empirical evidence of this research shows policy implications to the regulatory authority of Bangladeshi banking industry to determine relevant policies.
Details
Keywords
Zhuoxuan Jiang, Chunyan Miao and Xiaoming Li
Recent years have witnessed the rapid development of massive open online courses (MOOCs). With more and more courses being produced by instructors and being participated by…
Abstract
Purpose
Recent years have witnessed the rapid development of massive open online courses (MOOCs). With more and more courses being produced by instructors and being participated by learners all over the world, unprecedented massive educational resources are aggregated. The educational resources include videos, subtitles, lecture notes, quizzes, etc., on the teaching side, and forum contents, Wiki, log of learning behavior, log of homework, etc., on the learning side. However, the data are both unstructured and diverse. To facilitate knowledge management and mining on MOOCs, extracting keywords from the resources is important. This paper aims to adapt the state-of-the-art techniques to MOOC settings and evaluate the effectiveness on real data. In terms of practice, this paper also tries to answer the questions for the first time that to what extend can the MOOC resources support keyword extraction models, and how many human efforts are required to make the models work well.
Design/methodology/approach
Based on which side generates the data, i.e instructors or learners, the data are classified to teaching resources and learning resources, respectively. The approach used on teaching resources is based on machine learning models with labels, while the approach used on learning resources is based on graph model without labels.
Findings
From the teaching resources, the methods used by the authors can accurately extract keywords with only 10 per cent labeled data. The authors find a characteristic of the data that the resources of various forms, e.g. subtitles and PPTs, should be separately considered because they have the different model ability. From the learning resources, the keywords extracted from MOOC forums are not as domain-specific as those extracted from teaching resources, but they can reflect the topics which are lively discussed in forums. Then instructors can get feedback from the indication. The authors implement two applications with the extracted keywords: generating concept map and generating learning path. The visual demos show they have the potential to improve learning efficiency when they are integrated into a real MOOC platform.
Research limitations/implications
Conducting keyword extraction on MOOC resources is quite difficult because teaching resources are hard to be obtained due to copyrights. Also, getting labeled data is tough because usually expertise of the corresponding domain is required.
Practical implications
The experiment results support that MOOC resources are good enough for building models of keyword extraction, and an acceptable balance between human efforts and model accuracy can be achieved.
Originality/value
This paper presents a pioneer study on keyword extraction on MOOC resources and obtains some new findings.
Details
Keywords
Chunlan Li, Jun Wang, Min Liu, Desalegn Yayeh Ayal, Qian Gong, Richa Hu, Shan Yin and Yuhai Bao
Extreme high temperatures are a significant feature of global climate change and have become more frequent and intense in recent years. These pose a significant threat to both…
Abstract
Purpose
Extreme high temperatures are a significant feature of global climate change and have become more frequent and intense in recent years. These pose a significant threat to both human health and economic activity, and thus are receiving increasing research attention. Understanding the hazards posed by extreme high temperatures are important for selecting intervention measures targeted at reducing socioeconomic and environmental damage.
Design/methodology/approach
In this study, detrended fluctuation analysis is used to identify extreme high-temperature events, based on homogenized daily minimum and maximum temperatures from nine meteorological stations in a major grassland region, Hulunbuir, China, over the past 56 years.
Findings
Compared with the commonly used functions, Weibull distribution has been selected to simulate extreme high-temperature scenarios. It has been found that there was an increasing trend of extreme high temperature, and in addition, the probability of its indices increased significantly, with regional differences. The extreme high temperatures in four return periods exhibited an extreme low hazard in the central region of Hulunbuir, and increased from the center to the periphery. With the increased length of the return period, the area of high hazard and extreme high hazard increased. Topography and anomalous atmospheric circulation patterns may be the main factors influencing the occurrence of extreme high temperatures.
Originality/value
These results may contribute to a better insight in the hazard of extreme high temperatures, and facilitate the development of appropriate adaptation and mitigation strategies to cope with the adverse effects.
Details
Keywords
Jun Lin, Zhiqi Shen, Chunyan Miao and Siyuan Liu
With the rapid growth of the Internet of Things (IoT) market and requirement, low power wide area (LPWA) technologies have become popular. In various LPWA technologies, Narrow…
Abstract
Purpose
With the rapid growth of the Internet of Things (IoT) market and requirement, low power wide area (LPWA) technologies have become popular. In various LPWA technologies, Narrow Band IoT (NB-IoT) and long range (LoRa) are two main leading competitive technologies. Compared with NB-IoT networks, which are mainly built and managed by mobile network operators, LoRa wide area networks (LoRaWAN) are mainly operated by private companies or organizations, which suggests two issues: trust of the private network operators and lack of network coverage. This study aims to propose a conceptual architecture design of a blockchain built-in solution for LoRaWAN network servers to solve these two issues for LoRaWAN IoT solution.
Design/methodology/approach
The study proposed modeling, model analysis and architecture design.
Findings
The proposed solution uses the blockchain technology to build an open, trusted, decentralized and tamper-proof system, which provides the indisputable mechanism to verify that the data of a transaction has existed at a specific time in the network.
Originality/value
To the best of our knowledge, this is the first work that integrates blockchain technology and LoRaWAN IoT technology.
Details
Keywords
Xiaomei Jiang, Shuo Wang, Wenjian Liu and Yun Yang
Traditional Chinese medicine (TCM) prescriptions have always relied on the experience of TCM doctors, and machine learning(ML) provides a technical means for learning these…
Abstract
Purpose
Traditional Chinese medicine (TCM) prescriptions have always relied on the experience of TCM doctors, and machine learning(ML) provides a technical means for learning these experiences and intelligently assists in prescribing. However, in TCM prescription, there are the main (Jun) herb and the auxiliary (Chen, Zuo and Shi) herb collocations. In a prescription, the types of auxiliary herbs are often more than the main herb and the auxiliary herbs often appear in other prescriptions. This leads to different frequencies of different herbs in prescriptions, namely, imbalanced labels (herbs). As a result, the existing ML algorithms are biased, and it is difficult to predict the main herb with less frequency in the actual prediction and poor performance. In order to solve the impact of this problem, this paper proposes a framework for multi-label traditional Chinese medicine (ML-TCM) based on multi-label resampling.
Design/methodology/approach
In this work, a multi-label learning framework is proposed that adopts and compares the multi-label random resampling (MLROS), multi-label synthesized resampling (MLSMOTE) and multi-label synthesized resampling based on local label imbalance (MLSOL), three multi-label oversampling techniques to rebalance the TCM data.
Findings
The experimental results show that after resampling, the less frequent but important herbs can be predicted more accurately. The MLSOL method is shown to be the best with over 10% improvements on average because it balances the data by considering both features and labels when resampling.
Originality/value
The authors first systematically analyzed the label imbalance problem of different sampling methods in the field of TCM and provide a solution. And through the experimental results analysis, the authors proved the feasibility of this method, which can improve the performance by 10%−30% compared with the state-of-the-art methods.
Details
Keywords
Liangyin Chen, Jun Huang, Danqi Hu and Xinyuan Chen
This paper aims to examine the effect of dividend regulation on cost stickiness (i.e. the asymmetric change in firm expense between sales increase and sales decrease) and explore…
Abstract
Purpose
This paper aims to examine the effect of dividend regulation on cost stickiness (i.e. the asymmetric change in firm expense between sales increase and sales decrease) and explore the underlying mechanism.
Design/methodology/approach
Based on the quasi-natural experiment of the Guideline for Dividend Policy of Listed Companies issued by the Shanghai Stock Exchange (SSE) in 2013, the authors employ a difference-in-difference model to investigate the impact of dividend regulation on cost stickiness.
Findings
The authors find that the cost stickiness of treatment group firms has decreased significantly when compared with control group firms after the dividend regulation. Moreover, this effect is more pronounced among firms in lower marketization regions, in lower competition industries and those with less analyst coverage and lower cash flow levels. Further analyses show that dividend regulation reduces the cost stickiness of firms by mitigating agency problems. Finally, the conclusion holds after several robust tests, including controlling for firm fixed effect, propensity score matching (PSM), placebo test and reconstruction of expense variable.
Originality/value
This paper confirms that dividend regulation serves an important role in corporate governance, which reduces firms' agency costs and thereby decreases cost stickiness. The conclusions shed light on the dividend policies of listed companies and capital market regulation in the future.
Details
Keywords
Tianliang Wang, Ya-Meng He, Zhen Wu and Jun-jun Li
This paper aims to study the impacts of groundwater seepage on artificial freezing process of gravel strata, the temperature field characteristics of the strata, and the strata…
Abstract
Purpose
This paper aims to study the impacts of groundwater seepage on artificial freezing process of gravel strata, the temperature field characteristics of the strata, and the strata process, closure time and thickness evolution mechanism of the frozen wall.
Design/methodology/approach
In this paper several laboratory model tests were conducted, considering different groundwater seepage rate.
Findings
The results show that there is a significant coupling effect between the cold diffusion of artificial freezing pipes and groundwater seepage; when there is no seepage, temperature fields upstream and downstream of the gravel strata are symmetrically distributed, and the thickness of the frozen soil column/frozen wall is consistent during artificial freezing; groundwater seepage causes significant asymmetry in the temperature fields upstream and downstream of the gravel strata, and the greater the seepage rate, the more obvious the asymmetry; the frozen wall closure time increases linearly with the increase in the groundwater seepage rate, and specifically, the time length under seepage rate of 5.00 m d−1 is 3.2 times longer than that under no seepage; due to the erosion from groundwater seepage, the thickness of the upstream frozen wall decreases linearly with the seepage velocity, while that of the downstream frozen wall increases linearly, resulting in a saddle-shaped frozen wall.
Originality/value
The research results are beneficial to the optimum design and risk control of artificial freezing process in gravel strata.
Details
Keywords
Jun Gao, Niall O’Sullivan and Meadhbh Sherman
The Chinese fund market has witnessed significant developments in recent years. However, although there has been a range of studies assessing fund performance in developed…
Abstract
Purpose
The Chinese fund market has witnessed significant developments in recent years. However, although there has been a range of studies assessing fund performance in developed industries, the rapidly developing fund industry in China has received very little attention. This study aims to examine the performance of open-end securities investment funds investing in Chinese domestic equity during the period May 2003 to September 2020. Specifically, applying a non-parametric bootstrap methodology from the literature on fund performance, the authors investigate the role of skill versus luck in this rapidly evolving investment funds industry.
Design/methodology/approach
This study evaluates the performance of Chinese equity securities investment funds from 2003–2020 using a bootstrap methodology to distinguish skill from luck in performance. The authors consider unconditional and conditional performance models.
Findings
The bootstrap methodology incorporates non-normality in the idiosyncratic risk of fund returns, which is a major drawback in “conventional” performance statistics. The evidence does not support the existence of “genuine” skilled fund managers. In addition, it indicates that poor performance is mainly attributable to bad stock picking skills.
Practical implications
The authors find that the top-ranked funds with positive abnormal performance are attributed to “good luck” not “good skill” while the negative abnormal performance of bottom funds is mainly due to “bad skill.” Therefore, sensible advice for most Chinese equity investors would be against trying to “pick winners funds” among Chinese securities investment funds but it would be recommended to avoid holding “losers.” At the present time, investors should consider other types of funds, such as index/tracker funds with lower transactions. In addition, less risk-averse investors may consider Chinese hedge funds [Zhao (2012)] or exchange-traded fund [Han (2012)].
Originality/value
The paper makes several contributions to the literature. First, the authors examine a wide range (over 50) of risk-adjusted performance models, which account for both unconditional and conditional risk factors. The authors also control for the profitability and investment risks in Fama and French (2015). Second, the authors select the “best-fit” model across all risk-adjusted models examined and a single “best-fit” model from each of the three classes. Therefore, the bootstrap analysis, which is mainly based on the selected best-fit models, is more precise and robust. Third, the authors reduce the possibility that findings may be sample-period specific or may be a survivor (upward) biased. Fourth, the authors consider further analysis based on sub-periods and compare fund performance in different market conditions to provide more implications to investors and practitioners. Fifth, the authors carry out extensive robustness checks and show that the findings are robust in relation to different minimum fund histories and serial correlation and heteroscedasticity adjustments. Sixth, the authors use higher frequency weekly data to improve statistical estimation.
Details
Keywords
Using Social Network Analysis (SNA), this paper examines the inter-provincial logistics relationships in China. Based on the annual data of inter-provincial railway logistics…
Abstract
Using Social Network Analysis (SNA), this paper examines the inter-provincial logistics relationships in China. Based on the annual data of inter-provincial railway logistics quantity during the period 1998-2009, the degree of interconnection between regions could imply intensified trends of regional economic integration.
The main results of the logistic relationships in China are as follows: the regional logistic interconnection, especially between western and eastern China has increased continuously, which would imply a rising national economic integration. However, the increased centralization index and the average Degree Centrality level imply that a logistics bottleneck has intensified in several hub provinces.
Secondly, logistic center provinces evaluated by the Degree Centrality have changed. In 2009, Hebei, Liaoning, Jiangsu, Shandong, Henan and Sichuan provinces revealed the highest inward Degree Centrality. Sichuan Province is the region that most surprisingly increased its centrality.
Thirdly, the number of logistic hub provinces, evaluated by the Degree Betweenness Centrality, has increased. In 2008, Henan province was only a focal hub but in 2009, Shandong, Hubei, Sichuan provinces became logistic hubs.
Lastly, the Community Modularity which analyzed grouping structures shows that there are three time-consistent communities. This means that even though there is enhanced between-region integration, the inter-regional inter-connection is more important in explaining the regional logistic relationship.