Yuling Ran, Wei Bai, Lingwei Kong, Henghui Fan, Xiujuan Yang and Xuemei Li
The purpose of this paper is to develop an appropriate machine learning model for predicting soil compaction degree while also examining the contribution rates of three…
Abstract
Purpose
The purpose of this paper is to develop an appropriate machine learning model for predicting soil compaction degree while also examining the contribution rates of three influential factors: moisture content, electrical conductivity and temperature, towards the prediction of soil compaction degree.
Design/methodology/approach
Taking fine-grained soil A and B as the research object, this paper utilized the laboratory test data, including compaction parameter (moisture content), electrical parameter (electrical conductivity) and temperature, to predict soil degree of compaction based on five types of commonly used machine learning models (19 models in total). According to the prediction results, these models were preliminarily compared and further evaluated.
Findings
The Gaussian process regression model has a good effect on the prediction of degree of compaction of the two kinds of soils: the error rates of the prediction of degree of compaction for fine-grained soil A and B are within 6 and 8%, respectively. As per the order, the contribution rates manifest as: moisture content > electrical conductivity >> temperature.
Originality/value
By using moisture content, electrical conductivity, temperature to predict the compaction degree directly, the predicted value of the compaction degree can be obtained with higher accuracy and the detection efficiency of the compaction degree can be improved.
Details
Keywords
Raymond A.K. Cox and Robert T. Kleiman
Outlines previous research on the security analyst “superstar” phenomenon, including the stochastic model of Yule and Simon. Applies this to data on the 1986‐1997 selections for…
Abstract
Outlines previous research on the security analyst “superstar” phenomenon, including the stochastic model of Yule and Simon. Applies this to data on the 1986‐1997 selections for the Institutional Investor’s All‐British Research First Team (ABRT) and finds that it does not explain the distribution, i.e. that selection does appear to be based on skill rather than luck. Considers consistency with other research and expects future research to concentrate on the ABRT’s ability to forecast earnings per share and share prices.
Details
Keywords
Lukas Koelbl, Alexander Braumann, Elisabeth Felsenstein and Manfred Deistler
This paper is concerned with estimation of the parameters of a high-frequency VAR model using mixed-frequency data, both for the stock and for the flow case. Extended Yule–Walker…
Abstract
This paper is concerned with estimation of the parameters of a high-frequency VAR model using mixed-frequency data, both for the stock and for the flow case. Extended Yule–Walker estimators and (Gaussian) maximum likelihood type estimators based on the EM algorithm are considered. Properties of these estimators are derived, partly analytically and by simulations. Finally, the loss of information due to mixed-frequency data when compared to the high-frequency situation as well as the gain of information when using mixed-frequency data relative to low-frequency data is discussed.
Details
Keywords
Hong-Yan Yan and Jin Kwon Hwang
The purpose of this paper is to improve the online monitoring level of low-frequency oscillation in the power system. A modal identification method of discrete Fourier transform…
Abstract
Purpose
The purpose of this paper is to improve the online monitoring level of low-frequency oscillation in the power system. A modal identification method of discrete Fourier transform (DFT) curve fitting based on ambient data is proposed in this study.
Design/methodology/approach
An autoregressive moving average mathematical model of ambient data was established, parameters of low-frequency oscillation were designed and parameters of low-frequency oscillation were estimated via DFT curve fitting. The variational modal decomposition method is used to filter direct current components in ambient data signals to improve the accuracy of identification. Simulation phasor measurement unit data and measured data of the power grid proved the correctness of this method.
Findings
Compared with the modified extended Yule-Walker method, the proposed approach demonstrates the advantages of fast calculation speed and high accuracy.
Originality/value
Modal identification method of low-frequency oscillation based on ambient data demonstrated high precision and short running time for small interference patterns. This study provides a new research idea for low-frequency oscillation analysis and early warning of power systems.
Details
Keywords
Examines the behaviour of UK employment in manufacturing over theperiod 1964 to 1986. The use of cointegration techniques allows theseparation of a long‐run equilibrium…
Abstract
Examines the behaviour of UK employment in manufacturing over the period 1964 to 1986. The use of cointegration techniques allows the separation of a long‐run equilibrium relationship for employment from its short‐run dynamics. The estimated model demonstrates a high degree of parameter stability both within and outwith the sample period used for estimation. Given the noted sensitivity of other employment equations to system shocks, the model′s performance pre – and post‐1979 is particularly noteworthy.
Details
Keywords
Albert A. Okunade, Xiaohui You and Kayhan Koleyni
The search for more effective policies, choice of optimal implementation strategies for achieving defined policy targets (e.g., cost-containment, improved access, and quality…
Abstract
The search for more effective policies, choice of optimal implementation strategies for achieving defined policy targets (e.g., cost-containment, improved access, and quality healthcare outcomes), and selection among the metrics relevant for assessing health system policy change performance simultaneously pose continuing healthcare sector challenges for many countries of the world. Meanwhile, research on the core drivers of healthcare costs across the health systems of the many countries continues to gain increased momentum as these countries learn among themselves. Consequently, cross-country comparison studies largely focus their interests on the relationship among health expenditures (HCE), GDP, aging demographics, and technology. Using more recent 1980–2014 annual data panel on 34 OECD countries and the panel ARDL (Autoregressive Distributed Lag) framework, this study investigates the long- and short-run relationships among aggregate healthcare expenditure, income (GDP per capita or per capita GDP_HCE), age dependency ratio, and “international co-operation patents” (for capturing the technology effects). Results from the panel ARDL approach and Granger causality tests suggest a long-run relationship among healthcare expenditure and the three major determinants. Findings from the Westerlund test with bootstrapping further corroborate the existence of a long-run relationship among healthcare expenditure and the three core determinants. Interestingly, GDP less health expenditure (GDP_HCE) is the only short-run driver of HCE. The income elasticity estimates, falling in the 1.16–1.46 range, suggest that the behavior of aggregate healthcare in the 34 OECD countries tends toward those for luxury goods. Finally, through cross-country technology spillover effects, these OECD countries benefit significantly from international investments through technology cooperations resulting in jointly owned patents.
Details
Keywords
Paresh Kumar Narayan and Seema Narayan
This paper aims to delineate the short‐ and long‐run relationships between savings, real interest rate, income, current account deficits (CADs) and age dependency ratio in Fiji…
Abstract
Purpose
This paper aims to delineate the short‐ and long‐run relationships between savings, real interest rate, income, current account deficits (CADs) and age dependency ratio in Fiji using cointegration and error correction models over the period 1968‐2000.
Design/methodology/approach
The recently developed bounds testing approach to cointegration is used, which is applicable irrespective of whether the underlying variables are integrated of order one or order zero. Given the small sample size in this study, appropriate critical values were extracted from Narayan. To estimate the short‐ and long‐run elasticities, the autoregressive distributed‐lag model is used.
Findings
In the short‐ and long‐run: a 1 per cent increase in growth rate increases savings by over 0.07 and 0.5 per cent, respectively; a 1 per cent increase in the CAD reduces savings rate by 0.01 and 0.02 per cent, respectively; and the negative coefficient on the real interest rate implies that the income effect dominates the substitution effect, while in the short‐run the total effect of the real interest rate is positive, implying that the substitution effect dominates the income effect.
Originality/value
This paper makes the first attempt at estimating the savings function for the Fiji Islands. Given that Fiji's capital market is poorly developed, the empirical findings here have direct policy relevance.
WORDS, like currency, are easily debased. They lose their value when used out of context, invoked as a talisman or, without attention to their precise meaning, pressed into…
The purpose of this article is to analyze the historical significance of Donald J. Urquhart, who established the National Lending Library for Science and Technology (NLL) that…
Abstract
Purpose
The purpose of this article is to analyze the historical significance of Donald J. Urquhart, who established the National Lending Library for Science and Technology (NLL) that later was merged into the British Library Lending Division (BLLD), now called the British Library Document Supply Centre (BLDSC).
Design/methodology/approach
The paper presents a short history of the probabilistic revolution, particularly as it developed in the UK in the form of biometric statistics due to Darwin's theory of evolution. It focuses on the overthrow of the normal paradigm, according to which frequency distributions in nature and society conform to the normal law of error. The paper discusses the importance of the Poisson distribution and its utilization in the construction of stochastic models that better describe reality. Here the focus is on the compound Poisson distribution in the form of the negative binomial distribution (NBD). The paper then shows how Urquhart extended the probabilistic revolution to librarianship by using the Poisson as the probabilistic model in his analyses of the 1956 external loans made by the Science Museum Library (SML) as well as in his management of the scientific and technical (sci/tech) journal collection of the NLL. Thanks to this, Urquhart can be considered as playing a pivotal role in the creation of bibliometrics or the statistical bases of modern library and information science. The paper relates how Urquhart's son and daughter‐in‐law, John A. and Norma C. Urquhart, completed Urquhart's probabilistic breakthrough by advancing for the first time the NBD as the model for library use in a study executed at the University of Newcastle upon Tyne, connecting bibliometrics with biometrics. It concludes with a discussion of Urquhart's Law and its probabilistic implications for the use of sci/tech journals in a library system.
Findings
By being the first librarian to apply probability to the analysis of sci/tech journal use, Urquhart was instrumental in the creation of modern library and information science. His findings force a probabilistic re‐conceptualization of sci/tech journal use in a library system that has great implications for the transition of sci/tech journals from locally held paper copies to shared electronic databases.
Originality/value
Urquhart's significance is considered from the perspective of the development of science as a whole as well as library and information science in particular.