Yellow corn wholesale price forecasts via the neural network

Xiaojie Xu (North Carolina State University at Raleigh, Raleigh, North Carolina, USA)
Yun Zhang (North Carolina State University at Raleigh, Raleigh, North Carolina, USA)

EconomiA

ISSN: 1517-7580

Article publication date: 4 April 2023

Issue publication date: 10 July 2023

1383

Abstract

Purpose

Forecasts of commodity prices are vital issues to market participants and policy makers. Those of corn are of no exception, considering its strategic importance. In the present study, the authors assess the forecast problem for the weekly wholesale price index of yellow corn in China during January 1, 2010–January 10, 2020 period.

Design/methodology/approach

The authors employ the nonlinear auto-regressive neural network as the forecast tool and evaluate forecast performance of different model settings over algorithms, delays, hidden neurons and data splitting ratios in arriving at the final model.

Findings

The final model is relatively simple and leads to accurate and stable results. Particularly, it generates relative root mean square errors of 1.05%, 1.08% and 1.03% for training, validation and testing, respectively.

Originality/value

Through the analysis, the study shows usefulness of the neural network technique for commodity price forecasts. The results might serve as technical forecasts on a standalone basis or be combined with other fundamental forecasts for perspectives of price trends and corresponding policy analysis.

Keywords

Citation

Xu, X. and Zhang, Y. (2023), "Yellow corn wholesale price forecasts via the neural network", EconomiA, Vol. 24 No. 1, pp. 44-67. https://doi.org/10.1108/ECON-05-2022-0026

Publisher

:

Emerald Publishing Limited

Copyright © 2023, Xiaojie Xu and Yun Zhang

License

Published in EconomiA. Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode


1. Introduction

Forecasting agricultural commodity prices has always been a significant task for policy makers and different agricultural market participants (Ouyang, Hu, Yang, Yao, & Lin, 2022; Wang, Wang, Li, & Zhou, 2022; Xu, 2017, Xu, 2018). This could be particularly the case when one considers the fact that agricultural commodities generally carry with them natural importance from a strategic perspective to a country or region (Xu and Zhang, 2022). Importance of forecasting corn prices is of no exception when one takes into consideration strategic importance of corn, which could include its close relationship with the energy economic sector (Alola, 2022; Forhad & Alam, 2022; Liu & Wang, 2022; Wu, Weersink, & Maynard, 2022), deep financialization of trading (Abuselidze, Alekseieva, Kovtun, Kostiuk, & Karpenko, 2022; Penone, Giampietri, & Trestini, 2022; Wang, Zhang, Wang, & Meng, 2022; Xu and Zhang, 2022; Xu, Li, Wang, & Li, 2022), and the role of serving as an important food source across the globe (Li et al., 2022; Lu et al., 2022; Niu et al., 2022; Yu, Yue, & Wang, 2022). Price forecasts are required by different forecast users in agricultural markets. For example, they offer useful insights into setting future sales prices to agricultural commodity processors, provide necessary information for reaching contractual requirements to trading partners, shed light on potential opportunities for seeking profits in spot and futures markets and suggest possible gaps in risk management and policy assessments to policy makers. As price volatilities tend to be rather irregular (Marfatia, Ji, & Luo, 2022; Xu, 2017, Xu, 2020; Yang, Du, Lu, & Tejeda, 2022, Yang, Ge, & Li, 2022), different price levels have immense impacts on business and policy decisions (Ricome & Reynaud, 2022; Wang et al., 2022; Warren-Vega, Aguilar-Hernández, Zárate-Guzmán, Campos-Rodríguez, & Romero-Cano, 2022; Xu, 2014; Xu & Thurman, 2015), and ultimately on allocations of resources and social welfare (Liu, Fang, Zhang, Zhong, & Chen, 2022; Ma, Zhang, Song, & Yu, 2022; Xu, 2019, Xu, 2019); price forecasting’s significance to the agricultural economic sector should not call for too much motivation.

One direction that has been pursued in the applied econometrics literature is utilizing time-series models for the purpose of building accurate and stable forecast results of commodity prices (Awokuse & Yang, 2003; Babula, Bessler, Reeder, & Somwaru, 2004; Bessler, 1982, Bessler, 1990; Bessler & Babula, 1987; Bessler & Brandt, 1981, Bessler & Brandt, 1992; Bessler & Chamberlain, 1988; Bessler & Hopkins, 1986; Bessler & Kling, 1986; Bessler, Yang, & Wongcharupan, 2003; Brandt & Bessler, 1981, Brandt & Bessler, 1982; Brandt & Bessler, 1983, Brandt & Bessler, 1984; Chen & Bessler, 1987, Chen & Bessler, 1990; Kling & Bessler, 1985; McIntosh & Bessler, 1988; Wang & Bessler, 2004; Xu, 2014, Xu, 2015; Xu & Thurman, 2015; Yang & Awokuse, 2003; Yang, Haigh, & Leatham, 2001; Yang & Leatham, 1998; Yang, Li, & Wang, 2021; Yang, Zhang, & Leatham, 2003). Some typical models sought in previous studies include the ARIMA model, VAR model and VECM model. Over the past decade, computational power has becoming much more affordable, and the interest among researchers in building machine learning models aiming at offering good forecasts in economics and finance has been well documented (Ge, Jiang, He, Zhu, & Zhang, 2020; Yang & Wang, 2019), including, of course, forecasts of commodity prices for the agricultural market (Abreham, 2019; Ali, Deo, Downs, & Maraseni, 2018; Antwi, Gyamfi, Kyei, Gill, & Adam, 2022; Ayankoya, Calitz, & Greyling, 2016; Bayona-Oré, Cerna, & Hinojoza, 2021; Degife & Sinamo, 2019; Deina et al., 2021; Dias & Rocha, 2019; Fang, Guan, Wu, & Heravi, 2020; Filippi et al., 2019; Gómez, Salvador, Sanz, & Casanova, 2021; Handoyo & Chen, 2020; Harris, 2017; Huy, Thac, Thu, Nhat, & Ngoc, 2019; Jiang, He, & Zeng, 2019; Khamis & Abdullah, 2014; Kohzadi, Boyd, Kermanshahi, & Kaastra, 1996; Kouadio et al., 2018; Li, Chen, Li, Wang, & Xu, 2020, Li, Li, Liu, Zhu, & Wei, 2020; Lopes, 2018; Mayabi, 2019; de Melo, Júnior, & Milioni, 2004; Melo, Milioni, & Nascimento Júnior, 2007; Moreno et al., 2018; Naveena et al., 2017; Rasheed, Younis, Ahmad, Qadir, & Kashif, 2021; dos Reis Filho, Correa, Freire, & Rezende, 2020; Ribeiro & Oliveira, 2011; Ribeiro, Ribeiro, Reynoso-Meza, & dos Santos Coelho, 2019; Ribeiro & dos Santos Coelho, 2020; RL & Mishra, 2021; Shahhosseini, Hu, & Archontoulis, 2020, Shahhosseini, Hu, Huber, & Archontoulis, 2021; Silalahi et al., 2013; Silva, Siqueira, Okida, Stevan, & Siqueira, 2019; Storm, Baylis, & Heckelei, 2020; Surjandari, Naffisah, & Prawiradinata, 2015; Wan & Zhou, 2021; Wen et al., 2021; Xu & Zhang, 2022, Xu & Zhang, 2022; Yoosefzadeh-Najafabadi, Earl, Tulpan, Sulik, & Eskandari, 2021; Yuan, San, & Leong, 2020; Zelingher, Makowski, & Brunelle, 2020, Zelingher, Makowski, & Brunelle, 2021; Zhang, Meng, Wei, Chen, & Qin, 2021; Zhao, 2021; Zou, Xia, Yang, & Wang, 2007), such as corn (Antwi et al., 2022; Ayankoya et al., 2016; Mayabi, 2019; Moreno et al., 2018; dos Reis Filho et al., 2020; Ribeiro et al., 2019; Shahhosseini et al., 2020, 2021; Surjandari et al., 2015; Wan & Zhou, 2021; Xu & Zhang, 2021; Zelingher et al., 2020, 2021), soybean oil (Li et al., 2020; Silalahi et al., 2013; Xu & Zhang, 2022), coffee (Abreham, 2019; Degife & Sinamo, 2019; Deina et al., 2021; Huy et al., 2019; Kouadio et al., 2018; Lopes, 2018; Naveena et al., 2017), peanut oil (Mishra & Singh, 2013; Quan-Yin, Yong-Hu, Yun-Yang, & Tian-Feng, 2014; Singh & Mishra, 2015; Yin & Zhu, 2012; Zhu, Yin, Zhu, & Zhou, 2014; Zong & Zhu, 2012, Zong & Zhu, 2012), palm oil (Kanchymalay, Salim, Sukprasert, Krishnan, & Hashim, 2017), wheat (Dias & Rocha, 2019; Fang et al., 2020; Gómez et al., 2021; Khamis & Abdullah, 2014; Kohzadi et al., 1996; Rasheed et al., 2021; Ribeiro & dos Santos Coelho, 2020; Zou et al., 2007), oats (Harris, 2017), soybeans (Handoyo & Chen, 2020; Jiang et al., 2019; Li et al., 2020; dos Reis Filho et al., 2020; Ribeiro & dos Santos Coelho, 2020; Yoosefzadeh-Najafabadi et al., 2021; Zhao, 2021), canola (Filippi et al., 2019; Shahwan & Odening, 2007; Wen et al., 2021), cotton (Ali et al., 2018; Fang et al., 2020) and sugar (de Melo et al., 2004; Melo et al., 2007; Ribeiro & Oliveira, 2011; Silva et al., 2019; Surjandari et al., 2015; Zhang et al., 2021). The machine learning forecasting tools often observed in the literature include deep learning (RL & Mishra, 2021), random forest (Dias & Rocha, 2019; Filippi et al., 2019; Gómez et al., 2021; Kouadio et al., 2018; Li et al., 2020; Lopes, 2018; Ribeiro & dos Santos Coelho, 2020; Shahhosseini et al., 2020, 2021; Wen et al., 2021; Yoosefzadeh-Najafabadi et al., 2021; Zelingher et al., 2020, 2021), K-nearest neighbor (Abreham, 2019; Gómez et al., 2021; Lopes, 2018), genetic programming Ali et al. (2018), support vector regression (Abreham, 2019; Dias & Rocha, 2019; Fang et al., 2020; Gómez et al., 2021; Harris, 2017; Kanchymalay et al., 2017; Li et al., 2020, 2020; Lopes, 2018; dos Reis Filho et al., 2020; Ribeiro & dos Santos Coelho, 2020; Surjandari et al., 2015; Yoosefzadeh-Najafabadi et al., 2021; Zhang et al., 2021; Zhao, 2021), decision tree (Abreham, 2019; Degife & Sinamo, 2019; Dias & Rocha, 2019; Harris, 2017; Lopes, 2018; Surjandari et al., 2015; Zelingher et al., 2020, 2021), extreme learning (Deina et al., 2021; Jiang et al., 2019; Kouadio et al., 2018; Silva et al., 2019), neural network (Abreham, 2019; Antwi et al., 2022; Ayankoya et al., 2016; Deina et al., 2021; Fang et al., 2020; Harris, 2017; Huy et al., 2019; Khamis & Abdullah, 2014; Kohzadi et al., 1996; Li et al., 2020, 2020; Mayabi, 2019; de Melo et al., 2004; Melo et al., 2007; Mishra & Singh, 2013; Moreno et al., 2018; Naveena et al., 2017; Quan-Yin et al., 2014; Rasheed et al., 2021; Ribeiro & Oliveira, 2011; Ribeiro & dos Santos Coelho, 2020; Shahwan & Odening, 2007; Silalahi et al., 2013; Silva et al., 2019; Singh & Mishra, 2015; Wan & Zhou, 2021; Xu & Zhang, 2021, 2022; Yin & Zhu, 2012; Yoosefzadeh-Najafabadi et al., 2021; Yuan et al., 2020; Zhang et al., 2021; Zhu et al., 2014; Zong & Zhu, 2012, Zong & Zhu, 2012; Zou et al., 2007), boosting (Gómez et al., 2021; Lopes, 2018; Ribeiro & dos Santos Coelho, 2020; Shahhosseini et al., 2020, 2021; Zelingher et al., 2020, 2021), multivariate adaptive regression splines (Dias & Rocha, 2019) and ensemble (Fang et al., 2020; Ribeiro et al., 2019; Ribeiro & dos Santos Coelho, 2020; Shahhosseini et al., 2020, 2021). With these reviews, although not exhaustive, it appears that the neural network model is one of the most useful techniques in terms of constructing price forecasts for agricultural commodities (Bayona-Oré, Cerna, & Tirado Hinojoza, 2021). More specifically, a wide variety of time-series variables that are chaotic and noised could be well forecasted through the neural network model (Karasu, Altan, Bekiros, & Ahmad, 2020; Wang & Yang, 2010; Wegener, von Spreckelsen, Basse, & von Mettenheim, 2016; Xu, 2015, Xu, 2018, Xu, 2018, Xu, 2018; Yang, Cabrera, & Wang, 2010, Yang, Su, & Kolari, 2008), including many different types of economic and financial time series (Xu & Zhang, 2022). This fact could stem from the good capability of the neural network model for self-learning (Karasu, Altan, Saraç, & Hacioğlu, 2017, Karasu, Altan, Saraç, & Hacioğlu, 2017) and characterizing nonlinear features (Altan, Karasu, & Zio, 2021; Karasu et al., 2020; Xu & Zhang, 2022, Xu & Zhang, 2022) in various time series (Xu, 2018; Xu & Zhang, 2021, Xu & Zhang, 2021). Here, we adopt the neural network for the forecasting exercise of the price of yellow corn.

To conduct our analysis, the forecast problem in a data set of weekly wholesale price indices of yellow corn in China from January 1, 2010 to January 10, 2020 is examined via the nonlinear auto-regressive neural network technique. We assess performance of forecasts stemming from different settings of models, which include considerations of training algorithms, hidden neurons, delays and how the data are segmented. With the analysis, a relatively simple model is constructed, and it produces performance that is rather accurate and stable. The present work serves as the first one in addressing the price forecast problem for wholesale yellow corn in the Chinese market. Forecast results here could be utilized as part of technical analysis and/or combined with other fundamental forecasts as part of policy analysis.

2. Literature review

For price forecasting tasks in the agricultural sector, the literature has witnessed a great amount of studies that explore the use of econometric methods with the goal of producing stable and accurate forecasts. For example, the ARIMA model has been a great success in this field. It is univariate and generally relies on past values of a variable to be forecasted. Previous work has found it helpful for forecasting prices of wheat (Bessler & Babula, 1987) and cattle and hog (Bessler, 1990; Bessler & Brandt, 1981; Brandt & Bessler, 1981, 1982, 1983, 1984; Kling & Bessler, 1985). Instead of utilizing a single source of information for forecasting, the VAR, as another popular econometric forecasting tool, is built upon investigated economic variables’ relations (Awokuse & Yang, 2003; Bessler & Brandt, 1992; Bessler & Chamberlain, 1988; Bessler & Hopkins, 1986; Chen & Bessler, 1987; McIntosh & Bessler, 1988; Rezitis, 2015). Previous studies have demonstrated that it has good potential for forecasting prices of cotton (Chen & Bessler, 1990), wheat (Yang, Zhang, & Leatham, 2003), and soybeans (Babula et al., 2004). As compared to the VAR, the VECM is built upon the concept of cointegration, which is used to further incorporate long-run relationships among investigated economic variables (Xu, 2019, Xu, 2019; Xu & Zhang, 2023; Yang & Awokuse, 2003; Yang & Leatham, 1998; Yang et al., 2021). The VECM is usually found to be particularly useful for long-term price forecasting tasks (Bessler et al., 2003; Wang & Bessler, 2004).

The good potential of the econometric techniques mentioned above has been found as well among various forecasting research regarding prices of corn. For example, Zhou (2021) used the ARIMA for modeling monthly corn prices in China during April 2019–February 2021 and forecasting the price in March 2021, and obtained good accuracy. Crespo Cuaresma, Hlouskova and Obersteiner (2021) studied auto-regressive models, VARs, VECMs and their variations and combinations for forecasts of different agricultural commodity prices that include those of corn. They found that market fundamentals and macroeconomic developments contribute systematic predictive information for the forecast purpose. Albuquerquemello, Medeiros, Jesus and Oliveira (2021) assessed ARIMAs, VARs and their variations, particularly the consideration of transition regime models, for monthly U.S. corn price forecasts and pointed out the importance of incorporating nonlinear patterns in the model. Wan and Zhou (2021) examined corn futures price forecasts based on the ARIMA with data from China Dalian Commodity Exchange during 2018–2021 and concluded that a deeper consideration of parameter selection might improve model performance. Antwi, Gyamfi, Kyei, Gill and Adam (2022) investigated the ARIMA for corn futures price forecasts from Bloomberg during 2016–2021 and found that data decomposition techniques could help improve model accuracy, Jaiswal, Jha, Kumar and Choudhary (2021) researched the ARIMA for forecasts of monthly corn prices from World Bank Commodity Price Data during 1980–2020 and found that it achieved decent accuracy, although not optimal as compared to some machine learning models they considered. Silva, Barreira and Cugnasca (2021) evaluated the ARIMA for corn price forecasts in Brazil and found that it consistently underperforms as compared to machine learning models.

Advancements of machine learning techniques have been discovered in a diverse variety of forecasting work. For prices of corn investigated here, there does not exist an exception. For example, Wan and Zhou (2021) examined the comparison between the long short-term memory neural network and the ARIMA for corn futures price forecasts from China Dalian Commodity Exchange during 2018–2021 and found that the former leads to consistent better performance than the latter. Antwi et al. (2022) investigated the back propagation neural network for corn futures price forecasts from Bloomberg during 2016–2021 and determined that data decomposition techniques contribute to improved performance in terms of accuracy. Jaiswal et al. (2021) developed a deep long short-term memory neural network for forecasts of monthly corn prices from World Bank Commodity Price Data during 1980–2020 and concluded that it beats both the ARIMA and conventional time-delay neural network. Silva et al. (2021) studied the corn price forecast problem in Brazil by considering different machine learning models and found that the performance rank from the best to worst is: the support vector regression, the ensemble of the support vector regression and long short-term memory neural network, the ensemble of the AdaBoost and support vector regression, and the ensemble of the AdaBoost and long short-term memory neural network.

3. Data

We analyze weekly wholesale price indices of corn in the Chinese market from January 1, 2010, to January 10, 2020. In Figure 1, we plot the price series in the top left panel, the first differences of prices in the top right panel, the histogram of forty bins and the corresponding kernel estimates of price in the bottom left panel and the histogram of forty bins and the corresponding kernel estimates of the first differences of prices in the bottom right panel. We note that average weekly price of June 1994 serves as the price of the base period, and its value is set to 100, which indicates fifty-kilogram’s price of wholesale yellow corn. Table 1 presents the usual summary statistics of the prices, where we could see that they do not follow normal distributions like most of financial time series (Xu, 2017, Xu, 2019; Xu & Zhang, 2022, Xu & Zhang, 2022). Finally, we note that the price index is missing on February 19, 2010, and we apply the cubic spline interpolation technique for an approximated value of 122.839, which is rather close to 122.85 on February 12, 2010, and 122.53 on February 26, 2010.

4. Method

The nonlinear auto-regressive neural network model is adopted here for weekly price forecasts of wholesale yellow corn. It can be represented as yt = f(yt−1, …, ytd), where y is the price series of corn that will be forecasted, t is used to index time, d is used to denote the number of delays and f is used to represent the function. We note that f will need to be estimated as yt=α0+j=1kαjϕi=1dβijyti+β0j+εt, where k is used to denote the number of hidden layers whose transfer function is represented by ϕ, βij is used to denote the parameter that corresponds to the weight associated with the connection between the i − th input unit and j − th hidden unit, αj is used to denote the weight associated with the connection between the j − th hidden unit and output unit, β0j is used to denote the constant that corresponds to the j − th hidden unit, α0 is used to denote the constant that corresponds to the output unit and ɛ is used to denote the error. The current work concentrates on forecasts that are one-week ahead.

The model with the structure of a two-layer feed-forward network is applied here. It uses a sigmoid transfer function among the hidden layers and a linear transfer function for the output layer. More specifically, the logistic function of ϕ(z)=11+ez serves as the sigmoid transfer function. yt, the output, would be fed back through the delays back to the network’s input, and for the purpose of efficiency, the model training would adopt the form of an open loop, in which the real output is employed instead of the output that is estimated. The adoption of the open loop would ensure that the network’s inputs are more accurate, and as a result, the network would be purely feedforward.

For model training algorithm, we explore two options. One is the LM (Levenberg–Marquardt) algorithm (Levenberg, 1944; Marquardt, 1963) and the other is the SCG (scaled conjugate gradient) algorithm (Møller, 1993). These two algorithms have witnessed wide successful applications for forecasting purposes from different research areas (Doan & Liong, 2004; Kayri, 2016; Khan, Alam, Shahid, & Mazliham, 2019; Selvamuthu, Kumar, & Mishra, 2019; Xu & Zhang, 2021, Xu & Zhang, 2021, Xu & Zhang, 2022, Xu & Zhang, 2022, Xu & Zhang, 2022, Xu & Zhang, 2022). Their comparisons have been illustrated in previous research (Al Bataineh & Kaur, 2018; Baghirli, 2015; Xu & Zhang, 2022, Xu & Zhang, 2022, Xu & Zhang, 2022). Basically, the LM algorithm could robustly handle the problem of slow convergence (Hagan & Menhaj, 1994) by approximating the Hessian matrix (Paluszek & Thomas, 2020), and the SCG algorithm generally executes even faster as it does not involve line searches. Figure 2 shows the architecture of the final neural network model built in this work.

LM algorithm. In this algorithm, using a system whose weights are denoted as w1 and w2 as an example, the approximation of the Hessian matrix, H, is made as H ≈ JTJ, where J=Ew1Ew2 for a nonlinear function E(⋅) that contains the information of the sum square error whose H=2Ew122Ew1w22Ew2w12Ew22. The gradient could be expressed as g = JTe, where e denotes an error vector. For updating weights and biases, the rule of wk+1=wkJTJ+μI1JTe is adopted, where w denotes the weight vector, k denotes the index of the iteration during model training, I denotes the identity matrix and μ denotes the combination coefficient that is always positive. When μ = 0, the LM algorithm will be similar to Newton’s method. If μ is large, it would turn to be gradient descent with small step sizes. μ would be decreased after successful steps due to less need for faster gradient descent.

SCG algorithm. Weight adjustments in backpropagation algorithms are in the steepest descent because, in that direction, the performance function would decrease rapidly. However, this does not guarantee fastest convergence. As compared to the steepest descent, searches are conducted along conjugate directions in conjugate gradient algorithms for determining step sizes to reduce the performance function in iterations and convergence is generally faster. In addition, to avoid line searches in conjugate gradient algorithms, which could be time consuming, the SCG algorithm is adopted here as a fully automated supervised algorithm.

During the arrivals of our final model, different settings over delays, hidden neurons and data spitting ratios, in addition to algorithms, are tested. Specifically, delays of 2, 3, 4, 5 and 6, hidden neurons of 2, 3, 5 and 10, and data spitting ratios of 60%–20%–20%, 70%– 15%–15% and 80%–10%–10% for training–validation–testing are evaluated. Only training and validation part of the data are involved in selecting model parameters. Put in another way, only training and validation part of the data have been “seen” by a model. The testing part of the data has not been involved in selecting model parameters, and this part is only for testing a constructed model using the training and validation part of the data. For terminating the process of model training, we consider two options: the gradient’s magnitude and the validation check number. When model training has reached a performance minimum, the gradient would turn to be pretty small. Model training would be terminated if the gradient’s magnitude is smaller than 10−5. The validation check number refers to successive iterations whose performance based upon the validation part of the data no longer decreases. We adopt six as the validation check number, and model training would be terminated once it reaches six validation checks. Further, the maximal training iteration number is one thousand, and model training would be terminated once it reaches this iteration number. Other settings for the LM algorithm are as follows. μ’s initial value is set to 0.001, μ’s decreasing factor is set to 0.1, μ’s increasing factor is set to 10 and μ’s maximal value is set to 1010. Other settings for the SCG algorithm are as follows. The Marquardt adjustment parameter is set to 0.005, the weight change determinant is set to 5 × 10−5 for approximating second derivatives and the parameter for regulating the Hessian’s indefiniteness is set to 5 × 10−7. Table 2 contains all evaluated model settings, where the #67 is applied for building our final model for the price index of yellow corn. It is using 5 delays and 5 hidden neurons and trained with the LM algorithm and the training–validation–testing ratio of 60%–20%–20%.

5. Result

We evaluate each model setting contained in Table 2 for weekly prices of wholesale yellow corn. We adopt the relative root mean square error (RRMSE) for measuring forecast performance and calculate RRMSEs generated from each model setting across the training phase, validation phase and testing phase. Figure 3 reports the results of all RRMSEs. During the process of determining the final model setting for the price series, we take into consideration the need to balance forecast accuracy and forecast stabilities across the three phases, and select the setting #67 (5 delays and 5 hidden neurons). This setting is applying the LM algorithm and the data segmentation ratio of 60%–20%–20% for training–validation–testing, thus reserving the largest amount of the data for model testing purposes among three different data segmentation rations examined. We can observe from Figure 3, where the setting #67 is indicated via a dark arrow, that for the selected setting, the diamond for the training phase, the square for the validation phase and the triangular for the testing phase are rather close to each other. As compared to the selected setting, there exist others that generate a lower RRMSE for a specific subsample but with higher RRMSEs for the remaining subsamples, suggesting lower stabilities. For example, the setting #71 generates a lower RRMSE than the setting #67 for the training phase but higher RRMSEs for the validation and testing phases. By selecting the model setting with relatively stable performance across the training phase, validation phase and testing phase, we try to avoid potential problems of model overfitting or underfitting.

With the selected setting determined for prices of yellow corn, we turn to assess sensitivities of performance to different settings through switching one model setting each time. Figure 4 shows the results of assessments of performance sensitivities, where RRMSEs corresponding to the training phase, validation phase and testing phase are reported. The performance comparison between the model setting #67 and the model setting #68 aims at evaluating the sensitivity to training algorithm as the former is based upon the LM algorithm while the latter is based upon the SCG algorithm. Performance comparisons between the model setting #67 and model settings #61, #63, #65 and #69 aim at evaluating sensitivities to delays as the former is based upon 5 delays while the latter four are based upon 2, 3, 4 and 6 delays, respectively. Performance comparisons between the model setting #67 and model settings #47, #57 and #77 aim at evaluating sensitivities to hidden neurons as the former is based upon 5 hidden neurons while the latter three are based upon 2, 3 and 10 hidden neurons, respectively. Performance comparisons between the model setting #67 and model settings #27 and #107 aim at evaluating sensitivities to how the price series is segmented into the training phase, validation phase and testing phase as the former is based upon the ratio of 60%–20%–20% while the latter two are based upon ratios of 70%–15%–15% and 80%–10%–10%, respectively. With these performance comparisons, the model setting #67 is selected for the price series of yellow corn. Based upon the model setting #67, RRMSEs are 1.05%, 1.08% and 1.03%, respectively, corresponding to the training phase, validation phase and testing phase, and the overall RRMSE is 1.05%. From Figure 4, we can observe that the LM algorithm leads to lower RRMSEs than the SCG algorithm. Specifically, this is evidenced through the performance comparison between the model setting #67 and the model setting #68. The achievement of higher accuracy via the LM algorithm for neural networks based upon the multilayer perceptron structure and two hidden layers as compared to the SCG algorithm tends to be consistent with the finding in previous work (Batra, 2014; Xu & Zhang, 2022). Overall performance is slightly better based on the model settings #27 and #107 than the model setting #67 because the model setting #67 reserves fewer data for training and validation phases than the model settings #27 and #107. But the minor performance differences between the model setting #67 and model settings #27 and #107 suggest that the results are generally robust to data segmentation ratios.

We present plots of detailed forecasted results based upon the selected model setting in the top panel of Figure 5 and corresponding detailed forecast errors in the bottom panel of Figure 5 across the training phase, validation phase and testing phase. Overall, the selected model setting for the price series of yellow corn generates good forecast performance results that are also stable across different phases. In addition, as can be seen from Figure 5, the selected model setting does not lead to the issue of consistent overprediction or underprediction across the phases. To assess the adequacy of the selected model setting, analysis of auto-correlations of errors has been conducted (results are omitted here for brevity but are available upon request) for up to 20 lags, and it is found that they generally do not breach the 95% confidence limits with the two exceptions of the 4-th and 18-th lags, for which slight breaches are determined. These slight breaches would have been avoided with the use of the 99% confidence limits. Thus, the analysis of auto-correlations of errors confirms the adequacy of the selected model.

Inhabitancies of potential nonlinearities in the higher moments in financial or economic time series have been widely reported in the literature (Karasu et al., 2020; Wang & Yang, 2010; Yang et al., 2010, 2008). Here, we use the BDS test (Brock, Scheinkman, Dechert, & LeBaron, 1996; Dergiades, Martinopoulos, & Tsoulfidis, 2013; Fujihara & Mougoué, 1997) on weekly prices of yellow corn and determine that the corresponding p − values are all nearly zero based upon different testing scenarios. Given this situation, neural network models are suitable for modeling nonlinear features in the price series (Altan et al., 2021; Karasu et al., 2020). There are other machine learning approaches that could be considered for modeling nonlinearities. One advantage of neural network models is the use of combinations of different nonlinear functions rather than the use of one particular nonlinear function for approximations of the underlying price time series (Wang & Yang, 2010; Yang et al., 2010, 2008). With forecast results achieved here that are rather accurate and stable, our analysis demonstrates the potential of neural network models for forecasting prices of wholesale yellow corn.

6. Robustness analysis

Determining the number of layers needed for particular tasks has been an interesting topic for both theoretical and empirical research on neural networks, and the theoretical literature has not yet provided explicit guidelines in this regard (Gershenson, 2003; Jain, Mao, & Mohiuddin, 1996). From a practical standpoint, the implementation of the neural network generally does not require too many layers because training time would grow exponentially following the increase in the number of layers used (i.e. much more computation would be needed) and the tendency of model overfitting would also be elevated (Gershenson, 2003). A two-layer network could already form rather complex decision boundaries (Jain et al., 1996). For our particular case without many predictors, a two-layer network seems sufficient. A seminal study pointed out that a neural network generally does not need more than two hidden layers to solve most problems (Lapedes & Farber, 1987). Thus, to assess sensitivities of model performance to the number of layers used, we consider another neural network that has an additional hidden layer than our selected setting #67 and compare the resultant RRMSEs. We call this alternative neural network model “NN#67–MoreLayers,” which also uses 5 delays, 5 hidden neurons, the LM algorithm and the data splitting ratio of 60% vs 20% vs 20% for training, validation and testing.

When making comparisons of different models’ performance, we also adopt a modified Diebold–-Mariano (Diebold & Mariano, 2002) test (Harvey, Leybourne, & Newbold, 1997). This test of differences in performance of different models is based upon dt= error tM12 error tM22, where  error tM1 and  error tM2 are used to denote two errors associated with time t that stem from models M1 and M2, respectively. Here for our case, we could let M1 denote NN#67–MoreLayers and M2 denote our selected setting #67. The test statistic, which could be denoted as MDM, can be expressed as MDM=T+12h+T1h(h1)T1/2T1γ0+2k=1h1γk1/2d¯, where T is used to denote the length of the time period based on which comparisons of performance are carried out, h is used to denote the time horizon (for our case, h = 1), d¯ is used to denote dt’s sample mean, γ0=T1t=1Tdtd¯2 is used to denote dt’s variance and γk=T1t=k+1Tdtd¯dtkd¯ is used to denote dt’s kth auto-covariance for k = 1, …, h − 1 and h ≥ 2. Under the null that two particular models being compared produce equal MSEs (mean squared errors), the MDM test would follow the t–distribution with T − 1 degrees of freedom. Figure 6 shows comparisons of the setting #67 and NN#67–MoreLayers in terms of RRMSEs, where it could be observed that the two models lead to close performance. Specifically, NN#67–MoreLayers leads to slightly better performance for the training phase, and the setting #67 leads to slightly better performance for the validation and testing phases. The p − value of the MDM test for the testing phase is 0.263, indicating that the two models do not lead to performance differences that are statistically significant. As NN#67–MoreLayers is more complicated than the setting #67 but does not lead to significant better performance, the setting #67 appears to be a better choice for our case.

7. Benchmark analysis

Analysis so far has focused on the neural network. Here, we consider the following benchmark models against our selected setting #67: the random walk (RW) model, the autoregressive (AR) model, the autoregressive-generalized autoregressive conditional heteroskedasticity (AR-GARCH) model, the support vector regression (SVR) model, the regression tree (RT) model and the long short-term memory neural network (LSTM) model. Similar to robustness analysis aforementioned, when comparing model performance, we consider both RRMSEs and MDM tests of differences in MSEs.

Details of the six benchmark models are as follows. The RW model uses the price of the previous week as the forecast. The AR model uses the same number of lags as the setting #67, which is 5. The AR-GARCH model also uses the same number of lags as the setting #67 for the AR part and the GARCH(1,1) structure for the GARCH part. The linear ϵ-insensitive SVR model is adopted here, with the box constraint set to be the interquartile range of the target variable divided by 1.349 and the half the width of the ϵ-insensitive band set to be the interquartile range of the target variable divided by 13.49, which uses lagged one to lagged five price series as predictors. The RT model is based upon the classification analysis and regression tree (CART) algorithm (Breiman, 2017), with the minimum number of branch node observations set to 10 and the minimum number of leaf node observations set to 4, which uses lagged one to lagged five price series as predictors. The LSTM model uses the two-layer structure in the open loop form with the number of time steps set to 5 and the number of LSTM units set to 10, which employs the Adam optimizer for training.

Figure 7 presents performance comparisons of the setting #67 and the six benchmark models based upon the RRMSE. Table 3 presents performance comparisons based upon the MDM test for the testing phase. From these results, we could observe that the RW model leads to relatively close performance to the setting #67 based upon the RRMSE, but the MDM test suggests that performance generated by the setting #67 is significantly better than that by the RW model for the testing phase at the 5% level. While the AR-GARCH model helps improve performance based upon the AR model, these two models do not lead to as accurate performance as the setting #67, and the corresponding MDM tests both lead to p − values well below 0.001. Although the SVR and RT models do not beat the setting #67, their performance in terms of the RRMSE is not far from that based on the setting #67. MDM tests suggest that performance generated by the setting #67 is significantly better than that by the SVR and RT models at the 5% level. The LSTM model could further improve performance based upon the setting #67 but the magnitude is rather limited in terms of the RRMSE, with the MDM test suggesting an insignificant result either.

8. Conclusion

For diverse varieties of agricultural market participants, constructing price forecasts of different types of agricultural commodities has always been an important task. In the present work, we carry out the forecast exercise by focusing on weekly prices of wholesale yellow corn in the Chinese market from January 1, 2010 to January 10, 2020. For this purpose, we adopt the nonlinear auto-regressive neural network model to tackle this particular forecast problem by taking into consideration different model settings, which include fields of training algorithms, hidden neurons, delays and how the data are segmented. With the analysis, a relatively simple model is constructed which produces performance that is rather accurate and stable. More specifically, the Levenberg–Marquardt algorithm (Levenberg, 1944; Marquardt, 1963) is applied for constructing the model following the ratio of 60%–20%–20% for segmenting the price series into the training phase–validation phase–testing phase. The model is based upon 5 delays and 5 hidden neurons. It leads to relative root mean square errors of 1.05%, 1.08% and 1.03%, respectively, for the training phase, validation phase and testing phase, and the relative root mean square error of 1.05% for the overall data. Forecast results here could be utilized as part of technical analysis and/or combined with other fundamental forecasts as part of policy analysis. The forecast framework utilized here should be rather straightforward, which represents an essential consideration to policy makers and a significant number of market participants (Brandt & Bessler, 1983). Such a forecast framework could be applied to relevant forecast problems across many other commodity price series from different economic segments. For future work, one potential interesting avenue would be making forecasts of commodity prices by utilizing the combination of graph theory and time series models (Bessler & Wang, 2012; Kano et al., 2003; Shimizu, Hoyer, Hyvärinen, Kerminen, & Jordan, 2006, 2011; Shimizu & Kano, 2008; Xu, 2014; Xu & Zhang, 2022). Another worthwhile path would be examining economic significance that stems from price forecasts based upon different machine learning models (Wang & Yang, 2010; Yang et al., 2010, 2008). For example, one study (Colino & Irwin, 2010) found that a root mean square error reduction of 1% would translate to $11,500 for a risk-averse hog producer, who utilizes price forecast information as part of decision-making, in the agricultural sector with production of 10,000 head per year.

Figures

Top panel: The weekly price index of yellow corn (left) and first differences of prices (right); bottom panel: histograms of forty bins and kernel estimates for the weekly price index and its first differences

Figure 1

Top panel: The weekly price index of yellow corn (left) and first differences of prices (right); bottom panel: histograms of forty bins and kernel estimates for the weekly price index and its first differences

The block diagram of the neural network model of the two-layer feedforward structure with a logistic sigmoid transfer function for the hidden layer and a linear transfer function for the output layer based on 5 delays and 5 hidden neurons

Figure 2

The block diagram of the neural network model of the two-layer feedforward structure with a logistic sigmoid transfer function for the hidden layer and a linear transfer function for the output layer based on 5 delays and 5 hidden neurons

RRMSEs across all model settings for the weekly price index of yellow corn

Figure 3

RRMSEs across all model settings for the weekly price index of yellow corn

Sensitivities of model performance (the RRMSE) to different model settings for the weekly price index of yellow corn

Figure 4

Sensitivities of model performance (the RRMSE) to different model settings for the weekly price index of yellow corn

Top panel: forecasts of the weekly price index of yellow corn; bottom panel: forecast errors calculated as observations minus forecasts

Figure 5

Top panel: forecasts of the weekly price index of yellow corn; bottom panel: forecast errors calculated as observations minus forecasts

Performance comparisons of the setting #67 and NN#67–MoreLayers

Figure 6

Performance comparisons of the setting #67 and NN#67–MoreLayers

Performance comparisons of the setting #67 and six benchmark models

Figure 7

Performance comparisons of the setting #67 and six benchmark models

Summary statistics of the weekly price index and its first differences of yellow corn

CommoditySeriesMinimumMeanMedianStdMaximumSkewnessKurtosisJarque-Bera
Yellow cornPrice108.1800139.2282134.020016.6369177.41000.06831.7032<0.001
First difference−8.6300−0.00440.04001.55836.2000−0.84926.4195<0.001

Source(s): Elaborated by the authors

Explored model settings for the weekly price index of yellow corn

Model setting
AlgorithmLM1 + 2i (i = 0,1,…,59)
SCG2 + 2i(i = 0,1,…,59)
Delay21 + 10j–2 + 10j (j = 0,1,…,11)
33 + 10j–4 + 10j (j = 0,1,…,11)
45 + 10j–6 + 10j (j = 0,1,…,11)
57 + 10j–8 + 10j (j = 0,1,…,11)
69 + 10j–10 + 10j (j = 0,1,…,11)
Hidden neuron 21 + 40k–10 + 40k (k = 0,1,2)
311 + 40k–20 + 40k (k = 0,1,2)
521 + 40k–30 + 40k (k = 0,1,2)
1031 + 40k–40 + 40k (k = 0,1,2)
Training vs validation vs testing ratio70% vs 15% vs 15%1–40
60% vs 20% vs 20%41–80
80% vs 10% vs 10%81–120

Source(s): Elaborated by the authors

MDM test results of benchmark analysis

Comparisonp − value of the MDM test
#67 vs RW0.042
#67 vs AR<0.001
#67 vs AR-GARCH<0.001
#67 vs SVR0.029
#67 vs RT0.018
#67 vs LSTM0.135

Source(s): Elaborated by the authors

Competing interests: The authors did not receive support from any organization for the submitted work. The authors have no relevant financial or nonfinancial interests to disclose.

References

Abreham, Y., (2019). Coffee price prediction using machine-learning techniques. Ph.D. thesis. ASTU.

Abuselidze, G., Alekseieva, K., Kovtun, O., Kostiuk, O., & Karpenko, L. (2022). Application of hedge technologies to minimize price risks by agricultural producers. In XIV International Scientific Conference “INTERAGROMASH 2021” (pp. 906915), Springer. doi: 10.1007/978-3-030-81619-3_101.

Al Bataineh, A., & Kaur, D. (2018). A comparative study of different curve fitting algorithms in artificial neural network using housing dataset. In NAECON 2018-IEEE National Aerospace and Electronics Conference (pp. 174178), IEEE. doi: 10.1109/NAECON.2018.8556738.

Albuquerquemello, V. P. D., Medeiros, R. K. D., Jesus, D. P. D., & Oliveira, F .A. D. (2021). The role of transition regime models for corn prices forecasting. Revista de Economia e Sociologia Rural, 60. doi: 10.1590/1806-9479.2021.236922.

Ali, M., Deo, R. C., Downs, N. J., & Maraseni, T. (2018). Cotton yield prediction with Markov chain Monte Carlo-based simulation model integrated with genetic programing algorithm: A new hybrid copula-driven approach. Agricultural and Forest Meteorology, 263, 428448. doi: 10.1016/j.agrformet.2018.09.002.

Alola, A. A. (2022). The nexus of renewable energy equity and agricultural commodities in the United States: Evidence of regime-switching and price bubbles. Energy, 239, 122377. doi: 10.1016/j.energy.2021.122377.

Altan, A., Karasu, S., & Zio, E. (2021). A new hybrid model for wind speed forecasting combining long short-term memory neural network, decomposition methods and grey wolf optimizer. Applied Soft Computing, 100, 106996. doi: 10.1016/j.asoc.2020.106996.

Antwi, E., Gyamfi, E. N., Kyei, K. A., Gill, R., & Adam, A. M. (2022). Modeling and forecasting commodity futures prices: Decomposition approach. IEEE Access, 10, 2748427503. doi: 10.1109/ACCESS.2022.3152694.

Awokuse, T. O., & Yang, J. (2003). The informational role of commodity prices in formulating monetary policy: A reexamination. Economics Letters, 79, 219224. doi: 10.1016/S0165-1765(02)00331-2.

Ayankoya, K., Calitz, A. P., & Greyling, J. H. (2016). Using neural networks for predicting futures contract prices of white maize in South Africa. In Proceedings of the Annual Conference of the South African Institute of Computer Scientists and Information Technologists (pp. 110). doi: 10.1145/2987491.2987508.

Babula, R. A., Bessler, D. A., Reeder, J., & Somwaru, A. (2004). Modeling us soy-based markets with directed acyclic graphs and Bernanke structural var methods: The impacts of high soy meal and soybean prices. Journal of Food Distribution Research, 35, 2952. doi: 10.22004/ag.econ.27559.

Baghirli, O. (2015). Comparison of Lavenberg-Marquardt, scaled conjugate gradient and Bayesian regularization backpropagation algorithms for multistep ahead wind speed forecasting using multilayer perceptron feedforward neural network, Available from: https://www.diva-portal.org/smash/get/diva2:828170/FULLTEXT01.pdf

Batra, D. (2014). Comparison between levenberg-marquardt and scaled conjugate gradient training algorithms for image compression using mlp. International Journal of Image Processing (IJIP), 8, 412422.

Bayona-Oré, S., Cerna, R., & Hinojoza, E. T. (2021). Machine learning for price prediction for agricultural products. WSEAS Transactions on Business and Economics, 18, 969977. doi: 10.37394/23207.2021.18.92.

Bayona-Oré, S., Cerna, R., Tirado Hinojoza, E., (2021). Machine learning for price prediction for agricultural products doi:10.37394/23207.2021.18.92.

Bessler, D. A. (1982). Adaptive expectations, the exponentially weighted forecast, and optimal statistical predictors: A revisit. Agricultural Economics Research, 34, 1623. doi: 10.22004/ag.econ.148819.

Bessler, D. A. (1990). Forecasting multiple time series with little prior information. American Journal of Agricultural Economics, 72, 788792. doi: 10.2307/1243059.

Bessler, D. A., & Babula, R. A. (1987). Forecasting wheat exports: Do exchange rates matter?. Journal of Business & Economic Statistics, 5, 397406. doi: 10.2307/1391615.

Bessler, D. A., & Brandt, J. A. (1981). Forecasting livestock prices with individual and composite methods. Applied Economics, 13, 513522. doi: 10.1080/00036848100000016.

Bessler, D. A., & Brandt, J. A. (1992). An analysis of forecasts of livestock prices. Journal of Economic Behavior & Organization, 18, 249263. doi: 10.1016/0167-2681(92)90030-F.

Bessler, D. A., & Chamberlain, P. J. (1988). Composite forecasting with dirichlet priors. Decision Sciences, 19, 771781. doi: 10.1111/j.1540-5915.1988.tb00302.x.

Bessler, D. A., & Hopkins, J. C. (1986). Forecasting an agricultural system with random walk priors. Agricultural Systems, 21, 5967. doi: 10.1016/0308-521X(86)90029-6.

Bessler, D. A., & Kling, J. L. (1986). Forecasting vector autoregressions with bayesian priors. American Journal of Agricultural Economics, 68, 144151. doi: 10.2307/1241659.

Bessler, D. A., & Wang, Z. (2012). D-Separation, forecasting, and economic science: A conjecture. Theory and Decision, 73, 295314. doi: 10.1007/s11238-012-9305-8.

Bessler, D. A., Yang, J., & Wongcharupan, M. (2003). Price dynamics in the international wheat market: Modeling with error correction and directed acyclic graphs. Journal of Regional Science, 43, 133. doi: 10.1111/1467-9787.00287.

Brandt, J. A., & Bessler, D. A. (1981). Composite forecasting: An application with us hog prices. American Journal of Agricultural Economics, 63, 135140. doi: 10.2307/1239819.

Brandt, J. A., & Bessler, D. A. (1982). Forecasting with a dynamic regression model: A heuristic approach. North Central Journal of Agricultural Economics, 4, 2733. doi: 10.2307/1349096.

Brandt, J. A., & Bessler, D. A. (1983). Price forecasting and evaluation: An application in agriculture. Journal of Forecasting, 2, 237248. doi: 10.1002/for.3980020306.

Brandt, J. A., & Bessler, D. A. (1984). Forecasting with vector autoregressions vs a univariate arima process: An empirical example with us hog prices. North Central Journal of Agricultural Economics, 4, 2936. doi: 10.2307/1349248.

Breiman, L. (2017). Classification and regression trees. New York: Routledge.

Brock, W. A., Scheinkman, J. A., Dechert, W. D., & LeBaron, B. (1996). A test for independence based on the correlation dimension. Econometric Reviews, 15, 197235. doi: 10.1080/07474939608800353.

Chen, D. T., & Bessler, D. A. (1987). Forecasting the us cotton industry: Structural and time series approaches. In Proceedings of the NCR-134 Conference on Applied Commodity Price Analysis. Forecasting, and Market Risk Management, Chicago Mercantile Exchange, Chicago. doi: 10.22004/ag.econ.285463.

Chen, D. T., & Bessler, D. A. (1990). Forecasting monthly cotton price: Structural and time series approaches. International Journal of Forecasting, 6, 103113. doi: 10.1016/0169-2070(90)90101-G.

Colino, E. V., & Irwin, S. H. (2010). Outlook vs futures: Three decades of evidence in hog and cattle markets. American Journal of Agricultural Economics, 92, 115. doi: 10.1093/ajae/aap013.

Crespo Cuaresma, J., Hlouskova, J., & Obersteiner, M. (2021). Agricultural commodity price dynamics and their determinants: A comprehensive econometric approach. Journal of Forecasting, 40, 12451273. doi: 10.1002/for.2768.

de Melo, B., Júnior, C. N., & Milioni, A. Z. (2004). Daily sugar price forecasting using the mixture of local expert models. WIT Transactions on Information and Communication Technologies, 33. doi: 10.2495/DATA040221.

Degife, W. A., & Sinamo, A. (2019). Efficient predictive model for determining critical factors affecting commodity price: The case of coffee in Ethiopian commodity exchange (ecx). International Journal of Information Engineering and Electronic Business, 11, 3236. doi: 10.5815/ijieeb.2019.06.05.

Deina, C., do Amaral Prates, M. H., Alves, C. H. R., Martins, M. S. R., Trojan, F., Stevan, S. L., Jr., & Siqueira, H. V. (2021). A methodology for coffee price forecasting based on extreme learning machines. Information Processing in Agriculture, 9(4). doi:10.1016/j.inpa.2021.07.003.

Dergiades, T., Martinopoulos, G., & Tsoulfidis, L. (2013). Energy consumption and economic growth: Parametric and non-parametric causality testing for the case of Greece. Energy Economics, 36, 686697. doi: 10.1016/j.eneco.2012.11.017.

Dias, J., & Rocha, H. (2019). Forecasting wheat prices based on past behavior: Comparison of different modelling approaches. In International Conference on Computational Science and Its Applications (pp. 167182). Springer. doi: 10.1007/978-3-030-24302-9_13.

Diebold, F. X., & Mariano, R. S. (2002). Comparing predictive accuracy. Journal of Business & Economic Statistics, 20, 134144. doi: 10.2307/1392185.

Doan, C. D., & Liong, S. Y. (2004). Generalization for multilayer neural network Bayesian regularization or early stopping. In Proceedings of Asia Pacific Association of Hydrology and Water Resources 2nd Conference (pp. 58).

dos Reis Filho, I. J., Correa, G. B., Freire, G. M., & Rezende, S. O. (2020). Forecasting future corn and soybean prices: An analysis of the use of textual information to enrich time-series. In Anais do VIII Symposium on Knowledge Discovery, Mining and Learning, SBC (pp. 113120).

Fang, Y., Guan, B., Wu, S., & Heravi, S. (2020). Optimal forecast combination based on ensemble empirical mode decomposition for agricultural commodity futures prices. Journal of Forecasting, 39, 877886. doi: 10.1002/for.2665.

Filippi, P., Jones, E. J., Wimalathunge, N. S., Somarathna, P. D., Pozza, L. E., Ugbaje, S. U., … Bishop, T. F. (2019). An approach to forecast grain crop yield using multi-layered, multi-farm data sets and machine learning. Precision Agriculture, 20, 10151029. doi: 10.1007/s11119-018-09628-4.

Forhad, M. A. R., Alam, M. R. (2022). Impact of oil demand and supply shocks on food-grain prices: A Markov-switching approach. Applied Economics, 113. doi:10.1080/00036846.2021.2009113.

Fujihara, R. A., & Mougoué, M. (1997). An examination of linear and nonlinear causal relationships between price variability and volume in petroleum futures markets. Journal of Futures Markets: Futures, Options, and Other Derivative Products, 17, 385416, 199706. doi: 10.1002/(SICI)1096-9934(199706)17:4<385::AID-FUT2>3.0.CO;2-D.

Ge, Q., Jiang, H., He, M., Zhu, Y., & Zhang, J. (2020). Power load forecast based on fuzzy bp neural networks with dynamical estimation of weights. International Journal of Fuzzy Systems, 22, 956969. doi: 10.1007/s40815-019-00796-7.

Gershenson, C., (2003). Artificial neural networks for beginners doi:10.48550/arXiv.cs/0308031.

Gómez, D., Salvador, P., Sanz, J., & Casanova, J. L. (2021). Modelling wheat yield with antecedent information, satellite and climate data using machine learning methods in Mexico. Agricultural and Forest Meteorology, 300, 108317. doi: 10.1016/j.agrformet.2020.108317.

Hagan, M. T., & Menhaj, M. B. (1994). Training feedforward networks with the Marquardt algorithm. IEEE Transactions on Neural Networks, 5, 989993. doi: 10.1109/72.329697.

Handoyo, S., & Chen, Y. P. (2020). The developing of fuzzy system for multiple time series forecasting with generated rule bases and optimized consequence part. SSRG International Journal of Engineering Trends and Technology, 68, 118122. doi: 10.14445/22315381/IJETT-V68I12P220.

Harris, J. J., (2017). A machine learning approach to forecasting consumer food prices .

Harvey, D., Leybourne, S., & Newbold, P. (1997). Testing the equality of prediction mean squared errors. International Journal of Forecasting, 13, 281291. doi: 10.1016/S0169-2070(96)00719-4.

Huy, H. T., Thac, H. N., Thu, H. N. T., Nhat, A. N., & Ngoc, V. H. (2019). Econometric combined with neural network for coffee price forecasting. Journal of Applied Economic Sciences, 14(64), 378392.

Jain, A. K., Mao, J., & Mohiuddin, K. M. (1996). Artificial neural networks: A tutorial. Computer, 29, 3144. doi: 10.1109/2.485891.

Jaiswal, R., Jha, G. K., Kumar, R. R., Choudhary, K., (2021). Deep long short-term memory based model for agricultural price forecasting. Neural Computing and Applications, 116. doi:10.1007/s00521-021-06621-3.

Jiang, F., He, J., & Zeng, Z. (2019). Pigeon-inspired optimization and extreme learning machine via wavelet packet analysis for predicting bulk commodity futures prices. Science China Information Sciences, 62, 119. doi: 10.1007/s11432-018-9714-5.

Kanchymalay, K., Salim, N., Sukprasert, A., Krishnan, R., & Hashim, U. R. (2017). Multivariate time series forecasting of crude palm oil price using machine learning techniques. In IOP Conference Series: Materials Science and Engineering (pp. 012117), IOP Publishing. doi: 10.1088/1757-899X/226/1/012117.

Kano, Y., Shimizu, S., et al. (2003). Causal inference using nonnormality. In Proceedings of the international symposium on science of modeling, the 30th anniversary of the information criterion (pp. 261270), Available from: http://www.ar.sanken.osaka-u.ac.jp/sshimizu/papers/aic30_web2.pdf

Karasu, S., Altan, A., Saraç, Z., & Hacioğlu, R. (2017). Estimation of fast varied wind speed based on narx neural network by using curve fitting. International Journal of Energy Applications and Technologies, 4, 137146, Available from: https://dergipark.org.tr/en/download/article-file/354536

Karasu, S., Altan, A., Saraç, Z., & Hacioğlu, R. (2017). Prediction of wind speed with non-linear autoregressive (nar) neural networks. In 2017 25th Signal Processing and Communications Applications Conference (SIU) (pp. 14), IEEE. doi: 10.1109/SIU.2017.7960507.

Karasu, S., Altan, A., Bekiros, S., & Ahmad, W. (2020). A new forecasting model with wrapper-based feature selection approach using multi-objective optimization technique for chaotic crude oil time series. Energy, 212, 118750. doi: 10.1016/j.energy.2020.118750.

Kayri, M. (2016). Predictive abilities of Bayesian regularization and Levenberg–Marquardt algorithms in artificial neural networks: A comparative empirical study on social data. Mathematical and Computational Applications, 21, 20. doi: 10.3390/mca21020020.

Khamis, A., & Abdullah, S. (2014). Forecasting wheat price using backpropagation and narx neural network. The International Journal of Engineering and Science, 3, 1926.

Khan, T. A., Alam, M., Shahid, Z., & Mazliham, M. (2019). Comparative performance analysis of Levenberg-Marquardt, Bayesian regularization and scaled conjugate gradient for the prediction of flash floods. Journal of Information Communication Technologies and Robotic Applications, 10, 5258, Available from: http://jictra.com.pk/index.php/jictra/article/view/188/112

Kling, J. L., & Bessler, D. A. (1985). A comparison of multivariate forecasting procedures for economic time series. International Journal of Forecasting, 1, 524. doi: 10.1016/S0169-2070(85)80067-4.

Kohzadi, N., Boyd, M. S., Kermanshahi, B., & Kaastra, I. (1996). A comparison of artificial neural network and time series models for forecasting commodity prices. Neurocomputing, 10, 169181. doi: 10.1016/0925-2312(95)00020-8.

Kouadio, L., Deo, R. C., Byrareddy, V., Adamowski, J. F., Mushtaq, S., et al. (2018). Artificial intelligence approach for the prediction of robusta coffee yield using soil fertility properties. Computers and Electronics in Agriculture, 155, 324338. doi: 10.1016/j.compag.2018.10.014.

Lapedes, A., & Farber, R. (1987). How neural nets work. In Neural Information Processing Systems.

Levenberg, K. (1944). A method for the solution of certain non-linear problems in least squares. Quarterly of Applied Mathematics, 2, 164168. doi: 10.1090/qam/10666.

Li, G., Chen, W., Li, D., Wang, D., & Xu, S. (2020). Comparative study of short-term forecasting methods for soybean oil futures based on lstm, svr, es and wavelet transformation. Journal of Physics: Conference Series (pp. 012007), IOP Publishing. doi: 10.1088/1742-6596/1682/1/012007.

Li, J., Li, G., Liu, M., Zhu, X., & Wei, L. (2020). A novel text-based framework for forecasting agricultural futures using massive online news headlines. International Journal of Forecasting. doi: 10.1016/j.ijforecast.2020.02.002.

Li, C., Bremer, P., Harder, M. K., Lee, M. S., Parker, K., Gaugler, E. C., & Mirosa, M. (2022). A systematic review of food loss and waste in China: Quantity, impacts and mediators. Journal of Environmental Management, 303, 114092. doi: 10.1016/j.jenvman.2021.114092.

Liu, X., & Wang, Y. (2022). Influence of oil price on corn price based on multiple linear regression model. In Innovative Computing (pp. 909916). Springer. doi: 10.1007/978-981-16-4258-6_111.

Liu, B., Fang, H., Zhang, F., Zhong, Z., & Chen, Y. (2022). Spatiotemporal affordability evaluation of water services in China: A functional cost-price model. Advanced Sustainable Systems, 6, 2100284. doi: 10.1002/adsu.202100284.

Lopes, L. P. (2018). Prediction of the brazilian natural coffee price through statistical machine learning models. SIGMAE, 7, 116.

Lu, S., Cheng, G., Li, T., Xue, L., Liu, X., Huang, J., & Liu, G. (2022). Quantifying supply chain food loss in China with primary data: A large-scale, field-survey based analysis for staple food, vegetables, and fruits. Resources, Conservation and Recycling, 177, 106006. doi: 10.1016/j.resconrec.2021.106006.

Ma, Y., Zhang, L., Song, S., & Yu, S. (2022). Impacts of energy price on agricultural production, energy consumption, and carbon emission in China: A price endogenous partial equilibrium model analysis. Sustainability, 14, 3002. doi: 10.3390/su14053002.

Marfatia, H. A., Ji, Q., & Luo, J. (2022). Forecasting the volatility of agricultural commodity futures: The role of co-volatility and oil volatility. Journal of Forecasting, 41, 383404. doi: 10.1002/for.2811.

Marquardt, D. W. (1963). An algorithm for least-squares estimation of nonlinear parameters. Journal of the Society for Industrial and Applied Mathematics, 11, 431441. doi: 10.1137/0111030.

Mayabi, T. W. (2019). An artificial neural network model for predicting retail maize prices in Kenya. Ph.D. thesis. University of Nairobi.

McIntosh, C. S., & Bessler, D. A. (1988). Forecasting agricultural prices using a bayesian composite approach. Journal of Agricultural and Applied Economics, 20, 7380. doi: 10.1017/S0081305200017611.

Melo, B. D., Milioni, A. Z., & Nascimento Júnior, C. L. (2007). Daily and monthly sugar price forecasting using the mixture of local expert models. Pesquisa Operacional, 27, 235246. doi: 10.1590/S0101-74382007000200003.

Mishra, G., & Singh, A. (2013). A study on forecasting prices of groundnut oil in Delhi by arima methodology and artificial neural networks. Agris On-Line Papers in Economics and Informatics, 5(1), 8396. doi:10.22004/ag.econ.157527.

Møller, M. F. (1993). A scaled conjugate gradient algorithm for fast supervised learning. Neural Networks, 6, 525533. doi: 10.1016/S0893-6080(05)80056-5.

Moreno, R. S., Salazar, O. Z., et al. (2018). An artificial neural network model to analyze maize price behavior in Mexico. Applied Mathematics, 9, 473. doi: 10.4236/am.2018.95034.

Naveena, K., Subedar, S., et al. (2017). Hybrid time series modelling for forecasting the price of washed coffee (arabica plantation coffee) in India. International Journal of Agriculture Sciences, 09753710.

Niu, Y., Xie, G., Xiao, Y., Liu, J., Zou, H., Qin, K., … Huang, M. (2022). The story of grain self-sufficiency: China’s food security and food for thought. Food and Energy Security, 11, e344. doi: 10.1002/fes3.344.

Ouyang, S., Hu, J., Yang, M., Yao, M., & Lin, J. (2022). Temporal and regional differences and empirical analysis on sensitive factors of the corn production cost in China. Applied Sciences, 12, 1202. doi: 10.3390/app12031202.

Paluszek, M., & Thomas, S. (2020). Practical MATLAB deep learning: A project-based approach, Apress. Available from: https://link.springer.com/content/pdf/10.1007/978-1-4842-5124-9.pdf

Penone, C., Giampietri, E., & Trestini, S. (2022). Futures–spot price transmission in eu corn markets. Agribusiness. doi: 10.1002/agr.21735.

Quan-Yin, Z., Yong-Hu, Y., Yun-Yang, Y., & Tian-Feng, G. (2014). A novel efficient adaptive sliding window model for week-ahead price forecasting. TELKOMNIKA Indonesian Journal of Electrical Engineering, 12, 22192226. doi: 10.11591/telkomnika.v12i3.4490.

Rasheed, A., Younis, M. S., Ahmad, F., Qadir, J., Kashif, M., (2021). District wise price forecasting of wheat in Pakistan using deep learning. arXiv preprint arXiv:2103.04781.

Rezitis, A. N. (2015). The relationship between agricultural commodity prices, crude oil prices and us dollar exchange rates: A panel var approach and causality analysis. International Review of Applied Economics, 29, 403434. doi: 10.1080/02692171.2014.1001325.

Ribeiro, M. H. D. M., & dos Santos Coelho, L. (2020). Ensemble approach based on bagging, boosting and stacking for short-term prediction in agribusiness time series. Applied Soft Computing, 86, 105837. doi: 10.1016/j.asoc.2019.105837.

Ribeiro, C. O., & Oliveira, S. M. (2011). A hybrid commodity price-forecasting model applied to the sugar–alcohol sector. Australian Journal of Agricultural and Resource Economics, 55, 180198. doi: 10.1111/j.1467-8489.2011.00534.x.

Ribeiro, M. H. D. M., Ribeiro, V. H. A., Reynoso-Meza, G., & dos Santos Coelho, L. (2019). Multi-objective ensemble model for short-term price forecasting in corn price time series. 2019 International Joint Conference on Neural Networks (IJCNN), IEEE (pp. 18). doi: 10.1109/IJCNN.2019.8851880.

Ricome, A., & Reynaud, A. (2022). Marketing contract choices in agriculture: The role of price expectation and price risk management. Agricultural Economics, 53, 170186. doi: 10.1111/agec.12675.

RL, M., & Mishra, A. K. (2021). Forecasting spot prices of agricultural commodities in India: Application of deep-learning models. Intelligent Systems in Accounting, Finance and Management, 28, 7283. doi: 10.1002/isaf.1487.

Selvamuthu, D., Kumar, V., & Mishra, A. (2019). Indian stock market prediction using artificial neural networks on tick data. Financial Innovation, 5, 16. doi: 10.1186/s40854-019-0131-7.

Shahhosseini, M., Hu, G., & Archontoulis, S. (2020). Forecasting corn yield with machine learning ensembles. Frontiers in Plant Science, 11, 1120. doi: 10.3389/fpls.2020.01120.

Shahhosseini, M., Hu, G., Huber, I., & Archontoulis, S. V. (2021). Coupling machine learning and crop modeling improves crop yield prediction in the us corn belt. Scientific Reports, 11, 115. doi: 10.1038/s41598-020-80820-1.

Shahwan, T., & Odening, M. (2007). Forecasting agricultural commodity prices using hybrid neural networks. Computational intelligence in economics and finance (pp. 6374). Springer. doi: 10.1007/978-3-540-72821-4_3.

Shimizu, S., & Kano, Y. (2008). Use of non-normality in structural equation modeling: Application to direction of causation. Journal of Statistical Planning and Inference, 138, 34833491. doi: 10.1016/j.jspi.2006.01.017.

Shimizu, S., Hoyer, P. O., Hyvärinen, A., Kerminen, A., Jordan, M., (2006). A linear non-Gaussian acyclic model for causal discovery. Journal of Machine Learning Research 7, 2003–2030. Available from: https://www.jmlr.org/papers/volume7/shimizu06a/shimizu06a.pdf?ref=https://codemonkey.link

Shimizu, S., Inazumi, T., Sogawa, Y., Hyvärinen, A., Kawahara, Y., Washio, T., … Bollen, K. (2011). Directlingam: A direct method for learning a linear non-Gaussian structural equation model. The Journal of Machine Learning Research, 12, 12251248, Available from: https://www.jmlr.org/papers/volume12/shimizu11a/shimizu11a.pdf.

Silalahi, D. D. (2013). Application of neural network model with genetic algorithm to predict the international price of crude palm oil (CPO) and soybean oil (SBO). In 12th National Convention on Statistics (NCS) (pp. 12), Mandaluyong City, Philippine, October.

Silva, N., Siqueira, I., Okida, S., Stevan, S. L., & Siqueira, H. (2019). Neural networks for predicting prices of sugarcane derivatives. Sugar Tech, 21, 514523. doi: 10.1007/s12355-018-0648-5.

Silva, R. F., Barreira, B. L., & Cugnasca, C. E. (2021). Prediction of corn and sugar prices using machine learning, econometrics, and ensemble models. Engineering Proceedings (Vol. 9, pp. 31). doi: 10.3390/engproc2021009031.

Singh, A., & Mishra, G. (2015). Application of box-jenkins method and artificial neural network procedure for time series forecasting of prices. Statistics in Transition New Series, 16(1), 8396.

Storm, H., Baylis, K., & Heckelei, T. (2020). Machine learning in agricultural and applied economics. European Review of Agricultural Economics, 47, 849892. doi: 10.1093/erae/jbz033.

Surjandari, I., Naffisah, M. S., & Prawiradinata, M. I. (2015). Text mining of Twitter data for public sentiment analysis of staple foods price changes. Journal of Industrial and Intelligent Information, 3. doi: 10.12720/jiii.3.3.253-257.

Wan, H., & Zhou, Y. (2021). Neural network model comparison and analysis of prediction methods using arima and LSTM models. In 2021 IEEE International Conference on Advances in Electrical Engineering and Computer Applications (AEECA), IEEE (pp. 640643). doi: 10.1109/AEECA52519.2021.9574427.

Wang, Z., & Bessler, D. A. (2004). Forecasting performance of multivariate time series models with full and reduced rank: An empirical examination. International Journal of Forecasting, 20, 683695. doi: 10.1016/j.ijforecast.2004.01.002.

Wang, T., & Yang, J. (2010). Nonlinearity and intraday efficiency tests on energy futures markets. Energy Economics, 32, 496503. doi: 10.1016/j.eneco.2009.08.001.

Wang, J., Wang, Z., Li, X., & Zhou, H. (2022). Artificial bee colony-based combination approach to forecasting agricultural commodity prices. International Journal of Forecasting, 38, 2134. doi:10.1016/j.ijforecast.2019.08.006.

Wang, S., Zhang, M., Wang, Y., & Meng, H. (2022). Construction of grain price determinants analysis model based on structural vector autoregressive model. Scientific Programming, 2022. doi: 10.1155/2022/5694780.

Wang, X., Gao, S., Guo, Y., Zhou, S., Duan, Y., & Wu, D. (2022). A combined prediction model for hog futures prices based on woa-lightgbm-ceemdan. Complexity, 2022. doi: 10.1155/2022/3216036.

Warren-Vega, W. M., Aguilar-Hernández, D. E., Zárate-Guzmán, A. I., Campos-Rodríguez, A., & Romero-Cano, L. A. (2022). Development of a predictive model for agave prices employing environmental, economic, and social factors: Towards a planned supply chain for agave-tequila industry. Foods, 11, 1138. doi: 10.3390/foods11081138.

Wegener, C., von Spreckelsen, C., Basse, T., & von Mettenheim, H. J. (2016). Forecasting government bond yields with neural networks considering cointegration. Journal of Forecasting, 35, 8692. doi: 10.1002/for.2385.

Wen, G., Ma, B. L., Vanasse, A., Caldwell, C. D., Earl, H. J., & Smith, D. L. (2021). Machine learning-based canola yield prediction for site-specific nitrogen recommendations. Nutrient Cycling in Agroecosystems, 121, 241256. doi: 10.1007/s10705-021-10170-5.

Wu, Z., Weersink, A., & Maynard, A. (2022). Fuel-feed-livestock price linkages under structural changes. Applied Economics, 54, 206223. doi: 10.1080/00036846.2021.1965082.

Xu, X. (2014). Causality and price discovery in us corn markets: An application of error correction modeling and directed acyclic graphs. doi:10.22004/ag.econ.169806.

Xu, X. (2014). Cointegration and price discovery in us corn markets. In Agricultural and resource economics seminar series. North Carolina State University. doi: 10.13140/RG.2.2.30153.49768.

Xu, X. (2014). Price discovery in us corn cash and futures markets: The role of cash market selection. doi:10.22004/ag.econ.169809.

Xu, X. (2015). Causality, price discovery, and price forecasts: Evidence from us corn cash and futures markets .

Xu, X. (2015). Cointegration among regional corn cash prices. Economics Bulletin, 35, 25812594, Available from: http://www.accessecon.com/Pubs/EB/2015/Volume35/EB-15-V35-I4-P259.pdf

Xu, X. (2017). Contemporaneous causal orderings of us corn cash prices through directed acyclic graphs. Empirical Economics, 52, 731758. doi: 10.1007/s00181-016-1094-4.

Xu, X. (2017). The rolling causal structure between the Chinese stock index and futures. Financial Markets and Portfolio Management, 31, 491509. doi: 10.1007/s11408-017-0299-7.

Xu, X. (2017). Short-run price forecast performance of individual and composite models for 496 corn cash markets. Journal of Applied Statistics, 44, 25932620. doi: 10.1080/02664763.2016.1259399.

Xu, X. (2018). Causal structure among us corn futures and regional cash prices in the time and frequency domain. Journal of Applied Statistics, 45, 24552480. doi: 10.1080/02664763.2017.1423044.

Xu, X. (2018). Cointegration and price discovery in us corn cash and futures markets. Empirical Economics, 55, 18891923. doi: 10.1007/s00181-017-1322-6.

Xu, X. (2018). Intraday price information flows between the csi300 and futures market: An application of wavelet analysis. Empirical Economics, 54, 12671295. doi: 10.1007/s00181-017-1245-2.

Xu, X. (2018). Linear and nonlinear causality between corn cash and futures prices. Journal of Agricultural & Food Industrial Organization, 16, 20160006. doi: 10.1515/jafio-2016-0006.

Xu, X. (2018). Using local information to improve short-run corn price forecasts. Journal of Agricultural & Food Industrial Organization, 16. doi: 10.1515/jafio-2017-0018.

Xu, X. (2019). Contemporaneous and granger causality among us corn cash and futures prices. European Review of Agricultural Economics, 46, 663695. doi: 10.1093/erae/jby036.

Xu, X. (2019). Contemporaneous causal orderings of csi300 and futures prices through directed acyclic graphs. Economics Bulletin, 39, 20522077, Available from: http://www.accessecon.com/Pubs/EB/2019/Volume39/EB-19-V39-I3-P192.pdf

Xu, X. (2019). Price dynamics in corn cash and futures markets: Cointegration, causality, and forecasting through a rolling window approach. Financial Markets and Portfolio Management, 33, 155181. doi: 10.1007/s11408-019-00330-7.

Xu, X. (2020). Corn cash price forecasting. American Journal of Agricultural Economics, 102, 12971320. doi: 10.1002/ajae.12041.

Xu, X., Thurman, W. (2015). Forecasting local grain prices: An evaluation of composite models in 500 corn cash markets. doi:10.22004/ag.econ.205332.

Xu, X., Thurman, W. N. (2015). Using local information to improve short-run corn cash price forecasts doi:10.22004/ag.econ.285845.

Xu, X., & Zhang, Y. (2021). Corn cash price forecasting with neural networks. Computers and Electronics in Agriculture, 184, 106120. doi: 10.1016/j.compag.2021.106120.

Xu, X., & Zhang, Y. (2021). House price forecasting with neural networks. Intelligent Systems with Applications, 12, 200052. doi: 10.1016/j.iswa.2021.200052.

Xu, X., & Zhang, Y. (2021). Individual time series and composite forecasting of the Chinese stock index. Machine Learning with Applications, 5, 100035. doi: 10.1016/j.mlwa.2021.100035.

Xu, X., & Zhang, Y. (2021). Network analysis of corn cash price comovements. Machine Learning with Applications, 6, 100140. doi: 10.1016/j.mlwa.2021.100140.

Xu, X., & Zhang, Y. (2022). Canola and soybean oil price forecasts via neural networks. Advances in Computational Intelligence, 2, 32. doi: 10.1007/s43674-022-00045-9.

Xu, X., & Zhang, Y. (2022). Coking coal futures price index forecasting with the neural network. Mineral Economics. doi: 10.1007/s13563-022-00311-9.

Xu, X., & Zhang, Y. (2022). Commodity price forecasting via neural networks for coffee, corn, cotton, oats, soybeans, soybean oil, sugar, and wheat. Intelligent Systems in Accounting, Finance and Management, 29, 169181. doi: 10.1002/isaf.1519.

Xu, X., & Zhang, Y. (2022). Contemporaneous causality among one hundred Chinese cities. Empirical Economics, 63, 23152329. doi: 10.1007/s00181-021-02190-5.

Xu, X., & Zhang, Y. (2022). Contemporaneous causality among residential housing prices of ten major Chinese cities. International Journal of Housing Markets and Analysis. doi: 10.1108/IJHMA-03-2022-0039.

Xu, X., & Zhang, Y. (2022). Forecasting the total market value of a shares traded in the shenzhen stock exchange via the neural network. Economics Bulletin.

Xu, X., & Zhang, Y. (2022). House price information flows among some major Chinese cities: Linear and nonlinear causality in time and frequency domains. International Journal of Housing Markets and Analysis. doi: 10.1108/IJHMA-07-2022-0098.

Xu, X., & Zhang, Y. (2022). Network analysis of comovements among newly-built residential house price indices of seventy Chinese cities. International Journal of Housing Markets and Analysis. doi: 10.1108/IJHMA-09-2022-0134.

Xu, X., & Zhang, Y. (2022). Network analysis of housing price comovements of a hundred Chinese cities. National Institute Economic Review. doi: 10.1017/nie.2021.34.

Xu, X., & Zhang, Y. (2022). Network analysis of price comovements among corn futures and cash prices. Journal of Agricultural & Food Industrial Organization. doi: 10.1515/jafio-2022-0009.

Xu, X., & Zhang, Y. (2022). Neural network predictions of the high-frequency csi300 first distant futures trading volume. Financial Markets and Portfolio Management. doi: 10.1007/s11408-022-00421-y.

Xu, X., & Zhang, Y. (2022). Rent index forecasting through neural networks. Journal of Economic Studies, 49, 13211339. doi: 10.1108/JES-06-2021-0316.

Xu, X., & Zhang, Y. (2022). Residential housing price index forecasting via neural networks. Neural Computing and Applications, 34, 1476314776. doi: 10.1007/s00521-022-07309-y.

Xu, X., & Zhang, Y. (2022). Retail property price index forecasting through neural networks. Journal of Real Estate Portfolio Management. doi: 10.1080/10835547.2022.2110668.

Xu, X., & Zhang, Y. (2022). Second-hand house price index forecasting with neural networks. Journal of Property Research, 39, 215236. doi: 10.1080/09599916.2021.1996446.

Xu, X., & Zhang, Y. (2022). Soybean and soybean oil price forecasting through the nonlinear autoregressive neural network (NARNN) and NARNN with exogenous inputs (NARNN–x). Intelligent Systems with Applications, 13, 200061. doi: 10.1016/j.iswa.2022.200061.

Xu, X., & Zhang, Y. (2022). Steel price index forecasting through neural networks: The composite index, long products, flat products, and rolled products. Mineral Economics. doi: 10.1007/s13563-022-00357-9.

Xu, X., & Zhang, Y. (2022). Thermal coal price forecasting via the neural network. Intelligent Systems with Applications, 14, 200084. doi: 10.1016/j.iswa.2022.200084.

Xu, X., & Zhang, Y. (2023). Cointegration between housing prices: Evidence from one hundred Chinese cities. Journal of Property Research, 40, 5375. doi: 10.1080/09599916.2022.2114926.

Xu, Y., Li, J., Wang, L., & Li, C. (2022). Liquidity of China’s agricultural futures market: Measurement and cross-market dependence. China Agricultural Economic Review. doi: 10.1108/CAER-05-2021-0099.

Yang, J., & Awokuse, T. O. (2003). Asset storability and hedging effectiveness in commodity futures markets. Applied Economics Letters, 10, 487491. doi: 10.1080/1350485032000095366.

Yang, J., & Leatham, D. J. (1998). Market efficiency of us grain markets: Application of cointegration tests. Agribusiness: An International Journal, 14, 107112. doi: 10.1002/(SICI)1520-6297(199803/04)14:2<107::AID-AGR3>3.0.CO;2-6.

Yang, Q., & Wang, Z. (2019). Fuzzy model applied in risk perception and price forecasts. International Journal of Fuzzy Systems, 21, 19061918. doi: 10.1007/s40815-019-00651-9.

Yang, J., Haigh, M. S., & Leatham, D. J. (2001). Agricultural liberalization policy and commodity price volatility: A garch application. Applied Economics Letters, 8, 593598. doi: 10.1080/13504850010018734.

Yang, J., Zhang, J., & Leatham, D. J. (2003). Price and volatility transmission in international wheat futures markets. Annals of Economics and Finance, 4, 3750, Available from: https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.295.2182&rep=rep1&type=pdf

Yang, J., Su, X., & Kolari, J. W. (2008). Do euro exchange rates follow a martingale? Some out-of-sample evidence. Journal of Banking & Finance, 32, 729740. doi: 10.1016/j.jbankfin.2007.05.009.

Yang, J., Cabrera, J., & Wang, T. (2010). Nonlinearity, data-snooping, and stock index ETF return predictability. European Journal of Operational Research, 200, 498507. doi: 10.1016/j.ejor.2009.01.009.

Yang, J., Li, Z., & Wang, T. (2021). Price discovery in Chinese agricultural futures markets: A comprehensive look. Journal of Futures Markets, 41, 536555. doi: 10.1002/fut.22179.

Yang, J., Ge, Y. E., & Li, K. X. (2022). Measuring volatility spillover effects in dry bulk shipping market. Transport Policy. doi: 10.1016/j.tranpol.2022.01.018.

Yang, Z., Du, X., Lu, L., & Tejeda, H. (2022). Price and volatility transmissions among natural gas, fertilizer, and corn markets: A revisit. Journal of Risk and Financial Management, 15, 91. doi: 10.3390/jrfm15020091.

Yin, Y., & Zhu, Q. (2012). Effect of magnitude differences in the raw data on price forecasting using rbf neural network. In 2012 11th International Symposium on Distributed Computing and Applications to Business (pp. 237240), Engineering & Science, IEEE. doi: 10.1109/DCABES.2012.19.

Yoosefzadeh-Najafabadi, M., Earl, H. J., Tulpan, D., Sulik, J., & Eskandari, M. (2021). Application of machine learning algorithms in plant breeding: Predicting yield from hyperspectral reflectance in soybean. Frontiers in Plant Science, 11, 2169. doi: 10.3389/fpls.2020.624273.

Yu, W., Yue, Y., & Wang, F. (2022). The spatial-temporal coupling pattern of grain yield and fertilization in the north China plain. Agricultural Systems, 196, 103330. doi: 10.1016/j.agsy.2021.103330.

Yuan, C. Z., San, W. W., & Leong, T. W. (2020). Determining optimal lag time selection function with novel machine learning strategies for better agricultural commodity prices forecasting in Malaysia. Proceedings of the 2020 2nd International Conference on Information Technology and Computer Communications (pp. 3742). doi: 10.1145/3417473.3417480.

Zelingher, R., Makowski, D., Brunelle, T. (2020). Forecasting impacts of agricultural production on global maize price .

Zelingher, R., Makowski, D., & Brunelle, T. (2021). Assessing the sensitivity of global maize price to regional productions using statistical and machine learning methods. Frontiers in Sustainable Food Systems, 5, 171. doi: 10.3389/fsufs.2021.655206.

Zhang, J., Meng, Y., Wei, J., Chen, J., & Qin, J. (2021). A novel hybrid deep learning model for sugar price forecasting based on time series decomposition. Mathematical Problems in Engineering, 2021. doi: 10.1155/2021/6507688.

Zhao, H. (2021). Futures price prediction of agricultural products based on machine learning. Neural Computing and Applications, 33, 837850. doi: 10.1007/s00521-020-05250-6.

Zhou, L. (2021). Application of arima model on prediction of China’s corn market. In Journal of Physics: Conference Series (pp. 012064), IOP Publishing. doi: 10.1088/1742-6596/1941/1/012064.

Zhu, Q. Y., Yin, Y. H., Zhu, H. J., & Zhou, H. (2014). Effect of magnitude differences in the original data on price forecasting. Journal of Algorithms & Computational Technology, 8, 389420. doi: 10.1260/1748-3018.8.4.389.

Zong, J., & Zhu, Q. (2012). Apply grey prediction in the agriculture production price. In 2012 Fourth International Conference on Multimedia Information Networking and Security, IEEE (pp. 396399). doi: 10.1109/MINES.2012.78.

Zong, J., & Zhu, Q. (2012). Price forecasting for agricultural products based on BP and RBF neural network. In 2012 IEEE International Conference on Computer Science and Automation Engineering, IEEE (pp. 607610). doi: 10.1109/ICSESS.2012.6269540.

Zou, H., Xia, G., Yang, F., & Wang, H. (2007). An investigation and comparison of artificial neural network and time series models for Chinese food grain price forecasting. Neurocomputing, 70, 29132923. doi: 10.1016/j.neucom.2007.01.009.

Corresponding author

Xiaojie Xu can be contacted at: xxu6@alumni.ncsu.edu

Related articles