Search results

1 – 10 of over 344000
Per page
102050
Citations:
Loading...
Available. Open Access. Open Access
Article
Publication date: 24 January 2025

Liu Tianning, Xuesong Wang, Jinzhi Lu, Yao Tong and Yixiao Liu

This paper aims to propose a set of metamodels applicable to the architecture modeling of air traffic management systems|air traffic management system (ATMS) under the UAF…

11

Abstract

Purpose

This paper aims to propose a set of metamodels applicable to the architecture modeling of air traffic management systems|air traffic management system (ATMS) under the UAF methodology. The designing of metamodels also needs to meet modeling requirements for the introduction of new supersonic airliners into the ATMS.

Design/methodology/approach

In order to complete the designing of metamodels and the architecture modeling work in the case study, the GOPPRR method and the M0–M3 modeling framework are used in this paper. The design and modeling work carried out in this paper was done in the multi-architecture modeling tool Airdraw.

Findings

In this paper, the set of metamodels applicable to the architecture modeling of ATMS under the UAF methodology was proposed, which has a quantity of 102 object metamodels, eight point metamodels, 98 property metamodels, 41 relationship metamodels, 36 role metamodels and 65 graph metamodels.

Originality/value

The metamodel design proposed in this thesis allows for architectural modeling of the ATMS. Comparing with the traditional method of system engineering, which uses files to define, the model-based ATMS architecture can be updated for different ATMS and different aircraft types by modifying the parameters of the corresponding views or adding relevant supplementary model views in the architecture model library, which greatly improves the compatibility and modifiability of the system definition.

Details

Journal of Intelligent Manufacturing and Special Equipment, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2633-6596

Keywords

Access Restricted. View access options
Article
Publication date: 17 March 2016

Arnaud Baraston, Laurent Gerbaud, Vincent Reinbold, Thomas Boussey and Frédéric Wurtz

Multiphysical models are often useful for the design of electrical devices such as electrical machines. In this way, the modeling of thermal, magnetic and electrical phenomena by…

171

Abstract

Purpose

Multiphysical models are often useful for the design of electrical devices such as electrical machines. In this way, the modeling of thermal, magnetic and electrical phenomena by using an equivalent circuit approach is often used in sizing problems. The coupling of such models with other models is difficult to take into account, partly because it adds complexity to the process. The paper proposes an automatic modelling of thermal and magnetic aspects from an equivalent circuit approach, with its computation of gradients, using selectivity on the variables. Then, it discusses the coupling of various physical models, for the sizing by optimization algorithms. Sensibility analyses are discussed and the multiphysical approach is applied on a permanent magnet synchronous machine.

Design/methodology/approach

The paper allows one to describe thermal and magnetic models by equivalent circuits. Magnetic aspects are represented by reluctance networks and thermal aspects by thermal equivalent circuits. From circuit modelling and analytical equations, models are generated, coupled and translated into computational codes (Java, C), including the computation of their jacobians. To do so, model generators are used: CADES, Reluctool, Thermotool. The paper illustrates the modelling and automatic programming aspects with Thermotool. The generated codes are directly available for optimization algorithms. Then, the formulation of the coupling with other models is studied in the case of a multiphysical sizing by optimization of the Toyota PRIUS electrical motor.

Findings

A main specificity of the approach is the ability to easily deal with the selectivity of the inputs and outputs of the generated model according to the problem specifications, thus reducing drastically the size of the jacobian matrix and the computational complexity. Another specificity is the coupling of the models using analytical equations, possibly implicit equations.

Research limitations/implications

At the present time, the multiphysical modeling is considered only for static phenomena. However, this limit is not important for numerous sizing applications.

Originality/value

The analytical approach with the selectivity gives fast models, well-adapted for optimization. The use of model generators allows robust programming of the models and their jacobians. The automatic calculation of the gradients allows the use of determinist algorithms, such as SQP, well adapted to deal with numerous constraints.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering , vol. 35 no. 3
Type: Research Article
ISSN: 0332-1649

Available. Open Access. Open Access
Article
Publication date: 30 November 2002

Jae Ha Lee and Han Deog Hui

This study explores hedging strategies that use the KTB futures to hedge the price risk of the KTB spot portfolio. The study establishes the price sensitivity, risk-minimization…

60

Abstract

This study explores hedging strategies that use the KTB futures to hedge the price risk of the KTB spot portfolio. The study establishes the price sensitivity, risk-minimization, bivariate GARCH (1,1) models as hedging models, and analyzes their hedging performances. The sample period covers from September 29, 1999 to September 18, 2001. Time-matched prices at 11:00 (11:30) of the KTB futures and spot were used in the analysis. The most important findings may be summarized as follows. First, while the average hedge ration of the price sensitivity model is close to one, both the risk-minimization and GARCH model exhibit hedge ratios that are substantially lower than one. Hedge ratios tend to be greater for daily data than for weekly data. Second, for the daily in-sample data, hedging effectiveness is the highest for the GARCH model with time-varying hedge ratios, but the risk-minimization model with constant hedge ratios is not far behind the GARCH model in its hedging performance. In the case of out-of-sample hedging effectiveness, the GARCH model is the best for the KTB spot portfolio, and the risk-minimization model is the best for the corporate bond portfolio. Third, for daily data, the in-sample hedge shows a better performance than the out-of-sample hedge, except for the risk-minimization hedge against the corporate bond portfolio. Fourth, for the weekly in-sample hedges, the price sensitivity model is the worst and the risk-minimization model is the best in hedging the KTB spot portfolio. While the GARCH model is the best against the KTB +corporate bond portfolio, the risk-minimization model is generally as good as the GARCH model. The risk-minimization model performs the best for the weekly out-of-sample data, and the out-of-sample hedges are better than the in-sample hedges. Fifth, while the hedging performance of the risk-minimization model with daily moving window seems somewhat superior to the traditional risk-minimization model when the trading volume increased one year after the inception of the KTB futures, on the average the traditional model is better than the moving-window model. For weekly data, the traditional model exhibits a better performance. Overall, in the Korean bond markets, investors are encouraged to use the simple risk-minimization model to hedge the price risk of the KTB spot and corporate bond portfolios.

Details

Journal of Derivatives and Quantitative Studies, vol. 10 no. 2
Type: Research Article
ISSN: 2713-6647

Keywords

Available. Open Access. Open Access
Article
Publication date: 31 May 2006

Mi Ae Kim

Recently, domestic market participants have a growing interest in synthetic Collateralized Debt Obligation (CDO) as a security to reduce credit risk and create new profit…

22

Abstract

Recently, domestic market participants have a growing interest in synthetic Collateralized Debt Obligation (CDO) as a security to reduce credit risk and create new profit. Therefore, the valuation method and hedging strategy for synthetic CDO become an important issue. However, there is no won-denominated credit default swap transactions, which are essential for activating synthetic CDO transaction‘ In addition, there is no transparent market information for the default probability, asset correlation, and recovery rate, which are critical variables determining the price of synthetic CDO.

This study first investigates the method of estimating the default probability, asset correlation coefficient, and recovery rate. Next, using five synthetiC CDO pricing models‘ widely used OFGC (One-Factor Non-Gaussian Copula) model. OFNGC (One-Factor Non-Gaussian Copula) model such as OFDTC (One-Factor Double T-distribution Copula) model of Hull and White (2004) or NIGC (Normal Inverse Gaussian Copula) model of Kalemanova et al.(2005), SC<Stochastic Correlation) model of Burtschell et al.(2005), and FL (Forward Loss) model of Bennani (2005), I Investigate and compare three points: 1) appropriateness for portfolio loss distribution, 2) explanation for standardized tranche spread, 3) sensitivity for delta-neutral hedging strategy. To compare pricing models, parameter estimation for each model is preceded by using the term structure of iTraxx Europe index spread and the tranch spreads with different maturities and exercise prices Remarkable results of this study are as follows. First, the probability for loss interval determining mezzanine tranche spread is lower in all models except SC model than OFGC model. This result shows that all mαdels except SC model in some degree solve the implied correlation smile phenomenon, where the correlation coefficient of mezzanine tranche must be lower than other tranches when OFGC model is used. Second, in explaining standardized tranche spread, NIGC model is the best among various models with respect to relative error. When OFGC model is compared with OFDTC model, OFOTC model is better than OFGC model in explaining 5-year tranche spreads. But for 7-year or 10-year tranches, OFDTC model is better with respect to absolute error while OFGC model is better with respect to relative error. Third, the sensitivity sign of senior tranctle spread with respect to asset correlation is sometime negative in NIG model while it is positive in other models. This result implies that a long position may be taken by the issuers of synthet.ic COO as a correlation delta-neutral hedging strategy when OFGC model is used, while a short position may be taken when NIGC model is used.

Details

Journal of Derivatives and Quantitative Studies, vol. 14 no. 1
Type: Research Article
ISSN: 2713-6647

Keywords

Access Restricted. View access options
Article
Publication date: 1 November 2007

Agnieszka Cichocka, Pascal Bruniaux and Vladan Koncar

This paper presents an introduction to the modelling of virtual garment design process in 3D… Our global project of virtual clothing design, along with the conception of a virtual…

56

Abstract

This paper presents an introduction to the modelling of virtual garment design process in 3D… Our global project of virtual clothing design, along with the conception of a virtual adaptive mannequin, is devoted to creating and modelling garments in 3D. Starting from ideas of mass customization, e-commerce and the need of numerical innovations in the garment industry, this article presents a model of virtual garment and methodology enabling virtual clothing to be conceived directly on an adaptive mannequin morphotype in 3D. A short description of the overall garment model under constraints is presented. To explain the overall methodology, the basic pattern of trousers is given. The global model of garment creation in 3D is composed of three parts - a human body model, an ease model and a garment model. The most essential part is the ease model, which is necessary for the proposed process of garment modelling. After describing each garment modelling element influencing this process, a detailed presentation of the ease model in relation to the garment model is proposed. The combination of the previously mentioned models may be considered as 2 interconnected sub-models. The first sub-model is linked with the front pattern position on the body and the second with the back pattern position on the trousers with appropriate ease values. In order to execute the identification procedure of the correct ease values and consequently their right positions on the human body, an algorithm of identification is proposed. The two sub-models are strongly connected as in the feedback effect caused by the interactions of the trouser front and back patterns. The aforementioned connection phenomenon appears during modelling and it depends on the structure of the proposed ease model. The relatively significant number of parameters requires the use of the identification technique. Finally, the superposition of virtual and real patterns was done in order to visualise the results.

Details

Research Journal of Textile and Apparel, vol. 11 no. 4
Type: Research Article
ISSN: 1560-6074

Keywords

Available. Open Access. Open Access
Article
Publication date: 30 November 2004

Joon Haeng Lee

This paper estimates and forecasts yield curve of korea bond market using a three factor term structure model based on the Nelson-Siegel model. The Nelson-Siegel model is…

21

Abstract

This paper estimates and forecasts yield curve of korea bond market using a three factor term structure model based on the Nelson-Siegel model. The Nelson-Siegel model is in-terpreted as a model of level, slope and curvature and has the flexibility required to match the changing shape of the yield curve. To estimate this model, we use the two-step estima-tion procedure as in Diebold and Li. Estimation results show our model is Quite flexible and gives a very good fit to data.

To see the forecasting ability of our model, we compare the RMSEs (root mean square error) of our model to random walk (RW) model and principal component model for out-of sample period as well as in-sample period. we find that our model has better forecasting performances over principal component model but shows slight edge over RW model especially for long run forecasting period. Considering that it is difficult for any model to show better forecasting ability over the RW model in out-of-sample period, results suggest that our model is useful for practitioners to forecast yields curve dynamics.

Details

Journal of Derivatives and Quantitative Studies, vol. 12 no. 2
Type: Research Article
ISSN: 2713-6647

Access Restricted. View access options
Book part
Publication date: 17 February 2025

Jean-Louis Ermine, Denise Bedford and Alexeis Garcia-Perez

This chapter describes the nature and importance of the activity model. The authors explain what we learn from this model and the shift from an external and broader view of…

Abstract

Chapter Summary

This chapter describes the nature and importance of the activity model. The authors explain what we learn from this model and the shift from an external and broader view of knowledge to activities related to the knowledge itself. The challenges and confusion associated with an activity model are explained. The authors suggest a clarification for understanding and designing an activity model. The similarities to business capability models are identified. A step-by-step approach to building an activity model is described and aligned with the build-out of business capability models.

Details

The Mask Methodology and Knowledge Books
Type: Book
ISBN: 978-1-80455-430-2

Available. Open Access. Open Access
Article
Publication date: 6 February 2025

Arne Walter, Kamrul Ahsan and Shams Rahman

Demand planning (DP) is a key element of supply chain management (SCM) and is widely regarded as an important catalyst for improving supply chain performance. Regarding the…

210

Abstract

Purpose

Demand planning (DP) is a key element of supply chain management (SCM) and is widely regarded as an important catalyst for improving supply chain performance. Regarding the availability of technology to process large amounts of data, artificial intelligence (AI) has received increasing attention in the DP literature in recent years, but there are no reviews of studies on the application of AI in supply chain DP. Given the importance and value of this research area, we aimed to review the current body of knowledge on the application of AI in DP to improve SCM performance.

Design/methodology/approach

Using a systematic literature review approach, we identified 141 peer-reviewed articles and conducted content analysis to examine the body of knowledge on AI in DP in the academic literature published from 2012 to 2023.

Findings

We found that AI in DP is still in its early stages of development. The literature is dominated by modelling studies. We identified three knowledge clusters for AI in DP: AI tools and techniques, AI applications for supply chain functions and the impact of AI on digital SCM. The three knowledge domains are conceptualised in a framework to demonstrate how AI can be deployed in DP to improve SCM performance. However, challenges remain. We identify gaps in the literature that make suggestions for further research in this area.

Originality/value

This study makes a theoretical contribution by identifying the key elements in applying AI in DP for SCM. The proposed conceptual framework can be used to help guide further empirical research and can help companies to implement AI in DP.

Details

The International Journal of Logistics Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0957-4093

Keywords

Available. Open Access. Open Access
Article
Publication date: 31 January 2025

Patrick Küpper, Matthias Seel and Matthias Kokorsch

Gravity models and analogue store approaches are inadequate in predicting purchases in neighbourhood stores. This requires a new theoretical and empirically tested approach.

84

Abstract

Purpose

Gravity models and analogue store approaches are inadequate in predicting purchases in neighbourhood stores. This requires a new theoretical and empirically tested approach.

Design/methodology/approach

We use the Theory of Planned Behaviour (TPB) to determine which factors predict the choice for a new neighbourhood store. We develop a suitable model using a structural equation model with survey data from two cases in which all households in the catchment areas were surveyed both before and after the store opened.

Findings

We find the TPB to be appropriate for predicting store choice. Beliefs about one-stop shopping, social pressure from family members and car availability are most important in explaining the intention to shop in the planned store. These factors also explain the actual shopping in this store after opening.

Originality/value

Our model predicts store choice before a store opens. Using a two-wave survey, we avoid ex-post rationalisation and show that, at least in our cases, quality, price and assortment do not predict store choice.

Details

International Journal of Retail & Distribution Management, vol. 53 no. 13
Type: Research Article
ISSN: 0959-0552

Keywords

Available. Open Access. Open Access
Article
Publication date: 30 January 2025

Biswajit Kar and Mamata Jenamani

A vaccination strategy to cover the susceptible population is key to containing the spread of any virus during a healthcare emergency. This study quantifies the susceptibility of…

27

Abstract

Purpose

A vaccination strategy to cover the susceptible population is key to containing the spread of any virus during a healthcare emergency. This study quantifies the susceptibility of a region based on initial infection rates to prioritize optimal vaccine distribution strategies. The authors propose a metric, the regional vulnerability index (RVI), that identifies the degree of susceptibility/vulnerability of a region to virus infections for strategically locating hubs for vaccine storage and distribution.

Design/methodology/approach

A two-phase methodology is used to address this problem. Phase 1 uses a modified Susceptible-Infected-Recovered (SIR) model, ModSIR, to estimate the RVI. Phase 2 leverages this index to model a P-Center problem, prioritizing vulnerable regions through a Mixed Integer Quadratically Constrained Programming model, along with three variations that incorporate the RVI.

Findings

Results indicate a weighting scheme based on the population-to-RVI ratio fosters fair distribution and equitable coverage of vulnerable regions. Comparisons with the public distribution strategy outlined by the Government of India reveal similar zonal segregations. Additionally, the network generated by our model outperforms the actual distribution network, corroborated by network metrics such as degree centrality, weighted degree centrality and closeness centrality.

Originality/value

This research presents a novel approach to prioritizing vaccine distribution during pandemics by applying epidemiological predictions to an integer-programming framework, optimizing COVID-19 vaccine allocation based on historical infection data. The study highlights the importance of strategic planning in public health response to effectively manage resources in emergencies.

Details

Journal of Humanitarian Logistics and Supply Chain Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2042-6747

Keywords

1 – 10 of over 344000
Per page
102050