K.J. Craig, Nielen Stander, D.A. Dooge and S. Varadappa
The purpose of this paper is to provide a methodology with which to perform variable screening and optimization in automotive crashworthiness design.
Abstract
Purpose
The purpose of this paper is to provide a methodology with which to perform variable screening and optimization in automotive crashworthiness design.
Design/methodology/approach
The screening method is based on response surface methodology in which linear response surfaces are used to create approximations to the design response. The response surfaces are used to estimate the sensitivities of the responses with respect to the design variables while the variance is used to estimate the confidence interval of the regression coefficients. The sampling is based on the D‐optimality criterion with over‐sampling to improve noise filtering and find the best estimate of the regression coefficients. The coefficients and their confidence intervals as determined using analysis of variance (ANOVA), are used to construct bar charts for the purpose of selecting the important variables.
Findings
A known analytical function is first used to illustrate the effectiveness of screening. Using the finite element method (FEM), a complex vehicle occupant impact problem and a full vehicle multidisciplinary problem featuring frontal impact and torsional modal analysis of the vehicle body are modeled and parameterized. Two optimizations are conducted for each FEM example, one with the full variable set and one with a screened subset. An iterative, successive linear approximation method is used to achieve convergence. It is shown that, although significantly different final designs may be obtained, an appropriately selected subset of variables is effective while significantly reducing computational cost.
Practical implications
The method illustrated provides a practical approach to the screening of variables in simulation‐based design optimization, especially in automotive crashworthiness applications with costly simulations. It is shown that the reduction of variables used in the optimization process significantly reduces the total cost of the optimization.
Originality/value
Although variable screening has been used in other disciplines, the use of response surfaces to determine the variable screening information is novel in the crashworthiness field.
Details
Keywords
Xiwen Cai, Haobo Qiu, Liang Gao, Xiaoke Li and Xinyu Shao
This paper aims to propose hybrid global optimization based on multiple metamodels for improving the efficiency of global optimization.
Abstract
Purpose
This paper aims to propose hybrid global optimization based on multiple metamodels for improving the efficiency of global optimization.
Design/methodology/approach
The method has fully utilized the information provided by different metamodels in the optimization process. It not only imparts the expected improvement criterion of kriging into other metamodels but also intelligently selects appropriate metamodeling techniques to guide the search direction, thus making the search process very efficient. Besides, the corresponding local search strategies are also put forward to further improve the optimizing efficiency.
Findings
To validate the method, it is tested by several numerical benchmark problems and applied in two engineering design optimization problems. Moreover, an overall comparison between the proposed method and several other typical global optimization methods has been made. Results show that the global optimization efficiency of the proposed method is higher than that of the other methods for most situations.
Originality/value
The proposed method sufficiently utilizes multiple metamodels in the optimizing process. Thus, good optimizing results are obtained, showing great applicability in engineering design optimization problems which involve costly simulations.
Details
Keywords
Shujuan Hou, Zhidan Zhang, Xujing Yang, Hanfeng Yin and Qing Li
The purpose of this paper is to optimize a new thin-walled cellular configurations with crashworthiness criteria, so as to improve the crashworthiness of components of a vehicle…
Abstract
Purpose
The purpose of this paper is to optimize a new thin-walled cellular configurations with crashworthiness criteria, so as to improve the crashworthiness of components of a vehicle body.
Design/methodology/approach
ANSYS Parametric Design Language is used to create the parameterized models so that the design variables can be changed conveniently. Moreover, the surrogate technique, namely response surface method, is adopted for fitting objective and constraint functions. The factorial design and D-optimal criterion are employed to screen active parameters for constructing the response functions of the specific energy absorption and the peak crushing force. Finally, sequential quadratic programming-NLPQL is utilized to solve the design optimization problem of the new cellular configurations filled with multi-cell circular tubes under the axial crushing loading.
Findings
Two kinds of distribution modes of the cellular configurations are first investigated, which are in an orthogonal way and in a diamond fashion. After comparing the optimized configurations of the rectangular distribution with the annular distribution of the multi-cell fillers, it is found that the orthogonal way seems better in the aspects of crashworthiness than the diamond fashion.
Originality/value
The two new thin-walled cellular configuration are studied and optimized with the crashworthiness criteria. Study on the new cellular configurations is very valuable for improving the crashworthiness of components of a vehicle body. Meanwhile, the factorial design and the factor screening are adopted in the process of the crashworthiness optimization of the new thin-walled cellular configurations.
Details
Keywords
F.A. DiazDelaO and S. Adhikari
In the dynamical analysis of engineering systems, running a detailed high‐resolution finite element model can be expensive even for obtaining the dynamic response at few frequency…
Abstract
Purpose
In the dynamical analysis of engineering systems, running a detailed high‐resolution finite element model can be expensive even for obtaining the dynamic response at few frequency points. To address this problem, this paper aims to investigate the possibility of representing the output of an expensive computer code as a Gaussian stochastic process.
Design/methodology/approach
The Gaussian process emulator method is discussed and then applied to both simulated and experimentally measured data from the frequency response of a cantilever plate excited by a harmonic force. The dynamic response over a frequency range is approximated using only a small number of response values, obtained both by running a finite element model at carefully selected frequency points and from experimental measurements. The results are then validated applying some adequacy diagnostics.
Findings
It is shown that the Gaussian process emulator method can be an effective predictive tool for medium and high‐frequency vibration problems, whenever the data are expensive to obtain, either from a computer‐intensive code or a resource‐consuming experiment.
Originality/value
Although Gaussian process emulators have been used in other disciplines, there is no knowledge of it having been implemented for structural dynamic analyses and it has good potential for this area of engineering.
Details
Keywords
Adil Baykasoglu and Cengiz Baykasoglu
The purpose of this paper is to develop a new multi-objective optimization procedure for crashworthiness optimization of thin-walled structures especially circular tubes with…
Abstract
Purpose
The purpose of this paper is to develop a new multi-objective optimization procedure for crashworthiness optimization of thin-walled structures especially circular tubes with functionally graded thickness.
Design/methodology/approach
The proposed optimization approach is based on finite element analyses for construction of sample design space and verification; gene-expression programming (GEP) for generating algebraic equations (meta-models) to compute objective functions values (peak crash force and specific energy absorption) for design parameters; multi-objective genetic algorithms for generating design parameters alternatives and determining optimal combination of them. The authors have also utilized linear and non-linear least square regression meta-models as a benchmark for GEP.
Findings
It is shown that the proposed approach is able to generate Pareto optimal designs which are in a very good agreement with the actual results.
Originality/value
The paper presents the application of a genetic programming-based method, namely, GEP first time in the literature. The proposed approach can be used to all kinds of related crashworthiness problems.
Details
Keywords
R.H. Khatibi, R. Lincoln, D. Jackson, S. Surendran, C. Whitlow and J. Schellekens
With the diversification of modelling activities encouraged by versatile modelling tools, handling their datasets has become a formidable problem. A further impetus stems from the…
Abstract
With the diversification of modelling activities encouraged by versatile modelling tools, handling their datasets has become a formidable problem. A further impetus stems from the emergence of the real‐time forecasting culture, transforming data embedded in computer programs of one‐off modelling activities of the 1970s‐1980s into dataset assets, an important feature of modelling since the 1990s, where modelling has emerged as a practice with a pivotal role to data transactions. The scope for data is now vast but in legacy data management practices datasets are fragmented, not transparent outside their native software systems, and normally “monolithic”. Emerging initiatives on published interfaces will make datasets transparent outside their native systems but will not solve the fragmentation and monolithic problems. These problems signify a lack of science base in data management and as such it is necessary to unravel inherent generic structures in data. This paper outlines root causes for these problems and presents a tentative solution referred to as “systemic data management”, which is capable of solving the above problems through the assemblage of packaged data. Categorisation is presented as a packaging methodology and the various sources contributing to the generic structure of data are outlined, e.g. modelling techniques, modelling problems, application areas and application problems. The opportunities offered by systemic data management include: promoting transparency among datasets of different software systems; exploiting inherent synergies within data; and treating data as assets with a long‐term view on reuse of these assets in an integrated capability.
Details
Keywords
Initial value problems for the one‐dimensional third‐order dispersion equations are investigated using the reliable Adomian decomposition method (ADM).
Abstract
Purpose
Initial value problems for the one‐dimensional third‐order dispersion equations are investigated using the reliable Adomian decomposition method (ADM).
Design/methodology/approach
The solutions are obtained in the form of rapidly convergent power series with elegantly computable terms.
Findings
It was found that the technique is reliable, powerful and promising. It is easier to implement than the separation of variables method. Modifications of the ADM and the noise terms phenomenon are successfully applied for speeding up the convergence of non‐homogeneous equations.
Research limitations/implications
The method is restricted to initial value problems in which the space variable fills the whole real axis. Modifications are required to deal with initial boundary value problems. Further, the input initial condition is required to be an infinitely differentiable function and obviously, the convergence radius of the decomposition series depends on the input data.
Practical implications
The method was mainly illustrated for linear partial differential equations occuring in water resources research, but the natural extension of the ADM to solving nonlinear problems is extremely useful in nonlinear studies and soliton theory.
Originality/value
The study undertaken in this paper provides a reliable approach for solving both linear and nonlinear dispersion equations and new explicit or recursively‐based exact solutions are found.
Details
Keywords
Flexibility (enhanced cooperation) has arisen in the European Union (EU) agenda as a function of recent enlargement rounds and is now one of the key issues in the construction of…
Abstract
Flexibility (enhanced cooperation) has arisen in the European Union (EU) agenda as a function of recent enlargement rounds and is now one of the key issues in the construction of the EU polity with respect to diversity management. Whether enlargement has provoked normative reform in the EU, taking flexibility as an example is the focus of this article. The author argues that the flexibility case indicates that pressures of enlargement have not produced radical normative change in the EU. Tracing the evolution of enhanced cooperation from the 2000 Treaty of Nice onwards, the evidence points towards the continued existence of the traditional ‘frame’ of the integration process rather than its rejection in favour of more radical and innovative solutions to the EU's governance problems.
With the agreement to achieve an “internal market” by the end of 1992, the European Community has introduced a new timetable for food law harmonisation. This timetable is only the…
Abstract
With the agreement to achieve an “internal market” by the end of 1992, the European Community has introduced a new timetable for food law harmonisation. This timetable is only the latest attempt to harmonise European food law. Many delays have occurred since the first Directive was agreed in 1962. The first attempts at harmonisation, the different programmes which have been formulated and the development of case law at the European Court are examined and then aspects of the current policies are discussed.
Tran Phong and Rajib Shaw
As a consequence of the huge loss and damage caused by natural disasters all over the world, an impressive amount of attention is currently being given to a holistic approach in…
Abstract
As a consequence of the huge loss and damage caused by natural disasters all over the world, an impressive amount of attention is currently being given to a holistic approach in disaster risk management (McEntire, Fuller, Johnston, & Weber, 2002). The world experiences more and more natural disaster impacts in spite of numerous efforts, advancing sciences, and more powerful technologies. Indeed, current disasters are more complex, and climate change poses a greater potential for adverse impacts (Aalst & Burton 2002). Hence, there is a need to reassess the existing disaster risk reduction approaches due to problems in the existing risk management approaches, and new risks brought by climate change and by environment degradation.