Majda Kermadi, Saïd Moussaoui, Abdelhalim Taieb Brahimi and Mouloud Feliachi
This paper aims to present a data-processing methodology combining kernel change detection (KCD) and efficient global optimization algorithms for solving inverse problem in eddy…
Abstract
Purpose
This paper aims to present a data-processing methodology combining kernel change detection (KCD) and efficient global optimization algorithms for solving inverse problem in eddy current non-destructive testing. The main purpose is to reduce the computation cost of eddy current data inversion, which is essentially because of the heavy forward modelling with finite element method and the non-linearity of the parameter estimation problem.
Design/methodology/approach
The KCD algorithm is adapted and applied to detect damaged parts in an inspected conductive tube using probe impedance signal. The localization step allows in reducing the number of measurement data that will be processed for estimating the flaw characteristics using a global optimization algorithm (efficient global optimization). Actually, the minimized objective function is calculated from data related to defect detection indexes provided by KCD.
Findings
Simulation results show the efficiency of the proposed methodology in terms of defect detection and localization; a significant reduction of computing time is obtained in the step of defect characterization.
Originality/value
This study is the first of its kind that combines a change detection method (KCD) with a global optimization algorithm (efficient global optimization) for defect detection and characterization. To show that such approach allows to reduce the numerical cost of ECT data inversion.
Details
Keywords
Emmanuel Blanchard, Adrian Sandu and Corina Sandu
The purpose of this paper is to propose a new computational approach for parameter estimation in the Bayesian framework. A posteriori probability density functions are obtained…
Abstract
Purpose
The purpose of this paper is to propose a new computational approach for parameter estimation in the Bayesian framework. A posteriori probability density functions are obtained using the polynomial chaos theory for propagating uncertainties through system dynamics. The new method has the advantage of being able to deal with large parametric uncertainties, non‐Gaussian probability densities and nonlinear dynamics.
Design/methodology/approach
The maximum likelihood estimates are obtained by minimizing a cost function derived from the Bayesian theorem. Direct stochastic collocation is used as a less computationally expensive alternative to the traditional Galerkin approach to propagate the uncertainties through the system in the polynomial chaos framework.
Findings
The new approach is explained and is applied to very simple mechanical systems in order to illustrate how the Bayesian cost function can be affected by the noise level in the measurements, by undersampling, non‐identifiablily of the system, non‐observability and by excitation signals that are not rich enough. When the system is non‐identifiable and an a priori knowledge of the parameter uncertainties is available, regularization techniques can still yield most likely values among the possible combinations of uncertain parameters resulting in the same time responses than the ones observed.
Originality/value
The polynomial chaos method has been shown to be considerably more efficient than Monte Carlo in the simulation of systems with a small number of uncertain parameters. This is believed to be the first time the polynomial chaos theory has been applied to Bayesian estimation.
Details
Keywords
A number of problems in engineering and in the sciences may be formulated as involving a search for the extremal point of some multivariable utility function f. The Bayes method…
Abstract
A number of problems in engineering and in the sciences may be formulated as involving a search for the extremal point of some multivariable utility function f. The Bayes method may be used for achieving this aim. This method has recently been applied to some problems for planning “extremal” experiments in chemistry and in agrophysics, and for the study of random processes and other fields.
This chapter explores the multifaceted relationship between quantum computing (QC) and sustainability, with a focus on the Quadratic Unconstrained Binary Optimisation (QUBO…
Abstract
This chapter explores the multifaceted relationship between quantum computing (QC) and sustainability, with a focus on the Quadratic Unconstrained Binary Optimisation (QUBO) framework. The manuscript delves into the theoretical underpinnings of QUBO and its formulation as a quantum annealing problem, identifying the quantum principles that facilitate the resolution of such optimisation challenges. It offers a critical analysis of the suitability of QUBO for unconstrained problems and its efficacy in consistently locating the global minimum – a pivotal concern in optimisation tasks. Further, this study provides a nuanced discussion on the intersection of QC and sustainability. It delineates the types of optimisation problems within sustainability initiatives that are amenable to formulation as QUBO problems, while also highlighting sustainability challenges that elude the QUBO framework. It argues for the integration of quantum solutions into business operations, highlighting the potential for QC to play a transformative role in achieving sustainability objectives. The critique of the current hype surrounding QC provides a balanced viewpoint, ensuring a grounded approach to the adoption of quantum technologies in tackling pressing global issues.
Details
Keywords
Elderberry (Sambucus nigra) fruits are rich in bioactive components, specifically in anthocyanins. In this study, freeze-dried and powdered elderberry fruits were added to milk…
Abstract
Purpose
Elderberry (Sambucus nigra) fruits are rich in bioactive components, specifically in anthocyanins. In this study, freeze-dried and powdered elderberry fruits were added to milk, yoghurt and kefir samples at ratios of 5, 10 and 15% (w/w) to fortify these dairy products at home scale, and final products were monitored to understand the behaviour of bioactive compounds.
Design/methodology/approach
The action of bioactive compounds was examined before and after in vitro gastrointestinal digestion by the analysis of total anthocyanin content, total phenolic content and antioxidant activity assays as well as the HPLC-PDA system on the first day of preparation. Moreover, the effect of three days of storage on individual phenolic compounds was evaluated.
Findings
Kefir samples exhibited the highest total anthocyanin content levels among prepared products (255±4-702±65 µmol cyanidin-3-glucoside eq/100 g), which is followed by yoghurt samples. Individual phenolics in samples prepared at the same concentrations behaved almost similarly with each other during gastric digestion phases (p > 0.05), whereas intestinal digestion phases caused significant differences, and phenolics in yoghurt samples exhibited higher values than others (p < 0.05). The stability of bioactive compounds in samples tended to decrease during storage; however, as observed during gastrointestinal digestion, yoghurt and kefir samples provided better matrices than milk to maintain the presence of bioactive compounds (p < 0.05).
Originality/value
Previous studies have indicated that elderberry fruits contain high levels of bioactive compounds, and these fruits have been used to fortify different food matrices. However, this research paper investigates the interaction between three selected dairy products and elderberry powder with each other for the first time.
Details
Keywords
Arwen H. DeCostanza, Katherine R. Gamble, Armando X. Estrada and Kara L. Orvis
Unobtrusive measurement methodologies are critical to implementing intelligent tutoring systems (ITS) for teams. Such methodologies allow for continuous measurement of team states…
Abstract
Unobtrusive measurement methodologies are critical to implementing intelligent tutoring systems (ITS) for teams. Such methodologies allow for continuous measurement of team states and processes while avoiding disruption of mission or training performance, and do not rely on post hoc feedback (including for the aggregation of data into measures or to develop insights from these real-time metrics). This chapter summarizes advances in unobtrusive measurement developed within Army research programs to illustrate the variety and potential that unobtrusive measurement approaches can provide for building ITS for teams. Challenges regarding the real-time aggregation of data and applications to current and future ITS for teams are also discussed.
Details
Keywords
Wei He, Yuanming Xu, Yaoming Zhou and Qiuyue Li
This paper aims to introduce a method based on the optimizer of the particle swarm optimization (PSO) algorithm to improve the efficiency of a Kriging surrogate model.
Abstract
Purpose
This paper aims to introduce a method based on the optimizer of the particle swarm optimization (PSO) algorithm to improve the efficiency of a Kriging surrogate model.
Design/methodology/approach
PSO was first used to identify the best group of trend functions and to optimize the correlation parameter thereafter.
Findings
The Kriging surrogate model was used to resolve the fuselage optimization of an unmanned helicopter.
Practical implications
The optimization results indicated that an appropriate PSO scheme can improve the efficiency of the Kriging surrogate model.
Originality/value
Both the STANDARD PSO and the original PSO algorithms were chosen to show the effect of PSO on a Kriging surrogate model.
Details
Keywords
MARK STEWART and PETER WILLETT
This paper describes the simulation of a nearest neighbour searching algorithm for document retrieval using a pool of microprocessors. The documents in a database are organised in…
Abstract
This paper describes the simulation of a nearest neighbour searching algorithm for document retrieval using a pool of microprocessors. The documents in a database are organised in a multi‐dimensional binary search tree, and the algorithm identifies the nearest neighbour for a query by a backtracking search of this tree. Three techniques are described which allow parallel searching of the tree. A PASCAL‐based, general purpose simulation system is used to simulate these techniques, using a pool of Transputer‐like microprocessors with three standard document test collections. The degree of speed‐up and processor utilisation obtained is shown to be strongly dependent upon the characteristics of the documents and queries used. The results support the use of pooled microprocessor systems for searching applications in information retrieval.
Asif Qumer Gill and Deborah Bunker
In distributed adaptive development environments (DADE), a primary concern is that of human communication and knowledge sharing among developers. Developers' task performance will…
Abstract
Purpose
In distributed adaptive development environments (DADE), a primary concern is that of human communication and knowledge sharing among developers. Developers' task performance will be enhanced when their task needs are aligned with the communication media or technology capabilities of the development environment. What are actual communication needs of developers; and how do we enable developers to self‐assess and select appropriate communication technology for their tasks in the DADE. The purpose of this paper is to investigate and present research based on the developers' needs for communication technologies in the context of DADE.
Design/methodology/approach
The authors applied an exploratory qualitative research method to investigate, analyze and integrate survey information sourced from 40 developers, to identify their communication technology needs and, based on this information, the authors then set up a practical tool – communication technologies assessment tool (CTAT) to assist developers in the self‐assessment and selection of appropriate communication technologies for their DADE; and also to share this assessment knowledge with other developers or teams located in various DADEs.
Findings
The results of this research suggest that an effective CTAT should be an integral part of the DADE; and a DADE should have a “single source of information” in order to avoid possible communication inconsistencies and ambiguities.
Originality/value
The study results and the resultant CTAT may help developers to make informed choices about the assessment and selection of appropriate communication tools but it may also help communication tools and technology service providers to develop and improve their communication tools based on the identified developers' communication needs.