Search results
1 – 10 of 57P.B.S. Reddy, K. Nishina and A. Subash Babu
Reports on a study carried out on an injection moulding process to produce agitators for washing machines, following complaints from customers. The study revealed that there was a…
Abstract
Reports on a study carried out on an injection moulding process to produce agitators for washing machines, following complaints from customers. The study revealed that there was a lot of variation in the product dimensions produced. Attempts to demonstrate how robust design methodology is helpful in achieving variation reduction of product dimensions and achieving target values. Various analyses carried out using ANOVA and ANOM helped to understand the dynamics of the process. In the presence of multi‐responses and specially when the responses have conflicting behavior to each other, selection of optimum conditions for the process is explained in detail. Reports on the importance of confirmation experiments and the outcome of this detailed exercise.
Details
Keywords
Andrew Terranova, Paul Boxer and Amanda Morris
Children's responses to peer victimisation are thought to influence the duration of victimisation, yet research has not clearly indicated the best ways for young people to…
Abstract
Children's responses to peer victimisation are thought to influence the duration of victimisation, yet research has not clearly indicated the best ways for young people to respond. In the current study, students (n = 403, mean age of nine years, 11 months, 55% female, 53% Caucasian) reported on their peer victimisation experiences and responses at the beginning and end of a school year. Teachers also reported on students' victimisation experiences. Cross‐lagged path analysis indicated a reciprocal association between externalising responses and victimisation. Victimisation early in the school year also resulted in increased internalising responses. Findings also suggest that coping responses are more reliably linked to subsequent victimisation rates in young people who are not yet experiencing high levels of victimisation.
Details
Keywords
Hironobu Kawamura, Ken Nishina, Masanobu Higashide and Tomomichi Suzuki
This paper aims to clarify adequate control characteristics for using a control chart on the basis of a case study of the low‐pressure chemical vapor deposition (LPCVD) process…
Abstract
Purpose
This paper aims to clarify adequate control characteristics for using a control chart on the basis of a case study of the low‐pressure chemical vapor deposition (LPCVD) process, which is one of the semiconductor manufacturing processes.
Design/methodology/approach
The paper opted for a simulation study using the data generated by EWMA model and the real data obtained from the LPCVD process.
Findings
The paper provides adequate control characteristics for control charts. It suggests that it is desirable to employ both the quality characteristic and the process rate for monitoring when the process was modeled by the EWMA model. Furthermore, if only one control characteristic is employed, then the process rate is the most adequate characteristic.
Originality/value
This paper newly proposes the process rate as a control characteristic for control charts.
Details
Keywords
O.O. Adejumo and J.O. Ojo
The results of trial experiments carried out with a computer simulation model of total reflection X‐ray fluorescence, TXRF system to determine optimum conditions for detecting…
Abstract
The results of trial experiments carried out with a computer simulation model of total reflection X‐ray fluorescence, TXRF system to determine optimum conditions for detecting certain elements of interest under various analytical conditions in a given ten‐element standard sample is presented in this paper. Results of these trial experiments show that the detectability of elements improved with increasing applied voltages up to about 43kV (for a Molybdenum anode TXRF spectrometer) and atomic number of elements. Variation of geometry such as the glancing incidence angle of the excitation beam reflected slight increase in minimum detection limit, MDL values as the angle of incidence is reduced from an optimum value of 1.6mradian to 1.0mradian. The nature of the sample support was observed to affect the detectability of the elements as good detection limits were obtained if gold is used as sample holder..
Details
Keywords
S.T.A. Niaki and Majid Khedmati
The purpose of this paper is to propose two control charts to monitor multi-attribute processes and then a maximum likelihood estimator for the change point of the parameter…
Abstract
Purpose
The purpose of this paper is to propose two control charts to monitor multi-attribute processes and then a maximum likelihood estimator for the change point of the parameter vector (process fraction non-conforming) of multivariate binomial processes.
Design/methodology/approach
The performance of the proposed estimator is evaluated for both control charts using some simulation experiments. At the end, the applicability of the proposed method is illustrated using a real case.
Findings
The proposed estimator provides accurate and useful estimation of the change point for almost all of the shift magnitudes, regardless of the process dimension. Moreover, based on the results obtained the estimator is robust with regard to different correlation values.
Originality/value
To the best of authors’ knowledge, there are no work available in the literature to estimate the change-point of multivariate binomial processes.
Details
Keywords
The aim of this paper is to circumvent the multi‐distribution effects and small sample constraints that may arise in unreplicated‐saturated fractional factorial designs during…
Abstract
Purpose
The aim of this paper is to circumvent the multi‐distribution effects and small sample constraints that may arise in unreplicated‐saturated fractional factorial designs during construction blueprint screening.
Design/methodology/approach
A simple additive ranking scheme is devised based on converting the responses of interest to rank variables regardless of the nature of each response and the optimization direction that may be issued for each of them. Collapsing all ranked responses to a single rank response, appropriately referred to as “Super‐Ranking”, allows simultaneous optimization for all factor settings considered.
Research limitations/implications
The Super‐Rank response is treated by Wilcoxon's rank sum test or Mann‐Whitney's test, aiming to establish possible factor‐setting differences by exploring their statistical significance. An optimal value for each response is predicted.
Practical implications
It is stressed, by example, that the model may handle simultaneously any number of quality characteristics. A case study based on a real geotechnical engineering project is used to illustrate how this method may be applied for optimizing simultaneously three quality characteristics that belong to each of the three possible cases, i.e. “nominal‐is‐best”, “larger‐is‐better”, and “smaller‐is‐better” respectively. For this reason, a screening set of experiments is performed on a professional CAD/CAE software package making use of an L8(27) orthogonal array where all seven factor columns are saturated by group excavation controls.
Originality/value
The statistical nature of this method is discussed in comparison with results produced by the desirability method for the case of exhausted degrees of freedom for the error. The case study itself is a unique paradigm from the area of construction operations management.
Details
Keywords
The purpose of this paper is to develop an integrated engineering process control (EPC)–statistical process control (SPC) methodology for simultaneously monitoring and controlling…
Abstract
Purpose
The purpose of this paper is to develop an integrated engineering process control (EPC)–statistical process control (SPC) methodology for simultaneously monitoring and controlling autocorrelated multiple responses, namely, brightness and viscosity of the pulp bleaching process.
Design/methodology/approach
The pulp bleaching is a process of separating cellulose from impurities present in cooked wood chips through chemical treatment. More chemical dosage or process adjustments may result in better brightness but adversely affect viscosity. Hence, the optimum chemical dosage that would simultaneously minimize the deviation of pulp brightness and viscosity from their respective targets needs to be determined. Since the responses are autocorrelated, dynamic regression is used to model the responses. Then, the optimum chemical dosage that would simultaneously optimize the pulp brightness and viscosity is determined by fuzzy optimization methodology.
Findings
The suggested methodology is validated in 12 cases. The validation results showed that the optimum dosage simultaneously minimized the variation in brightness and viscosity around their respective targets. Moreover, suggested solution has been found to be superior to the one obtained by optimizing the responses independently.
Practical implications
This study provides valuable information on how to identify the optimum process adjustments to simultaneously ensure autocorrelated multiple responses on or close to their respective targets.
Originality/value
To the best of the authors’ knowledge, this paper is the first to provide application of the integrated EPC–SPC methodology for simultaneously monitoring multiple responses. The study also demonstrates the application of dynamic regression to model autocorrelated responses.
Details
Keywords
The purpose of this paper is to propose a manufacturing product‐screening methodology that will require minimal resource expenditures as well as succinct improvement tools based…
Abstract
Purpose
The purpose of this paper is to propose a manufacturing product‐screening methodology that will require minimal resource expenditures as well as succinct improvement tools based on multi‐response prioritisation.
Design/methodology/approach
A six‐step methodology is overviewed that relies on the sampling efficiency of fractional factorial designs introduced and recommended by Dr G. Taguchi. Moreover, the multi‐response optimisation approach based on the super‐ranking concept is expanded to the more pragmatic situation where prioritising of the implicated responses is imperative. Theoretical developments address the on‐going research issue of saturated and unreplicated fractional‐factorial designs. The methodology promotes the “user‐friendly” incorporation of assigned preference weights on the studied responses. Test efficiency is improved by concise rank ordering. This technique is accomplished by adopting the powerful rank‐sum inference method of Wilcoxon‐Mann‐Whitney.
Findings
Two real‐life case studies complement the proposed technique. The first discusses a production problem on manufacturing disposable shavers. Injection moulding data for factors such as handle weight, two associated critical handle dimensions and a single mechanical property undergo preferential multi‐response improvement based on working specification standards. This case shows that regardless of fluctuations incurred by four different sources of response prioritisation, only injection speed endures high‐statistical significance for all four cases out of the seven considered production factors. Similarly, the technique identifies a single active factor in a foil manufacturing optimisation of three traits among seven examined effects.
Originality/value
This investigation suggests a technique that targets the needs of manufacturing managers and engineers for “quick‐and‐robust” decision making in preferential product improvement. This is achieved by conjoining orthogonal arrays with a well‐established non‐parametric comparison test. A version of the super‐ranking concept is adapted for the weighted multi‐response optimisation case.
Details
Keywords
The purpose of this paper is to examine the statistical properties of the volatility index of India, India Vix (Ivix), its relationship with the Indian stock market and its…
Abstract
Purpose
The purpose of this paper is to examine the statistical properties of the volatility index of India, India Vix (Ivix), its relationship with the Indian stock market and its predictive power for forecasting future variance. Further, the paper examines the volatility transmission between India and developed markets.
Design/methodology/approach
The study uses quantile regression and VAR techniques to examine the empirical issues.
Findings
The results of the study show that Ivix returns are negatively related to stock market returns and the leverage effect is only significant around the middle of the joint distribution. The asymmetric response of Ivix is also not observed in the left tail and is significant again around the centre of the distribution. Monthly volatility forecasts obtained from Ivix contain important information about future market volatility. Finally, overnight volatility movements from the US market have significant effect on the Indian market's volatility and transmission in opposite direction was not observed.
Practical implications
If Ivix is included in a stock portfolio when the market moves up, Ivix may not fall significantly, consequently, the portfolio returns are not negatively effected. But, when market declines sharply, i.e. for large losses, Ivix may not move up significantly in the opposite direction, thereby not providing the much‐needed insurance to the portfolio returns. But for normal/average market declines, volatility derivatives on Ivix may be useful as portfolio insurance tools.
Originality/value
The paper is novel in employing quantile regression methodology to examine the empirical relationships of a volatility index. Volatility spillovers between emerging and developed markets are studied using volatility indices that are ex ante.
Details
Keywords
Jiju Antony, Raj Bardhan Anand, Maneesh Kumar and M.K. Tiwari
To provide a good insight into solving a multi‐response optimization problem using neuro‐fuzzy model and Taguchi method of experimental design.
Abstract
Purpose
To provide a good insight into solving a multi‐response optimization problem using neuro‐fuzzy model and Taguchi method of experimental design.
Design/methodology/approach
Over the last few years in many manufacturing organizations, multiple response optimization problems were resolved using the past experience and engineering judgment, which leads to increase in uncertainty during the decision‐making process. In this paper, a four‐step procedure is proposed to resolve the parameter design problem involving multiple responses. This approach employs the advantage of both artificial intelligence tool (neuro‐fuzzy model) and Taguchi method of experimental design to tackle problems involving multiple responses optimization.
Findings
The proposed methodology is validated by revisiting a case study to optimize the three responses for a double‐sided surface mount technology of an electronic assembly. Multiple signal‐to‐noise ratios are mapped into a single performance statistic through neuro‐fuzzy based model, to identify the optimal level settings for each parameter. Analysis of variance is finally performed to identify parameters significant to the process.
Research limitations/implications
The proposed model will be validated in future by conducting a real life case study, where multiple responses need to be optimized simultaneously.
Practical implications
It is believed that the proposed procedure in this study can resolve a complex parameter design problem with multiple responses. It can be applied to those areas where there are large data sets and a number of responses are to be optimized simultaneously. In addition, the proposed procedure is relatively simple and can be implemented easily by using ready‐made neural and statistical software like Neuro Work II professional and Minitab.
Originality/value
This study adds to the literature of multi‐optimization problem, where a combination of the neuro‐fuzzy model and Taguchi method is utilized hand‐in‐hand.
Details