Search results

1 – 10 of 33
Article
Publication date: 2 April 2021

George Besseris and Panagiotis Tsarouhas

The study aims to provide a quick-and-robust multifactorial screening technique for early detection of statistically significant effects that could influence a product's life-time…

Abstract

Purpose

The study aims to provide a quick-and-robust multifactorial screening technique for early detection of statistically significant effects that could influence a product's life-time performance.

Design/methodology/approach

The proposed method takes advantage of saturated fractional factorial designs for organizing the lifetime dataset collection process. Small censored lifetime data are fitted to the Kaplan–Meier model. Low-percentile lifetime behavior that is derived from the fitted model is used to screen for strong effects. A robust surrogate profiler is employed to furnish the predictions.

Findings

The methodology is tested on a difficult published case study that involves the eleven-factor screening of an industrial-grade thermostat. The tested thermostat units are use-rate accelerated to expedite the information collection process. The solution that is provided by this new method suggests as many as two active effects at the first decile of the data which improves over a solution provided from more classical methods.

Research limitations/implications

To benchmark the predicted solution with other competing approaches, the results showcase the critical first decile part of the dataset. Moreover, prediction capability is demonstrated for the use-rate acceleration condition.

Practical implications

The technique might be applicable to projects where the early reliability improvement is studied for complex industrial products.

Originality/value

The proposed methodology offers a range of features that aim to make the product reliability profiling process faster and more robust while managing to be less susceptible to assumptions often encountered in classical multi-parameter treatments.

Details

International Journal of Quality & Reliability Management, vol. 39 no. 1
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 3 April 2017

Panagiotis Tsarouhas and George Besseris

The purpose of this paper is to provide results for a complete maintainability analysis utilizing data sets from a production system in a shaving blades division of a large…

Abstract

Purpose

The purpose of this paper is to provide results for a complete maintainability analysis utilizing data sets from a production system in a shaving blades division of a large high-tech razor manufacturer. Through the illustrated case study, the authors demonstrate how to spot improvement points for enhancing availability by carrying out an equipment effectiveness analysis.

Design/methodology/approach

Descriptive statistics of the repair data and the best fitness index parameters were computed. Repair data were collected from departmental logs, and a preliminary screening analysis was conducted to validate their usefulness for the indicated period of 11 months. Next, the maintainability and repair rate modes for all the machines of the production system were calculated. Maintainability and availability estimations for different time periods that took in account the overall system were obtained by trying out and selecting an optimum statistical model after considering of several popular distributions.

Findings

Out of the five considered machines in the system, two particular units received about half of the repairs (M2 and M3). The time to repair follows a loglogistic distribution and subsequently the mean time to repair is estimated at 25 minutes at the machine level. Repair time performance is approximated at 55 minutes if the availability of the system is to attain a 90 percent maintainability.

Originality/value

This study is anticipated to serve as an illuminating effort in conducting a complete maintainability analysis in the much advertised field of shavers, for which on the contrary so little has been published on operational availability and equipment effectiveness. The case study augments the available pool of sources where executing maintainability studies is highlighted usually under the direction of combined total quality management and total productive maintenance programs.

Details

International Journal of Quality & Reliability Management, vol. 34 no. 4
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 6 January 2012

George J. Besseris

The purpose of this paper is to propose a methodology that may aid in assessing ecological quality multi‐trait screening through the use of simple and robust tools while exerting…

1619

Abstract

Purpose

The purpose of this paper is to propose a methodology that may aid in assessing ecological quality multi‐trait screening through the use of simple and robust tools while exerting minimal effort in conducting trials and interpreting results.

Design/methodology/approach

Response data for two popular site‐monitoring environmental indicators, chemical oxygen demand (COD) and biochemical oxygen demand (BOD), are arranged by implementing an 8‐run saturated orthogonal array proposed by Taguchi. Unreplicated data consolidation is performed through the Super‐Ranking translation. This permits converting the two eco‐traits to a single master response which becomes much easier to manipulate statistically. Distribution‐free multi‐factor contrasting provides the data reduction engine to filter‐out non‐active eco‐design variables for a waste treatment unit in a large dairy‐products plant.

Findings

Environmental quality improvement is achieved by accumulating structured eco‐data sets through an unreplicated‐saturated L8(27) Taguchi design. Concurrent minimization of the two selected eco‐responses, COD and BOD, promotes in a statistically significant fashion the quantity of incoming wastes, set at its minimum load, as the sole active eco‐factor.

Practical implications

Brief but robust experimentation is exploited in gaining information about the phenomenological behavior of environmental quality indicators, namely COD and BOD, in facilities that manage wastewater treatment. Design for environment is enforced through standard DOE planning schemes. Collected multi‐metric eco‐quality data are translated non‐parametrically in an easy‐to‐comprehend manner that requires no assist from software aids while bypassing more statistical intensive techniques which may demand involvement of more experienced personnel. The methodology is accessible to any level of statistical competence seamlessly intertwined to optimization demands for rapid inference needs.

Originality/value

The method mixes up three distinctive “design‐and‐analysis” elements in order to provide optimal solution in a design‐for‐environment project. The sampling capabilities of Taguchi's orthogonal arrays in concert with Super‐Ranking transformation fuse multi‐eco‐characteristics to a single easy‐to‐handle master unitless eco‐response. Order statistics tables recently published in terms of true probabilities have been adopted for supplying the proper cutoff points to be utilized for gauging against observed rank sums for an 8‐run orthogonal array screening. Quality managers and environmental engineers who contribute routinely to continuous eco‐improvement projects in their Total Environmental Quality Management (TEQM) program may find this approach attractive and viable en route to a typical industrial pollution prevention control deployment.

Article
Publication date: 1 May 2009

George J. Besseris

The purpose of this paper is to propose a manufacturing product‐screening methodology that will require minimal resource expenditures as well as succinct improvement tools based…

Abstract

Purpose

The purpose of this paper is to propose a manufacturing product‐screening methodology that will require minimal resource expenditures as well as succinct improvement tools based on multi‐response prioritisation.

Design/methodology/approach

A six‐step methodology is overviewed that relies on the sampling efficiency of fractional factorial designs introduced and recommended by Dr G. Taguchi. Moreover, the multi‐response optimisation approach based on the super‐ranking concept is expanded to the more pragmatic situation where prioritising of the implicated responses is imperative. Theoretical developments address the on‐going research issue of saturated and unreplicated fractional‐factorial designs. The methodology promotes the “user‐friendly” incorporation of assigned preference weights on the studied responses. Test efficiency is improved by concise rank ordering. This technique is accomplished by adopting the powerful rank‐sum inference method of Wilcoxon‐Mann‐Whitney.

Findings

Two real‐life case studies complement the proposed technique. The first discusses a production problem on manufacturing disposable shavers. Injection moulding data for factors such as handle weight, two associated critical handle dimensions and a single mechanical property undergo preferential multi‐response improvement based on working specification standards. This case shows that regardless of fluctuations incurred by four different sources of response prioritisation, only injection speed endures high‐statistical significance for all four cases out of the seven considered production factors. Similarly, the technique identifies a single active factor in a foil manufacturing optimisation of three traits among seven examined effects.

Originality/value

This investigation suggests a technique that targets the needs of manufacturing managers and engineers for “quick‐and‐robust” decision making in preferential product improvement. This is achieved by conjoining orthogonal arrays with a well‐established non‐parametric comparison test. A version of the super‐ranking concept is adapted for the weighted multi‐response optimisation case.

Details

Journal of Manufacturing Technology Management, vol. 20 no. 4
Type: Research Article
ISSN: 1741-038X

Keywords

Article
Publication date: 5 August 2014

George Besseris

The purpose of this paper is to propose a set of process capability indices (PCIs) which are based on robust and agile statistics such that they may be applicable irrespective of…

Abstract

Purpose

The purpose of this paper is to propose a set of process capability indices (PCIs) which are based on robust and agile statistics such that they may be applicable irrespective of the process status.

Design/methodology/approach

The four popular PCIs – Cp, Cpk, Cpm and Cpmk – are reconstructed to improve location and dispersion predictions by introducing robust estimators such as the median and the interquartile range. The proposed PCIs are sequentially evaluated in partitioned regions where fluctuations are inspected to be not significant. The runs test playing the role of a detector permits marking those regions between two consecutive appearances of causes that disrupt data randomness. Wilcoxon's one-sample test is utilized to approximate PCI's central tendency and its confidence interval across all formed partitions.

Findings

The Cpmk depicted the most conservative view of the process status when tracking the magnesium content in a showcased aluminum manufacturing paradigm. Cp and Cpk were benchmarked with controlled random data. It was found that the proposed set of robust PCIs are substantially less prone to false alarm in predicting non-conforming units in comparison to the regular PCIs.

Originality/value

The recommended method for estimating PCIs is purely distribution-free and thus deployable at any process maturity level. The advantageous approach defends vigorously against the influence of intruding sources of unknown and unknowable variability. Therefore, the predicament here is to protect the monitoring indicators from unforeseen data instability and breakdown, which are conspicuous in wreaking havoc in managerial decisions.

Article
Publication date: 9 August 2011

George J. Besseris

The purpose of this paper is to provide a case study on endorsing process improvement in maritime operations by implementing design of experiments on Lean Six Sigma performance…

1386

Abstract

Purpose

The purpose of this paper is to provide a case study on endorsing process improvement in maritime operations by implementing design of experiments on Lean Six Sigma performance responses. It is demonstrated how process efficiency and environmental muda may be dealt with simultaneously in a lean‐and‐green project driven by hardcore Six Sigma tools.

Design/methodology/approach

A 16‐run Taguchi‐type orthogonal design was employed to gather data for vessel speed (VS), exhaust gas temperature (EGT) and fuel consumption (FC) as modulated by a total of 15 controlling parameters synchronously. Active dependencies were inferred based on the desirability analysis method on direct process data from a performance log. This log was maintained for a long‐term monitoring during sea voyages of a double skin bulk carrier of 55,000 DWT while in sea service.

Findings

A high composite desirability value was achieved eclipsing the 0.90 mark. Values well over the 0.9 level were also obtained for the three examined individual desirability values of VS, EGT and FC. Leading controlling parameters were discovered to be compressor pressure, fuel pump index, slip, governor index and MIP.

Practical implications

A Lean Six Sigma project is carried out to improve performance characteristics in ordinary maritime operations. While the company in the case study outlined in this article no longer relies on periodic inspections to determine machinery conditions, improvement on key process characteristics were nevertheless deemed worthy of ameliorating. Information retrieval from computerized continuous monitoring systems assisted in conducting experimental designs in order to obtain optimal performance. Specifically, the tuning of vessel main engine running mode was examined aiming at increasing the quality levels of output power to the shaft along with a reduction of NOx emissions.

Originality/value

This work adds an interesting paradigm in the critical field of maritime activities for processes in full gear while operating at sea. Maritime operations are an imperative necessity when expediting international trading transactions. It is the first time that such a case study has emanated from a real pilot Lean Six Sigma project which interlaces process efficiency enhancement with concurrent environmental muda reduction.

Article
Publication date: 4 January 2013

George J. Besseris

The purpose of this paper is to propose a methodology that may assist quality professionals in assessing process variation with a combination of tools based on simple robust…

1009

Abstract

Purpose

The purpose of this paper is to propose a methodology that may assist quality professionals in assessing process variation with a combination of tools based on simple robust statistics. The technique targets alternative way of screening and detection of common and special causes in individuals' control charts (ICC).

Design/methodology/approach

The technique is using the classical box plot to detect and filter out outliers attributed to special causes. Then, the runs test is used to partition the data streak at points where the p‐value exceeds an assigned critical value. The transition between partitions is where the onset of a common cause is suspected.

Findings

The approach presented is supplemented with a case study from foundry operations in large‐scale can‐making operations. It is demonstrated how the magnesium content of an aluminium alloy is trimmed against special causes and then the location of the common causes is identified in a step‐by‐step fashion.

Research limitations/implications

The proposed method is useful when the collected data do not seem to comply with a known reference distribution. Since it is rare that an initial monitoring of a process will abide to normality, this technique saves time in recycling control charting which point often to misleading assignable causes. This is because the outliers are identified through the box plot one and out.

Practical implications

The technique identifies and eliminates quickly the off‐shoot values that tend to cause major instability in a process. Moreover, the onset for non‐assignable data points is detected in an expedient fashion without the need to remodel each time the inspected data series or to test against a score of multifarious test rules. The ingredients of the method have been well researched in the past, therefore, they may be implemented immediately without a further need to prove their worth.

Originality/value

The method mixes up two distinctive robust tools in a unique manner to aid quality monitoring to become fortified against data model inconsistencies. The technique is suitable for controlling processes that generate numerical readings. As such, the approach is projected to be useful for industrial as well as service operations.

Article
Publication date: 7 September 2010

George J. Besseris

Screening simultaneously for effects and their curvature may be useful in industrial environments when an economic restriction on experimentation is imposed…

Abstract

Purpose

Screening simultaneously for effects and their curvature may be useful in industrial environments when an economic restriction on experimentation is imposed. Saturated‐unreplicated fractional factorial designs have been a regular outlet for scheduling screening investigations under such circumstances. The purpose of this paper is to devise a practical test that may simultaneously quantify in statistical terms the possible existence of active factors in concert with an associated non‐linearity during screening.

Design/methodology/approach

The three‐level, nine‐run orthogonal design is utilized to compute a family of parameter‐free reference cumulative distributions by permuting ranked observations via a brute‐force method. The proposed technique is simple, practical and non‐graphical. It is based on Kruskal‐Wallis test and involves a sum of effects through the squared rank‐sum inference statistic. This statistic is appropriately extended for fractional factorial composite contrasting while avoiding explicitly the effect sparsity assumption.

Findings

The method is shown to be worthy competing with mainstream comparison methods and aids in averting potential complications arising from the indiscriminant use of analysis of variance in very low sampling schemes where subjective variance pooling is otherwise enforced.

Research limitations/implications

The true distributions obtained in this paper are suitable for sieving a fairly small amount of potential control factors while maintaining the non‐linearity question in the search.

Practical implications

The method is objective and is further elucidated by reworking two recent case studies which account for a total of five saturated screenings.

Originality/value

The statistical tables produced are easy to use and uphold the need for estimating separately mean and variance effects which are rather difficult to pinpoint for the fast track, low‐volume trials this paper is intended to.

Details

International Journal of Quality & Reliability Management, vol. 27 no. 8
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 17 April 2009

George J. Besseris

The aim of this paper is to examine product formulation screening at the industrial level in terms of multi‐trait improvement by considering several pertinent controlling factors.

Abstract

Purpose

The aim of this paper is to examine product formulation screening at the industrial level in terms of multi‐trait improvement by considering several pertinent controlling factors.

Design/methodology/approach

The study adopts Taguchi's orthogonal arrays (OAs) for sufficient and economical sampling in a mixture problem. Robustness of testing data is instilled in this method by employing a two‐stage analysis where controlling components are investigated together while the slack variable is tested independently. Multi‐responses collapse to a single master response has been incurred according to the Super Ranking concept. Order statistics are employed to provide statistical significance. The slack variable influence is tested by regression and nonparametric correlation.

Findings

Synergy among Taguchi methodology, super ranking and nonparametric testing was seamless to offer practical resolution to product component activeness. The concurrent modulation of two key product traits due to five constituents in the industrial production of muffin‐cake is invoked. The slack variable, rich cream, is strongly active while the influence of added amount of water is barely evident.

Research limitations/implications

The method presented is suitable only for situations where industrial mixtures are investigated. The case study demonstrates prediction capabilities up to quadratic effects for five nominated effects. However, the statistical processor selected here may be adapted to any number of factor settings dictated by the OA sampling plan.

Practical implications

By using a case study from food engineering, the industrial production of a muffin‐cake is examined focusing on a total of five controlling mixture components and two responses. This demonstration emphasizes the dramatic savings in time and effort that are gained by the proposed method due to reduction of experimental effort while gaining on analysis robustness.

Originality/value

This work interconnects Taguchi methodology with powerful nonparametric tests of Kruskal‐Wallis for the difficult problem of non‐linear analysis of mixtures for saturated, unreplicated fractional factorial designs in search of multi‐factor activeness in multi‐response cases employing simple and practical tools.

Details

International Journal of Quality & Reliability Management, vol. 26 no. 4
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 2 March 2010

George J. Besseris

The purpose of this paper is to propose a methodology that may aid in assessing information technology (IT) quality characteristic optimisation through the use of simple and…

Abstract

Purpose

The purpose of this paper is to propose a methodology that may aid in assessing information technology (IT) quality characteristic optimisation through the use of simple and robust tools with minimal effort.

Design/methodology/approach

Non‐linear saturated fractional factorial designs proposed by Taguchi receive robust data processing by the efficient nonparametric test of Jonckheere and Terpstra.

Findings

The paper finds that e‐mail quality improvement is achieved by collecting data through an unreplicated‐saturated L9(34) design. Active influences are attributed to the e‐mail volume and the receiving hardware type.

Research limitations/implications

The overall efficiency of the method is greatly enhanced due to incorporation of a nonparametric analysis tool that is known to perform effectively when data availability is minimal. The method does not succumb to normality and multi‐distributional effects which may easily handicap the decision‐making process when derived from other mainstream methods.

Practical implications

There are obvious professional and pedagogical aspects in this work aiming at IT quality practitioners offering facilitation towards implementing robust techniques while suppressing quality costs. It is noteworthy that nonparametric data processing improves on the ability to make predictions over Taguchi's regular Design of Experiments (DOE) formulation for small sampling conditions.

Originality/value

This method embraces designing efficiency by non‐linear orthogonal arrays with multi‐level order statistics providing the weaponry to deal with quality optimisation in complex environments such as those in the IT area. The value of this work may be appreciated best by quality managers and engineers engaged in routine quality improvement projects in the area of information systems which also augments the general database of quality‐related testing cases.

Details

The TQM Journal, vol. 22 no. 2
Type: Research Article
ISSN: 1754-2731

Keywords

1 – 10 of 33