Search results

1 – 8 of 8
Article
Publication date: 1 December 2004

R. Gandhinathan, N. Raviswaran and M. Suthakar

Globalization has provided excellent opportunities for the global manufacturing community together with a stringent barrier on cost control. Target costing has emerged as one of…

3653

Abstract

Globalization has provided excellent opportunities for the global manufacturing community together with a stringent barrier on cost control. Target costing has emerged as one of the main tools in aiding the manufacturers to be globally competitive. This paper analyses the effect of tools such as quality function deployment (QFD) and value engineering (VE) on target costing and explores the way in which these tools assist in achieving the target cost. The target costing model developed by Cooper and Slagmulder (Cooper, R. and Slagmulder, R., Target Costing and Value Engineering, Productivity Press, New York, NY, 1997) has been modified and tools such as QFD and VE have been incorporated in the model. Due to inherent uncertainties in the associated cost of various elements, the model has been further strengthened with the use of fuzzy logic. The theoretical model developed was implemented in an Indian auto component manufacturing company and the results were analysed. Target costing significantly relies upon QFD and VE for its effective implementation. Uncertainty in cost estimation plays a significant role in the target costing process since any variation in cost violates the cardinal rule of target costing, “the target cost should never be exceeded”. Fuzzy logic plays a vital role in accounting for uncertainty in the target costing process and gives a different perspective to arrive at the function cost. A functional approach (VE) combined with QFD backed by fuzzy approach appears to work effectively for a target costing process that is evidenced from the case study. It appears that the model developed will work satisfactorily for an industrial product and the validity of the model for fast moving consumer goods has to be ascertained.

Details

International Journal of Quality & Reliability Management, vol. 21 no. 9
Type: Research Article
ISSN: 0265-671X

Keywords

Book part
Publication date: 1 January 2008

Harry Zvi Davis, Roger Mesznik and John Y. Lee

This article contributes to the fuzzy logic application literature in accounting by examining a key issue in the use of fuzzy logic: how to find an optimum number of classes to…

Abstract

This article contributes to the fuzzy logic application literature in accounting by examining a key issue in the use of fuzzy logic: how to find an optimum number of classes to minimize the decision maker's cost. Two costs are assumed: (1) we assume fuzziness is costly and thus should be minimized and (2) we assume that adding categories is costly. In order to address the issue of finding the optimal number of classes, we define the objective function as being cost minimization. We seek to determine the costs and benefits of increasing the number of classifications and ask whether an internal optimum is identifiable and achievable. We assume, ceteris paribus, less fuzziness is preferable to more fuzziness, but fuzziness can only be reduced through the use of more categories whose creation is costly. More fuzziness is costly, but so is the creation of additional categories to alleviate the fuzziness. When we arrive at the optimal number of clusters that corresponds to a minimal total cost, that number may not be the same as the “natural” number of categories. It is, nonetheless, a useful and practical way of deciding on the number of classifications. The approach we employ in this study is not confined to a management accounting information environment. It can be applied to any information environment where measurable classifications exist.

Details

Advances in Management Accounting
Type: Book
ISBN: 978-1-84855-267-8

Article
Publication date: 2 November 2018

Nadiye Ozlem Erdil and Omid M. Arani

This paper aims to investigate to what extent quality function deployment (QFD) can be used in quality improvement rather than design activities.

1371

Abstract

Purpose

This paper aims to investigate to what extent quality function deployment (QFD) can be used in quality improvement rather than design activities.

Design/methodology/approach

A framework was developed for implementation of QFD as a quality improvement tool. A case study approach is used to test this framework, and quality issues were analyzed using the framework in a ceramic tile manufacturing company.

Findings

The results showed considerable improvements in the critical quality characteristics identified and sales rates, demonstrating the potential of QFD to be used in assessing and prioritizing areas of improvement, and converting them into measurable process or product requirements.

Research limitations/implications

One case study was completed. More studies would be beneficial to support current findings.

Practical implications

This framework provides structured approach and guidelines for practitioners in adapting QFD for quality improvements in existing products or processes.

Originality/value

This study proposes a new framework to use QFD in quality improvement activities, expanding its application areas. Moreover, the results of the literature study performed provide a valuable collection of practical QFD implementation examples.

Details

International Journal of Quality and Service Sciences, vol. 11 no. 2
Type: Research Article
ISSN: 1756-669X

Keywords

Article
Publication date: 21 October 2013

Kalluri Vinayak and Rambabu Kodali

Quality function deployment (QFD) has been used to translate customer requirements into engineering characteristics of a product, while benchmarking was developed to search for…

2678

Abstract

Purpose

Quality function deployment (QFD) has been used to translate customer requirements into engineering characteristics of a product, while benchmarking was developed to search for the best industry practices, which will lead to exceptional performance through the implementation of these best practices. However, no attempt has been made to integrate QFD with benchmarking to identify the best practices of QFD model. This paper aims to classify the QFD models and thereby applying benchmarking process to propose the best practices of QFD model.

Design/methodology/approach

The fundamental benchmarking model developed by Camp has been used to benchmark the existing QFD models available in the literature.

Findings

Benchmarking the QFD models revealed about 36 QFD steps in the first phase of the house of quality. The tools used in solving for each practice are also reported.

Research limitations/implications

The proposed model is conceptual and it requires validation by implementing the same in an organization to understand its effectiveness.

Originality/value

Utilizing the benchmarking process to develop the best practices of QFD model is an original concept.

Details

Benchmarking: An International Journal, vol. 20 no. 6
Type: Research Article
ISSN: 1463-5771

Keywords

Article
Publication date: 25 February 2014

P. Rajiv, R. Logesh, Sekar Vinodh and D. Rajanayagam

– The purpose of this paper is to report a case study in which financial feasibility integrated quality function deployment (QFD) approach was implemented.

Abstract

Purpose

The purpose of this paper is to report a case study in which financial feasibility integrated quality function deployment (QFD) approach was implemented.

Design/methodology/approach

Customer complaints were systematically gathered. The house of quality (HoQ) matrix was developed. The technical descriptors were prioritized and subjected to the financial feasibility study. The cost calculations were carried out and the actions were derived. A set of value engineering (VE) principles was used during this case study.

Findings

The study reported in this paper indicated the need for integrating financial feasibility study with QFD for enhancing the effectiveness of the method. The measures taken to prevent the customer complaints will be of considerable value to the manufacturing organizations.

Research limitations/implications

During the conduct of case study, high-cost factors restricted the selection of materials which would exhibit higher performance. The case study was carried out in a single electronic switches manufacturing organization.

Practical implications

The manufacturing costs incurred have been reduced by incorporating changes in the part material. The outcomes of the study have been considered for further implementation in the case organisation which indicated the practicality of the study.

Originality/value

The concept of apportionment of HoQ cost developed with the idea of integrating the same with QFD is the original contribution of the authors.

Details

Journal of Engineering, Design and Technology, vol. 12 no. 1
Type: Research Article
ISSN: 1726-0531

Keywords

Article
Publication date: 30 October 2018

Anuoluwapo Ajayi, Lukumon Oyedele, Juan Manuel Davila Delgado, Lukman Akanbi, Muhammad Bilal, Olugbenga Akinade and Oladimeji Olawale

The purpose of this paper is to highlight the use of the big data technologies for health and safety risks analytics in the power infrastructure domain with large data sets of…

2215

Abstract

Purpose

The purpose of this paper is to highlight the use of the big data technologies for health and safety risks analytics in the power infrastructure domain with large data sets of health and safety risks, which are usually sparse and noisy.

Design/methodology/approach

The study focuses on using the big data frameworks for designing a robust architecture for handling and analysing (exploratory and predictive analytics) accidents in power infrastructure. The designed architecture is based on a well coherent health risk analytics lifecycle. A prototype of the architecture interfaced various technology artefacts was implemented in the Java language to predict the likelihoods of health hazards occurrence. A preliminary evaluation of the proposed architecture was carried out with a subset of an objective data, obtained from a leading UK power infrastructure company offering a broad range of power infrastructure services.

Findings

The proposed architecture was able to identify relevant variables and improve preliminary prediction accuracies and explanatory capacities. It has also enabled conclusions to be drawn regarding the causes of health risks. The results represent a significant improvement in terms of managing information on construction accidents, particularly in power infrastructure domain.

Originality/value

This study carries out a comprehensive literature review to advance the health and safety risk management in construction. It also highlights the inability of the conventional technologies in handling unstructured and incomplete data set for real-time analytics processing. The study proposes a technique in big data technology for finding complex patterns and establishing the statistical cohesion of hidden patterns for optimal future decision making.

Details

World Journal of Science, Technology and Sustainable Development, vol. 16 no. 1
Type: Research Article
ISSN: 2042-5945

Keywords

Article
Publication date: 9 January 2020

Khurshid Ahmad, Zheng JianMing and Muhammad Rafi

This study aims to propose a model based on philosophical thoughts of Dr S.R Ranganathan and the lean-startup method for the execution of big data analytics (BDA) in libraries…

1298

Abstract

Purpose

This study aims to propose a model based on philosophical thoughts of Dr S.R Ranganathan and the lean-startup method for the execution of big data analytics (BDA) in libraries. The research paves a way to understand the role and required competencies of Library and Information Science (LIS) professionals for the implementation of BDA in libraries.

Design/methodology/approach

In the BDA analytics context, a session with a proposed model was presented to the audience to get the response of librarians about the required competencies and skills. The research tool was developed based on the literature review to know the role of LIS professionals and their required competencies/skills for BDA. The questionnaire was distributed in the BDA session to collect the responses of the participating audience on the variables that focused on the role and core competencies of LIS professionals in BDA. In the analysis of results, the independent t-test was applied to know the mean value of the overall response rate.

Findings

The findings show that perceptions of LIS professionals in the understanding of BDA ranked high in data privacy, data availability, data organization and data literacy. Digital data curation, policies supervision and providing the data consultancy also showed a significant relationship among these variables. Besides, the correlation between the required skills for BDA, metadata skills, data ethics, data acquisition, data cleaning, data organization, data analysis, digital curation, data clustering, data protection rules and digital visualization also showed a beneficial relationship.

Originality/value

This study also helps to understand the perspective of LIS professionals for the implementation of BDA in libraries and to fill the literature gap in the respective.

Details

Digital Library Perspectives, vol. 36 no. 1
Type: Research Article
ISSN: 2059-5816

Keywords

Article
Publication date: 7 June 2023

Wenjing Li and Zhi Liu

In 2016, the Chinese central government decentralized the responsibilities of housing market regulation to the municipal level. This paper aims to assess whether the decentralized…

Abstract

Purpose

In 2016, the Chinese central government decentralized the responsibilities of housing market regulation to the municipal level. This paper aims to assess whether the decentralized market regulation is effective.

Design/methodology/approach

This study first investigates the fundamental drivers of urban housing prices in China. Taking into consideration the factors driving housing prices, the authors further investigate the effectiveness of decentralized housing market regulation by a pre- and post-policy comparison test using a panel data set of 35 major cities for the years from 2014 to 2019.

Findings

The results reveal heterogenous policy effects on housing price growth among cities with a one-year lag in effectiveness. With the decentralized housing market regulation, cities with fast price growth are incentivized to implement tightening measures, while cities with relatively low housing prices and slow price growth are more likely to do nothing or deregulate the markets. The findings indicate that the shift from a centralized housing market regulation to a decentralized one is more appropriate and effective for the individual cities.

Originality/value

Few policy evaluation studies have been done to examine the effects of decentralized housing market regulation on the performance of urban housing markets in China. The authors devise a methodology to conduct a policy evaluation that is important to inform public policy and decisions. This study helps enhance the understanding of the fundamental factors in China’s urban housing markets and the effectiveness of municipal government interventions.

Details

International Journal of Housing Markets and Analysis, vol. 17 no. 5
Type: Research Article
ISSN: 1753-8270

Keywords

1 – 8 of 8