The fast implementation of multivariable discrete linear systems via VLSI array processors is presented. Two direct state‐space realizations, the block controller form and the…
Abstract
The fast implementation of multivariable discrete linear systems via VLSI array processors is presented. Two direct state‐space realizations, the block controller form and the block observer form, are used as the basis for the proposed implementations, which are comprised of similar processing elements in a linear configuration with nearest neighbor links. High concurrency is achieved by exploiting both parallelism and pipelining. The implementations are characterized by modularity, local communication and high throughput rates.
D.A. Karras, S.A. Karkanis and B.G. Mertzios
This paper suggests a novel methodology for building robust information processing systems based on wavelets and artificial neural networks (ANN) to be applied either in…
Abstract
This paper suggests a novel methodology for building robust information processing systems based on wavelets and artificial neural networks (ANN) to be applied either in decision‐making tasks based on image information or in signal prediction and modeling tasks. The efficiency of such systems is increased when they simultaneously use input information in its original and wavelet transformed form, invoking ANN technology to fuse the two different types of input. A quality control decision‐making system as well as a signal prediction system have been developed to illustrate the validity of our approach. The first one offers a solution to the problem of defect recognition for quality control systems. The second application improves the quality of time series prediction and signal modeling in the domain of NMR. The accuracy obtained shows that the proposed methodology deserves the attention of designers of effective information processing systems.
Details
Keywords
Matjaž Kragelj and Mirjana Kljajić Borštnar
The purpose of this study is to develop a model for automated classification of old digitised texts to the Universal Decimal Classification (UDC), using machine-learning methods.
Abstract
Purpose
The purpose of this study is to develop a model for automated classification of old digitised texts to the Universal Decimal Classification (UDC), using machine-learning methods.
Design/methodology/approach
The general research approach is inherent to design science research, in which the problem of UDC assignment of the old, digitised texts is addressed by developing a machine-learning classification model. A corpus of 70,000 scholarly texts, fully bibliographically processed by librarians, was used to train and test the model, which was used for classification of old texts on a corpus of 200,000 items. Human experts evaluated the performance of the model.
Findings
Results suggest that machine-learning models can correctly assign the UDC at some level for almost any scholarly text. Furthermore, the model can be recommended for the UDC assignment of older texts. Ten librarians corroborated this on 150 randomly selected texts.
Research limitations/implications
The main limitations of this study were unavailability of labelled older texts and the limited availability of librarians.
Practical implications
The classification model can provide a recommendation to the librarians during their classification work; furthermore, it can be implemented as an add-on to full-text search in the library databases.
Social implications
The proposed methodology supports librarians by recommending UDC classifiers, thus saving time in their daily work. By automatically classifying older texts, digital libraries can provide a better user experience by enabling structured searches. These contribute to making knowledge more widely available and useable.
Originality/value
These findings contribute to the field of automated classification of bibliographical information with the usage of full texts, especially in cases in which the texts are old, unstructured and in which archaic language and vocabulary are used.
Details
Keywords
Xiaoliang Qian, Jing Li, Jianwei Zhang, Wenhao Zhang, Weichao Yue, Qing-E Wu, Huanlong Zhang, Yuanyuan Wu and Wei Wang
An effective machine vision-based method for micro-crack detection of solar cell can economically improve the qualified rate of solar cells. However, how to extract features which…
Abstract
Purpose
An effective machine vision-based method for micro-crack detection of solar cell can economically improve the qualified rate of solar cells. However, how to extract features which have strong generalization and data representation ability at the same time is still an open problem for machine vision-based methods.
Design/methodology/approach
A micro-crack detection method based on adaptive deep features and visual saliency is proposed in this paper. The proposed method can adaptively extract deep features from the input image without any supervised training. Furthermore, considering the fact that micro-cracks can obviously attract visual attention when people look at the solar cell’s surface, the visual saliency is also introduced for the micro-crack detection.
Findings
Comprehensive evaluations are implemented on two existing data sets, where subjective experimental results show that most of the micro-cracks can be detected, and the objective experimental results show that the method proposed in this study has better performance in detecting precision.
Originality/value
First, an adaptive deep features extraction scheme without any supervised training is proposed for micro-crack detection. Second, the visual saliency is introduced for micro-crack detection.
Details
Keywords
This paper presents an overview of three information‐theoretic methods, which have been used extensively in many areas such as signal/image processing, pattern recognition and…
Abstract
This paper presents an overview of three information‐theoretic methods, which have been used extensively in many areas such as signal/image processing, pattern recognition and statistical inference. These are: the maximum entropy (ME), minimum cross‐entropy (MCE) and mutual information (MI) methods. The development history of these techniques is reviewed, their essential philosophy is explained, and typical applications, supported by simulation results, are discussed.
Details
Keywords
Ahmet Can Kutlu and Cigdem Kadaifci
Total quality management (TQM) is a process and philosophy to achieve customer satisfaction in long term by improving the products, processes and services effectively and…
Abstract
Purpose
Total quality management (TQM) is a process and philosophy to achieve customer satisfaction in long term by improving the products, processes and services effectively and efficiently. TQM implementation is turning into a complex practice due to the increasing number of effective factors and key elements labelled as critical success factors (CSFs). The purpose of this paper is to analyse the relations between CSFs of TQM and to provide decision makers has a clear picture of relations by determining the most affecting – both the number of CSFs which this factor affects and the its effect degree on relevant CSFs are higher comparing to other factors – of this factors affected factors – both the number of CSFs and their effect degree on these factors are higher – that influences a successful TQM implementation.
Design/methodology/approach
The paper refers to fuzzy cognitive maps (FCMs) that allow dynamic modelling of a system in consideration of a complex network structure and the effects of factors to each other. The method demonstrates causal representations between CSFs under uncertainty to represent the relations and interaction between them and performs qualitative simulations to analyse the factors that have the highest impact on continuous improvement of quality management process. The evaluations are performed by five academicians whose professions are on both the areas of TQM and FCM.
Findings
FCM analysis shows how the most affecting and affected factors influence the other CSF in order to manage a successful TQM implementation.
Originality/value
The critical factors of TQM implementation are in the focus of most of the empirical studies in the literature. However, none of them considers the dynamic interactions between the factors. This study employs FCM to explore the CSFs that influence the TQM implementation process considering the relations among them to observe the most affecting and affected factors based on the changes of determined CSFs.
Details
Keywords
Benedict M. Uzochukwu, Silvanus J. Udoka and Femi Balogun
Managing product life cycle data is important for achieving design excellence, product continued operational performance, customer satisfaction and sustainment. As a result, it is…
Abstract
Purpose
Managing product life cycle data is important for achieving design excellence, product continued operational performance, customer satisfaction and sustainment. As a result, it is important to develop a sustainment simulator to transform life cycle data into actionable design metrics. Currently, there is apparent lack of technologies and tools to synthesize product life time data. The purpose of this paper is to provide a description of how a product sustainment simulator was developed using fuzzy cognitive map (FCM). As a proof of concept, and to demonstrate the utility of the simulator, an implementation example utilizing product life time data as input was demonstrated.
Design/methodology/approach
The sustainment simulator was developed using visual basic. The simulation experiment was accomplished using a FCM. The Statistical Analytical Software tool was used to run structural equation model programs that provided the initial input into the FCM and the simulator. Product life data were used as input to the simulator.
Findings
There is an apparent lack of technologies and tools to synthesize product life time data. This constitutes an impediment to designing the next generation of sustainable products. Modern tools, technologies and techniques must be used if the goal of removing product design and sustainment disablers is to be achieved. Product sustainment can, therefore, be achieved using the simulator.
Research limitations/implications
The sustainment simulator is a tool that demonstrates in a practical way how a product life time generated data can be transformed into actionable design parameters. This paper includes analysis of a sample generated using random numbers. The lack of actual data set is primarily due to reluctance of organizations to avail the public of actual product life time data. However, this paper provides a good demonstration of how product life time data can be transformed to ensure product sustainment.
Practical implications
The technique used in this research paper would be very useful to product designers, engineers and research and development teams in developing data manipulation tools to improve product operational and sustainable life cycle performance. Sustainment conscious organizations will, no doubt, benefit from a strong comparative and competitive advantage over rivals.
Originality/value
Utilizing the simulator to transform product life time data into actionable design metrics through the help of an efficient decision support tool like the FCM constitutes a step in supporting product life cycle management. The outcome of this paper alerts product designers on parameters which should be taken into account when designing a new generation of a given product(s).
Details
Keywords
Jiakun Wang and Yun Li
Under the new media environment, while enjoying the convenience brought by the propagation of public opinion information (referred to as public opinion), learning the evolution…
Abstract
Purpose
Under the new media environment, while enjoying the convenience brought by the propagation of public opinion information (referred to as public opinion), learning the evolution process of public opinion and strengthening the governance of the spreading of public opinion are of great significance to promoting economic development and maintaining social stability as well as effectively resisting the negative impact of its propagation.
Design/methodology/approach
Thinking about the results of empirical research and bibliometric analysis, this paper focused on introducing key factors such as information content, social strengthening effects, etc., from both internal and external levels, dynamically designed public opinion spreading rules and netizens' state transition probability. Subsequently, simulation experiments were conducted to discuss the spreading law of public opinion in two types of online social networks and to identify the key factors which influencing its evolution process. Based on the experimental results, the governance strategies for the propagation of negative public opinion were proposed finally.
Findings
The results show that compared with other factors, the propagation of public opinion depends more on the attributes of the information content itself. For the propagation of negative public opinion, on the one hand, the regulators should adopt flexible guidance strategy to establish a public opinion supervision mechanism and autonomous system with universal participation. On the other hand, they still need to adopt rigid governance strategy, focusing on the governance timing and netizens with higher network status to forestall the wide-diffusion of public opinion.
Practical implications
The research conclusions put forward the enlightenment for the governance of public opinion in management practice, and also provided decision-making reference for the regulators to reasonably respond to the propagation of public opinion.
Originality/value
Our research proposed a research framework for the discussion of public opinion propagation process and had important practical guiding significance for the governance of public opinion propagation.