International Journal of Quality Engineering and Technology (8 papers in press)
Determination of correlation level and Process Capability of a manufacturing Process Using Control charts
by D.R. Prajapati
Abstract: Statistical process control is an industry-based methodology for measuring and controlling quality during the manufacturing process. The case study; presented in this paper deals with the determination of process capability ratio (PCR), process capability index (Cpk) and level of correlation for a drive shaft run outs of the manufacturing industry. Data have been taken from the manufacturing process and the X-bar & R control charts have been plotted to find whether the processes are with-in or out of statistical control. Process capability analysis has been done to find the effectiveness of the processes. It is found that the 36.02% of drive shaft run outs is beyond the control, which is really a higher rejection level and major concern for the industry. The samples of drive shaft run outs are taken from a manufacturing industry; located in India and estimated the level of correlation among the observations. The level of correlation among the observations of the drive shaft run outs has been computed and matched with the suggested optimal schemes of Shewhart X-bar chart for sample size (n) of four. The presented case study shows the level of correlation of 0.50.
Keywords: Process capability index; capability ratio; Shewhart X & R charts; manufacturing industry and level of correlation.
Quality Insight: A Pi Tail Mathematical Postulation of the Limit of Quality
by Adedeji Badiru
Abstract: In the pursuit of better quality, under the spirit of continuous improvement, it is appropriate to inquiry about the potential limit of quality. Using quality engineering and technology tools and techniques, can we continue to improve quality indefinitely, inching toward the limit of quality, ever so infinitesimally? This paper presents a postulation of an asymptomatic approach to the limit of quality based on the mathematical reasoning of pi. While there is no mathematical correlation of pi to the physical quality of a product, it is interesting and inspirational to conduct research on the move toward the best possible point on the quality curve. In this regard, the motivational question is what is the limit of quality? The methodology of this paper suggests some interesting analogies.
Keywords: Quality Limit; Quality Engineering; Quality Technology; Learning Curve; Quality Improvement; Quality Expectation; Control Limit.
Fault diagnosis for a milk pasteurization plant with missing data
by Ouahab Kadri, L.H. Mouss, Adel Abdelhadi
Abstract: This paper addresses the problem of fault diagnosis from observed data containing missing values amongst the inputs. In order to provide good classification accuracy for the decision function, a novel approach based on support vector machine and Extreme Learning Machine is developed. SVM Mixture Model is used to model the data distribution, which is adapted to handle missing values, while Extreme Learning Machine enables to devise a multiple imputation strategy for final estimation. The effectiveness of the proposed approach is verified by the milk pasteurization system. The results show that our approach can perfectly detect dysfunction, identify the fault, and is strong in unsupervised process monitoring.
Keywords: fault diagnosis; missing data; support vector machine mixture; extreme learning machine; multiple imputation.
A Taylor Series Approach to the Robust Parameter Design of Computer Simulations Using Kriging and Radial Basis Function Neural Networks
by Joseph Bellucci, Kenneth Bauer
Abstract: Robust parameter design is used to identify a systems control settings that offer a compromise between obtaining desired mean responses and minimising the variability about those responses. Two popular combined-array strategiesthe response surface model (RSM) approach and the emulator approachare limited when applied to simulations. In the former case, the mean and variance models can be inadequate due to the high level of non-linearity within many simulations. In the latter case, precise mean and variance approximations are developed at the expense of extensive Monte Carlo sampling. This paper extends the RSM approach to include non-linear metamodels, namely Kriging and radial basis function neural networks. The mean and variance of second-order Taylor series approximations of these metamodels are generated via the Multivariate Delta Method and subsequent optimisation problems employing these approximations are solved. Results show that improved mean and variance prediction models, relative to the RSM approach, can be attained at a fraction of the emulator approachs cost.
Keywords: robust parameter design; RPD; Taylor series; simulations; Kriging; neural networks; radial basis function neural networks; RBFNN; delta method; dual response optimisation; metamodelling; response surface modelling; emulators.
Six Sigma Evaluation using Process Capability and X Control Chart for Reducing Variance in Oil Density Characteristic: A study in Yemen
by Faisal Ali, Amran Ahmed
Abstract: Implementing Six Sigma method in industries contributes to developing high-quality products with minimal defects and variations in processes. Variations in any process indicate a high ratio of defects. Six Sigma is widely employed and accepted by various organizations but has been rarely investigated in case studies. This paper presents Six Sigma evaluation based on process capability with X control chart to reduce variations in oil density characteristics in the refinery process in Aden, Yemen. The evaluation was conducted based on the estimation of standard deviation and process capability. Twenty-five oil density samples were randomly collected from Aden refinery oil, and each sample consisted of four items. The data collected were tested statistically for normality and the results confirmed that the data follows a normal distribution. Analysis indicated that the levels of Sigma employed in Aden refinery are less than 3 σ. The results of this study suggested that the X chart based on Six Sigma estimation method is significant to reduce variance in the oil density characteristic. The study also pointed out that process capability is high when the standard deviation is low. In addition, the Sigma levels increase when the process capability increases. Thus, the increment in Sigma levels leads to improvements in the performance of the process refinery of oil density characteristic. Finally, this paper provides the necessary fundamentals and knowledge for quality control researchers and engineers to reduce variances by using Six Sigma.
Keywords: Process capability; Six Sigma; Variation; Statistical process control; Density characteristic of petroleum. X control chart; upper and lower quality limits.
Distribution-free synthetic and runs-rules control charts combined with a Mann-Whitney chart
by Jean-Claude Malela-Majika, Eeva Maria Rapoo
Abstract: A control chart is one of the most important tools used in statistical process control and monitoring (SPCM) to detect changes in quality processes. This paper investigates the performance of the improved modified distribution-free synthetic and runs-rules charts combined with a Shewhart Mann-Whitney (MW) control charts, in terms of the average run length (ARL), standard deviation of the run length (SDRL) and median run length (MRL) through intensive simulation. It is observed that, the new control charts present very attractive run-length properties and outperform the competing charts in many cases. Numerical examples are given as illustration of the design and implementation of the proposed charts.
Keywords: nonparametric statistical process control; MW statistic; conforming run-length chart; runs-rules; synthetic MW chart; Monte Carlo simulation.
Preventive Maintenance Modeling in Lifetime Warranty
by Farhad Imani, Ki-Hwan Bae
Abstract: Lifetime warranty is a type of long-term assurance that is now
ubiquitous. With a long period of coverage, however, the expected number of
failures and the associated costs are likely to add up. Hence, maintenance
policies over a lifetime warranty can have a substantial impact on warranty
servicing costs. Maintenance policies impose additional costs and would be
worthy only if the reduction amount in total costs due to maintenance is greater
than maintenance costs. The focus of this paper is on modelling preventive
maintenance during a lifetime warranty in order to derive optimal maintenance
policy and optimal level of repair based on the structures of a cost function and
a failure rate function. Our investigation demonstrates that the optimal strategy
for preventive maintenance can be achieved by considering minor repairs
during early ages and marginal repairs near the end of a product life. Numerical
examples are provided to evaluate our developed models and support the
Keywords: Lifetime warranty; increasing failure rate; preventive maintenance policy; failure rate reduction method.
The development of target-based posterior process capability indices and confidence intervals
by Anintaya Khamkanya, Byung Rae Cho, Paul Goethals
Abstract: Quality engineering tools and techniques are often sought as platforms for improving system design, enhancing performance, and optimizing process conditions. Perhaps one of the most popular tools is the process capability index (PCI), which relates the allowable spread of a process defined by engineering specifications to the natural spread of a process. The PCI enables an engineer to assess the performance of a process and thus realize where improvements in product quality may be needed. The vast majority of PCI research involves measuring process performance during the manufacturing stage, prior to the final inspection of products and shipping to the customer. After implementing inspections, however, non-conforming products are typically scrapped when they fail to fall within their specification limits; hence, the actual resulting process distribution shipped to the customer after inspection is truncated. Moreover, the traditional PCI does not account for the loss in quality when product characteristics fail to achieve their process target value. This research, in contrast, proposes indices that consider the underlying result of observations after inspection or when non-conforming products are scrapped, referred to as posterior PCIs. Utilizing a truncated distribution as the basis for measurement along with a target-based quality loss function for capability analyses, several posterior indices are developed corresponding to their traditional non-truncated counterparts. A simulation technique is implemented to compare the proposed posterior PCIs with traditional measures across multiple performance scenarios; finally, the confidence interval approximations for the posterior PCIs are derived. Our results suggest using the proposed posterior indices for capability analyses when industrial processes require that non-conforming products be scrapped prior to shipping to the customer.
Keywords: process capability index; truncated normal distribution; process target; quality loss function.