Forthcoming articles

 


International Journal of Quality Engineering and Technology

 

These articles have been peer-reviewed and accepted for publication in IJQET, but are pending final changes, are not yet published and may not appear here in their final order of publication until they are assigned to issues. Therefore, the content conforms to our standards but the presentation (e.g. typesetting and proof-reading) is not necessarily up to the Inderscience standard. Additionally, titles, authors, abstracts and keywords may change before publication. Articles will not be published until the final proofs are validated by their authors.

 

Forthcoming articles must be purchased for the purposes of research, teaching and private study only. These articles can be cited using the expression "in press". For example: Smith, J. (in press). Article Title. Journal Title.

 

Articles marked with this shopping trolley icon are available for purchase - click on the icon to send an email request to purchase.

 

Articles marked with this Open Access icon are freely available and openly accessible to all without any restriction except the ones stated in their respective CC licenses.

 

Register for our alerting service, which notifies you by email when new issues of IJQET are published online.

 

We also offer RSS feeds which provide timely updates of tables of contents, newly published articles and calls for papers.

 

International Journal of Quality Engineering and Technology (11 papers in press)

 

Regular Issues

 

  • Distribution-free synthetic and runs-rules control charts combined with a Mann-Whitney chart   Order a copy of this article
    by Jean-Claude Malela-Majika, Eeva Maria Rapoo 
    Abstract: A control chart is one of the most important tools used in statistical process control and monitoring (SPCM) to detect changes in quality processes. This paper investigates the performance of the improved modified distribution-free synthetic and runs-rules charts combined with a Shewhart Mann-Whitney (MW) control charts, in terms of the average run length (ARL), standard deviation of the run length (SDRL) and median run length (MRL) through intensive simulation. It is observed that, the new control charts present very attractive run-length properties and outperform the competing charts in many cases. Numerical examples are given as illustration of the design and implementation of the proposed charts.
    Keywords: nonparametric statistical process control; MW statistic; conforming run-length chart; runs-rules; synthetic MW chart; Monte Carlo simulation.

  • Preventive Maintenance Modeling in Lifetime Warranty   Order a copy of this article
    by Farhad Imani, Ki-Hwan Bae 
    Abstract: Lifetime warranty is a type of long-term assurance that is now ubiquitous. With a long period of coverage, however, the expected number of failures and the associated costs are likely to add up. Hence, maintenance policies over a lifetime warranty can have a substantial impact on warranty servicing costs. Maintenance policies impose additional costs and would be worthy only if the reduction amount in total costs due to maintenance is greater than maintenance costs. The focus of this paper is on modelling preventive maintenance during a lifetime warranty in order to derive optimal maintenance policy and optimal level of repair based on the structures of a cost function and a failure rate function. Our investigation demonstrates that the optimal strategy for preventive maintenance can be achieved by considering minor repairs during early ages and marginal repairs near the end of a product life. Numerical examples are provided to evaluate our developed models and support the corresponding results.
    Keywords: Lifetime warranty; increasing failure rate; preventive maintenance policy; failure rate reduction method.
    DOI: 10.1504/IJQET.2017.10009738
     
  • The development of target-based posterior process capability indices and confidence intervals   Order a copy of this article
    by Anintaya Khamkanya, Byung Rae Cho, Paul Goethals 
    Abstract: Quality engineering tools and techniques are often sought as platforms for improving system design, enhancing performance, and optimizing process conditions. Perhaps one of the most popular tools is the process capability index (PCI), which relates the allowable spread of a process defined by engineering specifications to the natural spread of a process. The PCI enables an engineer to assess the performance of a process and thus realize where improvements in product quality may be needed. The vast majority of PCI research involves measuring process performance during the manufacturing stage, prior to the final inspection of products and shipping to the customer. After implementing inspections, however, non-conforming products are typically scrapped when they fail to fall within their specification limits; hence, the actual resulting process distribution shipped to the customer after inspection is truncated. Moreover, the traditional PCI does not account for the loss in quality when product characteristics fail to achieve their process target value. This research, in contrast, proposes indices that consider the underlying result of observations after inspection or when non-conforming products are scrapped, referred to as posterior PCIs. Utilizing a truncated distribution as the basis for measurement along with a target-based quality loss function for capability analyses, several posterior indices are developed corresponding to their traditional non-truncated counterparts. A simulation technique is implemented to compare the proposed posterior PCIs with traditional measures across multiple performance scenarios; finally, the confidence interval approximations for the posterior PCIs are derived. Our results suggest using the proposed posterior indices for capability analyses when industrial processes require that non-conforming products be scrapped prior to shipping to the customer.
    Keywords: process capability index; truncated normal distribution; process target; quality loss function.

  • OPTIMIZATION OF FOOD PROCESSING WITH MULTIPLE QUALITY CHARACTERISTICS USING DESIRABILITY FUNCTION.   Order a copy of this article
    by Jesús Gabriel Rangel-Peraza, Edith Padilla-Gasca, Yaneth A. Bustos-Terrones, Jaime Rochin-Medina, Abraham Rodriguez-Mata, Antonio J. Sanhouse-García 
    Abstract: Desirability function is a statistical methodology that can be applied to find optimal solutions for response variables in multiple objective optimization. Desirability function is widely used for finding a global optimum response for several manufacturing processes, including food processes. In this investigation a factorial design 24 with 4 center points was used to find out the best formulation conditions for the cucumber chutney production process. The factors taken into account were: osmotic dehydration time, thermal treatment time, treatment temperature, and formulation. The response variables measured were: water activity, efficiency, total soluble solids, viscosity and pH. The results showed that it was possible to optimize all response variables in a simultaneous way through the desirability function methodology. An optimization strategy that reduced the food development problem with multiple quality features to a simple mathematical model is the main contribution of this study.
    Keywords: Optimization; food processing; quality characteristics; desirability function; design of experiments; response surface methodology; factorial design; desirability function; cucumber Chutney; quality parameters; water activity; efficiency; total soluble solids; viscosity; pH.

  • Statistical Analysis of the Researches Carried Out on Lean and Six Sigma Applications in Health Care Industry   Order a copy of this article
    by Gaurav Suman, D.R. Prajapati 
    Abstract: The Lean and Six Sigma are two complementary methodologies in the sense that Lean focuses on reducing waste and increasing speed; whereas Six Sigma focuses on reducing variations and increasing consistency. The purpose of this paper is to provide an overview of Lean and Six Sigma applications in the healthcare sector. The work done by many researchers in healthcare industry is discussed. Literature survey shows that most of the studies (42%) are focused on reducing processing time. It is also found that number of studies focused on reducing processing time never goes out of phase. The Pareto chart analysis was performed for number of studies in various countries and different departments. It is found that more than 50% of studies were carried out in United States of America (USA) only and 22% of the studies were performed in emergency department in various countries. The matrix plots are shown for number of studies in different countries and different departments throughout the time line starting from the year 2000 to till date. It is also found that Lean and Six Sigma methodologies were uniformly applied in Emergency and Surgery departments, whereas in case of countries; only USA shows continuous applications of Lean and six sigma techniques.
    Keywords: Lean & Six Sigma; Quality in healthcare; Processing time; Productivity; Length of stay.

  • Hotellings T2 control chart with variable sampling interval and variable dimension   Order a copy of this article
    by Reza Shokrizadeh, Mohammad Dolatabadi, Yaqub Yaqubinejad 
    Abstract: T2 control chart with variable dimension is useful when there is a set of p1 variables that are easy to monitor or whose measurement is cheap, against a set of p2 variables, p = p1 + p2, that are difficult and/or expensive to monitor. However, the information that these p2 variables provide is important to detect quickly the process quality shift. Therefore, there are cases when controlling the whole set of p variables may be difficult or expensive, but controlling sometimes p1 variables, and only when the process seems to have a problem controlling the full set of p variables, may be a cheaper option on average and very efficient. However, its not effective if the shift size is small. For obtaining good performance in detecting such shifts, we propose the application of the variable sampling interval technique to the VDT2 control chart, resulting in the VSIVDT2 control chart.
    Keywords: Markov chain; Variable sampling interval; Variable dimension; Average time to signal; Control chart.

  • Parameter Optimizations for Gold Electroplating of Gold Jewelry   Order a copy of this article
    by Chanpen Anurattananon 
    Abstract: The purpose of this research is to reduce the over-specification of thickness in electroplating process of gold jewelry using experimental design for analyzing optimization of parameter conditions affecting thickness of gold jewelry in electroplating process and finding optimization for controlling specification and reducing average thickness. The 11 steps of methodology include 1) To study electroplating process of gold jewelry and identify problems 2) To analyze causes of problems and identify the factors 3) To analyze correlation between gold thickness and gold percentage 4) To test the variation of thickness and percentage of gold from external variations 5) To analyze measurement system (MSA) 6) To analyze process capability 7) To do experiment using factorial experiment 8) To collect the responses data 9) To analyze the effect of factors 10) To do confirmation of experiment 11) To analyze and summarize the experimental results. The factors were electroplating period, gold concentration, the electric current and the electroplating temperature. Each factor had two levels. There were four responses collected, that were average gold thickness (), minimum value of gold thickness (), standard deviation of gold thickness () and average of gold on stainless steel sheet (). 0.05 significance level was used. The results showed that the appropriate parameters were 17 minutes of electroplating period, 0.4 gram per liter of gold concentration, 0.6 amperes per square decimeter of electric current and 60 degree Celsius of electroplating temperature. The experiment could reduce the electroplating period by 4.20 minutes, the average gold thickness jig by 0.13 micron, minimum of gold thickness by 0.19 micron, and average gold percentage by 3.91 percent. These results reduced specification from maximum value of thickness from 3.75 micron to 3.55 micron and mid-thickness average from 3.50 micron to 3.45 micron.rn
    Keywords: Parameter optimization; Gold jewelry; 2k Factorial Experiment; Electroplating process.

  • Uncertainties estimation in identification of digital planar surface parameters using a 3D laser sensor   Order a copy of this article
    by Mohamed BARKI, Farid ASMA 
    Abstract: In modern metrological applications, 3D laser sensors are used to measure surface parameters. However, the performance of 3D laser sensor is influenced by its position and orientation angles on the target, which causes uncertainties in the measured parameters. In this paper, the influence of the position and orientation of a 3D laser plane sensor is studied by determining the measurement uncertainties on the parameters of a flat surface. Relative position and two orientation angles between the sensor and the surface are taken as the main characteristics of the sensor and are considered separately. In order to determine the uncertainties in the measurements, a set of measurements are performed on a reference part by using a Coordinate Measuring Machine (CMM) equipped with a laser plane sensor.
    Keywords: 3D sensor; Uncertainties estimation; CMM.

  • Quality Insights: Artificial Neural Network and Taxonomical Analysis of Activity Networks in Quality Engineering   Order a copy of this article
    by Adedeji Badiru 
    Abstract: The resurgence and advancement of artificial intelligence (AI) has spurred new research interests in applying or re-applying proven techniques from the 1990s to challenges in business and industry. The area of quality engineering and technology is particularly amenable to the application of AI techniques. Quality engineering programs are predicted on recurring quality projects that are made up of specific activities. The proper management of such activities is a precursor for the success of any quality program. This research paper explores the application of the proven technique of artificial neural network to activity networking in the management of quality programs.
    Keywords: Neural Network; Artificial Intelligence; Project Management; Quality Engineering; Quality Technology; Activity Networking; Taxonomical Analysis; Quality Decisions; Kohonen Networks; Activity Scheduling.

  • Analytical approach to product reliability estimation; a case study of automotive clutch system   Order a copy of this article
    by Hamed Niknafs, Morteza Faridkhah, Camelia Kazemi 
    Abstract: The study of reliability is an important part of engineering design process which forms the basis of analysis and judgment on future performance of the product. Since the future couldnt be predicted with an absolute certainty, the nature of reliability would lead us to probability theory and uncertainty modeling. The quantitative calculation of this parameter for mechanical systems within different steps of production requires an analytical and systematic approach which has been focused in this paper. The proposed approach has been applied for calculating reliability of a clutch system as a case study. The system reliability in this work is determined based on the block diagram method as a function of individual component reliabilities which are calculated by statistical analysis of life test results. Using Weibull model, reliability of a typical clutch system has been formulated based on durability bench test and results has been interpreted to estimate field reliability.
    Keywords: Reliability; block diagram; product life; clutch system; Weibull distribution model.

  • Determination of sample size to support diagnostic inspection of components   Order a copy of this article
    by Eishiro Higo, Mahesh Pandey 
    Abstract: A complex engineering system like a nuclear power reactor consists of a large variety and number of engineering components. As a part of a component aging management program, the diagnostic inspections of various component populations are performed to detect the onset of any unanticipated degradation. A prudent selection of the inspection sample size is necessary to optimize inspection cost. Sample size selection is typically based on the traditional statistical hypothesis test, which tends to result in a fairly large sample size. This paper presents an alternate approach to the sample size determination (SSD) problem based on the concept of the Value of Information (VoI). The paper provides a comparative analysis of the VoI and hypothesis-testing approaches through illustrative examples. The VoI approach is shown to provide a more meaningful way to minimize the cost of inspection as a function of component-replacement cost and losses arising from a failure. The characteristics and advantages of the VoI approach are analysed.
    Keywords: Maintenance; Inspection; Fitness for service; Sample size determination; Bayesian analysis; Value of information; EVSI; ENGS; Hypothesis testing; Decision analysis.