International Journal of Reliability and Safety (10 papers in press)
Integrated Bayesian probabilistic approach to improve predictive modelling
by Xiaofei Guan, Xiaomo Jiang, Yucheng Tang, Xueyu Cheng, Yong Yuan
Abstract: This paper presents an integrated Bayesian probabilistic methodology and procedure to calibrate parameters of an analytics predictive model and quantitatively evaluate its validity and predictive capacity with non-normality data, considering uncertainties in both model and data. Bayes network is developed to graphically represent the relationships of all variables in the computational model. Bayesian regression theory associated with Markov Chain Monte Carlo technique and Gibbs sampling is developed to calibrate the model parameters for improving prediction accuracy. The Bayesian method is compared to traditional maximum likelihood and nonlinear optimization approaches in terms of parameter calibration. A generic procedure is presented to integrate the model calibration and quantitative validation. Hypothesis testing based validation requires the validation data to be normally distributed. The Anderson-Darling goodness-of-fit test and Box-Cox transformation are employed, respectively, to perform the normality hypothesis test of validation difference data and data normality conversion. Both classical and Bayesian hypothesis testing approaches are utilized to quantitatively assess the calibrated models. The confidence of evaluating the calibrated model is quantified via the Bayesian inference method, which facilitates the decision making on the model quality under uncertainty. The integrated methodology and procedure is demonstrated with a nonlinear computational model for pressure loss prediction in a gas turbine and five sets of different measurement data.
Keywords: Bayesian statistics; Bayes network; hypothesis testing; model calibration; model validation.
A methodology to determine maintenance criticality using Dempster Shafer theory
by Ankur Bahl, Anish Sachdeva, Rajeev Kumar Grag
Abstract: This study deals with development of a methodology for maintenance policy selection by assimilating the various factors as an alternative to traditional failure mode effect analysis. This methodology is based upon the Dempster Shafer theory (DST) of evidence, which helps the plant managers and engineers to contrive an efficacious priority rating of various items of equipment and components under conditions of epistemic uncertainty, considering the various parameters. A case study of a distillery plant is undertaken for selection of a suitable maintenance policy for the various items of equipment and components of the plant using the proposed methodology.
Keywords: Dempster Shafer theory; epistemic uncertainty; evidence theory; FMEA; maintenance policy selection.
Cost-reliability trade-off of path-generating linkages using multiobjective genetic algorithm
by Palaniappan Ramu, Gurunathan Saravana Kumar, Prashanth Neelakantan, Kiran Kumar Bathula
Abstract: The performance of a path-generating linkage is measured in terms of the error in the generated path. The probability of producing its intended path is its reliability. Tighter tolerances in link lengths and joint clearances result in higher reliability but incur more costs. Therefore, it is desirable to understand the trade-off relationship between the costs and reliability. In the current work, a genetic algorithm is used to construct the Pareto trade-off front between cost and reliability by solving a bi-criterion optimisation problem. Statistical moments required to estimate reliability are computed by combining an approximate cumulative density function of error and a three-point approximation technique. This approach uses a fraction of the samples compared with crude Monte Carlo simulation. The proposed approach is demonstrated on a four-bar mechanism tracing a straight line and a closed path. It is observed that the Pareto front generated using the proposed approach with fewer samples compares well with the one generated with crude Monte Carlo simulation with a large sample set, thus offering enormous gains in computational efficiency.
Keywords: reliability; Pareto front; mechanism; Monte Carlo simulation; bootstrap.
A bivariate replacement policy for system in an increasing failure rate model with double repair cost limits
by Min-Tsai Lai
Abstract: This paper proposed a bivariate (T, L2) replacement policy with double repair-cost limits for a system subject to increasing failure rate model. External shocks are divided into two types: type-I shock and type-II shock. Each type-II shock results in a minor system failure, while each type-I shock increases the failure rate of the system by a certain amount and it will induce the system into critical failure. When a minor failure occurs, the repair cost is evaluated and minimal repair is executed if the repair cost is less than a single repair-cost limit L1 and the accumulated repair cost is less than a cumulative repair-cost limit L2; otherwise, the system is replaced by a new one. In addition, the system is replaced at scheduled time T or at critical failure. By formulating the optimisation problem, the optimal T* and L2* which minimise the long-run expected cost per unit time are found, respectively. We develop the corresponding computational algorithm to obtain the optimal replacement policy and present a numerical example to illustrate the effectiveness of the proposed model.
Keywords: bivariate replacement policy; repair-cost limit; cumulative repair-cost limit; shock model; minimal repair.
Dynamic control for safety system multi-agent system with case-based reasoning
by Nassima Aissani, Islam Hadj Mohamed Guetarni, Soraya Zebirate
Abstract: The increasing complexity and size of electronic systems in industry, combined with growing market demand, requires industries to implement an efficient safety system to preserve equipment viability, environment protection and especially human life protection. The aim of this paper is to present a dynamic safety system based on a multi-agent paradigm using case-based reasoning to identify and react to risks. To develop the base of cases, an exhaustive risk analysis was carried out, giving rise to risk ontology and their precursors. Then, a model for identifying the similarities between precursors was developed. Case-based reasoning is very reactive, and with the presented model of similarity, the developed security system was very reactive against the risks that the system was experiencing during the experiments.
Keywords: safety industrial system; dynamic modelisation; case-based reasoning; Ontology.
Special Issue on: IJRS REC2016 Computing with Polymorphic Uncertain Data
Uncertainty assessment in the results of inverse problems: application to damage detection in masonry dams
by Long Nguyen-Tuan, Carsten Koenke, Volker Bettzieche, Tom Lahmer
Abstract: In this work, we study the uncertainties in the results of inverse problems. The inverse problems solve damage identification problems in multifield-multiphase problems for fluid-flow problems in deforming porous materials under non-isothermal boundary conditions. These analyses are important within the structural health monitoring of masonry dams. Results of the inverse problems show a scatter due to different sources of uncertainties in model parameters, measurement data, field of measurements, and in the solving algorithms of the inverse problem. In order to see and analyse the scatter, the inverse problem is solved repeatedly by a sampling process. The uncertainty in the inverse solutions can be quantified by their probability distributions according to the sampling results.
Keywords: damage identification; masonry dams; optimisation; uncertainty quantification; random field.
Numerical simulation of wooden structures with polymorphic uncertainty in material properties
by Ferenc Leichsenring, Christian Jenkel, Wolfgang Graf, Michael Kaliske
Abstract: Uncertainties are inherently present in structural parameters such as loadings, boundary conditions or resistance of structural materials. Especially material properties and parameters of wood are strongly varying in consequence of growth and environmental conditions. The considered uncertainties can be classified into aleatoric and epistemic uncertainty. To include this variation in structural analysis, available data need to be modelled appropriately, e.g. by means of probability and, furthermore, fuzzy probability based random variables or fuzzy sets. Therefore, a limited empirical data basis for Norway spruce, obtained by experiments according to DIN EN 408, is stochastically analysed including correlation-, sensitivity-analyses and statistical tests. In order to comprehend uncertainties induced by estimating the distribution parameters, the stochastic approach has been extended by fuzzy distribution parameters to fuzzy probability based random variables according to [1, 2]. To cope with epistemic uncertainties for e.g. geometric parameters of knotholes, fuzzy sets are used. The consequence for wooden structures is determined by fuzzy stochastic analysis  in combination with a Finite Element (FE) simulation using an model suitable for characteristics of a timber structure by . The uncertain results (e.g. displacements, failure loads) constituted by the proposed holistic approach defining the material properties based on an empirical data basis and the attempt of representing the uncertainties in material parameters and methods itself will be discussed.
Keywords: polymorphic uncertainty; fuzzy randomness; stochastic modelling;rnwood mechanics; structural analysis.
Using statistical and interval-based approaches to propagate snow measurement uncertainty to structural reliability
by Árpád Rózsás, Miroslav Sýkora
Abstract: Observations are inevitably contaminated with measurement uncertainty, which is a predominant source of uncertainty in some cases. In present practice, probabilistic models are typically fitted to measurements without proper consideration of this uncertainty. Hence, this study explores the effect of this simplification on structural reliability and provides recommendations on its appropriate treatment. Statistical and interval-based approaches are used to quantify and propagate measurement uncertainty in probabilistic reliability analysis. The two approaches are critically compared by analysing ground snow measurements that are often affected by large measurement uncertainty. The results indicate that measurement uncertainty may lead to significant (order of magnitude) underestimation of failure probability and should be taken into account in reliability analysis. Ranges of the key parameters are identified where measurement uncertainty should be considered. For practical applications, the lower interval bound and predictive reliability index are recommended as point estimates using interval and statistical analysis, respectively. The point estimates should be accompanied by uncertainty intervals, which convey valuable information about the credibility of results.
Keywords: measurement uncertainty; snow; structural reliability; interval arithmetic; maximum likelihood; deconvolution; statistics.
Extrapolation of extreme traffic load effects on a cable-stayed bridge based on weigh-in-motion measurements
by Naiwei Lu, Yang Liu, Michael Beer
Abstract: The steadily growing traffic loading may become a hazard for the bridge safety. Compared with short and medium span bridges, long-span bridges suffer from simultaneous presence of multiple vehicle loads. This study presents an approach for extrapolating probabilistic extreme effects on long-span bridges based on weigh-in-motion (WIM) measurements. Three types of stochastic traffic load model are simulated based on the WIM measurements of a highway in China. The level-crossing rate of each stochastic traffic load is evaluated and integrated for extrapolating extreme traffic load effects. The probability of exceedance of a cable-stayed bridge is evaluated considering a linear traffic growth model. The numerical results show that the superposition of crossing rates is effective and feasible to model the probabilistic extreme effects of long-span bridges under the actual traffic loads. The increase of dense traffic flows is sensitive to the maximum load effect extrapolation. The dense traffic flow governs the limit state of traffic load on long-span bridges.
Keywords: long-span bridge; traffic load; extreme value; level-crossing theory; weigh-in-motion; probability of exceedance.
Solving the power allocation problem using methods with result verification
by Ekaterina Auer, Cesar Benavente-Peces, Andreas Ahrens
Abstract: Characterising how different types of uncertainty in the multiple-input multiple-output (MIMO) systems influence their performance is an important research topic. In this paper, we focus on the task of power allocation in fixed rate MIMO systems with singular value decomposition based channel separation. The interval analysis is used to develop a verified solution to the problem, taking bounded uncertainty in parameters and rounding errors into account. We demonstrate that power allocation improves the bit error rate (BER) using an exemplary 4x4 MIMO channel for two distinct choices of the channel matrix, and verify the upper bound on the BER under realistic uncertainty conditions. Besides, we show that a combined analytical/numerical procedure produces better results than the purely numerical one, and identify the parameters that the mathematical model is most sensitive to.
Keywords: interval analysis; result verification; MIMO systems; power allocation.