Results from UQ ultimately serve as decision support. Hence it is relevant to set the UQ analysis in the context of a formal decision analysis, to ensure the optimal choice of UQ methods and interpretation of results. This minisymposium focuses on such a combination of UQ with formal decision analysis methods. On the one hand, this includes the selection of metrics for UQ analysis based on decision-theoretic considerations. Examples include the choice of appropriate objective functions and decision-theoretic sensitivity measures. On the other hand, the minisymposium considers the integration of UQ in artificial intelligence applications, and more specifically sequential decision making algorithms, which are of increasing relevance in many fields of application.
14:00
Quantifying and communicating uncertainty in probabilistic predictions for effective decision support
Daniel Straub | Technical University of Munich | Germany
Show details
Authors:
Daniel Straub | Technical University of Munich | Germany
Iason Papaioannou | Technical University of Munich | Germany
Max Ehre | Technical University of Munich | Germany
Understanding and communicating uncertainty about predictions is essential for an effective decision support. Unfortunately, the modelling and representation of this uncertainty is ambiguous, in particular when the quantity of interest is a probability. This situation arises in engineering risk assessments, where the probability of failure is a key quantity for making decisions on risk mitigation strategies. In most cases, the probability of failure is presented as a scalar, computed by taking the expectation with respect to all uncertain input quantities. As an alternative, the effect of selected uncertainties can be quantified by evaluating a distribution of the probability of failure (and hence the risk) conditional on selected input random variables. This conditional probability is then a random variable itself, whose distribution reflects the prediction uncertainty. In this talk, we review the computational challenges associated with such a representation and critically discuss its interpretation. In particular, we look at how the probability of failure distribution provides an effective decision support. Finally, we also discuss its relation to sensitivity measures, focusing on measures related to the concept of value of information.
14:30
Sequential decision making under epistemic constraints: how to evaluate long-term policies and assess the value of information
Matteo Pozzi | Carnegie Mellon University | United States
Show details
Authors:
Matteo Pozzi | Carnegie Mellon University | United States
Shuo Li | Carnegie Mellon University | United States
A broad class of problems in sequential decision making under uncertainty can be modeled by Partial Observable Markov Decision Processes (POMPDs). In the domain of beliefs about the current process state, the optimal value function (or the cost-to-go function) of POMPDs has strong mathematical properties: the function is piecewise linear and convex. One consequence of these properties is related to the Value of Information (VoI), i.e. the cost-reduction related to receiving new information: VoI is always non-negative and it is small when the impact of information on the belief is small. However, these properties do not hold when the decision maker must follow external epistemic constraints, for example when the probability of some failure event must be kept under a threshold. Under these constraints, the value function is not necessarily convex, and the VoI is not necessarily non-negative.
This paper illustrates how to evaluate policies following external epistemic constraints in POMDPs, using grid and point-wise approximation in the belief domain, and how to assess the VoI based on this evaluation. Application is to the management of engineering systems under public regulations.
15:00
Optimization-based Decision Support via Uncertainty Quantification
Hailiang Du | Durham University | United Kingdom
Show details
Authors:
Hailiang Du | Durham University | United Kingdom
Michael Goldstein | Durham University | United Kingdom
Wei Sun | University of Edinburgh | United Kingdom
Gareth Harrison | University of Edinburgh | United Kingdom
Computer simulators are widely used to make inferences about complex physical systems in conjunction with historic observations. In energy systems modelling, for example, optimization methods based on certain objective function(s) are widely used to provide deterministic solutions to decision makers. For complex high-dimensional systems, however, simplifications are inevitable to conduct traditional optimization, which leads to the “optimal” solution being suboptimal or nonoptimal. Whilst the optimization problem is well resolved, it would still be valuable for both operational and long term planning purpose to introduce some flexibility to the solution. A novel statistical methodology is introduced, where statistical emulation and uncertainty quantification are employed to identify candidate solutions and to quantify uncertainties (due to i) observational error; ii) emulation approximation; iii) model discrepancy) attach to each candidate solution. Candidate solutions subject to the objective function(s) provide useful flexibility and attached uncertainty quantification provides extra valuable information for decision support. Furthermore, the proposed methodology is able to identify and invert the Pareto boundary for decision when there are tradeoffs in the objective functions. Applications to energy system planning problems including wind farms and solar power stations are presented to provide experimental demonstrations.
15:30
Methodology for Robust Bayesian Optimal Experimental Design Decisions
Costas Papadimitriou | University of Thessaly | Greece
Show details
Authors:
Costas Papadimitriou | University of Thessaly | Greece
Tulay Ercan | University of Thessaly | Greece
Petros Koumoutsakos | ETH Zürich | Switzerland
Optimal experimental design (OED) decisions maximize the amount of useful information extracted from measurements to accomplish the goals for which these experiments are set up. Goals include the selection of the most suitable physics-based models for the system components, the estimation of the model parameters, the improvement in the confidence of the model predictions, as well as the reliable identification of damage/faults (location and severity) in systems. We propose a Bayesian framework for optimal experimental design that incorporates information-based measures (e.g. mutual information) in expected utility functions. The OED design is rendered robust to uncertainties in model selection/parameters and modelling errors, by formulating a multi-objective optimization problem that accounts for the expected amount of information in the data and robustness of this information to modeling uncertainties. Theoretical and computational aspects involved with sampling techniques and asymptotic approximations are addressed. The framework is demonstrated using applications from structural dynamics. The design variables include type, location and number of sensors, as well as type, location and excitation characteristics (e.g. frequency content and amplitude) of actuators. Acknowledgements: This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No 764547.