American Journal of Mathematical and Computational Sciences  
Manuscript Information
 
 
Probabilistic Reduced Order Modeling Using a Bayesian Approach
American Journal of Mathematical and Computational Sciences
Vol.3 , No. 2, Publication Date: May 31, 2018, Page: 50-61
629 Views Since May 31, 2018, 374 Downloads Since May 31, 2018
 
 
Authors
 
[1]    

Indika Udagedara, Department of Mathematics, Clarkson University, New York, United States of America.

[2]    

Brian Todd Helenbrook, Department of Mathematics, Clarkson University, New York, United States of America; Department of Mechanical and Aeronautical Engineering, Clarkson University, New York, United States of America.

[3]    

Aaron Luttman, Signal Processing and Applied Mathematics, Nevada National Security Site, Las Vegas, United States of America.

[4]    

Jared Catenacci, Signal Processing and Applied Mathematics, Nevada National Security Site, Las Vegas, United States of America.

 
Abstract
 

A method for probabilistic reduced order modeling (ROM) is developed for stochastic problems. Probabilistic principal component analysis (PPCA) was modified to generate a basis for the reduced order model from training data, in such a way that it allows the noise in the training data to be estimated and also determines the variance of the latent variables. This variance information is then used as a prior in a new probabilistic data projection approach. Together these techniques give a fully probabilistic method for creating ROMs that allow accurate predictions of noise-free data from data that is dominated by noise.


Keywords
 

Bayesian Parameter Estimation, Model Selection, Noise Reduction, Probabilistic Principal Component Analysis, Reduced Order Modeling


Reference
 
[01]    

Indika Udagedara, Brian T Helenbrook, Aaron Luttman, and Stephen E Mitchell. Reduced order modeling for accelerated Monte Carlo simulations in radiation transport. Applied Mathematics and Computation, 267: 237–251, 2015.

[02]    

K Peason. On lines and planes of closest fit to systems of point in space. Philosophical Magazine, 2: 559–572, 1901.

[03]    

Svante Wold, Kim Esbensen, and Paul Geladi. Principal component analysis. Chemometrics and Intelligent Laboratory Systems, 2 (1-3): 37–52, 1987.

[04]    

Hervé Abdi and Lynne J Williams. Principal component analysis. Wiley Interdisciplinary Reviews: Computational Statistics, 2 (4): 433–459, 2010.

[05]    

Mark Richardson. Principal component analysis. URL: http://people.maths.ox.ac.uk/richardsonm/SignalProcPCA. pdf (last access: 3. 5. 2013), 2009.

[06]    

Rasmus Bro and Age K Smilde. Principal component analysis. Analytical Methods, 6 (9): 2812–2831, 2014.

[07]    

Ian Jolliffe. Principal component analysis. Wiley Online Library, 2002.

[08]    

Bruce Moore. Principal component analysis in linear systems: Controllability, observability, and model reduction. IEEE Transactions on Automatic Control, 26 (1): 17–32, 1981.

[09]    

Neil Lawrence. Probabilistic non-linear principal component analysis with Gaussian process latent variable models. Journal of Machine Learning Research, 6 (Nov): 1783–1816, 2005.

[10]    

Michael E Tipping and Christopher M Bishop. Mixtures of probabilistic principal component analyzers. Neural Computation, 11 (2): 443–482, 1999.

[11]    

Michael E Tipping and Christopher M Bishop. Probabilistic principal component analysis. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 61 (3): 611–622, 1999.

[12]    

Christian F Beckmann and Stephen M Smith. Probabilistic independent component analysis for functional magnetic resonance imaging. IEEE Transactions on Medical Imaging, 23 (2): 137–152, 2004.

[13]    

Fang X Wu. Gene regulatory network modelling: a state-space approach. International Journal of Data Mining and Bioinformatics, 2 (1): 1–14, 2008.

[14]    

Alexander Ilin and Tapani Raiko. Practical approaches to principal component analysis in the presence of missing values. Journal of Machine Learning Research, 11 (Jul): 1957–2000, 2010.

[15]    

Jiahua Chen and Zehua Chen. Extended Bayesian information criteria for model selection with large model spaces. Biometrika, 95 (3): 759–771, 2008.

[16]    

Tina Toni, David Welch, Natalja Strelkowa, Andreas Ipsen, and Michael PH Stumpf. Approximate Bayesian computation scheme for parameter inference and model selection in dynamical systems. Journal of the Royal Society Interface, 6 (31): 187–202, 2009.

[17]    

Larry Wasserman. Bayesian model selection and model averaging. Journal of Mathematical Psychology, 44 (1): 92–107, 2000.

[18]    

Kenneth P Burnham and David R Anderson. Multimodel inference understanding AIC and BIC in model selection. Sociological Methods & Research, 33 (2): 261–304, 2004.

[19]    

Walter Zucchini. An introduction to model selection. Journal of Mathematical Psychology, 44 (1): 41–61, 2000.

[20]    

James L Beck and Ka V Yuen. Model selection using response measurements: Bayesian probabilistic approach. Journal of Engineering Mechanics, 130 (2): 192–203, 2004.

[21]    

Adrian E Raftery. Bayesian model selection in structural equation models. Sage Focus Editions, 154: 163–163, 1993.

[22]    

Adrian E Raftery. Bayesian model selection in social research. Sociological Methodology, pages 111–163, 1995.

[23]    

David Posada and Thomas R Buckley. Model selection and model averaging in phylogenetics: advantages of Akaike information criterion and Bayesian approaches over likelihood ratio tests. Systematic Biology, 53 (5): 793–808, 2004.

[24]    

Jerald B Johnson and Kristian S Omland. Model selection in ecology and evolution. Trends in Ecology & Evolution, 19 (2): 101–108, 2004.

[25]    

Kenneth P Burnham and David Anderson. Model selection and multi-model inference. Taylor & Francis, 2003.

[26]    

Sadanori Konishi, Tomohiro Ando, and Seiya Imoto. Bayesian information criteria and smoothing parameter selection in radial basis function networks. Biometrika, 91 (1): 27–43, 2004.

[27]    

Gerda Claeskens, Nils L Hjort, et al. Model selection and model averaging, volume 330. Cambridge University Press Cambridge, 2008.

[28]    

Ronald A Fisher. On the mathematical foundations of theoretical statistics. Philosophical Transactions of the Royal Society of London. Series A, Containing Papers of a Mathematical or Physical Character, 222: 309–368, 1922.

[29]    

FW Scholz. Maximum likelihood estimation. Wiley Online Library, 1985.

[30]    

Halbert White. Maximum likelihood estimation of misspecified models. Econometrica: Journal of the Econometric Society, pages 1–25, 1982.

[31]    

Bradley Efron and David V Hinkley. Assessing the accuracy of the maximum likelihood estimator: Observed versus expected Fisher information. Biometrika, 65 (3): 457–483, 1978.

[32]    

Klaas E Stephan, Will D Penny, Jean Daunizeau, Rosalyn J Moran, and Karl J Friston. Bayesian model selection for group studies. Neuroimage, 46 (4): 1004–1017, 2009.

[33]    

Henry D Acquah. Comparison of Akaike information criterion (AIC) and Bayesian information criterion (BIC) in selection of an asymmetric price relationship. Journal of Development and Agricultural Economics, 2 (1): 001–006, 2010.

[34]    

Joseph E Cavanaugh. Unifying the derivations for the Akaike and corrected Akaike information criteria. Statistics & Probability Letters, 33 (2): 201–208, 1997.

[35]    

Hamparsum Bozdogan. Akaike’s information criterion and recent developments in information complexity. Journal of Mathematical Psychology, 44 (1): 62–91, 2000.





 
  Join Us
 
  Join as Reviewer
 
  Join Editorial Board
 
share:
 
 
Submission
 
 
Membership