Abo Bibliothek: Guest
International Journal for Uncertainty Quantification

Erscheint 6 Ausgaben pro Jahr

ISSN Druckformat: 2152-5080

ISSN Online: 2152-5099

The Impact Factor measures the average number of citations received in a particular year by papers published in the journal during the two preceding years. 2017 Journal Citation Reports (Clarivate Analytics, 2018) IF: 1.7 To calculate the five year Impact Factor, citations are counted in 2017 to the previous five years and divided by the source items published in the previous five years. 2017 Journal Citation Reports (Clarivate Analytics, 2018) 5-Year IF: 1.9 The Immediacy Index is the average number of times an article is cited in the year it is published. The journal Immediacy Index indicates how quickly articles in a journal are cited. Immediacy Index: 0.5 The Eigenfactor score, developed by Jevin West and Carl Bergstrom at the University of Washington, is a rating of the total importance of a scientific journal. Journals are rated according to the number of incoming citations, with citations from highly ranked journals weighted to make a larger contribution to the eigenfactor than those from poorly ranked journals. Eigenfactor: 0.0007 The Journal Citation Indicator (JCI) is a single measurement of the field-normalized citation impact of journals in the Web of Science Core Collection across disciplines. The key words here are that the metric is normalized and cross-disciplinary. JCI: 0.5 SJR: 0.584 SNIP: 0.676 CiteScore™:: 3 H-Index: 25

Indexed in

A FULLY BAYESIAN GRADIENT-FREE SUPERVISED DIMENSION REDUCTION METHOD USING GAUSSIAN PROCESSES

Volumen 12, Ausgabe 2, 2022, pp. 19-51
DOI: 10.1615/Int.J.UncertaintyQuantification.2021035621
Get accessGet access

ABSTRAKT

Modern day engineering problems are ubiquitously characterized by sophisticated computer codes that map parameters or inputs to an underlying physical process. In other situations, experimental setups are used to model the physical process in a laboratory, ensuring high precision while being costly in materials and logistics. In both scenarios, only a limited amount of data can be generated by querying the expensive information source at a finite number of inputs or designs. This problem is compounded further in the presence of a high-dimensional input space. State-of-the-art parameter space dimension reduction methods, such as active subspace, aim to identify a subspace of the original input space that is sufficient to explain the output response. These methods are restricted by their reliance on gradient evaluations or copious data, making them inadequate for expensive problems without direct access to gradients. The proposed methodology is gradient-free and fully Bayesian, as it quantifies uncertainty in both the low-dimensional subspace and the surrogate model parameters. This enables a full quantification of epistemic uncertainty and robustness to limited data availability. It is validated on multiple datasets from engineering and science and compared to two other state-of-the-art methods based on four aspects: (a) recovery of the active subspace, (b) deterministic prediction accuracy, (c) probabilistic prediction accuracy, and (d) training time. The comparison shows that the proposed method improves the active subspace recovery and predictive accuracy, in both the deterministic and probabilistic sense, when only few a model observations are available for training, at the cost of increased training time.

REFERENZEN
  1. Kiureghian, A.D. and Ditlevsen, O., Aleatory or Epistemic? Does It Matter, Struct. Saf., 31(2):105-112, 2009.

  2. Oakley, J., Estimating Percentiles of Uncertain Computer Code Outputs, J. R. Stat. Soc., 53(1):83-93,2004.

  3. Flournoy, N., A Clinical Experiment in Bone Marrow Transplantation: Estimating a Percentage Point of a Quantal Response Curve, Berlin: Springer, pp. 324-336, 1993.

  4. Schonlau, M., Computer Experiments and Global Optimization, PhD, University of Waterloo, 1997.

  5. Sacks, J., Welch, W.J., Mitchell, T.J., and Wynn, H.P., Design and Analysis of Computer Experiments, Stat. Sci., 4:409-423, 1989.

  6. Benner, P. and FaBbender, H., Model Order Reduction: Techniques and Tools, London: Springer, pp. 1-10, 2013.

  7. Jolliffe, I.T., Principal Component Analysis. Encyclopedia of Statistics in Behavioral Science, in Encyclopedia of Statistics in Behavioral Science, New York: Wiley, p. 518, 2005.

  8. Zhu, Y. and Zabaras, N., Bayesian Deep Convolutional Encoder-Decoder Networks for Surrogate Modeling and Uncertainty Quantification, J. Comput. Phys., 366:415-447, 2018.

  9. Dietrich, F., Kunzner, F., Neckel, T., Koster, G., and Bungartz, H.J., Fast and Flexible Uncertainty Quantification through a Data-Driven Surrogate Model, Int. J. Uncertainty Quantif, 8(2):175-192,2018.

  10. Lataniotis, C., Marelli, S., and Sudret, B., Extending Classical Surrogate Modeling to High Dimensions through Supervised Dimensionality Reduction: A Data-Driven Approach, Int. J. Uncertainty Quantif., 10(1):55-82, 2020.

  11. Sargsyan, K., Safta, C., Najm, H.N., Debusschere, B.J., Ricciuto, D., and Thornton, P., Dimensionality Reduction for Complex Models via Bayesian Compressive Sensing, Int. J. Uncertainty Quantif., 4(1):63-93, 2014.

  12. Kubicek, M., Minisci, E., and Cisternino, M., High Dimensional Sensitivity Analysis Using Surrogate Modeling and High Dimensional Model Representation, Int. J. Uncertainty Quantif., 5(5):393-414,2015.

  13. Shan, S. and Wang, G.G., Survey ofModeling and Optimization Strategies to Solve High-Dimensional Design Problems with Computationally-Expensive Black-Box Functions, Struct. Multidiscip. Opt., 41(2):219-241, 2010.

  14. Keiper, S., Approximation of Generalized Ridge Functions in High Dimensions, J. Approx. Theory, 245:101-129, 2019.

  15. Doerr, B. and Mayer, S., The Recovery of Ridge Functions on the Hypercube Suffers from the Curse of Dimensionality, Math. Numer. Anal., arXiv:1903.10223, 2019.

  16. Oglic, D., Constructive Approximation and Learning By Greedy Algorithms, PhD, Universitat und Landesbibliothek Bonn, 2018.

  17. Glaws, A. and Constantine, P.G., A Lanczos-Stieltjes Method for One-Dimensional Ridge Function Approximation and Integration, Math. Numer. Anal., arXiv:1808.02095, 2018.

  18. Kolleck, A., On Some Aspects of Recovery of Sparse Signals in High Dimensions from Nonlinear Measurements Using Compressed Sensing, PhD, Technischen Universitat Berlin, 2017.

  19. Pinkus, A., Ridge Functions, Vol. 205, Cambridge: Cambridge University Press, 2015.

  20. DeVore, R. and Lorentz, G., Constructive Approximation, Berlin: Springer Science & Business Media, 1993.

  21. Tyagi, H. and Cevher, V., Learning Ridge Functions with Randomized Sampling in High Dimensions, in Proc. of IEEE International Conference on Acoustics, Speech and Signal Processing-ICASSP, IEEE, pp. 2025-2028, 2012.

  22. Fornasier, M., Schnass, K., and Vybiral, J., Learning Functions of Few Arbitrary Linear Parameters in High Dimensions, Found. Comput. Math, 12(2):229-262, 2012.

  23. Cohen, A., Daubechies, I., DeVore, R., Kerkyacharian, G., and Picard, D., Capturing Ridge Functions in High Dimensions from Point Queries, Construct. Approx., 35(2):225-243,2012.

  24. Schnass, K. and Vybiral, J., Compressed Learning of High-Dimensional Sparse Functions, in Proc. of IEEE International Conference on Acoustics, Speech and Signal Processing-ICASSP, IEEE, pp. 3924-3927, 2011.

  25. Li, B., Sufficient Dimension Reduction: Methods and Applications with R, Boca Raton, FL: Chapman and Hall/CRC, 2018.

  26. Burges, C.J., Dimension Reduction: A Guided Tour, Found. Trends Mach. Learn, 2(4):275-365, 2009.

  27. Fukumizu, K., Bach, F.R., and Jordan, M.I., Kernel Dimension Reduction in Regression, Ann. Stat., 37(4):1871-1905, 2009.

  28. Adragni, K.P. and Cook, R.D., Sufficient Dimension Reduction and Prediction in Regression, Philos. Trans. R. Soc. A, 367(1906):4385-4405, 2009.

  29. Li, B. and Wang, S., On Directional Regression for Dimension Reduction, J. Am. Stat. Association, 102(479):997-1008,2007.

  30. Corrochano, E.B., De Bie, T., Cristianini, N., and Rosipal, R., Eigenproblems in Pattern Recognition, Handb. Geom. Comput:., 10:129-167, 2005.

  31. Rosipal, R. and Kramer, N., Overview and Recent Advances in Partial Least Squares, Lect. Notes Comput. Sci., 3940:34-51, 2006.

  32. Rosipal, R., Nonlinear Partial Least Squares: An Overview, in Chemoinformatics and Advanced Machine Learning Perspectives: Complex Computational Methods and Collaborative Techniques, H. Lodi and Y. Yamanishi, Eds., Hershey, PA: IGI Global, pp. 169-189,2010.

  33. Cole, T.J., Can Partial Least Squares Regression Separate the Effects of Body Size and Growth on Later Blood Pressure: Partial Least Squares Regression, Epidemiology, 21(4):449-451,2010.

  34. Yu, S., Yu, K., Tresp, V., Kriegel, H.P., and Wu, M., Supervised Probabilistic Principal Component Analysis, Proc. of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Vol. 2006 of KDD '06, New York, pp. 464-473, 2006.

  35. Chao, G., Luo, Y., and Ding, W., Recent Advances in Supervised Dimension Reduction: A Survey, Mach. Learn. Knowl. Extract, 1(1):341-358, 2019.

  36. Tripathy, R., Bilionis, I., and Gonzalez, M., Gaussian Processes with Built-In Dimensionality Reduction: Applications to High-Dimensional Uncertainty Propagation, J. Comput. Phys., 321:191-223,2016.

  37. Tsilifis, P. and Ghanem, R.G., Bayesian Adaptation of Chaos Representations Using Variational Inference and Sampling on Geodesics, Proc. R. Soc. A, 474(2217), 2018.

  38. Ma, Y. and Zhu, L., A Review on Dimension Reduction, Int. Stat. Rev, 81(1):134-150, 2013.

  39. Chen, P. and Ghattas, O., Hessian-Based Sampling for High-Dimensional Model Reduction, Int. J. Uncertainty Quantif., 9(2):103-121, 2019.

  40. Guyon, I., Gunn, S., Nikravesh, M., and Zadeh, L.A., Feature Extraction: Foundations and Applications, Vol. 207, Berlin: Springer, 2008.

  41. Constantine, P.G., Dow, E., and Wang, Q., Active Subspace Methods in Theory and Practice: Applications to Kriging Surfaces, SIAMJ Sci. Comput., 36(4):A1500-A1524, 2014.

  42. Stoyanov, M. and Webster, C.G., A Gradient-Based Sampling Approach for Dimension Reduction of Partial Differential Equations with Stochasticcoefficients, Int. J. Uncertainty Quantif., 5(1):49-72, 2015.

  43. Constantine, P.G., Emory, M., Larsson, J., and Iaccarino, G., Exploiting Active Subspaces to Quantify Uncertainty in the Numerical Simulation of the HyShot II Scramjet, J. Computat. Phys, 302:1-20,2015.

  44. Constantine, P.G., Kent, C., and Bui-Thanh, T., Accelerating Markov Chain Monte Carlo with Active Subspaces, SIAMJ. Sci. Comput, 38(5):A2779-A2805, 2016.

  45. Tezzele, M., Salmoiraghi, F., Mola, A., and Rozza, G., Dimension Reduction in Heterogeneous Parametric Spaces with Application to Naval Engineering Shape Design Problems, Adv. Model. Simul. Eng. Sci, 5(1):25, 2018.

  46. Gross, J.C., Seshadri, P., and Parks, G., Optimisation with Intrinsic Dimension Reduction: A Ridge Informed Trust-Region Method, in Proc. ofAIAA Scitech 2020 Forum, pp. 1-21,2020.

  47. Demo, N., Tezzele, M., and Rozza, G., A Supervised Learning Approach Involving Active Subspaces for an Efficient Genetic Algorithm in High-Dimensional Optimization Problems, Math. Numer. Anal., arXiv:2006.07282, 2020.

  48. Zahm, O., Constantine, P.G., Prieur, C., and Marzouk, Y.M., Gradient-Based Dimension Reduction of Multivariate Vector-Valued Functions, SIAMJ. Sci. Comput, 42(1):A534-A558, 2020.

  49. Rajaram, D., Gautier, R.H., Perron, C., Pinon-Fischer, O.J., and Mavris, D., Non-Intrusive Parametric Reduced Order Models with High-Dimensional Inputs via Gradient-Free Active Subspace, in Proc. ofAIAA Aviation 2020 Forum, p. 3184, 2020.

  50. Lam, R.R., Zahm, O., Marzouk, Y.M., and Willcox, K.E., Multifidelity Dimension Reduction via Active Subspaces, SIAMJ. Sci. Comput., 42(2):A929-A956, 2020.

  51. Russi, T.M., Uncertainty Quantification with Experimental Data and Complex System Models, PhD, University of California, Berkeley, 2010.

  52. Berguin, S.H., Rancourt, D., and Mavris, D.N., Method to Facilitate High-Dimensional Design Space Exploration Using Computationally Expensive Analyses, AIAA J, 53(12):3752-3765, 2015.

  53. Seshadri, P., Yuchi, S., and Parks, G.T., Dimension Reduction via Gaussian Ridge Functions, SIAM-ASAJ. Uncertainty Quantif, 7(4):1301-1322, 2019.

  54. Tipireddy, R. and Ghanem, R., Basis Adaptation in Homogeneous Chaos Spaces, J. Comput. Phys., 259:304-317, 2014.

  55. Girolami, M., Calderhead, B., and Chin, S.A., Riemannian Manifold Hamiltonian Monte Carlo, Stat. Comput:., arXiv:0907.1100, 2009.

  56. Byrne, S. and Girolami, M., Geodesic Monte Carlo on Embedded Manifolds, Scand. J. Stat., 40(4):825-845, 2013.

  57. Shepard, R., Gidofalvi, G., and Brozell, S.R., The Multifacet Graphically Contracted Function Method. II. A General Procedure for the Parameterization of Orthogonal Matrices and Its Application to Arc Factors, J. Chem. Phys, 141(6):064106, 2014.

  58. Nirwan, R.S. and Bertschinger, N., Rotation Invariant Householder Parameterization for Bayesian PCA, in Proc. of 36th International Conference on Machine Learning, ICML 2019, pp. 8466-8474, 2019.

  59. Jauch, M., Hoff, P.D., and Dunson, D.B., Monte Carlo Simulation on the Stiefel Manifold via Polar Expansion, Stat. Comput:., arXiv:1906.07684, 2019.

  60. Pourzanjani, A.A., Jiang, R.M., Mitchell, B., Atzberger, P.J., and Petzold, L.R., Bayesian Inference over the Stiefel Manifold via the Givens Representation, Stat. Mach. Learn., arXiv:1710.09443,2017.

  61. Tripathy, R.K. and Bilionis, I., Deep UQ: Learning Deep Neural Network Surrogate Models for High Dimensional Uncertainty Quantification, J. Comput. Phys, 375:565-588, 2018.

  62. Betancourt, M.J., Generalizing the No-U-Turn Sampler to Riemannian Manifolds, Stat. Methodol., arXiv:1304.1920, 2013.

  63. Tsilifis, P., Pandita, P., Ghosh, S., Andreoli, V., Vandeputte, T., and Wang, L., Bayesian Learning of Orthogonal Embeddings for Multi-Fidelity Gaussian Processes, Stat. Mach. Learn, arXiv:2008.02386,2020.

  64. Williams, C.K.I. and Rasmussen, C.E., Gaussian Processes for Machine Learning, Vol. 2, Cambridge, MA: MIT Press, 2006.

  65. Pinkus, A., Approximating by Ridge Functions, in Surface Fitting and Multiresolution Methods, A. Le Mehaute, C. Rabut, andL.L. Schumaker, Eds.,Nashrille, TN: Vanderbilt University Press, pp. 1-14, 1997.

  66. Constantine, P.G., Eftekhari, A., Hokanson, J., and Ward, R.A., A Near-Stationary Subspace for Ridge Approximation, Comput. Methods Appl. Mech. Eng., 326:402-421, 2017.

  67. Constantine, P.G., Active Subspaces: Emerging Ideas for Dimension Reduction in Parameter Studies, Philadelphia: Society for Industrial and Applied Mathematics, 2015.

  68. Absil, P.A., Mahony, R., and Sepulchre, R., Optimization Algorithms on Matrix Manifolds, Princeton, NJ: Princeton University Press, 2009.

  69. Townsend, J., Koep, N., and Weichwald, S., Pymanopt: A Python Toolbox for Optimization on Manifolds Using Automatic Differentiation, J. Mach. Learn. Res., 17:1-5, 2016.

  70. Bingham, E., Chen, J.P., Jankowiak, M., Obermeyer, F., Pradhan, N., Karaletsos, T., Singh, R., Szerlip, P., Horsfall, P., and Goodman, N.D., Pyro: Deep Universal Probabilistic Programming, J. Mach. Learn. Res, 20(1):973-978,2019.

  71. Ma, Y. and Zhu, L., A Review on Dimension Reduction, Int. Stat. Rev, 81(1):134-150, 2013.

  72. Loudon, T. and Pankavich, S., Mathematical Analysis and Dynamic Active Subspaces for a Long Term Model of HIV, Math. Biosci. Eng., 14(3):709-733, 2017.

  73. Lukaczyk, T.W., Constantine, P., Palacios, F., and Alonso, J.J., Active Subspaces for Shape Optimization, in Proc. of 10th AIAA Multidisciplinary Design Optimization Conference, p. 1171, 2014.

  74. Brooks, S., Gelman, A., Carlin, J.B., Stern, H.S., and Rubin, D.B., Bayesian Data Analysis, Vol. 45, Boca Raton, FL: CRC Press, 1996.

  75. Knyazev, A. and Zhu, P., Principal Angles Between Subspaces and Their Tangents, Math. Numer. Anal., arXiv:1209.0523, 2012.

  76. Gelman, A. and Rubin, D.B., Inference from Iterative Simulation Using Multiple Sequences, Stat. Sci., 7(4):457-472,1992.

  77. Wycoff, N., Binois, M., and Wild, S.M., Sequential Learning of Active Subspaces, Stat. Mach. Learn., arXiv:1907.11572, 2019.

  78. Phan, D., Pradhan, N., and Jankowiak, M., Composable Effects for Flexible and Accelerated Probabilistic Programming in NumPyro, Stat. Mach. Learn, arXiv:1912.11554, 2019.

  79. Bradbury, J., Frostig, R., Hawkins, P., Johnson, M.J., Leary, C., Maclaurin, D., and Wanderman-Milne, S., JAX: Composable Transformations of Python + NumPy Programs, 2018.

Digitales Portal Digitale Bibliothek eBooks Zeitschriften Referenzen und Berichte Forschungssammlungen Preise und Aborichtlinien Begell House Kontakt Language English 中文 Русский Português German French Spain