图书馆订阅: Guest
国际不确定性的量化期刊

每年出版 6 

ISSN 打印: 2152-5080

ISSN 在线: 2152-5099

The Impact Factor measures the average number of citations received in a particular year by papers published in the journal during the two preceding years. 2017 Journal Citation Reports (Clarivate Analytics, 2018) IF: 1.7 To calculate the five year Impact Factor, citations are counted in 2017 to the previous five years and divided by the source items published in the previous five years. 2017 Journal Citation Reports (Clarivate Analytics, 2018) 5-Year IF: 1.9 The Immediacy Index is the average number of times an article is cited in the year it is published. The journal Immediacy Index indicates how quickly articles in a journal are cited. Immediacy Index: 0.5 The Eigenfactor score, developed by Jevin West and Carl Bergstrom at the University of Washington, is a rating of the total importance of a scientific journal. Journals are rated according to the number of incoming citations, with citations from highly ranked journals weighted to make a larger contribution to the eigenfactor than those from poorly ranked journals. Eigenfactor: 0.0007 The Journal Citation Indicator (JCI) is a single measurement of the field-normalized citation impact of journals in the Web of Science Core Collection across disciplines. The key words here are that the metric is normalized and cross-disciplinary. JCI: 0.5 SJR: 0.584 SNIP: 0.676 CiteScore™:: 3 H-Index: 25

Indexed in

MANIFOLD LEARNING-BASED POLYNOMIAL CHAOS EXPANSIONS FOR HIGH-DIMENSIONAL SURROGATE MODELS

卷 12, 册 4, 2022, pp. 39-64
DOI: 10.1615/Int.J.UncertaintyQuantification.2022039936
Get accessGet access

摘要

In this work we introduce a manifold learning-based method for uncertainty quantification (UQ) in systems describing complex spatiotemporal processes. Our first objective is to identify the embedding of a set of high-dimensional data representing quantities of interest of the computational or analytical model. For this purpose, we employ Grassmannian diffusion maps, a two-step nonlinear dimension reduction technique which allows us to reduce the dimensionality of the data and identify meaningful geometric descriptions in a parsimonious and inexpensive manner. Polynomial chaos expansion is then used to construct a mapping between the stochastic input parameters and the diffusion coordinates of the reduced space. An adaptive clustering technique is proposed to identify an optimal number of clusters of points in the latent space. The similarity of points allows us to construct a number of geometric harmonic emulators which are finally utilized as a set of inexpensive pretrained models to perform an inverse map of realizations of latent features to the ambient space and thus perform accurate out-of-sample predictions. Thus, the proposed method acts as an encoder-decoder system which is able to automatically handle very high-dimensional data while simultaneously operating successfully in the small-data regime. The method is demonstrated on two benchmark problems and on a system of advection-diffusion-reaction equations which model a first-order chemical reaction between two species. In all test cases, the proposed method is able to achieve highly accurate approximations which ultimately lead to the significant acceleration of UQ tasks.

参考文献
  1. Bhosekar, A. and Ierapetritou, M., Advances in Surrogate Based Modeling, Feasibility Analysis, and Optimization: A Review, Comput. Chem. Eng., 108:250-267,2018.

  2. Xiu, D. and Karniadakis, G.E., The Wiener-Askey Polynomial Chaos for Stochastic Differential Equations, SIAM J. Sci. Comput, 24(2):619-644,2002.

  3. Zhou, Y., Lu, Z., Cheng, K., and Shi, Y., An Expanded Sparse Bayesian Learning Method for Polynomial Chaos Expansion, Mech. Syst. Signal Proces, 128:153-171, 2019.

  4. Hadigol, M. and Doostan, A., Least Squares Polynomial Chaos Expansion: A Review of Sampling Strategies, Comput. Methods Appl. Mech. Eng., 332:382-407, 2018.

  5. Babuska, I., Nobile, F., and Tempone, R., A Stochastic Collocation Method for Elliptic Partial Differential Equations with Random Input Data, SIAM J. Numer. Anal, 45(3):1005-1034,2007.

  6. Sudret, B., Global Sensitivity Analysis Using Polynomial Chaos Expansions, Reliab. Eng. Syst. Saf., 93(7):964-979, 2008.

  7. Shao, Q., Younes, A., Fahs, M., and Mara, T.A., Bayesian Sparse Polynomial Chaos Expansion for Global Sensitivity Analysis, Comput. Methods Appl. Mech. Eng., 318:474-496, 2017.

  8. Crestaux, T., Le Maitre, O., and Martinez, J.M., Polynomial Chaos Expansion for Sensitivity Analysis, Reliab. Eng. Syst. Saf., 94(7):1161-1172, 2009.

  9. Luthen, N., Marelli, S., and Sudret, B., Sparse Polynomial Chaos Expansions: Literature Survey and Benchmark, Math. Numer. Anal., arXiv:2002.01290, 2020.

  10. Blatman, G. and Sudret, B., Adaptive Sparse Polynomial Chaos Expansion Based on Least Angle Regression, J. Comput. Phys, 230(6):2345-2367, 2011.

  11. Alemazkoor, N. and Meidani, H., A Preconditioning Approach for Improved Estimation of Sparse Polynomial Chaos Expansions, Comput. Methods Appl. Mech. Eng., 342:474-489,2018.

  12. Jakeman, J.D., Eldred, M.S., and Sargsyan, K., Enhancing l1-Minimization Estimates of Polynomial Chaos Expansions Using Basis Selection, J. Comput. Phys, 289:18-34, 2015.

  13. Hampton, J. and Doostan, A., Basis Adaptive Sample Efficient Polynomial Chaos (BASE-PC), J. Comput. Phys, 371:20-49, 2018.

  14. Loukrezis, D., Galetzka, A., and De Gersem, H., Robust Adaptive Least Squares Polynomial Chaos Expansions in High-Frequency Applications, Int. J. Numer. Modell. Electron. Networks, Devices Fields, 33(6):e2725, 2020.

  15. Hombal, V. and Mahadevan, S., Surrogate Modeling of 3D Crack Growth, Int. J. Fatigue, 47:90-99, 2013.

  16. Boukouvala, F., Gao, Y., Muzzio, F., and Ierapetritou, M.G., Reduced-Order Discrete Element Method Modeling, Chem. Eng. Sci, 95:12-26, 2013.

  17. Amsallem, D. and Farhat, C., Interpolation Method for Adapting Reduced-Order Models and Application to Aeroelasticity, AIAAJ., 46(7):1803-1813,2008.

  18. Zimmermann, R., Gradient-Enhanced Surrogate Modeling Based on Proper Orthogonal Decomposition, J. Comput. Appl. Math, 237(1):403-418,2013.

  19. Carlberg, K. and Farhat, C., A Low-Cost, Goal-Oriented Compact Proper Orthogonal Decomposition Basis for Model Reduction of Static Systems, Int. J. Numer. Methods Eng., 86(3):381-402,2011.

  20. Nath, P., Hu, Z., and Mahadevan, S., Sensor Placement for Calibration of Spatially Varying Model Parameters, J Comput. Phys, 343:150-169,2017.

  21. Tripathy, R., Bilionis, I., and Gonzalez, M., Gaussian Processes with Built-In Dimensionality Reduction: Applications to High-Dimensional Uncertainty Propagation, J. Comput. Phys., 321:191-223,2016.

  22. Vohra, M., Nath, P., Mahadevan, S., and Lee, Y.T.T., Fast Surrogate Modeling Using Dimensionality Reduction in Model Inputs and Field Output: Application to Additive Manufacturing, Reliab. Eng. Syst. Saf., 201:106986, 2020.

  23. Constantine, P.G., Dow, E., and Wang, Q., Active Subspace Methods in Theory and Practice: Applications to Kriging Surfaces, SIAMJ. Sci. Comput., 36(4):A1500-A1524, 2014.

  24. Constantine, P.G., Active Subspaces: Emerging Ideas for Dimension Reduction in Parameter Studies, Philadelphia: SIAM, 2015.

  25. Bigoni, D., Marzouk, Y., Prieur, C., and Zahm, O., Nonlinear Dimension Reduction for Surrogate Modeling Using Gradient Information, Math. Numer. Anal., arXiv:2102.10351, 2021.

  26. Ji, W., Wang, J., Zahm, O., Marzouk, Y.M., Yang, B., Ren, Z., and Law, C.K., Shared Low-Dimensional Subspaces for Propagating Kinetic Uncertainty to Multiple Outputs, Combust. Flame, 190:146-157, 2018.

  27. Zahm, O., Constantine, P.G., Prieur, C., and Marzouk, Y.M., Gradient-Based Dimension Reduction of Multivariate Vector-Valued Functions, SIAMJ. Sci. Comput, 42(1):A534-A558, 2020.

  28. Giovanis, D.G. and Shields, M.D., Uncertainty Quantification for Complex Systems with Very High Dimensional Response Using Grassmann Manifold Variations, J. Comput. Phys, 364:393-415,2018.

  29. Giovanis, D.G. and Shields, M.D., Data-Driven Surrogates for High Dimensional Models Using Gaussian Process Regression on the Grassmann Manifold, Comput. Methods Appl. Mech. Eng., 370:113269,2020.

  30. Kontolati, K., Alix-Williams, D., Boffi, N.M., Falk, M.L., Rycroft, C.H., and Shields, M.D., Manifold Learning for Coarse-Graining Atomistic Simulations: Application to Amorphous Solids, Acta Mater., 215:117008, 2021.

  31. Coifman, R.R. andLafon, S., Diffusion Maps, Appl. Comput. Harmonic Anal., 21(1):5-30, 2006.

  32. Soize, C. and Ghanem, R., Data-Driven Probability Concentration and Sampling on Manifold, J. Comput. Phys., 321:242-258, 2016.

  33. Soize, C. and Ghanem, R., Polynomial Chaos Representation of Databases on Manifolds, J. Comput. Phys., 335:201-221, 2017.

  34. Soize, C. and Ghanem, R., Probabilistic Learning on Manifolds Constrained by Nonlinear Partial Differential Equations for Small Datasets, Comput. Methods Appl. Mech. Eng., 380:113777, 2021.

  35. Kalogeris, I. and Papadopoulos, V., Diffusion Maps-Based Surrogate Modeling: An Alternative Machine Learning Approach, Int. J. Numer. Methods Eng, 121(4):602-620,2020.

  36. Koronaki, E., Nikas, A., and Boudouvis, A., A Data-Driven Reduced-Order Model of Nonlinear Processes Based on Diffusion Maps and Artificial Neural Networks, Chem. Eng. J, 397:125475, 2020.

  37. Lataniotis, C., Marelli, S., and Sudret, B., Extending Classical Surrogate Modeling to High Dimensions through Supervised Dimensionality Reduction: A Data-Driven Approach, Int. J. Uncertainty Quantification, 10(1):55-82, 2020.

  38. Schmidhuber, J., Deep Learning in Neural Networks: An Overview, Neural Networks, 61:85-117,2015.

  39. Goodfellow, I., Bengio, Y., Courville, A., and Bengio, Y., Deep Learning, Vol. 1, Cambridge, MA: The MIT Press, 2016.

  40. Wang, Y., Yao, H., and Zhao, S., Auto-Encoder Based Dimensionality Reduction, Neurocomputing, 184:232-242, 2016.

  41. Rawat, W. and Wang, Z., Deep Convolutional Neural Networks for Image Classification: A Comprehensive Review, Neural Comput, 29(9):2352-2449, 2017.

  42. Tripathy, R.K. and Bilionis, I., Deep UQ: Learning Deep Neural Network Surrogate Models for High Dimensional Uncertainty Quantification, J. Comput. Phys, 375:565-588, 2018.

  43. Nikolopoulos, S., Kalogeris, I., and Papadopoulos, V., Non-Intrusive Surrogate Modeling for Parametrized Time-Dependent PDEs Using Convolutional Autoencoders, Math. Numer. Anal., arXiv:2101.05555, 2021.

  44. Zhu, Y. and Zabaras, N., Bayesian Deep Convolutional Encoder-Decoder Networks for Surrogate Modeling and Uncertainty Quantification, J. Comput. Phys, 366:415-447, 2018.

  45. Mo, S., Zhu, Y., Zabaras, N., Shi, X., and Wu, J., Deep Convolutional Encoder-Decoder Networks for Uncertainty Quantification of Dynamic Multiphase Flow in Heterogeneous Media, Water Resour. Res., 55(1):703-728, 2019.

  46. Thuerey, N., Weifienow, K., Prantl, L., and Hu, X., Deep Learning Methods for Reynolds-Averaged Navier-Stokes Simulations of Airfoil Flows, AIAA j., 58(1):25-36,2020.

  47. Wang, N., Chang, H., and Zhang, D., Efficient Uncertainty Quantification for Dynamic Subsurface Flow with Surrogate by Theory-Guided Neural Network, Comput. Methods Appl. Mech. Eng., 373:113492,2021.

  48. Mo, S., Zabaras, N., Shi, X., and Wu, J., Deep Autoregressive Neural Networks for High-Dimensional Inverse Problems in Groundwater Contaminant Source Identification, Water Resour. Res., 55(5):3856-3881, 2019.

  49. Hesthaven, J.S. and Ubbiali, S., Non-Intrusive Reduced Order Modeling of Nonlinear Problems Using Neural Networks, J. Comput. Phys, 363:55-78, 2018.

  50. dos Santos, K.R., Giovanis, D.G., and Shields, M.D., Grassmannian Diffusion Maps Based Dimension Reduction and Classification for High-Dimensional Data, Comput. Sci. Mach. Learn., arXiv:2009.07547, 2020.

  51. Zhang, J., Zhu, G., Heath, R.W., Jr., and Huang, K., Grassmannian Learning: Embedding Geometry Awareness in Shallow and Deep Learning, Comput. Sci. Mach. Learn., arXiv:1808.02229, 2018.

  52. Edelman, A., Arias, T.A., and Smith, S.T., The Geometry of Algorithms with Orthogonality Constraints, SIAMJ. Matrix Anal. Appl, 20(2):303-353,1998.

  53. Coifman, R.R. and Lafon, S., Geometric Harmonics: A Novel Tool for Multiscale Out-Of-Sample Extension of Empirical Functions, Appl. Comput. Harmonic Anal, 21(1):31-52, 2006.

  54. Olivier, A., Giovanis, D., Aakash, B., Chauhan, M., Vandanapu, L., and Shields, M.D., UQpy: A General Purpose Python Package and Development Environment for Uncertainty Quantification, J. Comput. Sci., 47:101204, 2020.

  55. Dsilva, C.J., Talmon, R., Coifman, R.R., and Kevrekidis, I.G., Parsimonious Representation of Nonlinear Dynamical Systems through Manifold Learning: A Chemotaxis Case Study, Appl. Comput. Harmonic Anal, 44(3):759-773, 2018.

  56. Absil, P.A., Mahony, R., and Sepulchre, R., Riemannian Geometry of Grassmann Manifolds with a View on Algorithmic Computation, Acta Appl. Math, 80(2):199-220, 2004.

  57. Ye, K. and Lim, L.H., Schubert Varieties and Distances between Subspaces of Different Dimensions, SIAM J. Matrix Anal. Appl, 37(3):1176-1197, 2016.

  58. Ye, K., Wong, K.S.W., and Lim, L.H., Optimization on Flag Manifolds, Math. Optim. Control, arXiv:1907.00949,2019.

  59. Begelfor, E. and Werman, M., Affine Invariance Revisited, in 2006 IEEE Computer Society Conf. on Computer Vision and Pattern Recognition (CVPR'06), IEEE, Vol. 2, pp. 2087-2094, 2006.

  60. Harandi, M.T., Salzmann, M., Jayasumana, S., Hartley, R., and Li, H., Expanding the Family of Grassmannian Kernels: An Embedding Perspective, in European Conference on Computer Vision, Berlin: Springer, pp. 408-423, 2014.

  61. Hamm, J. and Lee, D., Extended Grassmann Kernels for Subspace-Based Learning, Adv. Neural Inf. Proces. Syst., 21:601-608, 2008.

  62. Thiem, T.N., Kooshkbaghi, M., Bertalan, T., Laing, C.R., and Kevrekidis, I.G., Emergent Spaces for Coupled Oscillators, Front. Comput. Neurosci., 14:36, 2020.

  63. Feinberg, J., Eck, V.G., and Langtangen, H.P., Multivariate Polynomial Chaos Expansions with Dependent Variables, SIAMJ. Sci. Comput., 40(1):A199-A223, 2018.

  64. Jakeman, J.D., Franzelin, F., Narayan, A., Eldred, M., and Plfuger, D., Polynomial Chaos Expansions for Dependent Random Variables, Comput. Methods Appl. Mech. Eng., 351:643-666, 2019.

  65. Rahman, S., A Polynomial Chaos Expansion in Dependent Random Variables, J. Math. Anal. Appl., 464(1):749-775, 2018.

  66. Bobrowski, A., Functional Analysis for Probability and Stochastic Processes: An Introduction, Cambridge: Cambridge University Press, 2005.

  67. Wan, X. and Karniadakis, G.E., Multi-Element Generalized Polynomial Chaos for Arbitrary Probability Measures, SIAM J. Sci. Comput., 28(3):901-928, 2006.

  68. Soize, C. and Ghanem, R., Physical Systems with Random Uncertainties: Chaos Representations with Arbitrary Probability Measure, SIAMJ. Sci. Comput., 26(2):395-410, 2004.

  69. He, W., Zeng, Y., and Li, G., An Adaptive Polynomial Chaos Expansion for High-Dimensional Reliability Analysis, Struct. Multidiscip. Optimiz, 62(4):2051-2067,2020.

  70. Diaz, P., Doostan, A., and Hampton, J., Sparse Polynomial Chaos Expansions via Compressed Sensing and D-Optimal Design, Comput. Methods Appl. Mech. Eng., 336:640-666, 2018.

  71. Knio, O.M. and Le Maitre, O., Uncertainty Propagation in CFD Using Polynomial Chaos Decomposition, Fluid Dyn. Res., 38(9):616,2006.

  72. Constantine, P.G., Eldred, M.S., and Phipps, E.T., Sparse Pseudospectral Approximation Method, Computer Methods in Applied Mechanics and Engineering, 229:1-12, 2012.

  73. Conrad, P.R. and Marzouk, Y.M., Adaptive Smolyak Pseudospectral Approximations, SIAM J. Sci. Comput., 35(6):A2643-A2670, 2013.

  74. Winokur, J., Kim, D., Bisetti, F., Le Maitre, O.P., and Knio, O.M., Sparse Pseudo Spectral Projection Methods with Directional Adaptation for Uncertainty Quantification, J. Sci. Comput., 68(2):596-623,2016.

  75. Buzzard, G.T., Efficient Basis Change for Sparse-Grid Interpolating Polynomials with Application to T-Cell Sensitivity Analysis, Comput. Biol. J., 2013:562767, 2013.

  76. Loukrezis, D. and De Gersem, H., Adaptive Sparse Polynomial Chaos Expansions via Leja Interpolation, Math. Numer. Anal., arXiv:1911.08312, 2019.

  77. Doostan, A. and Owhadi, H., A Non-Adapted Sparse Approximation of PDEs with Stochastic Inputs, J. Comput. Phys., 230(8):3015-3034, 2011.

  78. Tsilifis, P., Huan, X., Safta, C., Sargsyan, K., Lacaze, G., Oefelein, J.C., Najm, H.N., and Ghanem, R.G., Compressive Sensing Adaptation for Polynomial Chaos Expansions, J. Comput. Phys, 380:29-47,2019.

  79. Rifkin, R.M. and Lippert, R.A., Notes on Regularized Least Squares, Tech. Rep., MIT-CSAIL-TR-2007-025,2007.

  80. Bottou, L., Large-Scale Machine Learning with Stochastic Gradient Descent, in Proc. of COMPSTAT'2010, Berlin: Springer, pp. 177-186,2010.

  81. Mao, X., Sabanis, S., and Renshaw, E., Asymptotic Behaviour of the Stochastic Lotka-Volterra Model, J. Math. Anal. Appl., 287(1):141-156, 2003.

  82. Hundsdorfer, W. and Verwer, J.G., Numerical Solution of Time-Dependent Advection-Diffusion-Reaction Equations, Vol. 33, Berlin: Springer Science & Business Media, 2013.

  83. Alns, M., Blechta, J., Hake, J., Johansson, A., Kehlet, B., Logg, A., Richardson, C., Ring, J., Rognes, M.E., and Wells, G.N., The FEniCS Project Version 1.5, Arch. Numer. Software, 3(100):20553, 2015.

对本文的引用
  1. Zapata Usandivaras José Felix, Urbano Annafederica, Bauerheim Michael, Cuenot Bénédicte, Data Driven Models for the Design of Rocket Injector Elements, Aerospace, 9, 10, 2022. Crossref

Begell Digital Portal Begell 数字图书馆 电子图书 期刊 参考文献及会议录 研究收集 订购及政策 Begell House 联系我们 Language English 中文 Русский Português German French Spain