ライブラリ登録: Guest
International Journal for Uncertainty Quantification

年間 6 号発行

ISSN 印刷: 2152-5080

ISSN オンライン: 2152-5099

The Impact Factor measures the average number of citations received in a particular year by papers published in the journal during the two preceding years. 2017 Journal Citation Reports (Clarivate Analytics, 2018) IF: 1.7 To calculate the five year Impact Factor, citations are counted in 2017 to the previous five years and divided by the source items published in the previous five years. 2017 Journal Citation Reports (Clarivate Analytics, 2018) 5-Year IF: 1.9 The Immediacy Index is the average number of times an article is cited in the year it is published. The journal Immediacy Index indicates how quickly articles in a journal are cited. Immediacy Index: 0.5 The Eigenfactor score, developed by Jevin West and Carl Bergstrom at the University of Washington, is a rating of the total importance of a scientific journal. Journals are rated according to the number of incoming citations, with citations from highly ranked journals weighted to make a larger contribution to the eigenfactor than those from poorly ranked journals. Eigenfactor: 0.0007 The Journal Citation Indicator (JCI) is a single measurement of the field-normalized citation impact of journals in the Web of Science Core Collection across disciplines. The key words here are that the metric is normalized and cross-disciplinary. JCI: 0.5 SJR: 0.584 SNIP: 0.676 CiteScore™:: 3 H-Index: 25

Indexed in

EXTENDING CLASSICAL SURROGATE MODELING TO HIGH DIMENSIONS THROUGH SUPERVISED DIMENSIONALITY REDUCTION: A DATA-DRIVEN APPROACH

巻 10, 発行 1, 2020, pp. 55-82
DOI: 10.1615/Int.J.UncertaintyQuantification.2020031935
Get accessGet access

要約

Thanks to their versatility, ease of deployment, and high performance, surrogate models have become staple tools in the arsenal of uncertainty quantification (UQ). From local interpolants to global spectral decompositions, surrogates are characterized by their ability to efficiently emulate complex computational models based on a small set of model runs used for training. An inherent limitation of many surrogate models is their susceptibility to the curse of dimensionality, which traditionally limits their applicability to a maximum of O(102) input dimensions. We present a novel approach at high-dimensional surrogate modeling that is model-, dimensionality reduction-, and surrogate model-agnostic (black box), and can enable the solution of high-dimensional [i.e., up to O(104)] problems. After introducing the general algorithm, we demonstrate its performance by combining Kriging and polynomial chaos expansion surrogates and kernel principal component analysis. In particular, we compare the generalization performance that the resulting surrogates achieve to the classical sequential application of dimensionality reduction followed by surrogate modeling on several benchmark applications, comprising an analytical function and two engineering applications of increasing dimensionality and complexity.

参考
  1. Sacks, J., Welch, W., Mitchell, T., and Wynn, H., Design and Analysis of Computer Experiments, Stat. Sci., 4:409-435,1989.

  2. Rasmussen, C. and Williams, C., Gaussian Processes for Machine Learning, Adaptive Computation and Machine Learning, Cambridge, MA: The MIT Press, 2006.

  3. Ghanem, R. and Spanos, P., Stochastic Finite Elements-A Spectral Approach, New York: Springer-Verlag, 1991.

  4. Xiu, D. and Karniadakis, G.E., The Wiener-Askey Polynomial Chaos for Stochastic Differential Equations, SIAM J. Sci. Comput., 24(2):619-644,2002.

  5. Xiu, D., Numerical Methods for Stochastic Computations-A Spectral Method Approach, Princeton, NJ: Princeton University Press, 2010.

  6. Chevreuil, M., Lebrun, R., Nouy, A., and Rai, P., A Least-Squares Method for Sparse Low Rank Approximation of Multivariate Functions, SIAM/ASAJ. Uncer. Quantif., 3(1):897-921, 2015.

  7. Konakli, K. and Sudret, B., Polynomial Meta-Models with Canonical Low-Rank Approximations: Numerical Insights and Comparison to Sparse Polynomial Chaos Expansions, J. Comput. Phys, 321:1144-1169, 2016.

  8. Vapnik, V., The Nature of Statistical Learning Theory, New York: Springer-Verlag, 1995.

  9. Verleysen, M. and Francois, D., The Curse of Dimensionality in Data Mining and Time Series Prediction, in International Work-Conference on Artificial Neural Networks, Berlin, Heidelberg: Springer, pp. 758-770, 2005.

  10. Gu, M. and Berger, J.O., Parallel Partial Gaussian Process Emulation for Computer Models with Massive Output, Annals Appl. Stat, 10(3):1317-1347, 2016.

  11. Ramsay, J.O., Functional Data Analysis, Encycl. Stat. Sci., 4,2004.

  12. Saltelli, A., Ratto, M., Andres, T., Campolongo, F., Cariboni, J., Gatelli, D., Saisana, M., and Tarantola, S., Global Sensitivity Analysis-The Primer, New York: Wiley, 2008.

  13. Iooss, B. and Lemaitre, P., A Review on Global Sensitivity Analysis Methods, Boston: Springer, pp. 101-122, 2015.

  14. Djolonga, J., Krause, A., and Cevher, V., High-Dimensional Gaussian Process Bandits, in Proc. ofAdvances in Neural Information Processing Systems, pp. 1025-1033, 2013.

  15. Lawrence, N., Probabilistic Non-Linear Principal Component Analysis with Gaussian Process Latent Variable Models, J. Mach. Learn. Res, 6:1783-1816, 2005.

  16. Durrande, N., Ginsbourger, D., and Roustant, O., Additive Covariance Kernels for High-Dimensional Gaussian Process Modeling, Annal. Fac. Sci. Univ. Toulouse, 21:481,2012.

  17. Wilson, A.G., Hu, Z., Salakhutdinov, R., and Xing, E.P., Deep Kernel Learning, in Proc. of Artificial Intelligence and Statistics, pp. 370-378, 2016.

  18. Vincent, P., Larochelle, H., Bengio, Y., and Manzagol, P.A., Extracting and Composing Robust Features with Denoising Autoencoders, in Proc. of 25th Int. Conf. Machine Learning. ACM, pp. 1096-1103,2008.

  19. Pearson, K., On Lines and Planes of Closest Fit to Systems of Points in Space, Philos. Mag, 6(2):559-572, 1901.

  20. Hyvarinen, A. and Oja, E., One-Unit Learning Rules for Independent Component Analysis, in Proc. of Advances in Neural Information Processing Systems, pp. 480-486, 1997.

  21. Tenenbaum, J.B., De Silva, V., andLangford, J.C., A Global Geometric Framework for Nonlinear Dimensionality Reduction, Science, 290(5500):2319-2323,2000.

  22. Roweis, S.T. and Saul, L.K., Nonlinear Dimensionality Reduction by Locally Linear Embedding, Science, 290(5500):2323-2326, 2000.

  23. Hinton, G.E. and Roweis, S.T., Stochastic Neighbor Embedding, in Proc. of Advances in Neural Information Processing Systems, pp. 857-864, 2003.

  24. Wahlstrom, N., Schon, T.B., and Deisenroth, M.P., Learning Deep Dynamical Models from Image Pixels, IFAC-PapersOnLine, 48(28):1059-1064, 2015.

  25. Calandra, R., Peters, J., Rasmussen, C.E., and Deisenroth, M.P., Manifold Gaussian Processes for Regression, in Proc. of 2016 Int. Joint Conf. on Neural Networks (IJCNN), pp. 3338-3345,2016.

  26. Constantine, P.G., Dow, E., and Wang, Q., Active Subspace Methods in Theory and Practice: Applications to Kriging Surfaces, SIAMJ. Sci. Comput., 36(4):A1500-A1524, 2014.

  27. Fornasier, M., Schnass, K., and Vybiral, J., Learning Functions of Few Arbitrary Linear Parameters in High Dimensions, Found. Comput. Math, 12(2):229-262, 2012.

  28. Tipireddy, R. and Ghanem, R.G., Basis Adaptation in Homogeneous Chaos Spaces, J. Comput. Phys, 259:304-317,2014.

  29. Tsilifis, P., Huan, X., Safta, C., Sargsyan, K., Lacaze, G., Oefelein, J.C., Najm, H., and Ghanem, R.G., Compressive Sensing Adaptation for Polynomial Chaos Expansions, J. Comput. Phys, 380:29-47,2019.

  30. Papaioannou, I., Ehre, M., and Straub, D., PLS-Based Adaptation for Efficient PCE Representation in High Dimensions, J. Comput. Phys, 387:186-204, 2019.

  31. Hinton, G.E. and Salakhutdinov, R.R., Reducing the Dimensionality of Data with Neural Networks, Science, 313(5786):504-507, 2006.

  32. Damianou, A. and Lawrence, N., Deep Gaussian Processes, in Proc. of Artificial Intelligence and Statistics, pp. 207-215, 2013.

  33. Huang, W., Zhao, D., Sun, F., Liu, H., and Chang, E.Y., Scalable Gaussian Process Regression Using Deep Neural Networks, in Proc. of 24th Int. Joint Conf. on Artificial Intelligence, pp. 3576-3582,2015.

  34. Fukunaga, K., Introduction to Statistical Pattern Recognition, New York: Academic Press, 2013.

  35. Camastra, F., Data Dimensionality Estimation Methods: A Survey, Pattern Recognit., 36(12):2945-2954,2003.

  36. Kwok, J.T. and Tsang, I.W., The Pre-Image Problem in Kernel Methods, in Proc. of 20th Int. Conf. on Machine Learning (ICML-03), pp. 408-415,2003.

  37. Hastie, T., Tibshirani, R., and Friedman, J., The Elements of Statistical Learning: Data Mining, Inference and Prediction, New York: Springer, 2001.

  38. Arlot, S. and Celisse, A., A Survey of Cross-Validation Procedures for Model Selection, Stat. Surveys, 4:40-79, 2010.

  39. Dubrule, O., Cross Validation of Kriging in a Unique Neighborhood, J. Int. Assoc. Math. Geol., 15(6):687-699,1983.

  40. Blatman, G. and Sudret, B., Adaptive Sparse Polynomial Chaos Expansion based on Least Angle Regression, J. Comput. Phys, 230:2345-2367, 2011.

  41. Goldberg, D.E., Genetic Algorithms in Search, Optimization and Machine Learning, Boston: Addison-Wesley Longman Publishing Co., Inc., 1989.

  42. Hansen, N., Muller, S.D., and Koumoutsakos, P., Reducing the Time Complexity of the Derandomized Evolution Strategy with Covariance Matrix Adaptation (CMA-ES), Evol. Comput., 11(1):1-18,2003.

  43. Yang, Z., Tang, K., and Yao, X., Differential Evolution for High-Dimensional Function Optimization, in Proc. of2007IEEE Congress on Evolutionary Computation, pp. 3523-3530, 2007.

  44. Bertsekas, D.P., Nonlinear Programming, Belmont, MA: Athena Scientific, 1999.

  45. Scholkopf, B., Smola, A., and Muller, K.R., Nonlinear Component Analysis as a Kernel Eigenvalue Problem, Neural Comput:., 10(5):1299-1319, 1998.

  46. Ince, H. and Trafalis, T.B., Kernel Principal Component Analysis and Support Vector Machines for Stock Price Prediction, IEE Trans, 39(6):629-637, 2007.

  47. Wang, Q., Kernel Principal Component Analysis and Its Applications in Face Recognition and Active Shape Models, Comput. Sci., arXiv:1207.3538, 2011.

  48. Mika, S., Scholkopf, B., Smola, A.J., Muller, K.R., Scholz, M., and Ratsch, G., Kernel PCA and De-Noising in Feature Spaces, in Proc. of Advances in Neural Information Processing Systems, pp. 536-542, 1999.

  49. Weston, J., Scholkopf, B., and Bakir, G.H., Learning to Find Pre-Images, in Proc. of Advances in Neural Information Processing Systems, pp. 449-456,2004.

  50. Weinberger, K.Q., Sha, F., and Saul, L.K., Learning a Kernel Matrix for Nonlinear Dimensionality Reduction, in Proc. of 21st Int. Conf. on Machine Learning, p. 106,2004.

  51. Alam, M.A. and Fukumizu, K., Hyperparameter Selection in Kernel Principal Component Analysis, J. Comput. Sci, 10(7):1139-1150, 2014.

  52. Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., and Dubourg, V., Scikit-Learn: Machine Learning in Python, J. Mach. Learn. Res., 12:2825-2830, 2011.

  53. Santner, T.J., Williams, B.J., andNotz, W.I., The Design and Analysis of Computer Experiments, New York: Springer, 2003.

  54. Bachoc, F., Cross Validation and Maximum Likelihood Estimations of Hyper-Parameters of Gaussian Processes with Model Misspecifications, Comput. Stat. Data Anal., 66:55-69,2013.

  55. Bazaraa, M.S., Sherali, H.D., and Shetty, C.M., Nonlinear Programming: Theory and Algorithms, New York: John Wiley & Sons, 2013.

  56. Moustapha, M., Sudret, B., Bourinet, J.M., and Guillaume, B., Comparative Study of Kriging and Support Vector Regression for Structural Engineering Applications, ASCE-ASMEJ. Risk Uncertainty Eng. Syst, Part A: Civ. Eng., 4(2):04018005,2018.

  57. Torre, E., Marelli, S., Embrechts, P., and Sudret, B., Data-Driven Polynomial Chaos Expansion for Machine Learning Regression, J. Comput. Phys, 388:601-623, 2019.

  58. Blatman, G. and Sudret, B., An Adaptive Algorithm to Build Up Sparse Polynomial Chaos Expansions for Stochastic Finite Element Analysis, Probl. Eng. Mech., 25:183-197, 2010.

  59. Jakeman, J., Eldred, M., and Sargsyan, K., Enhancing f1-Minimization Estimates of Polynomial Chaos Expansions Using Basis Selection, J. Comput. Phys, 289:18-34, 2015.

  60. Gautschi, W., Orthogonal Polynomials: Computation and Approximation, Numerical Mathematics and Scientific Computation, Oxford: Oxford University Press, 2004.

  61. Berveiller, M., Sudret, B., and Lemaire, M., Stochastic Finite Elements: A Non-Intrusive Approach by Regression, Eur. J. Comput. Mech., 15(1-3):81-92, 2006.

  62. Marelli, S. and Sudret, B., UQLAB: A Framework for Uncertainty Quantification in Matlab, in Vulnerability, Uncertainty, and Risk (Proc. 2nd Int. Conf. on Vulnerability, Risk Analysis and Management (ICVRAM2014)), Liverpool, UK, pp. 2554-2563, 2014.

  63. Marelli, S. and Sudret, B., UQLab User Manual-Polynomial Chaos Expansions, Tech. Rep., Chair of Risk, Safety and Uncertainty Quantification, ETH Zurich, Report no. UQLab-V1.1-104, 2018.

  64. Lataniotis, C., Marelli, S., and Sudret, B., The Gaussian Process Modelling Module in UQLab, Soft Comput. Civil Eng., 2(3):91-116, 2018.

  65. Konakli, K. and Sudret, B., Global Sensitivity Analysis Using Low-Rank Tensor Approximations, Reliab. Eng. Syst. Saf., 156:64-83,2016.

  66. Kersaudy, P., Sudret, B., Varsier, N., Picon, O., and Wiart, J., A New Surrogate Modeling Technique Combining Kriging and Polynomial Chaos Expansions-Application to Uncertainty Analysis in Computational Dosimetry, J. Comput. Phys, 286:103-117,2015.

  67. McKay, M.D., Beckman, R.J., and Conover, W. J., A Comparison of Three Methods for Selecting Values of Input Variables in the Analysis of Output from a Computer Code, Technometrics, 2:239-245, 1979.

  68. Sobol', I., Sensitivity Estimates for Nonlinear Mathematical Models, Math. Model. Comput. Exp., 1:407-414,1993.

  69. Saltelli, A., Chan, K., and Scott, E., Eds., Sensitivity Analysis, New York: J. Wiley & Sons, 2000.

  70. Helton, J.C., Iman, R.L., and Brown, J.B., Sensitivity Analysis of the Asymptotic Behavior of a Model for the Environmental Movement of Radionuclides, Ecol. Model, 28(4):243-278, 1985.

  71. Marelli, S., Lamas, C., Konakli, K., Mylonas, C., Wiederkehr, P., and Sudret, B., UQLab User Manual-Sensitivity Analysis, Tech. Rep., Chair of Risk, Safety and Uncertainty Quantification, ETH Zurich, Report no. UQLab-V1.2-106, 2019.

  72. Li, C. and Der Kiureghian, A., Optimal Discretization of Random Fields, J. Eng. Mech., 119(6):1136-1154,1993.

  73. Tibshirani, R.J. and Tibshirani, R., A Bias Correction for the Minimum Error Rate in Cross-Validation, Ann. Appl. Stat., 3(2):822-829, 2009.

によって引用された
  1. Giovanis D.G., Shields M.D., Data-driven surrogates for high dimensional models using Gaussian process regression on the Grassmann manifold, Computer Methods in Applied Mechanics and Engineering, 370, 2020. Crossref

  2. Zhou Tong, Peng Yongbo, Efficient reliability analysis based on deep learning-enhanced surrogate modelling and probability density evolution method, Mechanical Systems and Signal Processing, 162, 2022. Crossref

  3. Puppo L., Pedroni N., Bersano A., Di Maio F., Bertani C., Zio E., Failure identification in a nuclear passive safety system by Monte Carlo simulation with adaptive Kriging, Nuclear Engineering and Design, 380, 2021. Crossref

  4. Zhou Yicheng, Lu Zhenzhou, Shi Yan, Zhou Changcong, Yun Wanying, Variational Bayesian inference-based polynomial chaos expansion: Application to time-variantreliability analysis, Proceedings of the Institution of Mechanical Engineers, Part O: Journal of Risk and Reliability, 2021. Crossref

  5. Yin Jianhua, Du Xiaoping, Active learning with generalized sliced inverse regression for high-dimensional reliability analysis, Structural Safety, 94, 2022. Crossref

  6. Zhou Yicheng, Lu Zhenzhou, Cheng Kai, Adaboost-based ensemble of polynomial chaos expansion with adaptive sampling, Computer Methods in Applied Mechanics and Engineering, 388, 2022. Crossref

  7. Sabater Christian, Le Maître Olivier, Congedo Pietro Marco, Görtz Stefan, A Bayesian approach for quantile optimization problems with high-dimensional uncertainty sources, Computer Methods in Applied Mechanics and Engineering, 376, 2021. Crossref

  8. Puppo L., Pedroni N., Maio F. Di, Bersano A., Bertani C., Zio E., A Framework based on Finite Mixture Models and Adaptive Kriging for Characterizing Non-Smooth and Multimodal Failure Regions in a Nuclear Passive Safety System, Reliability Engineering & System Safety, 216, 2021. Crossref

  9. Li Yaohui, Shi Junjun, Yin Zhifeng, Shen Jingfang, Wu Yizhong, Wang Shuting, An Improved High-Dimensional Kriging Surrogate Modeling Method through Principal Component Dimension Reduction, Mathematics, 9, 16, 2021. Crossref

  10. Sterr Benedikt, Mahravan Ehsan, Kim Daegyoum, Uncertainty quantification of heat transfer in a microchannel heat sink with random surface roughness, International Journal of Heat and Mass Transfer, 174, 2021. Crossref

  11. Moustapha Maliki, Marelli Stefano, Sudret Bruno, Active learning for structural reliability: Survey, general framework and benchmark, Structural Safety, 96, 2022. Crossref

  12. Bigoni Daniele, Marzouk Youssef, Prieur Clémentine, Zahm Olivier, Nonlinear dimension reduction for surrogate modeling using gradient information, Information and Inference: A Journal of the IMA, 2022. Crossref

  13. Kontolati Katiana, Loukrezis Dimitrios, Giovanis Dimitrios G., Vandanapu Lohit, Shields Michael D., A survey of unsupervised learning methods for high-dimensional uncertainty quantification in black-box-type problems, Journal of Computational Physics, 464, 2022. Crossref

  14. Rani Ridhima, Khurana Meenu, Kumar Ajay, Kumar Neeraj, Big data dimensionality reduction techniques in IoT: review, applications and open research challenges, Cluster Computing, 2022. Crossref

  15. Jehle Jonas Siegfried, Lange Volker Andreas, Gerdts Matthias, Proposing an Uncertainty Management Framework to Implement the Evidence Theory for Vehicle Crash Applications, ASCE-ASME J Risk and Uncert in Engrg Sys Part B Mech Engrg, 8, 2, 2022. Crossref

  16. El Garroussi Siham, Ricci Sophie, De Lozzo Matthias, Goutal Nicole, Lucor Didier, Tackling random fields non-linearities with unsupervised clustering of polynomial chaos expansion in latent space: application to global sensitivity analysis of river flooding, Stochastic Environmental Research and Risk Assessment, 36, 3, 2022. Crossref

  17. Tran Vinh Ngoc, Kim Jongho, Robust and efficient uncertainty quantification for extreme events that deviate significantly from the training dataset using polynomial chaos-kriging, Journal of Hydrology, 609, 2022. Crossref

  18. Lüthen Nora, Marelli Stefano, Sudret Bruno, Sparse Polynomial Chaos Expansions: Literature Survey and Benchmark, SIAM/ASA Journal on Uncertainty Quantification, 9, 2, 2021. Crossref

  19. dos Santos Ketson R. M., Giovanis Dimitris G., Kontolati Katiana, Loukrezis Dimitrios, Shields Michael D., Grassmannian diffusion maps based surrogate modeling via geometric harmonics, International Journal for Numerical Methods in Engineering, 123, 15, 2022. Crossref

  20. Rossat D., Baroth J., Briffaut M., Dufour F., Monteil A., Masson B., Michel-Ponnelle S., Bayesian updating for predictions of delayed strains of large concrete structures: influence of prior distribution, European Journal of Environmental and Civil Engineering, 2022. Crossref

  21. Peng Yongbo, Zhou Tong, Li Jie, Surrogate modeling immersed probability density evolution method for structural reliability analysis in high dimensions, Mechanical Systems and Signal Processing, 152, 2021. Crossref

  22. Hou Chun Kit Jeffery, Behdinan Kamran, Dimensionality Reduction in Surrogate Modeling: A Review of Combined Methods, Data Science and Engineering, 2022. Crossref

  23. Chen Liming, Qiu Haobo, Gao Liang, Yang Zan, Xu Danyang, Exploiting active subspaces of hyperparameters for efficient high-dimensional Kriging modeling, Mechanical Systems and Signal Processing, 169, 2022. Crossref

  24. Lazzara Michele, Chevalier Max, Colombo Michele, Garay Garcia Jasone, Lapeyre Corentin, Teste Olivier, Surrogate modelling for an aircraft dynamic landing loads simulation using an LSTM AutoEncoder-based dimensionality reduction approach, Aerospace Science and Technology, 126, 2022. Crossref

  25. Zhou Tong, Peng Yongbo, An active-learning reliability method based on support vector regression and cross validation, Computers & Structures, 276, 2023. Crossref

Begell Digital Portal Begellデジタルライブラリー 電子書籍 ジャーナル 参考文献と会報 リサーチ集 価格及び購読のポリシー Begell House 連絡先 Language English 中文 Русский Português German French Spain