Доступ предоставлен для: Guest
International Journal for Uncertainty Quantification
Главный редактор: Habib N. Najm (open in a new tab)
Ассоциированный редакторs: Dongbin Xiu (open in a new tab) Tao Zhou (open in a new tab)
Редактор-основатель: Nicholas Zabaras (open in a new tab)

Выходит 6 номеров в год

ISSN Печать: 2152-5080

ISSN Онлайн: 2152-5099

The Impact Factor measures the average number of citations received in a particular year by papers published in the journal during the two preceding years. 2017 Journal Citation Reports (Clarivate Analytics, 2018) IF: 1.7 To calculate the five year Impact Factor, citations are counted in 2017 to the previous five years and divided by the source items published in the previous five years. 2017 Journal Citation Reports (Clarivate Analytics, 2018) 5-Year IF: 1.9 The Immediacy Index is the average number of times an article is cited in the year it is published. The journal Immediacy Index indicates how quickly articles in a journal are cited. Immediacy Index: 0.5 The Eigenfactor score, developed by Jevin West and Carl Bergstrom at the University of Washington, is a rating of the total importance of a scientific journal. Journals are rated according to the number of incoming citations, with citations from highly ranked journals weighted to make a larger contribution to the eigenfactor than those from poorly ranked journals. Eigenfactor: 0.0007 The Journal Citation Indicator (JCI) is a single measurement of the field-normalized citation impact of journals in the Web of Science Core Collection across disciplines. The key words here are that the metric is normalized and cross-disciplinary. JCI: 0.5 SJR: 0.584 SNIP: 0.676 CiteScore™:: 3 H-Index: 25

Indexed in

STABLE LIKELIHOOD COMPUTATION FOR MACHINE LEARNING OF LINEAR DIFFERENTIAL OPERATORS WITH GAUSSIAN PROCESSES

Том 12, Выпуск 3, 2022, pp. 75-99
DOI: 10.1615/Int.J.UncertaintyQuantification.2022038966
Get accessGet access

Краткое описание

In many applied sciences, the main aim is to learn the parameters in the operational equations which best fit the observed data. A framework for solving such problems is to employ Gaussian process (GP) emulators which are well-known as nonparametric Bayesian machine learning techniques. GPs are among a class of methods known as kernel machines which can be used to approximate rather complex problems by tuning their hyperparameters. The maximum likelihood estimation (MLE) has widely been used to estimate the parameters of the operators and kernels. However, the MLE-based and Bayesian inference in the standard form are usually involved in setting up a covariance matrix which is generally ill-conditioned. As a result, constructing and inverting the covariance matrix using the standard form will become unstable to learn the parameters in the operational equations. In this paper, we propose a novel approach to tackle these computational complexities and also resolve the ill-conditioning problem by forming the covariance matrix using alternative bases via the Hilbert−Schmidt SVD (HS-SVD) approach. Applying this approach yields a novel matrix factorization of the block-structured covariance matrix which can be implemented stably by isolating the main source of the ill-conditioning. In contrast to standard matrix decompositions which start with a matrix and produce the resulting factors, the HS-SVD is constructed from the Hilbert−Schmidt eigenvalues and eigenvectors without the need to ever form the potentially ill-conditioned matrix. We also provide stable MLE and Bayesian inference to adaptively estimate hyperparameters, and the corresponding operators can then be efficiently predicted at some new points using the proposed HS-SVD bases. The efficiency and stability of the proposed HS-SVD method will be compared with the existing methods by several illustrations of the parametric linear equations, such as ordinary and partial differential equations, and integro-differential and fractional order operators.

ЛИТЕРАТУРА
  1. Bender, E.A., An Introduction to Mathematical Modeling, Mineola, NY: Dover Publications Inc., 1978.

  2. de Vries, K., Nikishova, A., Czaja, B., Zavodszky, G., and Hoekstra, A.G., Inverse Uncertanity Quantification of a Cell Model Using a Gaussian Process Metamodel, Int. J. Uncertainty Quantif., 10(4):333-349,2020.

  3. Narayan, A., Yan, L., and Zhou, T., Optimal Design for Kernel Interpolation: Applications to Uncertainty Quantification, J. Comput. Phys, 430:1-20, 2021.

  4. Qin, T., Chen, Z., Jakeman, J.D., and Xiu, D., Deep Learning of Parameterized Equations with Applications to Uncertanity Quantification, Int. J. Uncertainty Quantif., 11(2):63-82, 2021.

  5. Raissi, M., Perdikaris, P., and Em Karniadakis, G., Numerical Gaussian Processes for Time-Dependent and Nonlinear Partial Differential Equations, SIAMJ. Sci. Comput., 40(1):172-198,2018.

  6. Raissi, M., Perdikaris, P., and Em Karniadakis, G., Machine Learning of Linear Differential Equations Using Gaussian Processes, J. Comput. Phys., 348(1):683-693, 2017.

  7. Mueller, J.L. and Siltanen, S., Linear and Nonlinear Inverse Problems with Practical Applications, Computational Science and Engineering, Vol. 10, Philadelphia: SIAM, 2012.

  8. Neto,F.D.M. andNeto, A.J.S., An Introduction to Inverse Problems with Applications, Berlin: Springer-Verlag, 2013.

  9. Williams, C.K. and Rasmussen, C.E., Gaussian Processes for Machine Learning, Cambridge, MA: The MIT Press, 2006.

  10. Murphy, K.P., Machine Learning: A Probabilistic Perspective, Cambridge, MA: The MIT Press, 2012.

  11. Vapnik, V., The Nature of Statistical Learning Theory, Berlin: Springer Science & Business Media, 2013.

  12. Guo, M. and Hesthaven, J.S., Reduced Order Modeling for Nonlinear Structural Analysis Using Gaussian Process Regression, Comput. Methods Appl. Mech. Eng., 341(1):807-826,2018.

  13. Tikhonov, A., Solution of Incorrectly Formulated Problems and the Regularization Method, Sov. Math. Dokl., 5:1035-1038, 1963.

  14. Poggio, T. and Girosi, F., Networks for Approximation and Learning, Proc. IEEE, 78:1481-1497,1990.

  15. Fasshauer, G.E., Meshfree Approximation Methods with Matlab, Interdisciplinary Mathematical Sciences, Vol. 6, Singapore: World Scientific Publishing, 2007.

  16. Fasshauer, G.E. and McCourt, M., Kernel-Based Approximation Methods Using MATLAB, Interdisciplinary Mathematical Sciences, Singapore: World Scientific Publishing, 2015.

  17. McCourt, M. and Fasshauer, G.E., Stable Likelihood Computation for Gaussian Random Fields, Recent Applications of Harmonic Analysis to Function Spaces, Differential Equations, and Data Science. Applied and Numerical Harmonic Analysis, Cham, Swizerland: Birkher, 2015.

  18. Kohn, R., Ansley, C.F., and Tharm, D., The Performance of Cross-Validation and Maximum Likelihood Estimators of Spline Smoothing Parameters, J. Am. Stat. Assoc, 86(416):1042-1050,1991.

  19. Fornberg, B., Larsson, E., and Flyer, N., Stable Computations with Gaussian Radial Basis Functions, SIAM J. Sci. Comput:., 33(2):869-892,2011.

  20. Pazouki, M. and Schaback, R., Bases for Kernel-Based Spaces, J. Comput. Appl. Math., 236(4):575-588, 2011.

  21. Quang, M.H., Niyogi, P., and Yao, Y., Mercer's Theorem, Feature Maps, and Smoothing, in Proc. of Int. Conf. on Computational Learning Theory (COLT), 2007.

  22. Fasshauer, G.E. and McCourt, M., Stable Evaluation of Gaussian RBF Interpolants, SIAMJ. Sci. Comput., 34(2):737-762, 2012.

  23. Gulian, M., Raissi, M., Perdikaris, P., and Em Karniadakis, G., Machine Learning of Space-Fractional Differential Equations, SIAMJ. Sci. Comput., 41(4):2485-2509, 2019.

  24. Esmaeilbeigi, M., Chatrabgoun, O., and Cheraghi, M., The Role of Hilbert-Schmidt SVD Basis in Hermite-Birkhoff Interpolation in Fractional Sense, Comput. Appl. Math., 38(82):1-20,2019.

  25. Esmaeilbeigi, M., Chatrabgoun, O., and Cheraghi, M., Fractional Hermite Interpolation Using RBFs in High Dimensions over Irregular Domains with Application, J. Comput. Phys., 375:1091-1120, 2018.

  26. Sacks, J., Welch, W. J., Mitchell, T. J., and Wynn, H.P., Design and Analysis of Computer Experiments, Stat. Sci., 4(4):409-435, 1989.

  27. Rasmussen, C.E. and Ghahramani, Z., Occam's Razor, in Adv. Neural. Inform. Process. Syst., pp. 294-300, 2001.

  28. Aronszajn, N., Theory of Reproducing Kernels, Trans. Am. Math. Soc., 68:337-404,1950.

  29. Saitoh, S., Theory of Reproducing Kernels and Its Application, Harlow, UK: Longman, 1988.

  30. Berlinet, A. and Thomas-Agnan, C., Reproducing Kernel Hilbert Spaces in Probability and Statistics, Berlin: Springer Science & Business Media, 2011.

  31. Datta, B.N., Numerical Linear Algebra and Applications, 2nd ed., Philadelphia: SIAM, 2010.

  32. Podlubny, I., Fractional Differential Equations: An Introduction to Fractional Derivatives, Fractional Differential Equations, to Methods of Their Solution and Some of Their Applications, New York: Academic Press, 1998.

ЦИТИРОВАНО В
  1. Esmaeilbeigi Mohsen, Chatrabgoun Omid, Daneshkhah Alireza, Shafa Maryam, On the impact of prior distributions on efficiency of sparse Gaussian process regression, Engineering with Computers, 2022. Crossref

Портал Begell Электронная Бибилиотека e-Книги Журналы Справочники и Сборники статей Коллекции Цены и условия подписки Begell House Контакты Language English 中文 Русский Português German French Spain