图书馆订阅: Guest
国际不确定性的量化期刊

每年出版 6 

ISSN 打印: 2152-5080

ISSN 在线: 2152-5099

The Impact Factor measures the average number of citations received in a particular year by papers published in the journal during the two preceding years. 2017 Journal Citation Reports (Clarivate Analytics, 2018) IF: 1.7 To calculate the five year Impact Factor, citations are counted in 2017 to the previous five years and divided by the source items published in the previous five years. 2017 Journal Citation Reports (Clarivate Analytics, 2018) 5-Year IF: 1.9 The Immediacy Index is the average number of times an article is cited in the year it is published. The journal Immediacy Index indicates how quickly articles in a journal are cited. Immediacy Index: 0.5 The Eigenfactor score, developed by Jevin West and Carl Bergstrom at the University of Washington, is a rating of the total importance of a scientific journal. Journals are rated according to the number of incoming citations, with citations from highly ranked journals weighted to make a larger contribution to the eigenfactor than those from poorly ranked journals. Eigenfactor: 0.0007 The Journal Citation Indicator (JCI) is a single measurement of the field-normalized citation impact of journals in the Web of Science Core Collection across disciplines. The key words here are that the metric is normalized and cross-disciplinary. JCI: 0.5 SJR: 0.584 SNIP: 0.676 CiteScore™:: 3 H-Index: 25

Indexed in

SCALABLE GAUSSIAN PROCESS ANALYSIS FOR IMPLICIT PHYSICS-BASED COVARIANCE MODELS

卷 11, 册 6, 2021, pp. 49-81
DOI: 10.1615/Int.J.UncertaintyQuantification.2021036657
Get accessGet access

摘要

The performance of Gaussian process analysis can be significantly affected by the choice of the covariance function. Physics-based covariance models provide a systematic way to construct covariance models that are consistent with the underlying physical laws. But the resulting models are still limited by the computational difficulties for large-scale problems. In this study, we propose a new framework combining low-rank approximations and physics-based covariance models to perform both accurate and efficient Gaussian process analysis for implicit models. The proposed approximations interact with the physical model via a black-box forward solver and can achieve quasilinear complexity for Gaussian process regression, maximum likelihood parameter estimations, and approximation of the expected Fisher information matrix when performing uncertainty quantification. We also propose a way to include higher-order terms in the covariance model to account for the nonlinearities. To accomplish the goal, we choose a specific global low-rank approximation of the covariance matrix and use stochastic trace estimators. Our numerical results demonstrate the effectiveness and scalability of the approach, validate the accuracy of maximum likelihood approximations and confidence intervals, and compare the performance favorably with other covariance models.

参考文献
  1. Rasmussen, C.E., Gaussian Processes in Machine Learning, in Summer School on Machine Learning, Springer, pp. 63-71, 2003.

  2. MacKay, D.J., Introduction to Gaussian Processes, NATO ASISeries F Comput. Syst. Sci, 168:133-166, 1998.

  3. Williams, C.K., Prediction with Gaussian Processes: From Linear Regression to Linear Prediction and Beyond, in Learning in Graphical Models, Springer, pp. 599-621, 1998.

  4. Stein, M.L., Interpolation of Spatial Data: Some Theory for Kriging, New York: Springer Science and Business Media, 2012.

  5. Kocijan, J., Murray-Smith, R., Rasmussen, C.E., and Girard, A., Gaussian Process Model Based Predictive Control, in Proc. of the 2004 American Control Conference, IEEE, Vol. 3, pp. 2214-2219,2004.

  6. Chiles, J.P. and Delfiner, P., Geostatistics: Modeling Spatial Uncertainty, Vol. 497, New York: John Wiley and Sons, 2009.

  7. Boyle, P. and Frean, M., Dependent Gaussian Processes, in Advances in Neural Information Processing Systems, Cambridge, MA: MIT Press, pp. 217-224,2005.

  8. Quinonero-Candela, J. and Rasmussen, C.E., A Unifying View of Sparse Approximate Gaussian Process Regression, J. Mach. Learn. Res., 6:1939-1959, 2005.

  9. Tresp, V., A Bayesian Committee Machine, Neural Comput., 12(ii):2719-2741, 2000.

  10. Nguyen-Tuong, D., Peters, J.R., and Seeger, M., Local Gaussian Process Regression for Real Time Online Model Learning, in Adv. in Neural Information Processing Systems 21: 22nd Annual Conf. on Neural Information Processing Systems 2008, pp. 1193-1200, 2009.

  11. Si, S., Hsieh, C.J., and Dhillon, I.S., Memory Efficient Kernel Approximation, j. Mach. Learn. Res, 18(i):682-713, 2017.

  12. Xu, Z., Cambier, L., Rouet, F.H., L'Eplatennier, P., Huang, Y., Ashcraft, C., and Darve, E., Low-Rank Kernel Matrix Approximation Using Skeletonized Interpolation with Endo-or Exo-Vertices, Numer. Anal., arXiv:1807.04787, 2018.

  13. Ambikasaran, S., Foreman-Mackey, D., Greengard, L., Hogg, D.W., and O'Neil, M., Fast Direct Methods for Gaussian Processes, IEEE Trans. Pattern Anal. Mach. Intell., 38(2):252-265, 2015.

  14. Borm, S. and Garcke, J., Approximating Gaussian Processes with H2-Matrices, in European Conference on Machine Learning, Springer, pp. 42-53, 2007.

  15. Gelman, A., Carlin, J.B., Stern, H.S., Dunson, D.B., Vehtari, A., and Rubin, D.B., Bayesian Data Analysis, Boca Raton, FL: Chapman and Hall/CRC, 2013.

  16. Berliner, L.M., Hierarchical Bayesian Time Series Models, in Maximum Entropy and Bayesian Methods, Springer, pp. 15-22, 1996.

  17. Wikle, C.K., Berliner, L.M., and Cressie, N., Hierarchical Bayesian Space-Time Models, Env. Ecol. Stat., 5(2):117-154, 1998.

  18. Berliner, L.M., Royle, J.A., Wikle, C.K., and Milliff, R.F., Bayesian Methods in the Atmospheric Sciences, Bayesian Stat., 6:83-100, 1999.

  19. Royle, J., Berliner, L., Wikle, C., and Milliff, R., A Hierarchical Spatial Model for Constructing Wind Fields from Scatterometer Data in the Labrador Sea, in Case Studies in Bayesian Statistics, Springer, pp. 367-382, 1999.

  20. Chiles, J.P., How to Adapt Kriging to Non-Classical Problems: Three Case Studies, in Advanced Geostatistics in the Mining Industry, Springer, pp. 69-89, 1976.

  21. Berliner, L.M., Physical-Statistical Modeling in Geophysics, J. Geophys. Res. Atmos., 108(D24):8776, 2003.

  22. Berliner, L.M., Milliff, R.F., and Wikle, C.K., Bayesian Hierarchical Modeling of Air-Sea Interaction, j. Geophys. Res. Oceans, 108(C4):3104, 2003.

  23. Wikle, C.K., Milliff, R.F., Nychka, D., and Berliner, L.M., Spatiotemporal Hierarchical Bayesian Modeling Tropical Ocean Surface Winds, j. Am. Stat. Ass., 96(454):382-397, 2001.

  24. Clark, J.S. and Gelfand, A.E., Hierarchical Modelling for the Environmental Sciences: Statistical Methods and Applications, Oxford: Oxford University Press on Demand, 2006.

  25. Constantinescu, E.M. and Anitescu, M., Physics-Based Covariance Models for Gaussian Processes with Multiple Outputs, Int. j. Uncertainty Quantif., 3(i):47-71, 2013.

  26. Yu, J. and Anitescu, M., Multidimensional Sum-Up Rounding for Integer Programming in Optimal Experimental Design, Math. Program, 185(1):37-76, 2019.

  27. Alexanderian, A., Petra, N., Stadler, G., and Ghattas, O., A-Optimal Design of Experiments for Infinite-Dimensional Bayesian Linear Inverse Problems with Regularized\ell_0-Sparsification, SIAMj. Sci. Comput., 36(5):A2122-A2148, 2014.

  28. Trefethen, L.N., Approximation Theory and Approximation Practice, Extended Edition, SIAM, 2019.

  29. Mametjanov, A., Norris, B., Zeng, X., Drewniak, B., Utke, J., Anitescu, M., and Hovland, P., Applying Automatic Differentiation to the Community Land Model, in Recent Advances in Algorithmic Differentiation, Springer, pp. 47-57, 2012.

  30. Stein, M.L., Chen, J., and Anitescu, M., Stochastic Approximation of Score Functions for Gaussian Processes, Ann. Appl. Stat., 7(2):1162-1191, 2013.

  31. Casella,G., An Introduction to Empirical Bayes Data Analysis, Am. Stat, 39(2):83-87, 1985.

  32. Xu, W. and Anitescu, M., A Limited-Memory Multiple Shooting Method for Weakly Constrained Variational Data Assimilation, SIAMJ. Numer. Anal, 54(6):3300-3331, 2016.

  33. Apte, A., Auroux, D., and Mythily, R., Variational Data Assimilation for Discrete Burgers Equation, in Electronic Journal of Differential Equations Conference, Texas State Univ., pp. 15-30, 2010.

  34. Kalnay, E., Atmospheric Modeling, Data Assimilation and Predictability, Cambridge: Cambridge University Press, 2003.

  35. Lewis, J.M., Lakshmivarahan, S., andDhall, S., Dynamic Data Assimilation: A Least Squares Approach, Vol. 13, Cambridge: Cambridge University Press, 2006.

  36. Uboldi, F. andKamachi, M., Time-Space Weak-Constraint Data Assimilation for Nonlinear Models, TellusA, 52(4):412-421, 2000.

  37. Stechmann, S.N. and Hottovy, S., Unified Spectrum of Tropical Rainfall and Waves in a Simple Stochastic Model, Geophys. Res. Lett, 44(20):10-713, 2017.

  38. Kloeden, P.E. and Platen, E., Numerical Solution of Stochastic Differential Equations, Vol. 23, New York: Springer Science & Business Media, 2013.

  39. Sarkka, S. and Solin, A., Applied Stochastic Differential Equations, Vol. 10, Cambridge: Cambridge University Press, 2019.

  40. Kalnay, E., Kanamitsu, M., Kistler, R., Collins, W., Deaven, D., Gandin, L., Iredell, M., Saha, S., White, G., and Woollen, J., The NCEP/NCAR 40-Year Reanalysis Project, Bull. Am. Meteorol. Soc., 77(3):437-472, 1996.

  41. Stechmann, S.N. and Ogrosky, H.R., The Walker Circulation, Diabatic Heating, and Outgoing Longwave Radiation, Geophys. Res. Lett, 41(24):9097-9105, 2014.

  42. Stechmann, S.N. and Majda, A.J., Identifying the Skeleton of the Madden-Julian Oscillation in Observational Data, Mon. Weather Rev, 143(1):395-416, 2015.

  43. Ogrosky, H.R. and Stechmann, S.N., Identifying Convectively Coupled Equatorial Waves Using Theoretical Wave Eigenvectors, Mon. Weather Rev, 144(6):2235-2264, 2016.

  44. Waliser, D., Sperber, K., Hendon, H., Kim, D., Maloney, E., Wheeler, M., Weickmann, K., Zhang, C., Donner, L., and Gottschalck, J., MJO Simulation Diagnostics, J. Clim., 22Q1):3006-3030, 2009.

  45. Chen, N. and Majda, A.J., Filtering the Stochastic Skeleton Model for the Madden-Julian Oscillation, Mon. Weather Rev., 144(2):501-527, 2016.

  46. Kumar, S., Mohri, M., and Talwalkar, A., Ensemble Nystrom Method, in Proc. of the 2009 Conf. on Advances in Neural Information Processing Systems 22, pp. 1060-1068, 2009.

  47. Wang, S. and Zhang, Z., Efficient Algorithms and Error Analysis for the Modified Nystrom Method, Proc. of the 17th Int. Conf. on Artificial Intelligence and Statistics, pp. 996-1004,2014.

  48. Ambikasaran, S., O'Neil, M., and Singh, K.R., Fast Symmetric Factorization of Hierarchical Matrices with Applications, Numer. Anal., arXiv:1405.0223,2014.

  49. Liberty, E., Woolfe, F., Martinsson, P.G., Rokhlin, V., and Tygert, M., Randomized Algorithms for the Low-Rank Approximation of Matrices, Proc. Nat. Acad. Sci, 104(51):20167-20172, 2007.

  50. Wang, S., Luo, L., and Zhang, Z., SPSD Matrix Approximation Vis Column Selection: Theories, Algorithms, and Extensions, J Mach. Learn. Res., 17(1):1697-1745, 2016.

  51. Brookes, M., The Matrix Reference Manual, London: Imperial College London, 2005.

Begell Digital Portal Begell 数字图书馆 电子图书 期刊 参考文献及会议录 研究收集 订购及政策 Begell House 联系我们 Language English 中文 Русский Português German French Spain