Suscripción a Biblioteca: Guest
Journal of Machine Learning for Modeling and Computing

Publicado 4 números por año

ISSN Imprimir: 2689-3967

ISSN En Línea: 2689-3975

Indexed in

A SURVEY OF CONSTRAINED GAUSSIAN PROCESS REGRESSION: APPROACHES AND IMPLEMENTATION CHALLENGES

Volumen 1, Edición 2, 2020, pp. 119-156
DOI: 10.1615/JMachLearnModelComput.2020035155
Get accessDownload

SINOPSIS

Gaussian process regression is a popular Bayesian framework for surrogate modeling of expensive data sources. As part of a broader effort in scientific machine learning, many recent works have incorporated physical constraints or other a priori information within Gaussian process regression to supplement limited data and regularize the behavior of the model. We provide an overview and survey of several classes of Gaussian process constraints, including positivity or bound constraints, monotonicity and convexity constraints, differential equation constraints provided by linear PDEs, and boundary condition constraints. We compare the strategies behind each approach as well as the differences in implementation, concluding with a discussion of the computational challenges introduced by constraints.

REFERENCIAS
  1. Agrell, C., Gaussian Processes with Linear Operator Inequality Constraints, 2019. arXiv:1901.03134.

  2. Albert, C.G. and Rath, K., Gaussian Process Regression for Data Fulfilling Linear Differential Equations with Localized Sources, Entropy, vol. 22, no. 152,2020.

  3. Bachoc, F., Lagnoux, A., Lopez-Lopera, A.F., Maximum Likelihood Estimation for Gaussian Processes under Inequality Constraints, Electron. J. Stat., vol. 13, no. 2, pp. 2921-2969,2019.

  4. Baker, N., Alexander, F., Bremer, T., Hagberg, A., Kevrekidis, Y., Najm, H., Parashar, M., Patra, A., Sethian, J., Wild, S., Willcox, K., and Lee, S., Workshop Report on Basic Research Needs for Scientific Machine Learning: Core Technologies for Artificial Intelligence, Tech. Rep., USDOE Office of Science (SC), Washington, D.C., USA, 2019.

  5. Baldassarre, L., Rosasco, L., Barla, A., and Verri, A., Vector Field Learning via Spectral Filtering, Joint European Conf. on Machine Learning and Knowledge Discovery in Databases, Springer, pp. 56-71, 2010.

  6. Berkeley Institute for Data Science, Physics in Machine LearningWorkshop, accessed May 26, 2020, from https://bids.berkeley.edu/events/physics-machine-learning-workshop, 2019.

  7. Berlinet, A. and Thomas-Agnan, C., Reproducing Kernel Hilbert Spaces in Probability and Statistics, Berlin: Springer Science & Business Media, 2011.

  8. Botev, Z.I., The Normal Law under Linear Restrictions: Simulation and Estimation via Minimax Tilting, J. R. Stat. Soc.: Ser. B (Stat. Method.), vol. 79, no. 1, pp. 125-148,2017.

  9. Brezger, A. and Steiner, W.J., Monotonic Regression based on Bayesian P-Splines: An Application to Estimating Price Response Functions from Store-Level Scanner Data, J. Bus. Econ. Stat., vol. 26, no. 1, pp. 90-104,2008.

  10. Brunton, S.L. and Kutz, J.N., Data-Driven Science and Engineering: Machine Learning, Dynamical Systems, and Control, Cambridge, UK: Cambridge University Press, 2019.

  11. Brunton, S.L., Proctor, J.L., and Kutz, J.N., Discovering Governing Equations from Data by Sparse Identification of Nonlinear Dynamical Systems, Proc. Nat. Acad. Sci., vol. 113, no. 15, pp. 3932-3937,2016.

  12. Cai, T., Liu, W., and Luo, X., A Constrained l1 Minimization Approach to Sparse Precision Matrix Estimation, J. Am. Stat. Assoc, vol. 106, no. 494, pp. 594-607,2011.

  13. Chiles, J.P. and Desassis, N., Fifty Years of Kriging, Handbook of Mathematical Geosciences, Berlin: Springer, pp. 589-612,2018.

  14. Cyr, E.C., Gulian, M.A., Patel, R.G., Perego, M., and Trask, N.A., Robust Training and Initialization of Deep Neural Networks: An Adaptive Basis Viewpoint, 2019. arXiv:1912.04862.

  15. Da Veiga, S. and Marrel, A., Gaussian Process Modeling with Inequality Constraints, Annales Faculte Sci. Toulouse, vol. 21, pp. 529-555,2012.

  16. Dean, J., Corrado, G., Monga, R., Chen, K., Devin, M., Mao, M., Ranzato, M., Senior, A., Tucker, P., and Yang, K., Large Scale Distributed Deep Networks, Adv. Neural Inf. Proc. Syst., pp. 1223-1231,2012.

  17. Driscoll, M.F., The Reproducing Kernel Hilbert Space Structure of the Sample Paths of a Gaussian Process, Prob. Theor. Related Fields, vol. 26, no. 4, pp. 309-316,1973.

  18. Duvenaud, D., The Kernel Cookbook: Advice on Covariance Functions, accessed 26 May, 2020, from https: //www. cs.toronto.edu/ duvenaud/cookbook/, 2014.

  19. Flaxman, S., Wilson, A., Neill, D., Nickisch, H., and Smola, A., Fast Kronecker Inference in Gaussian Processes with Non-Gaussian Likelihoods, Proc. of the 32nd International Conference on Machine Learning, F. Bach and D. Blei, Eds., Lille, France, pp. 607-616, 2015. http://proceedings.mlr.press/v37/flaxman15 .html.

  20. Frankel, A., Jones, R., and Swiler, L., Tensor Basis Gaussian Process Models of Hyperelastic Materials, 2019a. arXiv:1912.10872.

  21. Frankel, A., Tachida, K., and Jones, R., Prediction of the Evolution of the Stress Field of Polycrystals Undergoing Elastic-Plastic Deformation with a Hybrid Neural Network Model, 2019b. arXiv: 1910.03172.

  22. Frankel, A.L., Jones, R.E., Alleman, C., and Templeton, J.A., Predicting the Mechanical Response of Oligocrystals with Deep Learning, 2019c. arXiv: 1901.10669.

  23. Fuselier Jr., E.J., Refined Error Estimates for Matrix-Valued Radial Basis Functions, PhD, Texas A&M University, 2007.

  24. Genz, A. and Bretz, F., Computation of Multivariate Normal and t Probabilities, Berlin: Springer Science & Business Media, vol. 195,2009.

  25. Geoga, C.J., Anitescu, M., and Stein, M.L., Scalable Gaussian Process Computations Using Hierarchical Matrices, J. Comput. Graph. Stat., pp. 1-11,2019.

  26. Graepel, T., Solving Noisy Linear Operator Equations by Gaussian Processes: Application to Ordinary and Partial Differential Equations, ICML, pp. 234-241,2003.

  27. Gramacy, R., LaGP: Large-Scale Spatial Modeling via Local Approximate Gaussian Processes in R, J. Stat. Software, Articles, vol. 72, no. 1, pp. 1-46,2016. https://www.jstatsoft.org/v072/101.

  28. Gramacy, R.B., Surrogates: Gaussian Process Modeling, Design, and Optimization for the Applied Sci-ences, Boca Raton, FL: CRC Press, 2020.

  29. Gramacy, R.B. and Apley, D.W., Local Gaussian Process Approximation for Large Computer Experiments, J. Comput. Graph. Stat, vol. 24, no. 2, pp. 561-578,2015.

  30. Gulian, M., Raissi, M., Perdikaris, P., and Karniadakis, G.E., Machine Learning of Space-Fractional Dif-ferential Equations, SIAM J. Sci. Comput, vol. 41, no. 4, pp. A2485-A2509,2019.

  31. Hastie, T., Tibshirani, R., and Friedman, J., The Elements of Statistical Learning: Data Mining, Inference and Prediction, 2nd Ed., Berlin: Springer, 2016.

  32. Hensman, J., Durrande, N., and Solin, A., Variational Fourier Features for Gaussian Processes, J. Mach. Learn. Res., vol. 18, no. 1, pp. 5537-5588,2017.

  33. Jensen, B.S., Nielsen, J.B., and Larsen, J., Bounded Gaussian Process Regression, 2013 IEEE Int. Workshop on Machine Learning for Signal Processing (MLSP), IEEE, pp. 1-6,2013.

  34. Jidling, C., Strain Field Modelling Using Gaussian Processes, PhD, Uppsala University, 2017.

  35. Jidling, C., Wahlstrom, N., Wills, A., and Schon, T.B., Linearly Constrained Gaussian Processes, Adv. Neural Inf. Proc. Syst., pp. 1215-1224,2017.

  36. Jones, R., Templeton, J.A., Sanders, C.M., and Ostien, J.T., Machine Learning Models of Plastic Flow based on Representation Theory, Comput. Model. Eng. Sci., pp. 309-342,2018.

  37. Kanagawa, M., Hennig, P., Sejdinovic, D., and Sriperumbudur, B.K., Gaussian Processes and Kernel Methods: AReviewon Connections and Equivalences, 2018. arXiv:1807.02582.

  38. Karpatne, A., Atluri, G., Faghmous, J.H., Steinbach, M., Banerjee, A., Ganguly, A., Shekhar, S., Samatova, N., and Kumar, V., Theory-Guided Data Science: A New Paradigm for Scientific Discovery from Data, IEEE Transact. Knowl. Data Eng., vol. 29, no. 10, pp. 2318-2331,2017.

  39. Kaufman, C.G., Schervish, M.J., andNychka, D.W., Covariance Tapering for Likelihood-Based Estimation in Large Spatial Data Sets, J.Am. Stat. Assoc, vol. 103, no. 484, pp. 1545-1555,2008.

  40. Kelly, C. and Rice, J., Monotone Smoothing with Application to Dose-Response Curves and the Assessment of Synergism, Biometrics, pp. 1071-1085,1990.

  41. Lee, K. and Carlberg, K., Model Reduction of Dynamical Systems on Nonlinear Manifolds Using Deep Convolutional Autoencoders, 2018. arXiv: 1812.08373.

  42. Ling, J., Jones, R., and Templeton, J., Machine Learning Strategies for Systems with Invariance Properties, J. Comput. Phys, vol. 318, pp. 22-35,2016.

  43. Lopez-Lopera, A.F., LineqGPR: Gaussian Process Regression Models with Linear Inequality Constraints, accessed May 26,2020, from https://github.com/anfelopera/lineqGPR, 2018.

  44. Lopez-Lopera, A.F., Bachoc, F., Durrande, N., and Roustant, O., Finite-Dimensional Gaussian Approximation with Linear Inequality Constraints, SIAM/ASA J. Uncertainty Quant., vol. 6, no. 3, pp. 1224-1255, 2018.

  45. Los Alamos Center for Nonlinear Studies, 3rd Physics Informed Machine Learning Conference, accessed May 26, 2020, from https://cnls.lanl.gov/External/workshops.php, 2020.

  46. Lusch, B., Kutz, J.N., and Brunton, S.L., Deep Learning for Universal Linear Embeddings of Nonlinear Dynamics, Nat. Commun., vol. 9, no. 1, p. 4950, 2018. https://www.nature.com/articles/s41467-018-07210-0/.

  47. Maatouk, H., Finite-Dimensional Approximation of Gaussian Processes with Inequality Constraints, 2017. arXiv:1706.02178.

  48. Maatouk, H. and Bay, X., A New Rejection Sampling Method for Truncated Multivariate Gaussian Random Variables Restricted to Convex Sets, Monte Carlo and Quasi-Monte Carlo Methods, Berlin: Springer, pp. 521-530,2016.

  49. Maatouk, H. and Bay, X., Gaussian Process Emulators for Computer Experiments with Inequality Constraints, Math. Geosci., vol. 49, no. 5, pp. 557-582,2017.

  50. Macedo, I. and Castro, R., Learning Divergence-Free and Curl-Free Vector Fields with Matrix-Valued Kernels, Tech. rep., Instituto Nacional de Matematica Pura e Aplicada, 2008.

  51. Magiera, J., Ray, D., Hesthaven, J.S., and Rohde, C., Constraint-Aware Neural Networks for Riemann Problems, 2019. arXiv:1904.12794.

  52. Mao, Z., Jagtap, A.D., and Karniadakis, G.E., Physics-Informed Neural Networks for High-Speed Flows, Comput. Methods Appl. Mech. Eng., vol. 360, p. 112789,2020.

  53. Microsoft, Physics n ML: Physics Meets Machine Learning, accessed May 26, 2020, from https://www.microsoft.com/en-us/research/event/physics-ml-workshop/, 2019.

  54. Murphy, K.P., Machine Learning: A Probabilistic Perspective, Cambridge, MA: MIT press, 2012.

  55. Narcowich, F.J. and Ward, J.D., Generalized Hermite Interpolation via Matrix-Valued Conditionally Positive Definite Functions, Math. Comput., vol. 63, no. 208, pp. 661-687,1994.

  56. Pan, S. and Duraisamy, K., Data-Driven Discovery of Closure Models, SIAMJ. Appl. Dyn. Syst., vol. 17, no. 4, pp. 2381-2413,2018. https://doi.org/10.1137/18M1177263.

  57. Pensoneault, A., Yang, X., and Zhu, X., Nonnegativity-Enforced Gaussian Process Regression, 2020. arXiv: 2004.04632.

  58. Pouransari, H., Coulier, P., and Darve, E., Fast Hierarchical Solvers for Sparse Matrices Using Extended Sparsification and Low-Rank Approximation, SIAM J. Sci. Comput., vol. 39, no. 3, pp. A797-A830, 2017.

  59. Qumonero-Candela, J. and Rasmussen, C.E., Analysis of Some Methods for Reduced Rank Gaussian Process Regression, Switching and Learning in Feedback Systems, Berlin: Springer, pp. 98-127, 2005a.

  60. Qumonero-Candela, J. and Rasmussen, C.E., A Unifying View of Sparse Approximate Gaussian Process Regression, J. Mach. Learn. Res., vol. 6, pp. 1939-1959,2005b.

  61. Raissi, M., Deep Hidden Physics Models: Deep Learning of Nonlinear Partial Differential Equations, J. Mach. Learn. Res, vol. 19, no. 1, pp. 932-955,2018.

  62. Raissi, M. and Karniadakis, G.E., Hidden Physics Models: Machine Learning of Nonlinear Partial Differential Equations, J. Comput. Phys, vol. 357, pp. 125-141,2018.

  63. Raissi, M., Perdikaris, P., and Karniadakis, G.E., Machine Learning of Linear Differential Equations Using Gaussian Processes, J. Comput. Phys., vol. 348, pp. 683-693,2017.

  64. Raissi, M., Perdikaris, P., and Karniadakis, G.E., Numerical Gaussian Processes for Time-Dependent and Nonlinear Partial Differential Equations, SIAM J. Sci. Comput, vol. 40, no. 1, pp. A172-A198, 2018.

  65. Raissi, M., Perdikaris, P., and Karniadakis, G.E., Physics-Informed Neural Networks: A Deep Learning Framework for Solving Forward and Inverse Problems Involving Nonlinear Partial Differential Equations, J. Comput. Phys, vol. 378, pp. 686-707,2019.

  66. Rasmussen, C.E. and Williams, C.K., Gaussian Processes for Machine Learning, Cambridge, MA: MIT Press, 2006.

  67. Ray, P., Pati, D., and Bhattacharya, A., Efficient Bayesian Shape-Restricted Function Estimation with Constrained Gaussian Process Priors, 2019. arXiv:1902.04701.

  68. Rider, C. and Simmons, J.E., Eds., Chemical Mixtures and Combined Chemical and Non-Chemical Stressors, Ch. 15, Berlin: Springer, 2018.

  69. Riihimaki, J. and Vehtari, A., Gaussian Processes with Monotonicity Information, Proc. of the Thirteenth International Conference on Artificial Intelligence and Statistics, pp. 645-652,2010.

  70. Rue, H. and Tjelmeland, H., Fitting Gaussian Markov Random Fields to Gaussian Fields, Scand. J. Stat., vol. 29, no. 1,pp. 31-49,2002.

  71. Sacks, J., Welch, W. J., Mitchell, T. J., and Wynn, H.P., Design and Analysis of Computer Experiments, Stat. Sci., pp. 409-423,1989.

  72. Salzmann, M. and Urtasun, R., Implicitly Constrained Gaussian Process Regression for Monocular Non-Rigid Pose Estimation, Adv. Neural Inf. Proc. Syst, pp. 2065-2073,2010.

  73. Santner, T. J., Williams, B.J., and Notz, W.I., The Design and Analysis of Computer Experiments, Berlin: Springer Series in Statistics, 2003.

  74. Sarkka, S., Linear Operators and Stochastic Partial Differential Equations in Gaussian Process Regression, Int. Conf. on Artificial Neural Networks, Springer, pp. 151-158,2011.

  75. Seeger, M., Gaussian Processes for Machine Learning, Int. J. Neural Syst., vol. 14, no. 02, pp. 69-106, 2004.

  76. Shaby, B. and Ruppert, D., Tapered Covariance: Bayesian Estimation and Asymptotics, J. Comput. Graph. Stat, vol. 21, no. 2, pp. 433-452,2012.

  77. Snelson, E., Ghahramani, Z., and Rasmussen, C.E., Warped Gaussian Processes, AdK Neural Inf. Proc. Syst., pp. 337-344,2004.

  78. Solak, E., Murray-Smith, R., Leithead, W.E., Leith, D.J., and Rasmussen, C.E., Derivative Observations in Gaussian Process Models of Dynamic Systems, AdK Neural Inf. Proc. Syst., pp. 1057-1064, 2003.

  79. Solin, A. and Kok, M., Know Your Boundaries: Constraining Gaussian Processes by Variational Harmonic Features, Proceedings ofMachine LearningResearch, K. Chaudhuri and M. Sugiyama, Eds., Vol. 89 of Proc. Mach. Learn. Res, PMLR, pp. 2193-2202,2019.

  80. Solin, A., Kok, M., Wahlstrom, N., Schon, T.B., and Sarkka, S., Modeling and Interpolation of the Ambient Magnetic Field by Gaussian Processes, IEEE Transact. Rob., vol. 34, no. 4, pp. 1112-1127,2018.

  81. Solin, A. and Sarkka, S., Hilbert Space Methods for Reduced-Rank Gaussian Process Regression, Stat. Comput, 2019.

  82. Song, F., Xu, C., and Karniadakis, G.E., Computing Fractional Laplacians on Complex-Geometry Domains: Algorithms and Simulations, SIAM J. Sci. Comput, vol. 39, no. 4, pp. A1320-A1344, 2017.

  83. Serbye, S.H. and Rue, H., Scaling Intrinsic Gaussian Markov Random Field Priors in Spatial Modelling, Spatial Stat, vol. 8, pp. 39-51,2014.

  84. Stanford University, Combining Artificial Intelligence and Machine Learning with Physical Sciences, accessed May 26,2020, from https://sites.google.com/view/aaai-mlps, 2020.

  85. Stevens, R., Taylor, V., Nichols, J., Maccabe, A.B., Yelick, K., and Brown, D., AI for Science, Tech. Rep., Argonne National Lab. (ANL), Argonne, IL (United States), 2020.

  86. University of Washington, Physics Informed Machine Learning Workshop, accessed May 26, 2020, from http://www.databookuw.com/page-5/, 2019.

  87. Vanhatalo, J., Riihimaki, J., Hartikainen, J., Jylanki, P., Tolvanen, V., and Vehtari, A., GPStuff: Bayesian Modeling with Gaussian Processes, J. Mach. Learn. Res, vol. 14, no. Apr, pp. 1175-1179,2013.

  88. Wahlstrom, N., Modeling of Magnetic Fields and Extended Objects for Localization Applications, PhD, Linkoping University, 2015.

  89. Wahlstrom, N., Kok, M., Schon, T.B., and Gustafsson, F., Modeling Magnetic Fields Using Gaussian Processes, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing, IEEE, pp. 3522-3526,2013.

  90. Wang, X. and Berger, J.O., Estimating Shape Constrained Functions Using Gaussian Processes, SIAM/ASA J. Uncertainty Quant., vol. 4, no. 1,pp. 1-25,2016.

  91. Wilson, A. and Nickisch, H., Kernel Interpolation for Scalable Structured Gaussian Processes (KISS-GP), Int. Conf. on Machine Learning, pp. 1775-1784,2015.

  92. Yang, X., Barajas-Solano, D., Tartakovsky, G., and Tartakovsky, A.M., Physics-Informed CoKriging: A Gaussian-Process-Regression-Based Multifidelity Method for Data-Model Convergence, J. Comput. Phys, vol. 395, pp. 410-431,2019.

  93. Yang, X., Tartakovsky, G., and Tartakovsky, A., Physics-Informed Kriging: A Physics-Informed Gaussian Process Regression Method for Data-Model Convergence, 2018. arXiv: 1809.03461.

  94. Zhao, T. and Liu, H., Sparse Precision Matrix Estimation with Calibration, Proc. of the 26th International Conference on Neural Information Processing Systems, pp. 2274-2282,2013.

CITADO POR
  1. Chen Yifan, Hosseini Bamdad, Owhadi Houman, Stuart Andrew M., Solving and learning nonlinear PDEs with Gaussian processes, Journal of Computational Physics, 447, 2021. Crossref

  2. Wen Zezhao, Zhang Hongye, Mueller Markus, Sensitivity analysis and machine learning modelling for the output characteristics of rotary HTS flux pumps, Superconductor Science and Technology, 34, 12, 2021. Crossref

  3. Zhang Yijie, Zhu Xueyu, Gao Jinghuai, Hidden physics model for parameter estimation of elastic wave equations, Computer Methods in Applied Mechanics and Engineering, 381, 2021. Crossref

  4. Tartakovsky Alexandre M., Ma Tong, Barajas-Solano David A., Tipireddy Ramakrishna, Physics-informed Gaussian process regression for states estimation and forecasting in power grids, International Journal of Forecasting, 2022. Crossref

  5. Verspeek Simon, De Boi Ivan, Maldague Xavier, Penne Rudi, Steenackers Gunther, Dynamic Line Scan Thermography Parameter Design via Gaussian Process Emulation, Algorithms, 15, 4, 2022. Crossref

  6. Chen Jialei, Chen Zhehui, Zhang Chuck, Jeff Wu C. F., APIK: Active Physics-Informed Kriging Model with Partial Differential Equations, SIAM/ASA Journal on Uncertainty Quantification, 10, 1, 2022. Crossref

  7. Boi Ivan De, Sels Seppe, De Moor Olivier, Vanlanduit Steve, Penne Rudi, Input and Output Manifold Constrained Gaussian Process Regression for Galvanometric Setup Calibration, IEEE Transactions on Instrumentation and Measurement, 71, 2022. Crossref

  8. Maatouk H., Finite-dimensional approximation of Gaussian processes with linear inequality constraints and noisy observations, Communications in Statistics - Theory and Methods, 2022. Crossref

  9. Taw Eric, Neaton Jeffrey B., Accelerated Discovery of CH 4 Uptake Capacity Metal–Organic Frameworks Using Bayesian Optimization , Advanced Theory and Simulations, 5, 3, 2022. Crossref

  10. Himmel Andreas, Findeisen Rolf, Sundmacher Kai, Closed-loop real-time optimization for unsteady operating production systems, Journal of Process Control, 113, 2022. Crossref

  11. Römer Ulrich, Liu Jintian, Böl Markus, Surrogate‐based Bayesian calibration of biomechanical models with isotropic material behavior, International Journal for Numerical Methods in Biomedical Engineering, 38, 4, 2022. Crossref

  12. Randon Mathieu, Quost Benjamin, Boudaoud Nassim, Wissel Dirk von, Vehicle consumption estimation via calibrated Gaussian Process regression, 2022 IEEE Intelligent Vehicles Symposium (IV), 2022. Crossref

  13. Mondal Sudeepta, Sarkar Soumalya, Multi-fidelity prediction of spatiotemporal fluid flow, Physics of Fluids, 34, 8, 2022. Crossref

  14. Noack Marcus M., Sethian James A., Advanced stationary and nonstationary kernel designs for domain-aware Gaussian processes, Communications in Applied Mathematics and Computational Science, 17, 1, 2022. Crossref

  15. Cheung Damon H. T., Wong Kaze W. K., Hannuksela Otto A., Li Tjonnie G. F., Ho Shirley, Testing the robustness of simulation-based gravitational-wave population inference, Physical Review D, 106, 8, 2022. Crossref

  16. Jiao Ruwang, Xue Bing, Zhang Mengjie, Investigating the Correlation Amongst the Objective and Constraints in Gaussian Process-Assisted Highly Constrained Expensive Optimization, IEEE Transactions on Evolutionary Computation, 26, 5, 2022. Crossref

  17. Ranftl Sascha, A Connection between Probability, Physics and Neural Networks, MaxEnt 2022, 2022. Crossref

  18. VandenHeuvel Daniel J., Drovandi Christopher, Simpson Matthew J., Maini Philip K., Computationally efficient mechanism discovery for cell invasion with uncertainty quantification, PLOS Computational Biology, 18, 11, 2022. Crossref

Portal Digitalde Biblioteca Digital eLibros Revistas Referencias y Libros de Ponencias Colecciones Precios y Políticas de Suscripcione Begell House Contáctenos Language English 中文 Русский Português German French Spain