Abo Bibliothek: Guest
International Journal for Uncertainty Quantification

Erscheint 6 Ausgaben pro Jahr

ISSN Druckformat: 2152-5080

ISSN Online: 2152-5099

The Impact Factor measures the average number of citations received in a particular year by papers published in the journal during the two preceding years. 2017 Journal Citation Reports (Clarivate Analytics, 2018) IF: 1.7 To calculate the five year Impact Factor, citations are counted in 2017 to the previous five years and divided by the source items published in the previous five years. 2017 Journal Citation Reports (Clarivate Analytics, 2018) 5-Year IF: 1.9 The Immediacy Index is the average number of times an article is cited in the year it is published. The journal Immediacy Index indicates how quickly articles in a journal are cited. Immediacy Index: 0.5 The Eigenfactor score, developed by Jevin West and Carl Bergstrom at the University of Washington, is a rating of the total importance of a scientific journal. Journals are rated according to the number of incoming citations, with citations from highly ranked journals weighted to make a larger contribution to the eigenfactor than those from poorly ranked journals. Eigenfactor: 0.0007 The Journal Citation Indicator (JCI) is a single measurement of the field-normalized citation impact of journals in the Web of Science Core Collection across disciplines. The key words here are that the metric is normalized and cross-disciplinary. JCI: 0.5 SJR: 0.584 SNIP: 0.676 CiteScore™:: 3 H-Index: 25

Indexed in

A GENERAL FRAMEWORK FOR ENHANCING SPARSITY OF GENERALIZED POLYNOMIAL CHAOS EXPANSIONS

Volumen 9, Ausgabe 3, 2019, pp. 221-243
DOI: 10.1615/Int.J.UncertaintyQuantification.2019027864
Get accessGet access

ABSTRAKT

Compressive sensing has become a powerful addition to uncertainty quantification when only limited data are available. In this paper, we provide a general framework to enhance the sparsity of the representation of uncertainty in the form of generalized polynomial chaos expansion. We use an alternating direction method to identify new sets of random variables through iterative rotations so the new representation of the uncertainty is sparser. Consequently, we increase both the efficiency and accuracy of the compressive-sensing-based uncertainty quantification method. We demonstrate that the previously developed rotation-based methods to enhance the sparsity of Hermite polynomial expansion is a special case of this general framework. Moreover, we use Legendre and Chebyshev polynomial expansions to demonstrate the effectiveness of this method with applications in solving stochastic partial differential equations and high-dimensional (O (100)) problems.

REFERENZEN
  1. Ghanem,R.G. and Spanos, P.D., Stochastic Finite Elements: A Spectral Approach, Berlin: Springer-Verlag, 1991.

  2. Xiu, D. and Karniadakis, G.E., The Wiener-Askey Polynomial Chaos for Stochastic Differential Equations, SIAM J. Sci. Comput, 24(2): 619-644,2002.

  3. Cameron, R.H. and Martin, W.T., The Orthogonal Development of Non-Linear Functionals in Series of Fourier-Hermite Functionals, Ann. Math, 48(2):385-392,1947.

  4. Ogura, H., Orthogonal Functionals of the Poisson Process, IEEE Trans. Inf. Theory, 18(4):473-481,1972.

  5. Ernst, O.G., Mugler, A., Starkloff, H.J., and Ullmann, E., On the Convergence of Generalized Polynomial Chaos Expansions, ESAIM: Math. Model. Numer. Anal., 46(2):317-339,2012.

  6. Tatang, M.A., Pan, W., Prinn, R.G., and McRae, G.J., An Efficient Method for Parametric Uncertainty Analysis of Numerical Geophysical Models, J Geophys. Res. Atmos. (1984-2012), 102(D18):21925-21932,1997.

  7. Xiu, D. and Hesthaven, J.S., High-Order Collocation Methods for Differential Equations with Random Inputs, SIAM J. Sci. Comput, 27(3): 1118-1139,2005.

  8. Babuska I., Nobile, F., and Tempone, R., A Stochastic Collocation Method for Elliptic Partial Differential Equations with Random Input Data, SIAMRev., 52(2):317-355,2010.

  9. Candes E.J., Romberg, J.K., and Tao, T., Stable Signal Recovery from Incomplete and Inaccurate Measurements, Comm. Pure Appl. Math, 59(8): 1207-1223,2006.

  10. Donoho, D.L., Elad, M., and Temlyakov, V.N., Stable Recovery of Sparse Overcomplete Representations in the Presence of Noise, IEEE Trans. Inform. Theory, 52(1):6-18,2006.

  11. Candes, E.J., The Restricted Isometry Property and Its Implications for Compressed Sensing, C. R Math. Acad. Sci. Paris, 346(9-10):589-592,2008.

  12. Bruckstein, A.M., Donoho, D.L., and Elad, M., From Sparse Solutions of Systems of Equations to Sparse Modeling of SignalsandImages, SIAMRev, 51(1):34-81,2009.

  13. Doostan, A. and Owhadi, H., A Non-Adapted Sparse Approximation of PDEs with Stochastic Inputs, J. Comput. Phys, 230(8):3015-3034,2011.

  14. Yan, L., Guo, L., and Xiu, D., Stochastic Collocation Algorithms Using 11-Minimization, Int. J. Uncertain. Quant., 2(3):279-293,2012.

  15. Yang, X. and Karniadakis, G.E., Reweighted l1-Minimization Method for Stochastic Elliptic Differential Equations, J. Comput. Phys, 248(1):87-108,2013.

  16. Lei, H., Yang, X., Zheng, B., Lin, G., and Baker, N.A., Constructing Surrogate Models of Complex Systems with Enhanced Sparsity: Quantifying the Influence of Conformational Uncertainty in Biomolecular Solvation, SIAM Multiscale Model. Simul., 13(4): 1327-1353,2015.

  17. Xu, Z. and Zhou, T., On Sparse Interpolation and the Design of Deterministic Interpolation Points, SIAM J. Sci. Comput., 36(4):A1752-A1769,2014.

  18. Sargsyan, K., Safta, C.,Najm, H.N., Debusschere, B.J., Ricciuto, D., and Thornton, P., Dimensionality Reduction for Complex Models via Bayesian Compressive Sensing, Int. J. Uncertain. Quan., 4(1):63-93,2014.

  19. Peng, J., Hampton, J., andDoostan, A., On Polynomial Chaos Expansion via Gradient-Enhanced l-Minimization, J. Comput. Phys., 310:440-458,2016.

  20. Lei, H., Yang, X., Li, Z., and Karniadakis, G.E., Systematic Parameter Inference in Stochastic Mesoscopic Modeling, J. Comput. Phys, 330:571-593,2017.

  21. Candes, E.J., Wakin, M.B., and Boyd, S.P., Enhancing Sparsity by Reweighted l1-Minimization, J. Fourier Anal. Appl, 14(5-6):877-905,2008.

  22. Peng, J., Hampton, J., and Doostan, A., A Weighted l1-Minimization Approach for Sparse Polynomial Chaos Expansions, J. Comput. Phys, 267:92-111,2014.

  23. Rauhut, H. and Ward, R., Interpolation via Weighted 11-Minimization, Appl. Comput. Harmon. Anal., 40(2):321-351,2016.

  24. Rauhut, H. and Ward, R., Sparse Legendre Expansions via 11-Minimization, J. Approx. Theory, 164(5):517-533,2012.

  25. Hampton, J. and Doostan, A., Compressive Sampling of Polynomial Chaos Expansions: Convergence Analysis and Sampling Strategies, J. Comput. Phys, 280:363-386,2015.

  26. Alemazkoor, N. and Meidani, H., Divide and Conquer: An Incremental Sparsity Promoting Compressive Sampling Approach-for Polynomial Chaos Expansions, Comput. Methods Appl. Mech. Eng., 318:937-956,2017.

  27. Jakeman, J.D., Narayan, A., and Zhou, T., A Generalized Sampling and Preconditioning Scheme for Sparse Approximation of Polynomial Chaos Expansions, SIAM J. Sci. Comput, 39(3):A1114-A1144,2017.

  28. Jakeman, J.D., Eldred, M.S., and Sargsyan, K., Enhancing l1-Minimization Estimates of Polynomial Chaos Expansions Using Basis Selection, J. Comput. Phys, 289:18-34,2015.

  29. Dai, W. and Milenkovic, O., Subspace Pursuit for Compressive Sensing: Closing the Gap between Performance and Complexity, Illinois University at Urbana-Champaign, Urbana-Champaign, IL, Tech. Rep. 0704-0188,2008.

  30. Blatman, G. and Sudret, B., Adaptive Sparse Polynomial Chaos Expansion based on Least Angle Regression, J. Comput. Phys, 230(6):2345-2367,2011.

  31. Alemazkoor, N. and Meidani, H., Divide and Conquer: An Incremental Sparsity Promoting Compressive Sampling Approach for Polynomial Chaos Expansions, Comput. Methods Appl. Mech. Eng., 318:937-956,2017.

  32. Hampton, J. and Doostan, A., Basis Daptive Sample Efficient Polynomial Chaos (Base-PC), J. Comput. Phys, 371:20-49, 2018.

  33. Yang, X., Li, W., and Tartakovsky, A., Sliced-Inverse-Regression-AidedRotated Compressive Sensing Method for Uncertainty Quantification, SIAM/ASAJ Uncertain. Quant., 6(4):1532-1554,2018.

  34. Tsilifis, P., Huan, X., Safta, C., Sargsyan, K., Lacaze, G., Oefelein, J.C., Najm, H.N., and Ghanem, R.G., Compressive Sensing Adaptation for Polynomial Chaos Expansions, J. Comput. Phys, 380(1):29-47,2019.

  35. Yang, X., Wan, X., and Karniadakis, G.E., Generalized Polynomial Chaos: Approximation through Change of Measure, 4th Int. Cong. Comput. Eng. Sci, 2013.

  36. Yang, X., Lei, H., Baker, N., and Lin, G., Enhancing Sparsity of Hermite Polynomial Expansions by Iterative Rotations, J. Comput. Phys, 307:94-109,2016.

  37. Donoho, D.L., Compressed Sensing, IEEE Trans. Inf. Theory, 52(4):1289-1306,2006.

  38. Candes, E.J. and Tao, T., Decoding by Linear Programming, IEEE Trans. Inform. Theory, 51(12):4203-4215,2005.

  39. Adcock, B., Infinite-Dimensional Compressed Sensing and Function Interpolation, Found. Comput. Math., 18(3):661-701, 2018.

  40. Needell, D., Noisy Signal Recovery via Iterative Reweighted l1-Minimization, in Proc. Asilomar Conf. on Signal Systems and Computers, Pacific Grove, CA, pp. 113-117,2009.

  41. Li, K.C., Sliced Inverse Regression for Dimension Reduction, J. Am. Stat. Assoc, 86(414):316-327,1991.

  42. Russi, T.M., Uncertainty Quantification with Experimental Data and Complex System Models, PhD, UC Berkeley, 2010.

  43. Constantine, P.G., Dow, E., and Wang, Q., Active Subspace Methods in Theory and Practice: Applications to Kriging Surfaces, SIAM J. Sci. Comput, 36(4):A1500-A1524,2014.

  44. Tipireddy, R. and Ghanem, R., Basis Adaptation in Homogeneous Chaos Spaces, J. Comput. Phys, 259:304-317,2014.

  45. Yang, X., Barajas-Solano, D.A., Rosenthal, W.S., and Tartakovsky, A.M., PDF Estimation for Power Grid Systems via Sparse Regression, arXiv preprint arXiv:1708.08378,2017.

  46. Hoffman, A.J. and Wielandt, H.W. The Variation of the Spectrum of a Normal Matrix, in Selected Papers of Alan J. Hoffman: with Commentary, pp. 118-120, Singapore: World Scientific, 2003.

  47. Stewart, G.W., Matrix Perturbation Theory, Citeseer, 1990.

  48. Davis, C. and Kahan, W.M., The Rotation of Eigenvectors by a Perturbation. III, SIAMJ. Numer. Anal., 7(1): 1-46,1970.

  49. Smolyak, S., Quadrature and Interpolation Formulas for Tensor Products of Certain Classes of Functions, Sov. Math. Dokl., 4:240-243,1963.

  50. van den Berg, E. and Friedlander, M.P., Probing the Pareto Frontier for Basis Pursuit Solutions, SIAM J. Sci. Comput, 31(2):890-912,2008.

  51. van den Berg, E. and Friedlander, M.P., SPGL1: A Solver for Large-Scale Sparse Reconstruction, from http://www.cs.ubc.ca/ labs/scl/spgl1,2007.

  52. Ma, X. and Zabaras, N., An Adaptive High-Dimensional Stochastic Model Representation Technique for the Solution of Stochastic Partial Differential Equations, J. Comput. Phys, 229(10):3884-3915,2010.

  53. Yang, X., Choi, M., Lin, G., and Karniadakis, G.E., Adaptive ANOVA Decomposition of Stochastic Incompressible and Compressible Flows, J. Comput. Phys., 231(4):1587-1614,2012.

  54. Zhang, Z., Yang, X., Oseledets, I.V., Karniadakis, G.E., and Daniel, L., Enabling High-Dimensional Hierarchical Uncertainty Quantification by ANOVA and Tensor-Train Decomposition, IEEE Trans. Comput.-Aided Des. Integr. Circuits and Syst., 34(1):63-76,2015.

  55. Jardak, M., Su, C.H., and Karniadakis, G.E., Spectral Polynomial Chaos Solutions of the Stochastic Advection Equation, J. Sci. Comput, 17(1-4):319-338,2002.

  56. Lin, G., Grinberg, L., and Karniadakis, G.E., Numerical Studies of the Stochastic Korteweg-de Vries Equation, J. Comput. Phys, 213(2):676-703,2006.

  57. Yin,P.,Lou, Y., He, Q., and Xin, J., Minimization of l1-2 for Compressed Sensing, SIAMJ. Sci. Comput., 37(1):A536-A563, 2015.

  58. Guo, L., Li, J., and Liu, Y., Stochastic Collocation Methods via Minimization of Transformed 11 Penalty, arXiv preprint arXiv:1805.05416,2018.

  59. Dey, S., Mukhopadhyay, T., Khodaparast, H.H., and Adhikari, S., Fuzzy Uncertainty Propagation in Composites Using Gram-Schmidt Polynomial Chaos Expansion, Appl. Math. Model, 40(7-8):4412-4428,2016.

  60. Karagiannis, G., Konomi, B.A., and Lin, G., A Bayesian Mixed Shrinkage Prior Procedure for Spatial-Stochastic Basis Selectionand Evaluation of gPC Expansions: Applications to Elliptic SPDEs, J. Comput. Phys., 284:528-546,2015.

REFERENZIERT VON
  1. Sun Luning, Gao Han, Pan Shaowu, Wang Jian-Xun, Surrogate modeling for fluid flows based on physics-constrained deep learning without simulation data, Computer Methods in Applied Mechanics and Engineering, 361, 2020. Crossref

  2. Adcock Ben, Cardenas Juan M., Dexter Nick, Moraga Sebastian, Towards Optimal Sampling for Learning Sparse Approximations in High Dimensions, in High-Dimensional Optimization and Probability, 191, 2022. Crossref

Digitales Portal Digitale Bibliothek eBooks Zeitschriften Referenzen und Berichte Forschungssammlungen Preise und Aborichtlinien Begell House Kontakt Language English 中文 Русский Português German French Spain