图书馆订阅: Guest
国际不确定性的量化期刊

每年出版 6 

ISSN 打印: 2152-5080

ISSN 在线: 2152-5099

The Impact Factor measures the average number of citations received in a particular year by papers published in the journal during the two preceding years. 2017 Journal Citation Reports (Clarivate Analytics, 2018) IF: 1.7 To calculate the five year Impact Factor, citations are counted in 2017 to the previous five years and divided by the source items published in the previous five years. 2017 Journal Citation Reports (Clarivate Analytics, 2018) 5-Year IF: 1.9 The Immediacy Index is the average number of times an article is cited in the year it is published. The journal Immediacy Index indicates how quickly articles in a journal are cited. Immediacy Index: 0.5 The Eigenfactor score, developed by Jevin West and Carl Bergstrom at the University of Washington, is a rating of the total importance of a scientific journal. Journals are rated according to the number of incoming citations, with citations from highly ranked journals weighted to make a larger contribution to the eigenfactor than those from poorly ranked journals. Eigenfactor: 0.0007 The Journal Citation Indicator (JCI) is a single measurement of the field-normalized citation impact of journals in the Web of Science Core Collection across disciplines. The key words here are that the metric is normalized and cross-disciplinary. JCI: 0.5 SJR: 0.584 SNIP: 0.676 CiteScore™:: 3 H-Index: 25

Indexed in

A BAYES NETWORK APPROACH TO UNCERTAINTY QUANTIFICATION IN HIERARCHICALLY DEVELOPED COMPUTATIONAL MODELS

卷 2, 册 2, 2012, pp. 173-193
DOI: 10.1615/Int.J.UncertaintyQuantification.v2.i2.70
Get accessDownload

摘要

Performance assessment of complex systems is ideally accomplished through system-level testing, but because they are expensive, such tests are seldom performed. On the other hand, for economic reasons, data from tests on individual components that are parts of complex systems are more readily available. The lack of system-level data leads to a need to build computational models of systems and use them for performance prediction in lieu of experiments. Because their complexity, models are sometimes built in a hierarchical manner, starting with simple components, progressing to collections of components, and finally, to the full system. Quantification of uncertainty in the predicted response of a system model is required in order to establish confidence in the representation of actual system behavior. This paper proposes a framework for the complex, but very practical problem of quantification of uncertainty in system-level model predictions. It is based on Bayes networks and uses the available data at multiple levels of complexity (i.e., components, subsystem, etc.). Because epistemic sources of uncertainty were shown to be secondary, in this application, aleatoric only uncertainty is included in the present uncertainty quantification. An example showing application of the techniques to uncertainty quantification of measures of response of a real, complex aerospace system is included.

对本文的引用
  1. DeCarlo Erin C., Mahadevan Sankaran, Smarslok Benjamin P., Bayesian Calibration of Aerothermal Models for Hypersonic Air Vehicles, 54th AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference, 2013. Crossref

  2. Nannapaneni Saideep, Mahadevan Sankaran, Uncertainty quantification in performance evaluation of manufacturing processes, 2014 IEEE International Conference on Big Data (Big Data), 2014. Crossref

  3. Nannapaneni Saideep, Mahadevan Sankaran, Rachuri Sudarsan, Performance evaluation of a manufacturing process under uncertainty using Bayesian networks, Journal of Cleaner Production, 113, 2016. Crossref

  4. Nagel Joseph B., Sudret Bruno, A unified framework for multilevel uncertainty quantification in Bayesian inverse problems, Probabilistic Engineering Mechanics, 43, 2016. Crossref

  5. Wang Haoliang, Li Qingdong, Ren Zhang, Control-oriented credibility assessment of air-breathing hypersonic vehicle model, 2017 29th Chinese Control And Decision Conference (CCDC), 2017. Crossref

  6. Honarmandi Pejman, Arróyave Raymundo, Uncertainty Quantification and Propagation in Computational Materials Science and Simulation-Assisted Materials Design, Integrating Materials and Manufacturing Innovation, 9, 1, 2020. Crossref

  7. Schroeder Benjamin B., Silva Humberto, Smith Kyle D., Separability of Mesh Bias and Parametric Uncertainty for a Full System Thermal Analysis, Journal of Verification, Validation and Uncertainty Quantification, 3, 3, 2018. Crossref

  8. Lu Lu, Anderson‐Cook Christine M., Input‐response space‐filling designs, Quality and Reliability Engineering International, 37, 8, 2021. Crossref

  9. Ye Jiahui, Mahmoudi Mohamad, Karayagiz Kubra, Johnson Luke, Seede Raiyan, Karaman Ibrahim, Arroyave Raymundo, Elwany Alaa, Bayesian Calibration of Multiple Coupled Simulation Models for Metal Additive Manufacturing: A Bayesian Network Approach, ASCE-ASME J Risk and Uncert in Engrg Sys Part B Mech Engrg, 8, 1, 2022. Crossref

  10. Jiang Chen, Vega Manuel A., Ramancha Mukesh K., Todd Michael D., Conte Joel P., Parno Matthew, Hu Zhen, Bayesian calibration of multi-level model with unobservable distributed response and application to miter gates, Mechanical Systems and Signal Processing, 170, 2022. Crossref

  11. Jia Xinyu, Papadimitriou Costas, Hierarchical Bayesian learning framework for multi-level modeling using multi-level data, Mechanical Systems and Signal Processing, 179, 2022. Crossref

Begell Digital Portal Begell 数字图书馆 电子图书 期刊 参考文献及会议录 研究收集 订购及政策 Begell House 联系我们 Language English 中文 Русский Português German French Spain