论文标题

基于机器学习的替代建模,具有SVD的启用培训,可对非线性民用结构进行动态载荷

Machine learning based surrogate modeling with SVD enabled training for nonlinear civil structures subject to dynamic loading

论文作者

Parida, Siddharth S., Bose, Supratik, Butcher, Megan, Apostolakis, Georgios, Shekhar, Prashant

论文摘要

通过有限元(FE)模型对工程需求参数(EDP)的计算昂贵估计,同时考虑地震和参数不确定性限制了基于绩效的地震工程框架的使用。已经尝试用替代模型代替FE模型,但是,这些模型中的大多数仅是构建参数的函数。这需要重新训练替代物以前未见地震。在本文中,作者提出了一个基于机器学习的替代模型框架,该框架考虑了这两种不确定性,以预测看不见的地震。因此,地震的特征在于使用代表性地面运动套件的SVD计算出的正顺序。这使人们可以通过随机采样这些权重并以基础繁殖来产生大量的地震。权重以及本构参数作为用EDP作为所需输出的机器学习模型的输入。测试了四个竞争机器学习模型,并观察到深神经网络(DNN)给出了最准确的预测。该框架通过使用它成功地预测使用棍棒模型代表的一层楼和三层建筑的峰值响应来验证该框架,并受到看不见的远场地面运动。

The computationally expensive estimation of engineering demand parameters (EDPs) via finite element (FE) models, while considering earthquake and parameter uncertainty limits the use of the Performance Based Earthquake Engineering framework. Attempts have been made to substitute FE models with surrogate models, however, most of these models are a function of building parameters only. This necessitates re-training for earthquakes not previously seen by the surrogate. In this paper, the authors propose a machine learning based surrogate model framework, which considers both these uncertainties in order to predict for unseen earthquakes. Accordingly,earthquakes are characterized by their projections on an orthonormal basis, computed using SVD of a representative ground motion suite. This enables one to generate large varieties of earthquakes by randomly sampling these weights and multiplying them with the basis. The weights along with the constitutive parameters serve as inputs to a machine learning model with EDPs as the desired output. Four competing machine learning models were tested and it was observed that a deep neural network (DNN) gave the most accurate prediction. The framework is validated by using it to successfully predict the peak response of one-story and three-story buildings represented using stick models, subjected to unseen far-field ground motions.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源