论文标题
通过在线梯度下降对随机微分方程的参数估计
Parametric estimation of stochastic differential equations via online gradient descent
论文作者
论文摘要
我们提出了一种基于在线梯度下降的随机微分方程的在线参数估计方法,并具有离散观察结果和错误指定的建模。我们的研究为估计量的风险提供了统一的上限。界限的推导涉及三个基本的理论结果:基于依赖和偏见的亚级别的随机镜下降算法的分析,同时的扩散过程类别的指数呈指数性,以及其近似随机亚级别的损失函数的建议仅取决于已知的模型和观察。
We propose an online parametric estimation method of stochastic differential equations with discrete observations and misspecified modelling based on online gradient descent. Our study provides uniform upper bounds for the risks of the estimators over a family of stochastic differential equations. The derivation of the bounds involves three underlying theoretical results: the analysis of the stochastic mirror descent algorithm based on dependent and biased subgradients, the simultaneous exponential ergodicity of classes of diffusion processes, and the proposal of loss functions whose approximated stochastic subgradients are dependent only on the known model and observations.