RLS算法-公式初探
作者:互联网
RLS算法-公式推导
不带遗忘因子的推导:递推最小二乘法推导(RLS)——全网最简单易懂的推导过程 - 阿Q在江湖的文章 - 知乎
https://zhuanlan.zhihu.com/p/111758532
对于一组观测点\((x_1, y_1)\),\((x_2, y_2)\),\(\cdots\),\((x_n, y_n)\),有如下优化问题:
\[err_{min} = min \sum_{i=1}^n \zeta^{n - i} (kx_i + b - y_i)^2 \]记\(f(k,b) = \sum_{i=1}^n \zeta^{n - i} (kx_i + b - y_i)^2\),分别对\(k,b\)求偏导,令其等于\(0\),有
\[\begin{cases} \frac {\partial f } {\partial k } = \sum_{i=1}^n \zeta^{n - i} (kx_i + b - y_i)x_i = 0 \\ \quad \\ \frac {\partial f } {\partial b } = \sum_{i=1}^n \zeta^{n - i} (kx_i + b - y_i) = 0 \end{cases} \]改写成矩阵形式:
\[\begin{pmatrix}\sum_{i=1}^n\zeta^{n-i}x_i^2 & \sum_{i=1}^n\zeta^{n-i}x_i \\ \quad \\ \sum_{i=1}^n\zeta^{n-i}x_i^2 & \sum_{i=1}^n\zeta^{n-i} \end{pmatrix} * \begin{pmatrix} k \\ \quad \\ b \end{pmatrix} = \begin{pmatrix} \sum_{i=1}^n\zeta^{n-i}x_iy_i \\ \quad \\ \sum_{i=1}^n\zeta^{n-i}y_i \end{pmatrix} \tag{1} \]细致一点:
\[\begin{pmatrix} x_1&x_2& \cdots &x_n \\ 1 & 1 & \cdots &1 \end{pmatrix} * \begin{pmatrix} \zeta^{n-1} & 0& \cdots & 0 \\ 0 & \zeta^{n-2} & \cdots &0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & \zeta^0 \end{pmatrix} * \begin{pmatrix} x_1 & 1 \\ x_2 & 1 \\ \vdots & \vdots \\ x_n &1 \end{pmatrix} * \begin{pmatrix} k \\ b \end{pmatrix} = \begin{pmatrix} x_1&x_2& \cdots &x_n \\ 1 & 1 & \cdots &1 \end{pmatrix} * \begin{pmatrix} \zeta^{n-1} & 0& \cdots & 0 \\ 0 & \zeta^{n-2} & \cdots &0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & \zeta^0 \end{pmatrix} * \begin{pmatrix} y_1 \\ y_2 \\ \vdots \\ y_n \end{pmatrix} \]记 公式\((1)\)中的系数矩阵的逆矩阵为 \(R(n)\) ,\(n\)代表观测数据的组数,有:
\[\begin{aligned} R(n)=\begin{pmatrix}\sum_{i=1}^n\zeta^{n-i}x_i^2 & \sum_{i=1}^n\zeta^{n-i}x_i \\ \quad \\ \sum_{i=1}^n\zeta^{n-i}x_i^2 & \sum_{i=1}^n\zeta^{n-i} \end{pmatrix} ^{-1} &= \Bigg( \zeta \begin{pmatrix}\sum_{i=1}^{n-1}\zeta^{n-1-i}x_i^2 & \sum_{i=1}^{n-1}\zeta^{n-1-i}x_i \\ \quad \\ \sum_{i=1}^{n-1}\zeta^{n-1-i}x_i^2 & \sum_{i=1}^{n-1}\zeta^{n-1-i} \end{pmatrix} + \begin{pmatrix} x_n^2 & x_n \\ \quad \\ x_n & \zeta^0 \end{pmatrix} \Bigg)^{-1} \\ \quad \\ &= \Bigg(\zeta R^{-1}(n-1) + \begin{pmatrix} x_n \\ \quad \\1 \end{pmatrix} * \begin{pmatrix} x_n & 1 \end{pmatrix} \Bigg)^{-1} \end{aligned} \]记
\[ \phi(i) = \begin{pmatrix} x_i \\ \quad \\ 1 \end{pmatrix} \]根据矩阵引逆定理,展开上式可得:
\[R(n) = \frac{R(n-1)}{\zeta} - \frac{R(n-1) \phi(n) \phi^T(n)R(n-1)}{\zeta^2 + \zeta \phi^T(n)R(n-1)\phi(n)} \tag{2} \]记公式\((1)\)中右边结果值矩阵为 \(D(n)\),有
\[\begin{aligned} D(n) = \begin{pmatrix} \sum_{i=1}^n\zeta^{n-i}x_iy_i \\ \quad \\ \sum_{i=1}^n\zeta^{n-i}y_i \end{pmatrix} &= \zeta \begin{pmatrix} \sum_{i=1}^{n-1}\zeta^{n-1-i}x_iy_i \\ \quad \\ \sum_{i=1}^{n-1}\zeta^{n-1-i}y_i \end{pmatrix} + \begin{pmatrix} x_ny_n \\ \quad \\ y_n \end{pmatrix} \\ \quad\\ &= \zeta D(n-1) + \phi(n) * \begin{pmatrix} y_n \end{pmatrix} \end{aligned} \]记
\[\Theta = \begin{pmatrix} k \\ \quad \\ b \end{pmatrix} \]根据公式\((1)\)可得:
\[\begin{aligned} \Theta(n) = R(n)D(n) &= R(n)[\zeta D(n-1) + \phi(n) * \begin{pmatrix} y_n \end{pmatrix}] \\ &= R(n)[\zeta R^{-1}(n-1) \Theta(n-1) + \phi(n) * \begin{pmatrix} y_n \end{pmatrix}] \\ &= R(n)[\zeta \frac{(R^{-1}(n) - \phi(n)\phi^T(n))}{\zeta}\Theta (n-1) + \phi(n) * \begin{pmatrix} y_n \end{pmatrix}] \\ &= \Theta(n-1) + R(n)\phi(n)[\begin{pmatrix} y_n \end{pmatrix} - \phi^T(n)\Theta (n-1)] \end{aligned}\]总结:只需要计算得到\(\Theta(n - 1)\)和\(R(n)\),就可以递推出\(\Theta(n)\)!
标签:begin,end,sum,算法,pmatrix,初探,quad,zeta,RLS 来源: https://www.cnblogs.com/zgglj-com/p/15747722.html