Click here to Skip to main content
15,894,132 members
Please Sign up or sign in to vote.
1.00/5 (1 vote)
See more:
Hi every one

I'm simulating RLS algorithm
my code is
C#
w(1)=1;
lambda=0.99;
for i=2:length
    y(i)=x(i)*transpose(w(i-1));%output of filter
   % Correlation_matrix=corr(w(i),'rows','pairwise');
    u(i)=w(i-1)*x(i);
    k(i)=((u(i))/(lambda+transpose(x(i))*u(i)));%kalman
    e(i)=d(i)-y(i);%error
    w(i)=transpose(w(i-1))+k(i)*e(i);%weight of filter
end


In this algorithm we have two signal
1-main signal
2-noise signal
I mixed two signals and get "d" so d=main signal+noise signal and "x" is noise signal
and I should remove or reduce noise from "d" signal by above algorithm
in this line. u(i)=w(i-1)*x(i);
w(i-1) must be correlation matrix I don't know who can I compute it?

thanks in advance
Posted
Updated 17-Jan-15 0:19am
Comments
Kenneth Haugland 17-Jan-15 3:32am    
Are you talking about autocorrelation here?
Sa.Mo 17-Jan-15 7:33am    
Actually this is inverse correlation matrix I need to calculate it but I don't know how do I?

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900