x
|
1
|
2
|
3
|
4
|
5
|
y
|
0.5
|
1.7
|
3.4
|
5.7
|
8.4
|
Solution:
Fit with a Straight Line:
clc
clear all
close all
x = [1 2 3 4 5]; % Provided
x
y = [0.5 1.7 3.4
5.7 8.4]; % provided y
n =
length(x); %
determine the length of x
X = [n sum(x);
sum(x) sum(x.^2)];
Y = [sum(y);
sum(x.*y)];
A = inv(X) *
Y % value of
coefficient a, b in matrix form
y_bar =
sum(y)/n;
s_t = 0; % set
initial value of s_t as 0
for i = 1:n
s_t = s_t + sum(y(i)-y_bar)^2; % sum of squares of errors/residue
between
the actual
data and their arithmetic mean
end
s_t
s_r = 0; % set
initial value of s_r as 0
for i = 1:n
s_r = s_r + sum(y(i)-A(1)-A(2)*x(i))^2; % sum of
the squares of
errors/residue between the actual
data and our predicted model
end
s_r
r =
sqrt(1-(s_r/s_t)); %
correlation coefficient
r
scatter (x, y, '*'); % plot scatter of x and y
hold on
syms xi % define
xi as a symbolic number
yi = A(1)+A(2)*xi; % define
yi function
xi = linspace
(x(1), x(n)); % generate
linearly spaced vector of xi
yi =
subs(yi); % symbolic
substitution of yi
plot (xi,
yi); % plot
curve of xi and yi
grid on % turn on grid view
Output:
A = - 2.0000
1.9800
s_t = 40.1320
s_r = 0.9280
r = 0.9884
Curve:
Fit
with a 2nd degree Polynomial:
clc
clear all
close all
x = [1 2 3 4 5]; % Provided
x
y = [0.5 1.7 3.4
5.7 8.4]; % provided
y
n =
length(x); %
determine the length of x
X = [n sum(x)
sum(x.^2); sum(x) sum(x.^2) sum(x.^3); sum(x.^2) sum(x.^3) sum(x.^4)];
Y = [sum(y);
sum(x.*y); sum((x.^2).*y)];
A = inv(X) *
Y % value of
coefficient a, b, c in
matrix
form
y_bar =
sum(y)/n;
s_t = 0; % set
initial value of s_t as 0
for i = 1:n
s_t = s_t + sum(y(i)-y_bar)^2; % sum of squares of errors/residue
between
the actual data and their
arithmetic
mean
end
s_t
s_r = 0; % set initial value of s_r as 0
for i = 1:n
s_r = s_r + sum(y(i)-A(1)-A(2)*x(i) - A(3)*x(i)^2)^2; % sum of the squares
of
errors/residue between the actual
data and our predicted model
end
s_r
r =
sqrt(1-(s_r/s_t)); %
correlation coefficient
r
scatter (x, y, '*'); % plot
scatter of x and y
hold on
syms xi % define
xi as a symbolic number
yi =
A(1)+A(2)*xi+A(3)*xi*xi; % define
yi function
xi = linspace
(x(1), x(n)); % generate
linearly spaced vector of xi
yi = subs(yi); % symbolic
substitution of yi
plot (xi,
yi); % plot
curve of xi and yi
grid on % turn on grid view
Output:
A =
-0.2000
0.4371
0.2571
s_t = 40.1320
s_r = 0.0023
r = 1.0000
Curve:
Fit With a 3rd degree
Polynomial:
clc
clear all
close all
x = [1 2 3 4
5]; % Provided
x
y = [0.5 1.7 3.4
5.7 8.4]; % provided
y
n =
length(x); %
determine the length of x
X = [n sum(x) sum(x.^2)
sum(x.^3); sum(x) sum(x.^2) sum(x.^3) sum(x.^4); sum(x.^2) sum(x.^3) sum(x.^4)
sum(x.^5); sum(x.^3) sum(x.^4) sum(x.^5) sum(x.^6)];
Y = [sum(y);
sum(x.*y); sum((x.^2).*y); sum((x.^3).*y)];
A = inv(X) *
Y % value of
coefficient a, b, c, d in matrix form
y_bar =
sum(y)/n;
s_t = 0; % set
initial value of s_t as 0
for i = 1:n
s_t = s_t + sum(y(i)-y_bar)^2; % sum of squares of errors/residue
between
the actual
data and their arithmetic mean
end
s_t
s_r = 0; % set
initial value of s_r as 0
for i = 1:n
s_r = s_r + sum(y(i)-A(1)-A(2)*x(i) -
A(3)*x(i)^2 - A(4)*x(i)^3)^2;
% sum of
the squares of errors/residue between the actual data and our predicted model
end
s_r
r =
sqrt(1-(s_r/s_t)); %
correlation coefficient
r
scatter (x, y, '*'); % plot scatter of x and y
hold on
syms xi % define
xi as a symbolic number
yi = A(1) +
A(2)*xi + A(3)*xi*xi + A(4)*xi*xi*xi; % define
yi function
xi = linspace
(x(1), x(n)); % generate
linearly spaced vector of xi
yi =
subs(yi); % symbolic
substitution of yi
plot (xi,
yi); % plot
curve of xi and yi
grid on
Output:
A =
-0.0600
0.2405
0.3321
-0.0083
s_t = 40.1320
s_r = 0.0013
r = 1.0000
Curve:
Fit with a Exponential Curve:
clc
clear all
close all
x = [1 2 3 4 5]; % Provided
x
w = [0.5 1.7 3.4
5.7 8.4]; % set
provided y as w
y = log(w); % turn the
value of w in terms of log
n =
length(x); %
determine length of x
X = [n sum(x);
sum(x) sum(x.^2)];
Y = [sum(y);
sum(x.*y)];
B = inv(X) *
Y % value of
coefficient log(a), b in matrix form
A = [exp(B(1));
B(2)] % revert back to coefficien
of a and b in matrix form
y_bar = sum(w)/n;
s_t = 0; % set
initial value of s_t as 0
for i = 1:n
s_t = s_t + sum(w(i)-y_bar)^2; % sum of squares of errors/residue
between
the actual
data and their arithmetic mean
end
s_t
s_r = 0; % set
initial value of s_r as 0
for i = 1:n
s_r = s_r +
sum(y(i)-log(A(1))-A(2)*x(i))^2; % sum of
the squares of
errors/residue between the actual
data and our predicted model
end
s_r
r =
sqrt(1-(s_r/s_t)); %
correlation coefficient
r
scatter (x, w, 'p'); % plot
scatter of x and y
hold on
syms xi % define
xi as a symbolic number
yi =
A(1)*exp(A(2)*xi); % define yi function
xi = linspace
(x(1), x(n), 100); % generate
linearly spaced vector of xi
yi =
subs(yi); % symbolic
substitution of yi
plot (xi,
yi); % plot
curve of xi and yi
grid on % turn
grid view on
Output:
B =
-1.0698
0.6853
A =
0.3431
0.6853
s_t = 40.1320
s_r = 0.2615
r = 0.9967
Curve:
Comparison:
The
correlation coefficient ‘r’ determines the best fitted curve. In best fitted
curve the value of ‘r’ is close to 1. And the worst fitting curve ‘r’ is close
to 0. Here by the analysis of the above curve fittings we can see the value of
correlation coefficient ‘r’ is 1 in the 2nd order and 3rd
order polynomial. Hence these two are best fitted curve.