InterviewSolution
| 1. |
Explain the method of fitting a linear equation to the given data using the method of least squares. |
|
Answer» The data available in the form of time series {yt: t = 1, 2, …, n} are the bivariate data, where t is the independent variable and the variable quantity y is the dependent variable. From this data we have to obtain the linear trend that suits to the time series. According to linear regression model, we have to obtain the linear trend model yt = α + βt + ut, where t = 1, 2, …, n and ut is an error variable. According to the least squares method, we determine the values of the constants α and β in such a manner that the sum of the squares of error variable, i.e., Σe2 = Σ(yt – α – βt)2 is minimised. If ‘a’ and ‘b’ denote the estimated values of a and p respectively then ‘α’ and ‘β’ can be obtained by the following formulae: \(b =\frac{ nΣty−(Σt)(Σy)}{nΣt^2−(Σt)^2}\) and a = ȳ – bt̄ where, ȳ = \(\frac{Σy}{n}\); t̄ = \(\frac{Σt}{n}\); n = No. of observations We can obtain the estimates of trend values for all the terms of the time series by this method. If for the given time series there is no linear trend, this method is not useful. |
|