Skip to content

Commit

Permalink
changed the position of the slope and independent term equations.
Browse files Browse the repository at this point in the history
  • Loading branch information
fabarrios committed Oct 18, 2024
1 parent b2827ff commit 1f896b7
Show file tree
Hide file tree
Showing 2 changed files with 9 additions and 9 deletions.
8 changes: 4 additions & 4 deletions LinearModel/LinearModel.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,10 @@ The term "regression" was introduced by Francis Galton (Darwin's nephew) during

The general equation for the straight line is $y = mx + b_0$, this form is the "slope, intersection form". The slope is the rate of change the gives the change in $y$ for a unit change in $x$. Remember that the slope formula for two pair of points $(x_1, y_1)$ and $(x_2, y_2)$ is:
$$ m = \frac{(y_2 - y_1)}{(x_2 - x_1)} $$
Using the expression for all the lines $y_i = \beta_{0_i} + \beta_{1_i}x_i$ for a set of points $(x_i, y_i)$, finding the minimum of the addition of all the differences with the ideal line we estimate the expression for the "best" slope $\hat{\beta_{1}}$ and the independent term $\hat{\beta_{0}}$:
Using the expression for all the lines $y_i = \beta_{0_i} + \beta_{1_i}x_i$ for a set of points $(x_i, y_i)$, finding the minimum of the addition of all the differences with the ideal line we estimate the expression for the "best" slope $\hat{\beta_{1}}$ and the independent term $\hat{\beta_{0}}$:

$$\hat{\beta_{1}} = \frac{\sum_{i=1}^{n} (x_i - \bar{x})(y_i -\bar{y} )}{\sum_{i=1}^{n}(x_i - \bar{x})^2} $$
$$\hat{\beta_{0}} = \bar{y} - \hat{\beta_{1}} \bar x $$

The basic properties we know about one variable linear regression are:
The correlation measures the strength of the relationship between x and y (see this shiny app for an excellent visual overview of correlations).
Expand All @@ -52,9 +55,6 @@ The basic properties we know about one variable linear regression are:
The slope of the line is defined as the change in $y$ over the change in $x$; $m= \frac{\Delta y}{\Delta x}$.
For regression use the ratio of the standard deviations such that the correlation is defined as $m=r\frac{s_y}{s_x}$ where $m$ is the slope, $r$ is the correlation and $\bar{x}$ and $\bar{y}$ the mean, and $s$ is the sample standard deviation.

$$\hat{\beta_{1}} = \frac{\sum_{i=1}^{n} (x_i - \bar{x})(y_i -\bar{y} )}{\sum_{i=1}^{n}(x_i - \bar{x})^2} $$

$$\hat{\beta_{0}} = \bar{y} - \hat{\beta_{1}} \bar x $$

## Example of linear regression

Expand Down
10 changes: 5 additions & 5 deletions LinearModel/LinearModel.html
Original file line number Diff line number Diff line change
Expand Up @@ -1578,6 +1578,11 @@ <h2>The Least-Square linear regression</h2>
addition of all the differences with the ideal line we estimate the
expression for the “best” slope <span class="math inline">\(\hat{\beta_{1}}\)</span> and the independent term
<span class="math inline">\(\hat{\beta_{0}}\)</span>:</p>
<p><span class="math display">\[\hat{\beta_{1}} = \frac{\sum_{i=1}^{n}
(x_i - \bar{x})(y_i -\bar{y} )}{\sum_{i=1}^{n}(x_i - \bar{x})^2}
\]</span><br />
<span class="math display">\[\hat{\beta_{0}} = \bar{y} - \hat{\beta_{1}}
\bar x \]</span></p>
<p>The basic properties we know about one variable linear regression
are:<br />
The correlation measures the strength of the relationship between x and
Expand All @@ -1589,11 +1594,6 @@ <h2>The Least-Square linear regression</h2>
\frac{\Delta y}{\Delta x}\)</span>.<br />
For regression use the ratio of the standard deviations such that the
correlation is defined as <span class="math inline">\(m=r\frac{s_y}{s_x}\)</span> where <span class="math inline">\(m\)</span> is the slope, <span class="math inline">\(r\)</span> is the correlation and <span class="math inline">\(\bar{x}\)</span> and <span class="math inline">\(\bar{y}\)</span> the mean, and <span class="math inline">\(s\)</span> is the sample standard deviation.</p>
<p><span class="math display">\[\hat{\beta_{1}} = \frac{\sum_{i=1}^{n}
(x_i - \bar{x})(y_i -\bar{y} )}{\sum_{i=1}^{n}(x_i - \bar{x})^2}
\]</span></p>
<p><span class="math display">\[\hat{\beta_{0}} = \bar{y} -
\hat{\beta_{1}} \bar x \]</span></p>
</div>
<div id="example-of-linear-regression" class="section level2">
<h2>Example of linear regression</h2>
Expand Down

0 comments on commit 1f896b7

Please sign in to comment.