From 5025c037d5f5d6ebe9d82e5fae0b15b967ae7147 Mon Sep 17 00:00:00 2001 From: Christoph Molnar Date: Tue, 24 Apr 2018 18:31:18 +0200 Subject: [PATCH] removes part of sentence --- manuscript/04.2-interpretable-linear.Rmd | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/manuscript/04.2-interpretable-linear.Rmd b/manuscript/04.2-interpretable-linear.Rmd index 9e8ffc2a3..fdb0a0174 100644 --- a/manuscript/04.2-interpretable-linear.Rmd +++ b/manuscript/04.2-interpretable-linear.Rmd @@ -316,4 +316,4 @@ Step-wise procedures: - Backward selection: Same as forward selection, but instead of adding features, start with the model that includes all features and try out which feature you have to remove to get the highest performance increase. Repeat until some stopping criterium is reached. I recommend using Lasso, because it can be automated, looks at all features at the same time and can be controlled via $\lambda$. -It also works for the [logistic regression model](#logistic) for classification, which is the topic of Chapter 4.3. +It also works for the [logistic regression model](#logistic) for classification.