Skip to content

Commit

Permalink
FIX: pdf build for keras lecture (#197)
Browse files Browse the repository at this point in the history
* FIX: pdf build for keras lecture

* remove ipython checkpoints

* add to git ignore

* remove virtual documents
  • Loading branch information
mmcky authored Nov 19, 2024
1 parent 14b6ace commit 3796656
Show file tree
Hide file tree
Showing 2 changed files with 6 additions and 5 deletions.
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
.DS_Store
_build/
lectures/_build/
.ipython_checkpoints
.virtual_documents
9 changes: 4 additions & 5 deletions lectures/keras.md
Original file line number Diff line number Diff line change
Expand Up @@ -211,13 +211,13 @@ Let's print the final MSE on the cross-validation data.

```{code-cell} ipython3
print("Testing loss on the validation set.")
regression_model.evaluate(x_validate, y_validate)
regression_model.evaluate(x_validate, y_validate, verbose=2)
```

Here's our output predictions on the cross-validation data.

```{code-cell} ipython3
y_predict = regression_model.predict(x_validate)
y_predict = regression_model.predict(x_validate, verbose=2)
```

We use the following function to plot our predictions along with the data.
Expand Down Expand Up @@ -265,7 +265,7 @@ Here's the final MSE for the deep learning model.

```{code-cell} ipython3
print("Testing loss on the validation set.")
nn_model.evaluate(x_validate, y_validate)
nn_model.evaluate(x_validate, y_validate, verbose=2)
```

You will notice that this loss is much lower than the one we achieved with
Expand All @@ -274,7 +274,7 @@ linear regression, suggesting a better fit.
To confirm this, let's look at the fitted function.

```{code-cell} ipython3
y_predict = nn_model.predict(x_validate)
y_predict = nn_model.predict(x_validate, verbose=2)
```

```{code-cell} ipython3
Expand All @@ -290,4 +290,3 @@ fig, ax = plt.subplots()
plot_results(x_validate, y_validate, y_predict, ax)
plt.show()
```

0 comments on commit 3796656

Please sign in to comment.