Skip to content

Commit

Permalink
updated backtest environment references
Browse files Browse the repository at this point in the history
  • Loading branch information
stefan-jansen committed Feb 27, 2021
1 parent 255fa9f commit 5dd3891
Show file tree
Hide file tree
Showing 4 changed files with 8 additions and 4 deletions.
4 changes: 2 additions & 2 deletions 08_ml4t_workflow/04_ml4t_workflow_with_zipline/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ Quantopian first released Zipline in 2012 as version 0.5, and the latest version

Please follow the instructions in the [installation](../../installation/) directory to use the patched Zipline version that we'll use for the examples in this book.

> To run the code examples in this section, activate the `ml4t-zipline` `conda` environment, or otherwise install and use the patched Zipline version reference above.
> This notebook uses the `conda` environment `backtest`. Please see the installation [instructions](../../installation/README.md) for downloading the latest Docker image or alternative ways to set up your environment.
## Zipline Architecture

Expand All @@ -49,7 +49,7 @@ Zipline relies on the [Trading Calendars](https://www.zipline.io/trading-calenda

After installation, the command `zipline ingest -b bundle` lets you install the Quandl Wiki dataset (daily frequency) right away. The result ends up in the `.zipline` directory that by default resides in your home folder but can modify the location by setting the `ZIPLINE_ROOT` environment variable . In addition, you can design your own bundles with OHLCV data.

A shortcoming of bundles is that they do not let you store data other than price and volume information. However, two alternatives let you accomplish this: the `fetch_csv()` function downloads DataFrames from a URL and was designed for other Quandl data sources, e.g. fundamentals. Zipline reasonably expects the data to refer to the same securities for which you have provided OHCLV data and aligns the bars accordingly. It’s not very difficult to make minor changes to the library's source code to load from local CSV or HDF5 using pandas instead, and the [patched version](https://github.com/stefan-jansen/zipline) included in the `conda` environment `ml4t-zipline` includes this modification.
A shortcoming of bundles is that they do not let you store data other than price and volume information. However, two alternatives let you accomplish this: the `fetch_csv()` function downloads DataFrames from a URL and was designed for other Quandl data sources, e.g. fundamentals. Zipline reasonably expects the data to refer to the same securities for which you have provided OHCLV data and aligns the bars accordingly. It’s not very difficult to make minor changes to the library's source code to load from local CSV or HDF5 using pandas instead, and the [patched version](https://github.com/stefan-jansen/zipline) included in the `conda` environment `backtest` includes this modification.

In addition, the `DataFrameLoader` and the `BlazeLoader` permit you to feed additional attributes to a `Pipeline` (see the `DataFrameLoader` demo later in the chapter). The `BlazeLoader` can interface with numerous sources, including databases. However, since the Pipeline API is limited to daily data, `fetch_csv()` will be critical to adding features at minute frequency as we will do in later chapters.

Expand Down
2 changes: 1 addition & 1 deletion 09_time_series_models/02_arima_models.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -13646,7 +13646,7 @@
"name": "stderr",
"output_type": "stream",
"text": [
" 12%|█ | 10/81 [10:43<1:24:30, 71.42s/it]"
" 21%|█ | 17/81 [20:56<1:44:53, 98.34s/it]"
]
}
],
Expand Down
2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,8 @@ Another innovation of the second edition is to replicate several trading applica

All applications now use the latest available (at the time of writing) software versions such as pandas 1.0 and TensorFlow 2.2. There is also a customized version of Zipline that makes it easy to include machine learning model predictions when designing a trading strategy.

> Update: release 2.0 updates to Python 3.8, Pandas 1.2, and TensorFlow 1.2, among others; the Zipline backtesting environment with now uses Python 3.6.
## Installation instructions, Data Sources and Bug Reports

The code examples rely on a wide range of Python libraries from the data science and finance domains. To facilitate installation, we use [Docker](https://www.docker.com/get-started) to provide containerized [conda](https://docs.conda.io/en/latest/) environments.
Expand Down
4 changes: 3 additions & 1 deletion installation/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
# Installation instructions

This book uses (mostly) Python 3.7 and various ML- and trading-related libraries available in three different [conda environments](https://docs.conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html) based on the [Miniconda](https://docs.conda.io/en/latest/miniconda.html) distribution. I developed the content on Ubuntu 20.04 while also testing on Mac OS 10.15 (Catalina).
This book uses (mostly) Python 3.7 and various ML- and trading-related libraries available in three different [conda environments](https://docs.conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html) based on the [Miniconda](https://docs.conda.io/en/latest/miniconda.html) distribution. I developed the content on Ubuntu 20.04 while also testing on Mac OS 10.15 (Catalina).

> Update: release 2.0 reduces the number of environments to 2 ndbumps the Python version to 3.8 for the main `ml4t` and to 3.6 for the `backtest` environment.
Depending on your OS, you may have several options to create these environments. These are, in increasing order of complexity:
1. **Recommended**: use [Docker](https://www.docker.com/) Desktop to pull an image from [Docker Hub](https://www.docker.com/products/docker-hub) and create a local container with the requisite software to run the notebooks.
Expand Down

0 comments on commit 5dd3891

Please sign in to comment.