@@ -133,7 +133,7 @@ API key we may run:
133133 though we may still run ``ingest `` as an anonymous quandl user (with no API
134134key). We may also set the ``QUANDL_DOWNLOAD_ATTEMPTS `` environment variable to
135135an integer which is the number of attempts that should be made to download data
136- from quandls servers. By default ``QUANDL_DOWNLOAD_ATTEMPTS `` will be 5, meaning
136+ from quandl's servers. By default ``QUANDL_DOWNLOAD_ATTEMPTS `` will be 5, meaning
137137that we will retry each attempt 5 times.
138138
139139.. note ::
@@ -180,7 +180,7 @@ The signature of the ingest function should be:
180180
181181``environ `` is a mapping representing the environment variables to use. This is
182182where any custom arguments needed for the ingestion should be passed, for
183- example: the ``quandl `` bundle uses the enviornment to pass the API key and the
183+ example: the ``quandl `` bundle uses the environment to pass the API key and the
184184download retry attempt count.
185185
186186``asset_db_writer ``
@@ -227,7 +227,7 @@ used to convert data into zipline's internal bcolz format to later be read by a
227227provided, users should call
228228:meth: `~zipline.data.minute_bars.BcolzDailyBarWriter.write ` with an iterable of
229229(sid dataframe) tuples. The ``show_progress `` argument should also be forwarded
230- to this method. If the data shource does not provide daily data, then there is
230+ to this method. If the data source does not provide daily data, then there is
231231no need to call the write method. It is also acceptable to pass an empty
232232iterable to :meth: `~zipline.data.minute_bars.BcolzMinuteBarWriter.write ` to
233233signal that there is no daily data. If no daily data is provided but minute data
@@ -280,7 +280,7 @@ an ingestion crashes part way through. The idea is that the ingest function
280280should check the cache for raw data, if it doesn't exist in the cache, it should
281281acquire it and then store it in the cache. Then it can parse and write the
282282data. The cache will be cleared only after a successful load, this prevents the
283- ingest function from needing to redownload all the data if there is some bug in
283+ ingest function from needing to re-download all the data if there is some bug in
284284the parsing. If it is very fast to get the data, for example if it is coming
285285from another local file, then there is no need to use this cache.
286286
@@ -331,7 +331,7 @@ Once you have your data in the correct format, you can edit your ``extension.py`
331331.. code-block :: python
332332
333333 import pandas as pd
334-
334+
335335 from zipline.data.bundles import register
336336 from zipline.data.bundles.csvdir import csvdir_equities
337337
@@ -367,8 +367,8 @@ To finally ingest our data, we can run:
367367 Loading custom pricing data: [# #######################------------] 66% | FAKE1: sid 1
368368 Loading custom pricing data: [# ###################################] 100% | FAKE2: sid 2
369369 Loading custom pricing data: [# ###################################] 100%
370- Merging daily equity files: [# ###################################]
371-
370+ Merging daily equity files: [# ###################################]
371+
372372 # optionally, we can pass the location of our csvs via the command line
373373 $ CSVDIR= /path/to/your/csvs zipline ingest -b custom-csvdir-bundle
374374
0 commit comments