Skip to content

Commit bd60eb0

Browse files
Merge pull request #268 from UBC-DSCI/from-import
From import
2 parents 823a3a6 + 15a144d commit bd60eb0

File tree

2 files changed

+10
-1
lines changed

2 files changed

+10
-1
lines changed

source/classification1.md

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -869,6 +869,15 @@ in the `scikit-learn` package (named `sklearn` in Python) will help keep our cod
869869
less we have to code ourselves, the fewer mistakes we will likely make.
870870
Before getting started with $K$-nearest neighbors, we need to tell the `sklearn` package
871871
that we prefer using `pandas` data frames over regular arrays via the `set_config` function.
872+
```{note}
873+
You will notice a new way of importing functions in the code below: `from ... import ...`. This lets us
874+
import *just* `set_config` from `sklearn`, and then call `set_config` without any package prefix.
875+
We will import functions using `from` extensively throughout
876+
this and subsequent chapters to avoid very long names from `scikit-learn`
877+
that clutter the code
878+
(like `sklearn.neighbors.KNeighborsClassifier`, which has 38 characters!).
879+
```
880+
872881
```{code-cell} ipython3
873882
from sklearn import set_config
874883

source/reading.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1268,7 +1268,7 @@ no_official_lang_data.to_csv("data/no_official_languages.csv", index=False)
12681268
%
12691269
% ```{code-cell} ipython3
12701270
% import requests
1271-
% from bs4 import BeautifulSoup
1271+
% import bs4
12721272
% ```
12731273
%
12741274
% Next, we tell Python what page we want to scrape by providing the webpage's URL in quotations to the function `requests.get` and pass it into the `BeautifulSoup` function for parsing:

0 commit comments

Comments
 (0)