Skip to content

Commit 08cd533

Browse files
author
hfhoffman1144
committed
materials first commit
1 parent a7286ce commit 08cd533

File tree

9 files changed

+3098
-988
lines changed

9 files changed

+3098
-988
lines changed

huggingface-transformers/README.md

Lines changed: 24 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,24 @@
1+
# HuggingFace Transformers: Leverage Open-Source AI in Python
2+
3+
This repo contains the materials for the tutorial (HuggingFace Transformers: Leverage Open-Source AI in Python)[https://realpython.com/huggingface-transformers-open-source-ai-in-python/].
4+
5+
Transformers is available on [PyPI](https://pypi.org/), and you can install it with [pip](https://realpython.com/what-is-pip/). Open a terminal or command prompt, create a new virtual environment, and then run the following command:
6+
7+
```console
8+
(venv) $ python -m pip install transformers
9+
```
10+
11+
This command will install the latest version of Transformers from PyPI onto your machine. Throughout this tutorial, you'll also leverage [Pytorch](https://realpython.com/pytorch-vs-tensorflow/) to interact with models at a lower level. You can install Pytorch with the following command:
12+
13+
```console
14+
(venv) $ python -m pip install torch
15+
```
16+
17+
To verify that the installations were successful, start a [Python REPL](https://realpython.com/python-repl/) and import `transformers` and `torch`:
18+
19+
```pycon
20+
>>> import transformers
21+
>>> import torch
22+
```
23+
24+
If the imports run without errors, then you've successfully installed the dependencies needed for this tutorial.

huggingface-transformers/auto_classes.py

Lines changed: 5 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -2,11 +2,12 @@
22
from transformers import (
33
AutoTokenizer,
44
AutoModelForSequenceClassification,
5-
pipeline,
5+
AutoConfig,
66
)
77

88
model_name = "cardiffnlp/twitter-roberta-base-sentiment-latest"
99

10+
config = AutoConfig.from_pretrained(model_name)
1011
tokenizer = AutoTokenizer.from_pretrained(model_name)
1112
model = AutoModelForSequenceClassification.from_pretrained(model_name)
1213

@@ -18,10 +19,7 @@
1819

1920
scores = output.logits[0]
2021
probabilities = torch.softmax(scores, dim=0)
21-
predicted_class = probabilities.argmax().item()
2222

23-
print(f"Predicted class: {predicted_class}")
24-
print(f"Probabilities: {probabilities.tolist()}")
25-
26-
full_pipeline = pipeline(model=model_name)
27-
print(full_pipeline(text))
23+
for i, prob in enumerate(probabilities):
24+
label = config.id2label[i]
25+
print(f"{i + 1}) {label}: {prob}")

0 commit comments

Comments
 (0)