Skip to content

Commit a035d2a

Browse files
authored
Update keras-nlp landing page for 0.4 release (#1168)
* Update keras-nlp landing page for 0.4 release * Update ecosystem page * Address comments * Address comments 2
1 parent 29be689 commit a035d2a

File tree

2 files changed

+74
-65
lines changed

2 files changed

+74
-65
lines changed

templates/getting_started/ecosystem.md

Lines changed: 6 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -17,13 +17,12 @@ KerasTuner is an easy-to-use, scalable hyperparameter optimization framework tha
1717

1818
[KerasNLP Documentation](/keras_nlp/) - [KerasNLP GitHub repository](https://github.com/keras-team/keras-nlp)
1919

20-
KerasNLP is a simple and powerful API for building Natural Language
21-
Processing (NLP) models. KerasNLP provides modular building blocks following
22-
standard Keras interfaces (layers, metrics) that allow you to quickly and
23-
flexibly iterate on your task. Engineers working in applied NLP can leverage the
24-
library to assemble training and inference pipelines that are both
25-
state-of-the-art and production-grade. KerasNLP is maintained directly by the
26-
Keras team.
20+
KerasNLP is a natural language processing library that supports users through
21+
their entire development cycle. Our workflows are built from modular components
22+
that have state-of-the-art preset weights and architectures when used
23+
out-of-the-box and are easily customizable when more control is needed. We
24+
emphasize in-graph computation for all workflows so that developers can expect
25+
easy productionization using the TensorFlow ecosystem.
2726

2827
---
2928

templates/keras_nlp/index.md

Lines changed: 68 additions & 58 deletions
Original file line numberDiff line numberDiff line change
@@ -2,20 +2,24 @@
22

33
<a class="github-button" href="https://github.com/keras-team/keras-nlp" data-size="large" data-show-count="true" aria-label="Star keras-team/keras-nlp on GitHub">Star</a>
44

5-
KerasNLP is a simple and powerful API for building Natural Language Processing
6-
(NLP) models within the Keras ecosystem.
7-
8-
KerasNLP provides modular building blocks following standard Keras interfaces
9-
(layers, metrics) that allow you to quickly and flexibly iterate on your task.
10-
Engineers working in applied NLP can leverage the library to assemble training
11-
and inference pipelines that are both state-of-the-art and production-grade.
12-
13-
KerasNLP can be understood as a horizontal extension of the Keras API:
14-
components are first-party Keras objects that are too specialized to be
15-
added to core Keras, but that receive the same level of polish as the rest of
16-
the Keras API.
17-
18-
KerasNLP is also new and growing! If you are interested in contributing, please
5+
KerasNLP is a natural language processing library that supports users through
6+
their entire development cycle. Our workflows are built from modular components
7+
that have state-of-the-art preset weights and architectures when used
8+
out-of-the-box and are easily customizable when more control is needed. We
9+
emphasize in-graph computation for all workflows so that developers can expect
10+
easy productionization using the TensorFlow ecosystem.
11+
12+
This library is an extension of the core Keras API; all high-level modules are
13+
[`Layers`](/api/layers/) or [`Models`](/api/models/) that recieve that same
14+
level of polish as core Keras. If you are familiar with Keras, congratulations!
15+
You already understand most of KerasNLP.
16+
17+
See our [Getting Started guide](/guides/keras_nlp/getting_started)
18+
for example usage of our modular API starting with evaluating pretrained models
19+
and building up to designing a novel transformer architecture and training a
20+
tokenizer from scratch.
21+
22+
KerasNLP is new and growing! If you are interested in contributing, please
1923
check out our
2024
[contributing guide](https://github.com/keras-team/keras-nlp/blob/master/CONTRIBUTING.md).
2125

@@ -24,10 +28,10 @@ check out our
2428

2529
* [KerasNLP API reference](/api/keras_nlp/)
2630
* [KerasNLP on GitHub](https://github.com/keras-team/keras-nlp)
27-
2831
---
2932
## Guides
3033

34+
* [Getting Started with KerasNLP](/guides/keras-nlp/getting_started/)
3135
* [Pretraining a Transformer from scratch](/guides/keras_nlp/transformer_pretraining/)
3236

3337
---
@@ -39,64 +43,69 @@ check out our
3943
---
4044
## Installation
4145

42-
KerasNLP requires **Python 3.7+** and **TensorFlow 2.9+**.
43-
44-
Install the latest release:
46+
To install the latest official release:
4547

4648
```
4749
pip install keras-nlp --upgrade
4850
```
4951

50-
You can check out release notes and versions on our
51-
[releases page](https://github.com/keras-team/keras-nlp/releases).
52+
To install the latest unreleased changes to the library, we recommend using
53+
pip to install directly from the master branch on github:
5254

53-
KerasNLP is currently in pre-release (0.y.z) development. Until version 1.0, we
54-
may break compatibility at any time and APIs should not be considered stable.
55+
```
56+
pip install git+https://github.com/keras-team/keras-nlp.git --upgrade
57+
```
5558

56-
---
57-
## Quick introduction
59+
## Quickstart
5860

59-
The following snippet will tokenize some text, build a tiny transformer, and
60-
train a single batch.
61+
Fine-tune BERT on a small sentiment analysis task using the
62+
[`keras_nlp.models`](/api/keras_nlp/models/) API:
6163

6264
```python
6365
import keras_nlp
64-
import tensorflow as tf
6566
from tensorflow import keras
67+
import tensorflow_datasets as tfds
6668

67-
# Tokenize some inputs with a binary label.
68-
vocab = ["[UNK]", "the", "qu", "##ick", "br", "##own", "fox", "."]
69-
sentences = ["The quick brown fox jumped.", "The fox slept."]
70-
tokenizer = keras_nlp.tokenizers.WordPieceTokenizer(
71-
vocabulary=vocab,
72-
sequence_length=10,
69+
imdb_train, imdb_test = tfds.load(
70+
"imdb_reviews",
71+
split=["train", "test"],
72+
as_supervised=True,
73+
batch_size=16,
74+
)
75+
classifier = keras_nlp.models.BertClassifier.from_preset(
76+
"bert_base_en_uncased",
7377
)
74-
x, y = tokenizer(sentences), tf.constant([1, 0])
75-
76-
# Create a tiny transformer.
77-
inputs = keras.Input(shape=(None,), dtype="int32")
78-
outputs = keras_nlp.layers.TokenAndPositionEmbedding(
79-
vocabulary_size=len(vocab),
80-
sequence_length=10,
81-
embedding_dim=16,
82-
)(inputs)
83-
outputs = keras_nlp.layers.TransformerEncoder(
84-
num_heads=4,
85-
intermediate_dim=32,
86-
)(outputs)
87-
outputs = keras.layers.GlobalAveragePooling1D()(outputs)
88-
outputs = keras.layers.Dense(1, activation="sigmoid")(outputs)
89-
model = keras.Model(inputs, outputs)
90-
91-
# Run a single batch of gradient descent.
92-
model.compile(optimizer="rmsprop", loss="binary_crossentropy", jit_compile=True)
93-
model.train_on_batch(x, y)
78+
classifier.compile(
79+
loss=keras.losses.SparseCategoricalCrossentropy(from_logits=True),
80+
optimizer=keras.optimizers.experimental.AdamW(5e-5),
81+
metrics=keras.metrics.SparseCategoricalAccuracy(),
82+
jit_compile=True,
83+
)
84+
classifier.fit(
85+
imdb_train,
86+
validation_data=imdb_test,
87+
epochs=1,
88+
)
89+
90+
# Predict a new example
91+
classifier.predict(["What an amazing movie, three hours of pure bliss!"])
9492
```
9593

96-
To see an end-to-end example using KerasNLP, check out our guide on
97-
[pre-training a transfomer from scratch](/guides/keras_nlp/transformer_pretraining/).
94+
## Compatibility
95+
96+
We follow [Semantic Versioning](https://semver.org/), and plan to
97+
provide backwards compatibility guarantees both for code and saved models built
98+
with our components. While we continue with pre-release `0.y.z` development, we
99+
may break compatibility at any time and APIs should not be consider stable.
100+
101+
## Disclaimer
102+
103+
KerasNLP provides access to pre-trained models via the `keras_nlp.models` API.
104+
These pre-trained models are provided on an "as is" basis, without warranties
105+
or conditions of any kind. The following underlying models are provided by third
106+
parties, and subject to separate licenses:
107+
DistilBERT, RoBERTa, XLM-RoBERTa, DeBERTa, and GPT-2.
98108

99-
---
100109
## Citing KerasNLP
101110

102111
If KerasNLP helps your research, we appreciate your citations.
@@ -105,8 +114,9 @@ Here is the BibTeX entry:
105114
```bibtex
106115
@misc{kerasnlp2022,
107116
title={KerasNLP},
108-
author={Watson, Matthew, and Qian, Chen, and Zhu, Scott and Chollet, Fran\c{c}ois and others},
117+
author={Watson, Matthew, and Qian, Chen, and Bischof, Jonathan and Chollet,
118+
Fran\c{c}ois and others},
109119
year={2022},
110120
howpublished={\url{https://github.com/keras-team/keras-nlp}},
111121
}
112-
```
122+
```

0 commit comments

Comments
 (0)