|
| 1 | +# Example inputs for all widget types |
| 2 | + |
| 3 | +## Natural Language Processing |
| 4 | + |
| 5 | +### Fill-Mask |
| 6 | +```yaml |
| 7 | +widget: |
| 8 | +- text: "Paris is the <mask> of France." |
| 9 | + example_title: "Capital" |
| 10 | +- text: "The goal of life is <mask>." |
| 11 | + example_title: "Philosophy" |
| 12 | +``` |
| 13 | +
|
| 14 | +### Question Answering |
| 15 | +```yaml |
| 16 | +widget: |
| 17 | +- text: "What's my name?" |
| 18 | + context: "My name is Clara and I live in Berkeley." |
| 19 | + example_title: "Name" |
| 20 | +- text: "Where do I live?" |
| 21 | + context: "My name is Sarah and I live in London" |
| 22 | + example_title: "Location" |
| 23 | +``` |
| 24 | +
|
| 25 | +### Summarization |
| 26 | +```yaml |
| 27 | +widget: |
| 28 | +- text: "The tower is 324 metres (1,063 ft) tall, about the same height as an 81-storey building, and the tallest structure in Paris. Its base is square, measuring 125 metres (410 ft) on each side. During its construction, the Eiffel Tower surpassed the Washington Monument to become the tallest man-made structure in the world, a title it held for 41 years until the Chrysler Building in New York City was finished in 1930. It was the first structure to reach a height of 300 metres. Due to the addition of a broadcasting aerial at the top of the tower in 1957, it is now taller than the Chrysler Building by 5.2 metres (17 ft). Excluding transmitters, the Eiffel Tower is the second tallest free-standing structure in France after the Millau Viaduct." |
| 29 | + example_title: "Eiffel Tower" |
| 30 | +- text: "Laika, a dog that was the first living creature to be launched into Earth orbit, on board the Soviet artificial satellite Sputnik 2, on November 3, 1957. It was always understood that Laika would not survive the mission, but her actual fate was misrepresented for decades. Laika was a small (13 pounds [6 kg]), even-tempered, mixed-breed dog about two years of age. She was one of a number of stray dogs that were taken into the Soviet spaceflight program after being rescued from the streets. Only female dogs were used because they were considered to be anatomically better suited than males for close confinement." |
| 31 | + example_title: "First in Space" |
| 32 | +``` |
| 33 | +
|
| 34 | +### Table Question Answering |
| 35 | +```yaml |
| 36 | +widget: |
| 37 | +- text: "How many stars does the transformers repository have?" |
| 38 | + table: |
| 39 | + Repository: |
| 40 | + - "Transformers" |
| 41 | + - "Datasets" |
| 42 | + - "Tokenizers" |
| 43 | + Stars: |
| 44 | + - 36542 |
| 45 | + - 4512 |
| 46 | + - 3934 |
| 47 | + Contributors: |
| 48 | + - 651 |
| 49 | + - 77 |
| 50 | + - 34 |
| 51 | + Programming language: |
| 52 | + - "Python" |
| 53 | + - "Python" |
| 54 | + - "Rust, Python and NodeJS" |
| 55 | + example_title: "Github stars" |
| 56 | +``` |
| 57 | +
|
| 58 | +### Text Classification |
| 59 | +```yaml |
| 60 | +widget: |
| 61 | +- text: "I love football so much" |
| 62 | + example_title: "Positive" |
| 63 | +- text: "I don't really like this type of food" |
| 64 | + example_title: "Negative" |
| 65 | +``` |
| 66 | +
|
| 67 | +### Text Generation |
| 68 | +```yaml |
| 69 | +widget: |
| 70 | +- text: "My name is Julien and I like to" |
| 71 | + example_title: "Julien" |
| 72 | +- text: "My name is Merve and my favorite" |
| 73 | + example_title: "Merve" |
| 74 | +``` |
| 75 | +
|
| 76 | +### Text2Text Generation |
| 77 | +```yaml |
| 78 | +widget: |
| 79 | +- text: "My name is Julien and I like to" |
| 80 | + example_title: "Julien" |
| 81 | +- text: "My name is Merve and my favorite" |
| 82 | + example_title: "Merve" |
| 83 | +``` |
| 84 | +
|
| 85 | +### Token Classification |
| 86 | +```yaml |
| 87 | +widget: |
| 88 | +- text: "My name is Sylvain and I live in Paris" |
| 89 | + example_title: "Parisian" |
| 90 | +- text: "My name is Sarah and I live in London" |
| 91 | + example_title: "Londoner" |
| 92 | +``` |
| 93 | +
|
| 94 | +### Translation |
| 95 | +```yaml |
| 96 | +widget: |
| 97 | +- text: "My name is Sylvain and I live in Paris" |
| 98 | + example_title: "Parisian" |
| 99 | +- text: "My name is Sarah and I live in London" |
| 100 | + example_title: "Londoner" |
| 101 | +``` |
| 102 | +
|
| 103 | +### Zero-Shot Classification |
| 104 | +```yaml |
| 105 | +widget: |
| 106 | +- text: "I have a problem with my car that needs to be resolved asap!!" |
| 107 | + candidate_labels: "urgent, not urgent, phone, tablet, computer" |
| 108 | + multi_class: true |
| 109 | + example_title: "Car problem" |
| 110 | +- text: "Last week I upgraded my iOS version and ever since then my phone has been overheating whenever I use your app." |
| 111 | + candidate_labels: "mobile, website, billing, account access" |
| 112 | + multi_class: false |
| 113 | + example_title: "Phone issue" |
| 114 | +``` |
| 115 | +### Sentence Similarity |
| 116 | +```yaml |
| 117 | +widget: |
| 118 | +- source_sentence: "That is a happy person" |
| 119 | + sentences: |
| 120 | + - "That is a happy dog" |
| 121 | + - "That is a very happy person" |
| 122 | + - "Today is a sunny day" |
| 123 | + example_title: "Happy" |
| 124 | +``` |
| 125 | +
|
| 126 | +### Conversational |
| 127 | +```yaml |
| 128 | +widget: |
| 129 | +- text: "Hey my name is Julien! How are you?" |
| 130 | + example_title: "Julien" |
| 131 | +- text: "Hey my name is Clara! How are you?" |
| 132 | + example_title: "Clara" |
| 133 | +``` |
| 134 | +
|
| 135 | +### Feature Extraction |
| 136 | +```yaml |
| 137 | +widget: |
| 138 | +- text: "My name is Sylvain and I live in Paris" |
| 139 | + example_title: "Parisian" |
| 140 | +- text: "My name is Sarah and I live in London" |
| 141 | + example_title: "Londoner" |
| 142 | +``` |
| 143 | +
|
| 144 | +## Audio |
| 145 | +
|
| 146 | +### Text-to-Speech |
| 147 | +```yaml |
| 148 | +widget: |
| 149 | +- text: "My name is Sylvain and I live in Paris" |
| 150 | + example_title: "Parisian" |
| 151 | +- text: "My name is Sarah and I live in London" |
| 152 | + example_title: "Londoner" |
| 153 | +``` |
| 154 | +
|
| 155 | +### Automatic Speech Recognition |
| 156 | +```yaml |
| 157 | +widget: |
| 158 | +- src: https://cdn-media.huggingface.co/speech_samples/sample1.flac |
| 159 | + example_title: Librispeech sample 1 |
| 160 | +- src: https://cdn-media.huggingface.co/speech_samples/sample2.flac |
| 161 | + example_title: Librispeech sample 2 |
| 162 | +``` |
| 163 | +
|
| 164 | +### Audio-to-Audio |
| 165 | +```yaml |
| 166 | +widget: |
| 167 | +- src: https://cdn-media.huggingface.co/speech_samples/sample1.flac |
| 168 | + example_title: Librispeech sample 1 |
| 169 | +- src: https://cdn-media.huggingface.co/speech_samples/sample2.flac |
| 170 | + example_title: Librispeech sample 2 |
| 171 | +``` |
| 172 | +
|
| 173 | +### Audio Classification |
| 174 | +```yaml |
| 175 | +widget: |
| 176 | +- src: https://cdn-media.huggingface.co/speech_samples/sample1.flac |
| 177 | + example_title: Librispeech sample 1 |
| 178 | +- src: https://cdn-media.huggingface.co/speech_samples/sample2.flac |
| 179 | + example_title: Librispeech sample 2 |
| 180 | +``` |
| 181 | +
|
| 182 | +### Voice Activity Detection |
| 183 | +```yaml |
| 184 | +widget: |
| 185 | +- src: https://cdn-media.huggingface.co/speech_samples/sample1.flac |
| 186 | + example_title: Librispeech sample 1 |
| 187 | +- src: https://cdn-media.huggingface.co/speech_samples/sample2.flac |
| 188 | + example_title: Librispeech sample 2 |
| 189 | +``` |
| 190 | +
|
| 191 | +## Computer Vision |
| 192 | +
|
| 193 | +### Image Classification |
| 194 | +```yaml |
| 195 | +widget: |
| 196 | +- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/tiger.jpg |
| 197 | + example_title: Tiger |
| 198 | +- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/teapot.jpg |
| 199 | + example_title: Teapot |
| 200 | +``` |
| 201 | +
|
| 202 | +### Object Detection |
| 203 | +```yaml |
| 204 | +widget: |
| 205 | +- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/football-match.jpg |
| 206 | + example_title: Football Match |
| 207 | +- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/airport.jpg |
| 208 | + example_title: Airport |
| 209 | +``` |
| 210 | +
|
| 211 | +### Image Segmentation |
| 212 | +```yaml |
| 213 | +widget: |
| 214 | +- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/football-match.jpg |
| 215 | + example_title: Football Match |
| 216 | +- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/airport.jpg |
| 217 | + example_title: Airport |
| 218 | +``` |
| 219 | +
|
| 220 | +### Text-to-Image |
| 221 | +```yaml |
| 222 | +widget: |
| 223 | +- text: "A cat playing with a ball" |
| 224 | + example_title: "Cat" |
| 225 | +- text: "A dog jumping over a fence" |
| 226 | + example_title: "Dog" |
| 227 | +``` |
| 228 | +
|
| 229 | +## Other |
| 230 | +
|
| 231 | +### Structured Data Classification |
| 232 | +```yaml |
| 233 | +widget: |
| 234 | + structuredData: |
| 235 | + fixed_acidity: |
| 236 | + - 7.4 |
| 237 | + - 7.8 |
| 238 | + - 10.3 |
| 239 | + volatile_acidity: |
| 240 | + - 0.7 |
| 241 | + - 0.88 |
| 242 | + - 0.32 |
| 243 | + citric_acid: |
| 244 | + - 0 |
| 245 | + - 0 |
| 246 | + - 0.45 |
| 247 | + residual_sugar: |
| 248 | + - 1.9 |
| 249 | + - 2.6 |
| 250 | + - 6.4 |
| 251 | + chlorides: |
| 252 | + - 0.076 |
| 253 | + - 0.098 |
| 254 | + - 0.073 |
| 255 | + free_sulfur_dioxide: |
| 256 | + - 11 |
| 257 | + - 25 |
| 258 | + - 5 |
| 259 | + total_sulfur_dioxide: |
| 260 | + - 34 |
| 261 | + - 67 |
| 262 | + - 13 |
| 263 | + density: |
| 264 | + - 0.9978 |
| 265 | + - 0.9968 |
| 266 | + - 0.9976 |
| 267 | + pH: |
| 268 | + - 3.51 |
| 269 | + - 3.2 |
| 270 | + - 3.23 |
| 271 | + sulphates: |
| 272 | + - 0.56 |
| 273 | + - 0.68 |
| 274 | + - 0.82 |
| 275 | + alcohol: |
| 276 | + - 9.4 |
| 277 | + - 9.8 |
| 278 | + - 12.6 |
| 279 | + example_title: "Wine" |
| 280 | +``` |
0 commit comments