Skip to content

Latest commit

 

History

History
7 lines (7 loc) · 572 Bytes

File metadata and controls

7 lines (7 loc) · 572 Bytes
display_name short_description topic wikipedia_url
Transformer
A transformer is a deep learning architecture based on self-attention mechanisms, designed to process sequential data in parallel.
transformer

A transformer is a deep learning architecture based on self-attention mechanisms, designed to process sequential data in parallel. Transformers are the foundation of modern large language models and are widely used in natural language processing, computer vision, and generative AI.