You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Foundation models, such as GPT-4, are state-of-the-art natural language processing models designed to understand, generate, and interact with human language. To understand the significance of foundation models, it's essential to explore their origins, which stem from advancements in the field of natural language processing.
1
+
Generative AI applications are built on *language models*. The development process usually starts with an exploration and comparison of available *foundation* models to find the one that best suits the particular needs of your application. After selecting a suitable model, you deploy it to an endpoint where it can be consumed by a client application or AI agent.
2
2
3
-
## Understand natural language processing
3
+
In some cases, you might seek to optimize model responses for your application by applying *prompt engineering* techniques, implementing a *retrieve, augment, and generate (RAG)* solution that uses your own data to contextualize prompts, or by *fine-tuning* your chosen model with example prompts and responses that represent the conversational behavior you need. But it all begins with choosing the right model to start with.
4
4
5
-
Natural language processing (NLP) is a type of artificial intelligence (AI) that focuses on understanding, interpreting, and generating human language. Some common NLP use cases are:
5
+
## Foundation models
6
+
7
+
Foundation models, such as the GPT family of models, are state-of-the-art language models designed to understand, generate, and interact with natural language. Some common use cases for models are:
6
8
7
9
-**Speech-to-text and text-to-speech conversion**. For example, generate subtitles for videos.
8
10
-**Machine translation**. For example, translate text from English to Japanese.
@@ -12,16 +14,11 @@ Natural language processing (NLP) is a type of artificial intelligence (AI) that
12
14
-**Question answering**. For example, provide answers to questions like "What is the capital of France?"
13
15
-**Reasoning**. For example, solve a mathematical problem.
14
16
15
-
> [!Note]
16
-
> In this module, you focus on exploring foundation models used for question answering. The foundation models you explore can be used for chat applications in which you use a language model to generate a response to a user's question.
17
-
18
-
### Understand the importance of the Transformer architecture
19
-
20
-
The latest breakthrough in **Natural Language Processing** (**NLP**) is owed to the development of the **Transformer** architecture.
21
-
22
-
Transformers were introduced in the [*Attention is all you need* paper by Vaswani, et al. from 2017](https://arxiv.org/abs/1706.03762?azure-portal=true). The Transformer architecture provided two innovations to NLP that resulted in the emergence of foundation models:
17
+
In this module, you focus on exploring foundation models used for question answering. The foundation models you explore can be used for chat applications in which you use a language model to generate a response to a user's question.
23
18
24
-
- Instead of processing words sequentially, Transformers process each word independently and in parallel by using **attention**.
25
-
- Next to the semantic similarity between words, Transformers use **positional encoding** to include the information about the position of a word in a sentence.
19
+
> [!NOTE]
20
+
> The latest breakthrough in generative AI models is owed to the development of the **Transformer** architecture. Transformers were introduced in the [*Attention is all you need* paper by Vaswani, et al. from 2017](https://arxiv.org/abs/1706.03762?azure-portal=true). The Transformer architecture provided two innovations to NLP that resulted in the emergence of foundation models:
21
+
>
22
+
> - Instead of processing words sequentially, Transformers process each word independently and in parallel by using **attention**.
23
+
> - Next to the semantic similarity between words, Transformers use **positional encoding** to include the information about the position of a word in a sentence.
26
24
27
-
Foundation models designed for NLP use cases are often referred to as **Large Language Models** (**LLMs**) or language models. In this module, you explore the available language models, how to select a model for your use case, and how to use a language model with the Azure AI Foundry portal. You focus on language models that help you to develop generative AI apps that serve as chat applications that do question answering, answering questions to your users.
0 commit comments