You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/logic-apps/ai-resources.md
+19-11Lines changed: 19 additions & 11 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -43,21 +43,15 @@ The **Azure AI Search** and **Azure OpenAI** built-in or "in-app" connectors pro
43
43
44
44
For a guide that shows how to use these connector operations in your Standard workflow, see [Connect to Azure AI services from Standard workflows in Azure Logic Apps](/azure/logic-apps/connectors/azure-ai).
45
45
46
-
## Azure OpenAI Assistants
46
+
## Ingest documents and chat with data
47
47
48
-
| Resource type | Link |
49
-
|---------------|------|
50
-
|**Blog article**|[Build Azure OpenAI assistants with function calling](https://techcommunity.microsoft.com/blog/azure-ai-services-blog/announcing-azure-openai-service-assistants-public-preview-refresh/4143217)|
51
-
52
-
For more information, see [Call Azure Logic Apps workflows as functions using Azure OpenAI Assistants](/azure/ai-services/openai/how-to/assistants-logic-apps).
53
-
54
-
## Document ingestion and chat with data
48
+
Data is the cornerstone for any AI application and unique for each organization. When you build any AI application, efficient data ingestion is critical for success. Regardless where your data resides and with little or no code, you can more easily integrate AI into new and existing business processes by building Standard workflows with Azure Logic Apps.
55
49
56
-
When you build any AI application, efficient data ingestion is critical for success. Azure Logic Apps includes over 1,400 enterprise connectors and operations that provide access to a wide range of services, systems, applications, and databases.
50
+
With over 1,400 enterprise connectors and operations, Azure Logic Apps makes it possible for you to quickly access and perform tasks with a wide range of services, systems, applications, and databases. When you use these connectors alongside AI services, your organization can transform workloads such as automating routine tasks, enhancing customer interactions, and generating intelligent insights.
57
51
58
-
In addition to these out-of-the-box actions, Azure Logic Apps also offers pre-built templates for data ingestion from many common data sources, including SharePoint, Azure File Storage, Blob Storage, SFTP, and more, helping you rapidly build and deploy your applications.
52
+
Along with these operations, Azure Logic Apps also offers prebuilt workflow templates for data ingestion from many common data sources, such as SharePoint, Azure File Storage, Blob Storage, SFTP, and more, to help you quickly build and your applications.
59
53
60
-
By leveraging connectors like Azure OpenAI and Azure AI Search, businesses can seamlessly implement the Retrieval-Augmented Generation (RAG) pattern, allowing the ingestion and retrieval of data from multiple sources with ease.
54
+
For example, when you use operations from connectors such as **Azure OpenAI** and **Azure AI Search** in your workflows, your organization can seamlessly implement the retrieval-augmented generation (RAG) pattern. This architecture makes it easier to ingest and retrieve data from multiple sources.
61
55
62
56
| Resource type | Link |
63
57
|---------------|------|
@@ -114,6 +108,20 @@ You can customize this workflow to trigger daily, weekly, or monthly, based on y
114
108
|**Blog article**|[Automate responses to StackOverflow queries using Azure OpenAI and Azure Logic Apps](https://techcommunity.microsoft.com/blog/integrationsonazureblog/automate-responses-to-stackoverflow-queries-using-openai-and-logic-apps/4182590)|
115
109
|**GitHub sample**|[Automate responses to unanswered StackOverflow questions](https://github.com/Azure/logicapps/tree/helpdesk-sample-1/testAILA)|
116
110
111
+
## Build Assistants with Azure Logic apps
112
+
113
+
With Azure OpenAI, you can easily build agent-like features into your applications by using the Assistants API. Although the capability to build agents previously existed, the process often required significant engineering, external libraries, and multiple integrations. However, now with Assistants, you can rapidly create customized stateful copilots that are trained on their enterprise data and can handle diverse tasks by using the latest GPT models, tools, and knowledge. The current release includes features such as File Search and Browse tools, enhanced data security features, improved controls, new models, expanded region support, and various enhancements that make it easy to go from prototyping to production.
114
+
115
+
You can now build Assistants by calling Azure Logic Apps workflows as AI functions. Without writing any code, you can discover, import, and invoke workflows in Azure OpenAI Studio from the Azure OpenAI Assistants playground. The Assistants playground enumerates and lists all the workflows in your subscription that are eligible for function calling.
116
+
117
+
To test Assistants with function calling, you can import workflows as AI functions using a browse and select experience. Function specification generation and other configuration is automatically pulled from Swagger for your workflow. Function calling invokes workflows based on user prompts, while all the appropriate parameters are passed in based on the definition.
118
+
119
+
| Resource type | Link |
120
+
|---------------|------|
121
+
|**Blog article**|[Build Azure OpenAI assistants with function calling](https://techcommunity.microsoft.com/blog/azure-ai-services-blog/announcing-azure-openai-service-assistants-public-preview-refresh/4143217)|
122
+
123
+
For more information, see [Call Azure Logic Apps workflows as functions using Azure OpenAI Assistants](/azure/ai-services/openai/how-to/assistants-logic-apps).
124
+
117
125
## Integrate with Semantic Kernel
118
126
119
127
This lightweight, open-source development kit helps you easily build AI agents and integrate the latest AI models into your C#, Python, or Java codebase. At the simplest level, the kernel is a dependency injection container that manages all services and plugins that your AI application needs to run. If you provide all your services and plugins to the kernel, the AI seamlessly uses these components as needed. As the central component, the kernel serves as an efficient middleware that helps you quickly deliver enterprise-grade solutions. For more information, see [Introduction to Semantic Kernel](/semantic-kernel/overview/).
0 commit comments