Skip to content

Commit 152a16c

Browse files
Merge branch 'master' of https://github.com/MicrosoftDocs/azure-docs-pr into donet_core_fix
2 parents f90485a + b5a8dfa commit 152a16c

12 files changed

+231
-38
lines changed

articles/application-gateway/key-vault-certs.md

Lines changed: 2 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -11,12 +11,9 @@ ms.author: victorh
1111

1212
# SSL termination with Key Vault certificates
1313

14-
[Azure Key Vault](../key-vault/key-vault-overview.md) is a platform-managed secret store that you can use to safeguard secrets, keys, and SSL certificates. Azure Application Gateway supports integration with Key Vault (in public preview) for server certificates that are attached to HTTPS-enabled listeners. This support is limited to the v2 SKU of Application Gateway.
14+
[Azure Key Vault](../key-vault/key-vault-overview.md) is a platform-managed secret store that you can use to safeguard secrets, keys, and SSL certificates. Azure Application Gateway supports integration with Key Vault for server certificates that are attached to HTTPS-enabled listeners. This support is limited to the v2 SKU of Application Gateway.
1515

16-
> [!IMPORTANT]
17-
> Integration of Application Gateway with Key Vault is currently in public preview. This preview is provided without a service-level agreement (SLA) and isn't recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
18-
19-
This public preview offers two models for SSL termination:
16+
Key Vault integration offers two models for SSL termination:
2017

2118
- You can explicitly provide SSL certificates attached to the listener. This model is the traditional way to pass SSL certificates to Application Gateway for SSL termination.
2219
- You can optionally provide a reference to an existing Key Vault certificate or secret when you create an HTTPS-enabled listener.

articles/azure-monitor/app/opencensus-python-dependency.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ ms.date: 10/15/2019
1212

1313
# Track dependencies with OpenCensus Python
1414

15-
A dependency is an external component that is called by your application. Dependency data is collected using OpenCensus Python and its various integrations. The data is then sent to Application Insights under Azure Monitor.
15+
A dependency is an external component that is called by your application. Dependency data is collected using OpenCensus Python and its various integrations. The data is then sent to Application Insights under Azure Monitor as `dependencies` telemetry.
1616

1717
First, instrument your Python application with latest [OpenCensus Python SDK](../../azure-monitor/app/opencensus-python.md).
1818

Lines changed: 136 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,136 @@
1+
---
2+
title: Incoming Request Tracking in Azure Application Insights with OpenCensus Python | Microsoft Docs
3+
description: Monitor request calls for your Python apps via OpenCensus Python.
4+
ms.service: azure-monitor
5+
ms.subservice: application-insights
6+
ms.topic: conceptual
7+
author: lzchen
8+
ms.author: lechen
9+
ms.date: 10/15/2019
10+
11+
---
12+
13+
# Track incoming requests with OpenCensus Python
14+
15+
Incoming request data is collected using OpenCensus Python and its various integrations. Track incoming request data sent to your web applications built on top of the popular web frameworks `django`, `flask` and `pyramid`. The data is then sent to Application Insights under Azure Monitor as `requests` telemetry.
16+
17+
First, instrument your Python application with latest [OpenCensus Python SDK](../../azure-monitor/app/opencensus-python.md).
18+
19+
## Tracking Django applications
20+
21+
1. Download and install `opencensus-ext-django` from [PyPI](https://pypi.org/project/opencensus-ext-django/) and instrument your application with the `django` middleware. Incoming requests sent to your `django` application will be tracked.
22+
23+
2. Include `opencensus.ext.django.middleware.OpencensusMiddleware` in your `settings.py` file under `MIDDLEWARE`.
24+
25+
```python
26+
MIDDLEWARE = (
27+
...
28+
'opencensus.ext.django.middleware.OpencensusMiddleware',
29+
...
30+
)
31+
```
32+
33+
3. Make sure AzureExporter is properly configured in your `settings.py` under `OPENCENSUS`.
34+
35+
```python
36+
OPENCENSUS = {
37+
'TRACE': {
38+
'SAMPLER': 'opencensus.trace.samplers.ProbabilitySampler(rate=0.5)',
39+
'EXPORTER': '''opencensus.ext.azure.trace_exporter.AzureExporter(
40+
service_name='foobar',
41+
)''',
42+
}
43+
}
44+
```
45+
46+
4. You can also add urls to `settings.py` under `BLACKLIST_PATHS` for requests that you do not want to track.
47+
48+
```python
49+
OPENCENSUS = {
50+
'TRACE': {
51+
'SAMPLER': 'opencensus.trace.samplers.ProbabilitySampler(rate=0.5)',
52+
'EXPORTER': '''opencensus.ext.azure.trace_exporter.AzureExporter(
53+
service_name='foobar',
54+
)''',
55+
'BLACKLIST_PATHS': 'https://example.com', <--- This site will not be traced if a request is sent from it.
56+
}
57+
}
58+
```
59+
60+
## Tracking Flask applications
61+
62+
1. Download and install `opencensus-ext-flask` from [PyPI](https://pypi.org/project/opencensus-ext-flask/) and instrument your application with the `flask` middleware. Incoming requests sent to your `flask` application will be tracked.
63+
64+
```python
65+
66+
from flask import Flask
67+
from opencensus.ext.azure.trace_exporter import AzureExporter
68+
from opencensus.ext.flask.flask_middleware import FlaskMiddleware
69+
from opencensus.trace.samplers import ProbabilitySampler
70+
71+
app = Flask(__name__)
72+
middleware = FlaskMiddleware(
73+
app,
74+
exporter=AzureExporter(connection_string="InstrumentationKey=<your-ikey-here>"),
75+
sampler=ProbabilitySampler(rate=1.0),
76+
)
77+
78+
@app.route('/')
79+
def hello():
80+
return 'Hello World!'
81+
82+
if __name__ == '__main__':
83+
app.run(host='localhost', port=8080, threaded=True)
84+
85+
```
86+
87+
2. You can configure your `flask` middleware directly in the code. For requests from urls that you do not wish to track, add them to `BLACKLIST_PATHS`.
88+
89+
```python
90+
app.config['OPENCENSUS'] = {
91+
'TRACE': {
92+
'SAMPLER': 'opencensus.trace.samplers.ProbabilitySampler(rate=1.0)',
93+
'EXPORTER': '''opencensus.ext.azure.trace_exporter.AzureExporter(
94+
service_name='foobar',
95+
)''',
96+
'BLACKLIST_PATHS': 'https://example.com', <--- This site will not be traced if a request is sent to it.
97+
}
98+
}
99+
```
100+
101+
## Tracking Pyramid applications
102+
103+
1. Download and install `opencensus-ext-django` from [PyPI](https://pypi.org/project/opencensus-ext-pyramid/) and instrument your application with the `pyramid` tween. Incoming requests sent to your `pyramid` application will be tracked.
104+
105+
```python
106+
def main(global_config, **settings):
107+
config = Configurator(settings=settings)
108+
109+
config.add_tween('opencensus.ext.pyramid'
110+
'.pyramid_middleware.OpenCensusTweenFactory')
111+
```
112+
113+
2. You can configure your `pyramid` tween directly in the code. For requests from urls that you do not wish to track, add them to `BLACKLIST_PATHS`.
114+
115+
```python
116+
settings = {
117+
'OPENCENSUS': {
118+
'TRACE': {
119+
'SAMPLER': 'opencensus.trace.samplers.ProbabilitySampler(rate=1.0)',
120+
'EXPORTER': '''opencensus.ext.azure.trace_exporter.AzureExporter(
121+
service_name='foobar',
122+
)''',
123+
'BLACKLIST_PATHS': 'https://example.com', <--- This site will not be traced if a request is sent to it.
124+
}
125+
}
126+
}
127+
config = Configurator(settings=settings)
128+
```
129+
130+
## Next steps
131+
132+
* [Application Map](../../azure-monitor/app/app-map.md)
133+
* [Availability](../../azure-monitor/app/monitor-web-app-availability.md)
134+
* [Search](../../azure-monitor/app/diagnostic-search.md)
135+
* [Log (Analytics) query](../../azure-monitor/log-query/log-query-overview.md)
136+
* [Transaction diagnostics](../../azure-monitor/app/transaction-diagnostics.md)

articles/azure-monitor/app/opencensus-python.md

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -20,8 +20,6 @@ Azure Monitor supports distributed tracing, metric collection, and logging of Py
2020
- An Azure subscription. If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/) before you begin.
2121
- Python installation. This article uses [Python 3.7.0](https://www.python.org/downloads/), though earlier versions will likely work with minor changes.
2222

23-
24-
2523
## Sign in to the Azure portal
2624

2725
Sign in to the [Azure portal](https://portal.azure.com/).
@@ -52,6 +50,8 @@ Install the OpenCensus Azure Monitor exporters:
5250
python -m pip install opencensus-ext-azure
5351
```
5452

53+
For a full list of packages and integrations, see [OpenCensus packages](https://docs.microsoft.com/azure/azure-monitor/app/nuget#common-packages-for-python-using-opencensus).
54+
5555
> [!NOTE]
5656
> The `python -m pip install opencensus-ext-azure` command assumes that you have a `PATH` environment variable set for your Python installation. If you haven't configured this variable, you need to give the full directory path to where your Python executable is located. The result is a command like this: `C:\Users\Administrator\AppData\Local\Programs\Python\Python37-32\python.exe -m pip install opencensus-ext-azure`.
5757
@@ -124,6 +124,10 @@ The SDK uses three Azure Monitor exporters to send different types of telemetry
124124

125125
4. Now when you run the Python script, you should still be prompted to enter values, but only the value is being printed in the shell. The created `SpanData` will be sent to Azure Monitor. You can find the emitted span data under `dependencies`.
126126

127+
5. For information on sampling in OpenCensus, take a look at [sampling in OpenCensus](https://docs.microsoft.com/azure/azure-monitor/app/sampling#configuring-fixed-rate-sampling-in-opencensus-python).
128+
129+
6. For details on telemetry correlation in your trace data, take a look at OpenCensus [telemetry correlation](https://docs.microsoft.com/azure/azure-monitor/app/correlation#telemetry-correlation-in-opencensus-python).
130+
127131
### Metrics
128132

129133
1. First, let's generate some local metric data. We'll create a simple metric to track the number of times the user presses Enter.
@@ -288,6 +292,8 @@ The SDK uses three Azure Monitor exporters to send different types of telemetry
288292

289293
4. The exporter will send log data to Azure Monitor. You can find the data under `traces`.
290294

295+
5. For details on how to enrich your logs with trace context data, see OpenCensus Python [logs integration](https://docs.microsoft.com/azure/azure-monitor/app/correlation#logs-correlation).
296+
291297
## Start monitoring in the Azure portal
292298

293299
1. You can now reopen the Application Insights **Overview** pane in the Azure portal, to view details about your currently running application. Select **Live Metrics Stream**.

articles/azure-monitor/toc.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -318,6 +318,8 @@
318318
href: app/opencensus-python.md
319319
- name: Dependencies
320320
href: app/opencensus-python-dependency.md
321+
- name: Requests
322+
href: app/opencensus-python-request.md
321323
- name: Web pages
322324
items:
323325
- name: Client-side JavaScript

articles/firewall/threat-intel.md

Lines changed: 4 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -5,20 +5,17 @@ services: firewall
55
author: vhorne
66
ms.service: firewall
77
ms.topic: article
8-
ms.date: 3/11/2019
8+
ms.date: 11/05/2019
99
ms.author: victorh
1010
---
1111

12-
# Azure Firewall threat intelligence-based filtering - Public Preview
12+
# Azure Firewall threat intelligence-based filtering
1313

1414
Threat intelligence-based filtering can be enabled for your firewall to alert and deny traffic from/to known malicious IP addresses and domains. The IP addresses and domains are sourced from the Microsoft Threat Intelligence feed. [Intelligent Security Graph](https://www.microsoft.com/en-us/security/operations/intelligence) powers Microsoft threat intelligence and is used by multiple services including Azure Security Center.
1515

1616
![Firewall threat intelligence](media/threat-intel/firewall-threat.png)
1717

18-
> [!IMPORTANT]
19-
> Threat intelligence based filtering is currently in public preview and is provided with a preview service level agreement. Certain features may not be supported or may have constrained capabilities. See the [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) for details.
20-
21-
If threat intelligence-based filtering is enabled, the associated rules are processed before any of the NAT rules, network rules, or application rules. During the preview, only highest confidence records are included.
18+
If you've enabled threat intelligence-based filtering, the associated rules are processed before any of the NAT rules, network rules, or application rules.
2219

2320
You can choose to just log an alert when a rule is triggered, or you can choose alert and deny mode.
2421

@@ -46,7 +43,7 @@ The following log excerpt shows a triggered rule:
4643

4744
- **Outbound testing** - Outbound traffic alerts should be a rare occurrence, as it means that your environment has been compromised. To help test outbound alerts are working, a test FQDN has been created that triggers an alert. Use **testmaliciousdomain.eastus.cloudapp.azure.com** for your outbound tests.
4845

49-
- **Inbound testing** - You can expect to see alerts on incoming traffic if DNAT rules are configured on the firewall. This is true even if only specific sources are allowed on the DNAT rule and traffic is otherwise denied. Azure Firewall does not alert on all known port scanners; only on scanners that are known to also engage in malicious activity.
46+
- **Inbound testing** - You can expect to see alerts on incoming traffic if DNAT rules are configured on the firewall. This is true even if only specific sources are allowed on the DNAT rule and traffic is otherwise denied. Azure Firewall doesn't alert on all known port scanners; only on scanners that are known to also engage in malicious activity.
5047

5148
## Next steps
5249

articles/search/TOC.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -195,6 +195,8 @@
195195
href: search-blob-storage-integration.md
196196
- name: Set up a blob indexer
197197
href: search-howto-indexing-azure-blob-storage.md
198+
- name: Set up an Azure Data Lake Storage Gen2 indexer
199+
href: search-howto-index-azure-data-lake-storage.md
198200
- name: Index one-to-many blobs
199201
href: search-howto-index-one-to-many-blobs.md
200202
- name: Index CSV blobs

articles/search/search-api-preview.md

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ This article describes the `api-version=2019-05-06-Preview` version of Search se
2020

2121
## New in 2019-05-06-Preview
2222

23-
[**Incremental indexing](cognitive-search-incremental-indexing-conceptual.md) is a new mode for indexing that adds state and caching, allowing you to reuse existing output when data, indexer, and skillset definitions are unchanged. This feature applies only to enrichments through a cognitive skillset.
23+
[**Incremental indexing**](cognitive-search-incremental-indexing-conceptual.md) is a new mode for indexing that adds state and caching, allowing you to reuse existing output when data, indexer, and skillset definitions are unchanged. This feature applies only to enrichments through a cognitive skillset.
2424

2525
[**Knowledge store**](knowledge-store-concept-intro.md) is a new destination of an AI-based enrichment pipeline. In addition to an index, you can now persist populated data structures created during indexing in Azure storage. You control the physical structures of your data through elements in a Skillset, including how data is shaped, whether data is stored in Table storage or Blob storage, and whether there are multiple views.
2626

@@ -31,8 +31,9 @@ This article describes the `api-version=2019-05-06-Preview` version of Search se
3131
Features announced in earlier previews are still in public preview. If you're calling an API with an earlier preview api-version, you can continue to use that version or switch to `2019-05-06-Preview` with no changes to expected behavior.
3232

3333
+ [moreLikeThis query parameter](search-more-like-this.md) finds documents that are relevant to a specific document. This feature has been in earlier previews.
34-
* [CSV blob indexing](search-howto-index-csv-blobs.md) creates one document per line, as opposed to one document per text blob.
35-
* [MongoDB API support for Cosmos DB indexers](search-howto-index-cosmosdb.md) is in preview.
34+
+ [CSV blob indexing](search-howto-index-csv-blobs.md) creates one document per line, as opposed to one document per text blob.
35+
+ [Cosmos DB indexer](search-howto-index-cosmosdb.md) supports MongoDB API, Gremlin API, and Cassandra API.
36+
+ [Azure Data Lake Storage Gen2 indexer](search-howto-index-azure-data-lake-storage.md) can index content and metadata from Data Lake Storage Gen2.
3637

3738

3839
## How to call a preview API
Lines changed: 44 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,44 @@
1+
---
2+
title: Indexing documents in Azure Data Lake Storage Gen2 (preview)
3+
titleSuffix: Azure Cognitive Search
4+
description: Learn how to index content and metadata in Azure Data Lake Storage Gen2.
5+
6+
manager: nitinme
7+
author: markheff
8+
ms.author: maheff
9+
ms.devlang: rest-api
10+
ms.service: cognitive-search
11+
ms.topic: conceptual
12+
ms.date: 11/04/2019
13+
---
14+
15+
# Indexing documents in Azure Data Lake Storage Gen2 (preview)
16+
17+
When setting up an Azure storage account, you have the option to enable [hierarchical namespace](https://docs.microsoft.com/azure/storage/blobs/data-lake-storage-namespace). This allows the collection of content in an account to be organized into a hierarchy of directories and nested subdirectories. By enabling hierarchical namespace, you enable [Azure Data Lake Storage Gen2](https://docs.microsoft.com/azure/storage/blobs/data-lake-storage-introduction).
18+
19+
This article describes how to get started with indexing documents that are in Azure Data Lake Storage Gen2.
20+
21+
> [!Note]
22+
> The Azure Data Lake Storage Gen2 indexer is in preview and is not intended for production use.
23+
24+
## Set up Azure Data Lake Storage Gen2 indexer
25+
26+
There are a few steps you'll need to complete to index content from Data Lake Storage Gen2.
27+
28+
### Step 1: Sign up for the preview
29+
30+
Sign up for the Data Lake Storage Gen2 indexer preview by filling out [this form](https://aka.ms/azure-cognitive-search/indexer-preview). You will receive a confirmation email once you have been accepted into the preview.
31+
32+
### Step 2: Follow the Azure Blob storage indexing setup steps
33+
34+
Once you've received confirmation that your preview sign-up was successful, you're ready to create the indexing pipeline.
35+
36+
You can index content and metadata from Data Lake Storage Gen2 by using the [REST API version 2019-05-06-Preview](search-api-preview.md). There is no .NET SDK support at this time.
37+
38+
Indexing content in Data Lake Storage Gen2 is identical to indexing content in Azure Blob storage. So to understand how to set up the Data Lake Storage Gen2 data source, index, and indexer, refer to [How to index documents in Azure Blob Storage with Azure Cognitive Search](search-howto-indexing-azure-blob-storage.md). The Blob storage article also provides information about what document formats are supported, what blob metadata properties are extracted, incremental indexing, and more. This information will be the same for Data Lake Storage Gen2.
39+
40+
## Access control
41+
42+
Azure Data Lake Storage Gen2 implements an [access control model](https://docs.microsoft.com/azure/storage/blobs/data-lake-storage-access-control) that supports both Azure role-based access control (RBAC) and POSIX-like access control lists (ACLs). When indexing content from Data Lake Storage Gen2, Azure Cognitive Search will not extract the RBAC and ACL information from the content. As a result, this information will not be included in your Azure Cognitive Search index.
43+
44+
If maintaining access control on each document in the index is important, it is up to the application developer to implement [security trimming](https://docs.microsoft.com/azure/search/search-security-trimming-for-azure-search).

0 commit comments

Comments
 (0)