Skip to content

Commit bf0d03a

Browse files
authored
[Monitor][Ingestion] Update samples (Azure#28466)
Added a samples README to give users more information regarding on how to set up environment variables and run the samples. Additional samples for uploading file contents and uploading data from a pandas DataFrame were added. Signed-off-by: Paul Van Eck <paulvaneck@microsoft.com>
1 parent bfb1726 commit bf0d03a

10 files changed

+510
-11
lines changed

sdk/monitor/azure-monitor-ingestion/README.md

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -136,12 +136,16 @@ except HttpResponseError as e:
136136
To upload logs with custom error handling, you can pass a callback function to the `on_error` parameter of the `upload` method. The callback function will be called for each error that occurs during the upload and should expect one argument that corresponds to an `UploadLogsError` object. This object contains the error encountered and the list of logs that failed to upload.
137137

138138
```python
139+
# Example 1: Collect all logs that failed to upload.
139140
failed_logs = []
140141
def on_error(error):
141142
print("Log chunk failed to upload with error: ", error.error)
142-
# Collect all logs that failed to upload.
143143
failed_logs.extend(error.failed_logs)
144144

145+
# Example 2: Ignore all errors.
146+
def on_error_pass(error):
147+
pass
148+
145149
client.upload(rule_id=rule_id, stream_name=os.environ['LOGS_DCR_STREAM_NAME'], logs=body, on_error=on_error)
146150
```
147151

Lines changed: 92 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,92 @@
1+
---
2+
page_type: sample
3+
languages:
4+
- python
5+
products:
6+
- azure
7+
- azure-monitor
8+
urlFragment: ingestion-azuremonitor-samples
9+
---
10+
11+
# Azure Monitor Ingestion client library Python samples
12+
13+
This library allows you to send data from virtually any source to supported built-in tables or to custom tables that you create in Log Analytics workspaces. The following code samples show common scenarios with the Azure Monitor Ingestion client library.
14+
15+
|**File Name**|**Description**|
16+
|-------------|---------------|
17+
|[sample_send_small_logs.py][sample_send_small_logs] and [sample_send_small_logs_async.py][sample_send_small_logs_async]|Send a small number of logs to a Log Analytics workspace.|
18+
|[sample_custom_error_callback.py][sample_custom_error_callback] and [sample_custom_error_callback_async.py][sample_custom_error_callback_async]|Use error callbacks to customize how errors are handled during upload. |
19+
|[sample_upload_file_contents.py][sample_upload_file_contents] and [sample_upload_file_contents_async.py][sample_upload_file_contents_async]|Upload the contents of a file to a Log Analytics workspace.|
20+
|[sample_upload_pandas_dataframe.py][sample_upload_pandas_dataframe] and [sample_upload_pandas_dataframe_async.py][sample_upload_pandas_dataframe_async]|Upload data in a pandas DataFrame to a Log Analytics workspace.|
21+
22+
## Prerequisites
23+
24+
- Python 3.7 or later
25+
- An [Azure subscription][azure_subscription]
26+
- An [Azure Log Analytics workspace][azure_monitor_create_using_portal]
27+
- A [Data Collection Endpoint (DCE)][data_collection_endpoint]
28+
- A [Data Collection Rule (DCR)][data_collection_rule]
29+
30+
## How to run the samples
31+
32+
### Install the dependencies
33+
34+
To run the samples, you need to install the following dependencies:
35+
```bash
36+
pip install azure-monitor-ingestion azure-identity pandas
37+
```
38+
39+
To run the async samples, you need an asynchronous HTTP framework like `aiohttp`:
40+
41+
```bash
42+
pip install aiohttp
43+
```
44+
45+
### Set up authentication
46+
47+
We use [azure-identity][azure_identity]'s [DefaultAzureCredential][azure_identity_default_azure_credential] to authenticate. Ensure that your service principal or managed identity has the `Monitoring Metrics Publisher` role assigned on the Data Collection Rule resource. If you are using a service principal, set the following environment variables:
48+
49+
```bash
50+
AZURE_TENANT_ID="your Azure AD tenant (directory) ID"
51+
AZURE_CLIENT_ID="your Azure AD client (application) ID"
52+
AZURE_CLIENT_SECRET="your Azure AD client secret"
53+
```
54+
55+
### Set up additional environment variables
56+
57+
Change and set the following environment variables to match your configuration:
58+
59+
```bash
60+
DATA_COLLECTION_ENDPOINT="your data collection endpoint"
61+
LOGS_DCR_RULE_ID="your data collection rule immutable ID"
62+
LOGS_DCR_STREAM_NAME="your data collection rule stream name"
63+
```
64+
65+
### Run the samples
66+
67+
Navigate to the directory that the samples are saved in, and follow the usage described in the file. For example, `python sample_send_small_logs.py`.
68+
69+
## Next steps
70+
71+
To learn more about Azure Monitor, see the [Azure Monitor service documentation][azure_monitor_docs] and the [Logs Ingestion API overview][azure_monitor_logs_ingestion_overview].
72+
73+
74+
<!-- Sample links -->
75+
[sample_send_small_logs]: https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/monitor/azure-monitor-ingestion/samples/sample_send_small_logs.py
76+
[sample_send_small_logs_async]: https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/monitor/azure-monitor-ingestion/samples/async_samples/sample_send_small_logs_async.py
77+
[sample_custom_error_callback]: https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/monitor/azure-monitor-ingestion/samples/sample_custom_error_callback.py
78+
[sample_custom_error_callback_async]: https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/monitor/azure-monitor-ingestion/samples/async_samples/sample_custom_error_callback_async.py
79+
[sample_upload_file_contents]: https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/monitor/azure-monitor-ingestion/samples/sample_upload_file_contents.py
80+
[sample_upload_file_contents_async]: https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/monitor/azure-monitor-ingestion/samples/async_samples/sample_upload_file_contents_async.py
81+
[sample_upload_pandas_dataframe]: https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/monitor/azure-monitor-ingestion/samples/sample_upload_pandas_dataframe.py
82+
[sample_upload_pandas_dataframe_async]: https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/monitor/azure-monitor-ingestion/samples/async_samples/sample_upload_pandas_dataframe_async.py
83+
84+
<!-- External links -->
85+
[azure_identity]: https://pypi.org/project/azure-identity/
86+
[azure_identity_default_azure_credential]: https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/identity/azure-identity#defaultazurecredential
87+
[azure_monitor_create_using_portal]: https://docs.microsoft.com/azure/azure-monitor/logs/quick-create-workspace
88+
[azure_monitor_docs]: https://docs.microsoft.com/azure/azure-monitor/
89+
[azure_monitor_logs_ingestion_overview]: https://learn.microsoft.com/azure/azure-monitor/logs/logs-ingestion-api-overview
90+
[azure_subscription]: https://azure.microsoft.com/free/
91+
[data_collection_endpoint]: https://learn.microsoft.com/azure/azure-monitor/essentials/data-collection-endpoint-overview
92+
[data_collection_rule]: https://learn.microsoft.com/azure/azure-monitor/essentials/data-collection-rule-overview

sdk/monitor/azure-monitor-ingestion/samples/async_samples/sample_custom_error_callback_async.py

Lines changed: 37 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,35 @@
1+
# -------------------------------------------------------------------------
2+
# Copyright (c) Microsoft Corporation. All rights reserved.
3+
# Licensed under the MIT License. See License.txt in the project root for
4+
# license information.
5+
# --------------------------------------------------------------------------
6+
17
"""
2-
Usage: python sample_custom_error_callback_async.py
8+
FILE: sample_custom_error_callback_async.py
9+
10+
DESCRIPTION:
11+
This sample demonstrates how to use error callbacks to customize how errors are handled during upload.
12+
13+
Note: This sample requires the azure-identity library.
14+
15+
USAGE:
16+
python sample_custom_error_callback_async.py
17+
18+
Set the environment variables with your own values before running the sample:
19+
1) DATA_COLLECTION_ENDPOINT - your data collection endpoint
20+
2) LOGS_DCR_RULE_ID - your data collection rule immutable ID
21+
3) LOGS_DCR_STREAM_NAME - your data collection rule stream name
22+
23+
If using an application service principal for authentication, set the following:
24+
1) AZURE_TENANT_ID - your Azure AD tenant (directory) ID
25+
2) AZURE_CLIENT_ID - your Azure AD client (application) ID
26+
3) AZURE_CLIENT_SECRET - your Azure AD client secret
327
"""
28+
429
import asyncio
530
import os
631

32+
from azure.core.exceptions import HttpResponseError
733
from azure.identity.aio import DefaultAzureCredential
834
from azure.monitor.ingestion import UploadLogsError
935
from azure.monitor.ingestion.aio import LogsIngestionClient
@@ -29,17 +55,24 @@ async def send_logs():
2955
failed_logs = []
3056

3157
# Sample callback that stores the logs that failed to upload.
32-
async def on_error(error: UploadLogsError) -> None:
58+
async def on_error_save(error: UploadLogsError) -> None:
3359
print("Log chunk failed to upload with error: ", error.error)
3460
failed_logs.extend(error.failed_logs)
3561

3662
# Sample callback that just ignores the error.
37-
async def on_error_pass(*_) -> None:
63+
async def on_error_pass(_) -> None:
3864
pass
3965

66+
# Sample callback that raises the error if it corresponds to a specific HTTP error code.
67+
# This aborts the rest of the upload.
68+
async def on_error_abort(error: UploadLogsError) -> None:
69+
if isinstance(error.error, HttpResponseError) and error.error.status_code in (400, 401, 403):
70+
print("Aborting upload...")
71+
raise error.error
72+
4073
client = LogsIngestionClient(endpoint=endpoint, credential=credential, logging_enable=True)
4174
async with client:
42-
await client.upload(rule_id=rule_id, stream_name=os.environ['LOGS_DCR_STREAM_NAME'], logs=body, on_error=on_error)
75+
await client.upload(rule_id=rule_id, stream_name=os.environ['LOGS_DCR_STREAM_NAME'], logs=body, on_error=on_error_save)
4376

4477
# Retry once with any failed logs, and this time ignore any errors.
4578
if failed_logs:

sdk/monitor/azure-monitor-ingestion/samples/async_samples/sample_send_small_logs_async.py

Lines changed: 26 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,31 @@
1+
# -------------------------------------------------------------------------
2+
# Copyright (c) Microsoft Corporation. All rights reserved.
3+
# Licensed under the MIT License. See License.txt in the project root for
4+
# license information.
5+
# --------------------------------------------------------------------------
6+
17
"""
2-
Usage: python sample_send_small_logs_async.py
8+
FILE: sample_send_small_logs_async.py
9+
10+
DESCRIPTION:
11+
This sample demonstrates how to send a small number of logs to a Log Analytics workspace.
12+
13+
Note: This sample requires the azure-identity library.
14+
15+
USAGE:
16+
python sample_send_small_logs_async.py
17+
18+
Set the environment variables with your own values before running the sample:
19+
1) DATA_COLLECTION_ENDPOINT - your data collection endpoint
20+
2) LOGS_DCR_RULE_ID - your data collection rule immutable ID
21+
3) LOGS_DCR_STREAM_NAME - your data collection rule stream name
22+
23+
If using an application service principal for authentication, set the following:
24+
1) AZURE_TENANT_ID - your Azure AD tenant (directory) ID
25+
2) AZURE_CLIENT_ID - your Azure AD client (application) ID
26+
3) AZURE_CLIENT_SECRET - your Azure AD client secret
327
"""
28+
429
import asyncio
530
import os
631

Lines changed: 77 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,77 @@
1+
# -------------------------------------------------------------------------
2+
# Copyright (c) Microsoft Corporation. All rights reserved.
3+
# Licensed under the MIT License. See License.txt in the project root for
4+
# license information.
5+
# --------------------------------------------------------------------------
6+
7+
"""
8+
FILE: sample_upload_file_contents_async.py
9+
10+
DESCRIPTION:
11+
This sample demonstrates how to upload the contents of a file to a Log Analytics workspace. The file is
12+
expected to contain a list of JSON objects.
13+
14+
Note: This sample requires the azure-identity library.
15+
16+
USAGE:
17+
python sample_upload_file_contents_async.py
18+
19+
Set the environment variables with your own values before running the sample:
20+
1) DATA_COLLECTION_ENDPOINT - your data collection endpoint
21+
2) LOGS_DCR_RULE_ID - your data collection rule immutable ID
22+
3) LOGS_DCR_STREAM_NAME - your data collection rule stream name
23+
24+
If using an application service principal for authentication, set the following:
25+
1) AZURE_TENANT_ID - your Azure AD tenant (directory) ID
26+
2) AZURE_CLIENT_ID - your Azure AD client (application) ID
27+
3) AZURE_CLIENT_SECRET - your Azure AD client secret
28+
"""
29+
30+
import asyncio
31+
from datetime import datetime
32+
import json
33+
import os
34+
35+
from azure.core.exceptions import HttpResponseError
36+
from azure.identity.aio import DefaultAzureCredential
37+
from azure.monitor.ingestion.aio import LogsIngestionClient
38+
39+
40+
async def upload_file_contents() -> None:
41+
endpoint = os.environ['DATA_COLLECTION_ENDPOINT']
42+
credential = DefaultAzureCredential()
43+
44+
client = LogsIngestionClient(endpoint=endpoint, credential=credential, logging_enable=True)
45+
46+
# Update this to point to a file containing a list of JSON objects.
47+
FILE_PATH ="../../test-logs.json"
48+
49+
async with client:
50+
# Option 1: Upload the file contents by passing in the file stream. With this option, no chunking is done, and the
51+
# file contents are uploaded as is through one request. Subject to size service limits.
52+
with open(FILE_PATH, "r") as f:
53+
try:
54+
await client.upload(
55+
rule_id=os.environ['LOGS_DCR_RULE_ID'],
56+
stream_name=os.environ['LOGS_DCR_STREAM_NAME'],
57+
logs=f)
58+
except HttpResponseError as e:
59+
print(f"File stream upload failed: {e}")
60+
61+
# Option 2: Upload the file contents by passing in the list of JSON objects. Chunking is done automatically, and the
62+
# file contents are uploaded through multiple requests.
63+
with open(FILE_PATH, "r") as f:
64+
logs = json.load(f)
65+
try:
66+
await client.upload(
67+
rule_id=os.environ['LOGS_DCR_RULE_ID'],
68+
stream_name=os.environ['LOGS_DCR_STREAM_NAME'],
69+
logs=logs)
70+
except HttpResponseError as e:
71+
print(f"List upload failed: {e}")
72+
73+
await credential.close()
74+
75+
76+
if __name__ == '__main__':
77+
asyncio.run(upload_file_contents())
Lines changed: 81 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,81 @@
1+
# -------------------------------------------------------------------------
2+
# Copyright (c) Microsoft Corporation. All rights reserved.
3+
# Licensed under the MIT License. See License.txt in the project root for
4+
# license information.
5+
# --------------------------------------------------------------------------
6+
7+
"""
8+
FILE: sample_upload_pandas_dataframe_async.py
9+
10+
DESCRIPTION:
11+
This sample demonstrates how to upload data stored in a pandas dataframe to Azure Monitor. For example,
12+
a user might have the result of a query stored in a pandas dataframe and want to upload it to a Log Analytics
13+
workspace.
14+
15+
Note: This sample requires the azure-identity and pandas libraries.
16+
17+
USAGE:
18+
python sample_upload_pandas_dataframe_async.py
19+
20+
Set the environment variables with your own values before running the sample:
21+
1) DATA_COLLECTION_ENDPOINT - your data collection endpoint
22+
2) LOGS_DCR_RULE_ID - your data collection rule immutable ID
23+
3) LOGS_DCR_STREAM_NAME - your data collection rule stream name
24+
25+
If using an application service principal for authentication, set the following:
26+
1) AZURE_TENANT_ID - your Azure AD tenant (directory) ID
27+
2) AZURE_CLIENT_ID - your Azure AD client (application) ID
28+
3) AZURE_CLIENT_SECRET - your Azure AD client secret
29+
"""
30+
31+
import asyncio
32+
from datetime import datetime
33+
import json
34+
import os
35+
36+
import pandas as pd
37+
38+
from azure.core.exceptions import HttpResponseError
39+
from azure.identity.aio import DefaultAzureCredential
40+
from azure.monitor.ingestion.aio import LogsIngestionClient
41+
42+
43+
async def upload_dataframe() -> None:
44+
45+
# Set up example DataFrame.
46+
data = [
47+
{
48+
"Time": datetime.now().astimezone(),
49+
"Computer": "Computer1",
50+
"AdditionalContext": "context-2"
51+
},
52+
{
53+
"Time": datetime.now().astimezone(),
54+
"Computer": "Computer2",
55+
"AdditionalContext": "context"
56+
}
57+
]
58+
df = pd.DataFrame.from_dict(data)
59+
60+
# If you have a populated dataframe that you want to upload, one approach is to use the DataFrame `to_json` method
61+
# which will convert any datetime objects to ISO 8601 strings. The `json.loads` method will then convert the JSON string
62+
# into a Python object that can be used for upload.
63+
body = json.loads(df.to_json(orient='records', date_format='iso'))
64+
65+
endpoint = os.environ['DATA_COLLECTION_ENDPOINT']
66+
credential = DefaultAzureCredential()
67+
68+
client = LogsIngestionClient(endpoint=endpoint, credential=credential, logging_enable=True)
69+
async with client:
70+
try:
71+
await client.upload(
72+
rule_id=os.environ['LOGS_DCR_RULE_ID'],
73+
stream_name=os.environ['LOGS_DCR_STREAM_NAME'],
74+
logs=body)
75+
except HttpResponseError as e:
76+
print(f"Upload failed: {e}")
77+
await credential.close()
78+
79+
80+
if __name__ == '__main__':
81+
asyncio.run(upload_dataframe())

0 commit comments

Comments
 (0)