diff --git a/content/learning-paths/iot/azure-iot/Figures/01.png b/content/learning-paths/iot/azure-iot/Figures/01.png new file mode 100644 index 0000000000..66a5062cc1 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/01.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/02.png b/content/learning-paths/iot/azure-iot/Figures/02.png new file mode 100644 index 0000000000..9be7f5d23f Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/02.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/03.png b/content/learning-paths/iot/azure-iot/Figures/03.png new file mode 100644 index 0000000000..5d94b4f288 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/03.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/04.png b/content/learning-paths/iot/azure-iot/Figures/04.png new file mode 100644 index 0000000000..94527c1c30 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/04.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/05.png b/content/learning-paths/iot/azure-iot/Figures/05.png new file mode 100644 index 0000000000..26b59136da Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/05.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/06.png b/content/learning-paths/iot/azure-iot/Figures/06.png new file mode 100644 index 0000000000..e2cb00bbb9 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/06.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/07.png b/content/learning-paths/iot/azure-iot/Figures/07.png new file mode 100644 index 0000000000..b7723dcfc9 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/07.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/08.png b/content/learning-paths/iot/azure-iot/Figures/08.png new file mode 100644 index 0000000000..3d84949c63 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/08.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/09.png b/content/learning-paths/iot/azure-iot/Figures/09.png new file mode 100644 index 0000000000..554f0c6e34 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/09.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/10.png b/content/learning-paths/iot/azure-iot/Figures/10.png new file mode 100644 index 0000000000..e2d212b0cd Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/10.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/11.png b/content/learning-paths/iot/azure-iot/Figures/11.png new file mode 100644 index 0000000000..589161892a Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/11.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/12.png b/content/learning-paths/iot/azure-iot/Figures/12.png new file mode 100644 index 0000000000..de66df9318 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/12.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/13.png b/content/learning-paths/iot/azure-iot/Figures/13.png new file mode 100644 index 0000000000..e1f51e4f42 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/13.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/14.png b/content/learning-paths/iot/azure-iot/Figures/14.png new file mode 100644 index 0000000000..e8ef07b354 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/14.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/15.png b/content/learning-paths/iot/azure-iot/Figures/15.png new file mode 100644 index 0000000000..6fc6770bb6 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/15.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/16.png b/content/learning-paths/iot/azure-iot/Figures/16.png new file mode 100644 index 0000000000..bd98921c51 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/16.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/17.png b/content/learning-paths/iot/azure-iot/Figures/17.png new file mode 100644 index 0000000000..4df5c42524 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/17.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/18.png b/content/learning-paths/iot/azure-iot/Figures/18.png new file mode 100644 index 0000000000..9078556baa Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/18.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/19.png b/content/learning-paths/iot/azure-iot/Figures/19.png new file mode 100644 index 0000000000..e7edd6d92f Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/19.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/20.png b/content/learning-paths/iot/azure-iot/Figures/20.png new file mode 100644 index 0000000000..00d2fb6a29 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/20.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/21.png b/content/learning-paths/iot/azure-iot/Figures/21.png new file mode 100644 index 0000000000..d5b05befc7 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/21.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/22.png b/content/learning-paths/iot/azure-iot/Figures/22.png new file mode 100644 index 0000000000..c30cbecd4b Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/22.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/23.png b/content/learning-paths/iot/azure-iot/Figures/23.png new file mode 100644 index 0000000000..be7c334707 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/23.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/24.png b/content/learning-paths/iot/azure-iot/Figures/24.png new file mode 100644 index 0000000000..8bfaffd4cc Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/24.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/25.png b/content/learning-paths/iot/azure-iot/Figures/25.png new file mode 100644 index 0000000000..d591132b4b Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/25.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/26.png b/content/learning-paths/iot/azure-iot/Figures/26.png new file mode 100644 index 0000000000..291e58ab3f Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/26.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/27.png b/content/learning-paths/iot/azure-iot/Figures/27.png new file mode 100644 index 0000000000..35ac504298 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/27.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/28.png b/content/learning-paths/iot/azure-iot/Figures/28.png new file mode 100644 index 0000000000..a777b20a95 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/28.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/29.png b/content/learning-paths/iot/azure-iot/Figures/29.png new file mode 100644 index 0000000000..3879728434 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/29.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/30.png b/content/learning-paths/iot/azure-iot/Figures/30.png new file mode 100644 index 0000000000..c5777a8cc5 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/30.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/31.png b/content/learning-paths/iot/azure-iot/Figures/31.png new file mode 100644 index 0000000000..bfdd84141b Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/31.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/32.png b/content/learning-paths/iot/azure-iot/Figures/32.png new file mode 100644 index 0000000000..6253ca2376 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/32.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/33.png b/content/learning-paths/iot/azure-iot/Figures/33.png new file mode 100644 index 0000000000..6d4c37e058 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/33.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/34.png b/content/learning-paths/iot/azure-iot/Figures/34.png new file mode 100644 index 0000000000..7cee58606c Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/34.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/35.png b/content/learning-paths/iot/azure-iot/Figures/35.png new file mode 100644 index 0000000000..328f441531 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/35.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/36.png b/content/learning-paths/iot/azure-iot/Figures/36.png new file mode 100644 index 0000000000..742f1241f9 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/36.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/37.png b/content/learning-paths/iot/azure-iot/Figures/37.png new file mode 100644 index 0000000000..30ada344c3 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/37.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/38.png b/content/learning-paths/iot/azure-iot/Figures/38.png new file mode 100644 index 0000000000..d228398ac7 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/38.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/39.png b/content/learning-paths/iot/azure-iot/Figures/39.png new file mode 100644 index 0000000000..3cb7a83cb0 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/39.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/40.png b/content/learning-paths/iot/azure-iot/Figures/40.png new file mode 100644 index 0000000000..86d1a6846e Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/40.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/41.png b/content/learning-paths/iot/azure-iot/Figures/41.png new file mode 100644 index 0000000000..75c68f9fa1 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/41.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/42.png b/content/learning-paths/iot/azure-iot/Figures/42.png new file mode 100644 index 0000000000..de9106e6c5 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/42.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/43.png b/content/learning-paths/iot/azure-iot/Figures/43.png new file mode 100644 index 0000000000..07839de9f0 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/43.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/44.png b/content/learning-paths/iot/azure-iot/Figures/44.png new file mode 100644 index 0000000000..f540b53000 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/44.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/45.png b/content/learning-paths/iot/azure-iot/Figures/45.png new file mode 100644 index 0000000000..07dd1d9839 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/45.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/46.png b/content/learning-paths/iot/azure-iot/Figures/46.png new file mode 100644 index 0000000000..500dc00069 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/46.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/47.png b/content/learning-paths/iot/azure-iot/Figures/47.png new file mode 100644 index 0000000000..1ba6ec0b4b Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/47.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/48.png b/content/learning-paths/iot/azure-iot/Figures/48.png new file mode 100644 index 0000000000..4ebfaaba49 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/48.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/49.png b/content/learning-paths/iot/azure-iot/Figures/49.png new file mode 100644 index 0000000000..0527909e9c Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/49.png differ diff --git a/content/learning-paths/iot/azure-iot/Figures/50.png b/content/learning-paths/iot/azure-iot/Figures/50.png new file mode 100644 index 0000000000..67f55020ba Binary files /dev/null and b/content/learning-paths/iot/azure-iot/Figures/50.png differ diff --git a/content/learning-paths/iot/azure-iot/_index.md b/content/learning-paths/iot/azure-iot/_index.md new file mode 100644 index 0000000000..6757d7f5f4 --- /dev/null +++ b/content/learning-paths/iot/azure-iot/_index.md @@ -0,0 +1,65 @@ +--- +title: Creating IoT Solutions in Azure for Arm64-Powered Devices + +minutes_to_complete: 320 + +who_is_this_for: This is an advanced topic for software developers interested in learning how to build a comprehensive IoT solution in Azure that streams, stores, monitors, aggregates, and visualizes data from Arm64-powered IoT devices. + +learning_objectives: + - Set up and configure an Azure IoT Hub. + - Register an IoT device and stream data using the Azure IoT SDK. + - Stream IoT data into Azure services using Azure Stream Analytics. + - Store and persist streamed IoT data in Azure Cosmos DB by configuring a Stream Analytics job. + - Implement data monitoring and alerts by creating an Azure Function that checks sensor data from Cosmos DB and sends notifications when thresholds are exceeded. + - Aggregate sensor readings by developing an Azure Function that calculates average values from data stored in Cosmos DB. + - Publish aggregated IoT data to a public-facing web portal, by deploying a Static Web App hosted on Azure Blob Storage + +prerequisites: + - A machine that can run Python3, and Visual Studio Code. + - Azure Account and Subscription. + - Azure CLI (Command Line Interface). + - Azure IoT SDK for Python. + +author: Dawid Borycki + +### Tags +skilllevels: Advanced +subjects: Internet of Things +armips: + - Cortex-A + - Neoverse +operatingsystems: + - Windows + - Linux + - macOS +tools_software_languages: + - Coding + - VS Code +shared_path: true +shared_between: + - servers-and-cloud-computing + - laptops-and-desktops + - mobile-graphics-and-gaming + +further_reading: + - resource: + title: Official Azure IoT SDK for Python Documentation + link: https://learn.microsoft.com/azure/iot-hub/iot-hub-devguide-sdks + type: documentation + - resource: + title: Azure IoT Hub Python Quickstart + link: https://learn.microsoft.com/azure/iot-hub/quickstart-send-telemetry-python + type: website + - resource: + title: End-to-End IoT Solution Tutorial with Python and Azure + link: https://github.com/Azure-Samples/azure-iot-samples-python + type: website + + + +### FIXED, DO NOT MODIFY +# ================================================================================ +weight: 1 # _index.md always has weight of 1 to order correctly +layout: "learningpathall" # All files under learning paths have this same wrapper +learning_path_main_page: "yes" # This should be surfaced when looking for related content. Only set for _index.md of learning path content. +--- diff --git a/content/learning-paths/iot/azure-iot/_next-steps.md b/content/learning-paths/iot/azure-iot/_next-steps.md new file mode 100644 index 0000000000..c3db0de5a2 --- /dev/null +++ b/content/learning-paths/iot/azure-iot/_next-steps.md @@ -0,0 +1,8 @@ +--- +# ================================================================================ +# FIXED, DO NOT MODIFY THIS FILE +# ================================================================================ +weight: 21 # Set to always be larger than the content in this path to be at the end of the navigation. +title: "Next Steps" # Always the same, html page title. +layout: "learningpathall" # All files under learning paths have this same wrapper for Hugo processing. +--- diff --git a/content/learning-paths/iot/azure-iot/aggregation.md b/content/learning-paths/iot/azure-iot/aggregation.md new file mode 100644 index 0000000000..e89871f593 --- /dev/null +++ b/content/learning-paths/iot/azure-iot/aggregation.md @@ -0,0 +1,214 @@ +--- +# User change +title: "Sensor Data Aggregation Using Azure Functions" + +weight: 8 + +layout: "learningpathall" +--- + +## Objective +In the previous section, you configured Azure Stream Analytics to securely store incoming IoT telemetry data in Azure Cosmos DB, making sensor data readily available for further processing. In this section, you’ll enhance your IoT solution by implementing real-time data aggregation capabilities using Azure Functions. Azure Functions is a powerful, event-driven, serverless compute service provided by Azure that allows you to execute custom code in response to scheduled events without managing infrastructure. You’ll create an Azure Function that periodically queries sensor data from Cosmos DB and computes aggregated metrics, such as average, minimum, and maximum values, enabling you to derive actionable insights and monitor sensor performance more effectively. + +## Data Aggregation +As your IoT solution matures, the volume of sensor data continuously captured and securely stored in Azure Cosmos DB grows rapidly. However, raw telemetry data alone may not effectively communicate actionable insights, especially when quick decision-making and proactive management are required. Transforming this raw sensor data into meaningful, summarized information becomes essential for efficient monitoring, accurate analysis, and rapid response. + +Aggregating sensor readings into various metrics such as average, minimum, and maximum values helps reveal underlying patterns, trends, and anomalies that might otherwise remain hidden. By identifying these trends early, you can proactively manage your devices, maintain optimal operational efficiency, and swiftly detect conditions that require immediate attention, such as overheating or other critical environmental issues. + +In this section, you will leverage Azure Functions to implement a data aggregation. This Azure Function will respond to the HTTP trigger, and return aggregated sensor data. + +### Azure Function with HTTP Trigger +Building upon the sensor data aggregation strategy, this section demonstrates how to implement a serverless Azure Function using an HTTP trigger to calculate real-time insights from sensor data stored in Azure Cosmos DB. Specifically, you’ll create an HTTP-triggered function that queries temperature readings from the past minute, computes the average temperature, and returns this aggregated value as a JSON response. This HTTP-triggered approach provides an on-demand method to access up-to-date metrics. + +To implement this functionality open the function_app.py and modify it as follows: +1. Add the following import statements: +```python +from azure.cosmos import CosmosClient +import datetime +import json +``` + +2. Define the following constants: +```python +DATABASE_NAME = "IoTDatabase" +CONTAINER_NAME = "SensorReadings" +CONNECTION_ENV_VAR = "armiotcosmosdb_DOCUMENTDB" +``` + +3. Add the following function: + +```python +@app.function_name(name="GetAverageTemperature") +@app.route(route="averagetemperature", methods=["GET"], auth_level=func.AuthLevel.ANONYMOUS) +def get_average_temperature(req: func.HttpRequest) -> func.HttpResponse: + """ + This HTTP-triggered Function queries Cosmos DB for documents + from the last 1 minute, calculates an average temperature, + and returns it as a JSON response. + """ + logging.info("Received request for average temperature from the last minute.") + + cosmos_conn_str = os.environ.get(CONNECTION_ENV_VAR) + if not cosmos_conn_str: + logging.error(f"Environment variable '{CONNECTION_ENV_VAR}' not set.") + return func.HttpResponse( + "Internal server error: Missing Cosmos DB connection string.", + status_code=500 + ) + + # Initialize the Cosmos DB client and get the database and container clients. + try: + client = CosmosClient.from_connection_string(cosmos_conn_str) + database = client.get_database_client(DATABASE_NAME) + container = database.get_container_client(CONTAINER_NAME) + except Exception as e: + logging.error(f"Error initializing Cosmos DB client: {e}") + return func.HttpResponse( + "Internal server error: Failed to initialize Cosmos DB client.", + status_code=500 + ) + + # Calculate the epoch time (in seconds) for 1 minute ago using timezone-aware datetime + one_minute_ago_epoch = int( + (datetime.datetime.now(datetime.timezone.utc) - datetime.timedelta(minutes=1)).timestamp() + ) + + # Query for all documents where _ts >= one_minute_ago_epoch + query = """ + SELECT VALUE c.temperature + FROM c + WHERE c._ts >= @minTs + """ + parameters = [{"name": "@minTs", "value": one_minute_ago_epoch}] + + try: + items = list(container.query_items( + query=query, + parameters=parameters, + enable_cross_partition_query=True + )) + except Exception as e: + logging.error(f"Error querying Cosmos DB: {e}") + return func.HttpResponse( + "Internal server error: Query to Cosmos DB failed.", + status_code=500 + ) + + if not items: + response_body = { + "message": "No temperature readings found in the last minute.", + "averageTemperature": None + } + return func.HttpResponse( + json.dumps(response_body), + status_code=200, + mimetype="application/json" + ) + + # Compute the average temperature safely + try: + average_temp = sum(items) / len(items) + except Exception as e: + logging.error(f"Error calculating average temperature: {e}") + return func.HttpResponse( + "Internal server error: Failed to compute average temperature.", + status_code=500 + ) + + response_body = { + "message": "Success", + "averageTemperature": round(average_temp, 2) + } + + return func.HttpResponse( + json.dumps(response_body), + status_code=200, + mimetype="application/json", + headers={"Access-Control-Allow-Origin": "*"} + ) +``` + +The GetAverageTemperature function is triggered by an HTTP GET request sent to the route /averagetemperature. Upon invocation, it first logs that a request has been received for calculating the average temperature based on data from the last minute. + +The function then retrieves the Cosmos DB connection string from an environment variable. If the connection string is not available, the function logs an error and returns a 500 Internal Server Error response, indicating that essential configuration details are missing. + +Next, it initializes the Cosmos DB client using the provided connection string. It accesses the appropriate database (IoTDatabase) and container (SensorReadings) to perform the subsequent data retrieval. If there are issues initializing the Cosmos DB client, it logs an error and responds with a 500 Internal Server Error. + +To determine the relevant data points, the function calculates a timestamp representing exactly one minute before the current UTC time. It then constructs and executes a query against Cosmos DB to retrieve temperature readings (temperature values) with a timestamp (_ts) greater than or equal to this calculated value, effectively fetching all recent temperature data from the last minute. + +If no recent temperature data is found, the function returns a JSON response stating that no readings are available for that period, along with a 200 OK status and appropriate CORS headers. The CORS header, set as {"Access-Control-Allow-Origin": "*"}, is essential for allowing cross-origin requests from the portal, ensuring that the client-side application can access the response without any browser security issues. + +When data points are available, the function computes the average temperature from the retrieved readings. In case of unexpected errors during calculation, it logs the issue and responds with a 500 Internal Server Error, while still including the necessary CORS headers so that the portal receives the error response correctly. + +Finally, if the average calculation succeeds, the function constructs a JSON response containing the calculated average temperature (rounded to two decimal places) along with a success message. It then sends this response back to the caller with a status code of 200 OK and the configured CORS header {"Access-Control-Allow-Origin": "*"}, which is required to ensure that the portal can successfully retrieve and display the data from the function. + +Before running the function, dependencies need to be added and installed. Open the requirements.txt file and include the following lines: + +```json +azure-cosmos +datetime +``` + +Save the file and open your command prompt, then execute the following command to install the dependencies: + +```console +pip install -r requirements.txt +``` + +You are now ready to launch the function. Run the following command: + +```console +func start +``` + +Once running, observe the HTTP trigger endpoint, which should appear similar to the following: +![img36 alt-text#center](Figures/36.png) + +Next, start the simulator to stream sensor data and open the HTTP trigger endpoint URL in your web browser. You will see the calculated average temperature displayed: +![img37 alt-text#center](Figures/37.png) + +## Deploy to Azure Function App +Now that your Azure Function is fully tested and ready, it's time to deploy it to Azure, making it accessible online and available for integration with other services and applications. Visual Studio Code provides an easy and efficient way to deploy Azure Functions directly from your local development environment. Follow these steps to deploy your function +1. In Visual Studio Code, open the Command Palette (Ctrl+Shift+P on Windows/Linux, or Cmd+Shift+P on macOS) and search for "Azure Functions: Deploy to Function App": +![img38 alt-text#center](Figures/38.png) +2. The deployment wizard will guide you through the following selections: +* Subscription: choose the Azure subscription you wish to use, +* Select a function app: Select the Function App that you previously created in Azure (in this example, "IoTTemperatureAlertFunc") +* Confirm your deployment: +![img39 alt-text#center](Figures/39.png) +3. Wait for the deployment to complete. This process typically takes a few moments. Once deployed, your Azure Function is hosted in Azure and ready for use. +4. Open the Azure Portal, and go to your function app (in this example, "IoTTemperatureAlertFunc"). You will see the deployed functions: +![img40 alt-text#center](Figures/40.png) + +## Configure Function App Settings +We have just deployed the functions to Azure. Previously, when testing the functions locally, we used the local.settings.json file to store the Cosmos DB connection string. However, this local configuration file is not deployed to Azure. Therefore, we need to update the corresponding settings directly within the Azure portal. + +Azure Function App settings, which are also known as application settings or environment variables, are designed to securely store sensitive configuration information, such as database connection strings, API keys, and other confidential details. Storing the Cosmos DB connection string as an app setting in Azure ensures secure management of your database credentials, allowing your function to safely access Cosmos DB without exposing sensitive information within your source code. + +Follow these steps to configure the Cosmos DB connection string +1. In the Azure Portal navigate to Azure Function App. +2. Click Environmental variables (under Settings) +3. Click the + Add button +4. Enter the name you used in your code (e.g., armiotcosmosdb_DOCUMENTDB). +5. Paste the Cosmos DB connection string into the Value field: +![img41 alt-text#center](Figures/41.png) +6. Click Apply to add the setting. +7. Press Apply at the bottom to apply changes. Then, confirm to save changes + +## Testing the Azure Function +Once you've configured the connection string, test your deployed Azure Function as follows: +1. Return to the Overview page of your Azure Function App. +2. Click on your HTTP-triggered function (GetAverageTemperature). +3. Click Get function URL and copy the displayed URL (under default): +![img42 alt-text#center](Figures/42.png) +4. Open this URL in your web browser. +5. Start your IoT simulator to begin streaming telemetry data to Cosmos DB. +6. Refresh or access the function URL again, and you should see the calculated average temperature displayed: +![img43 alt-text#center](Figures/43.png) + +This confirms your Azure Function successfully connects to Cosmos DB, retrieves real-time data, and calculates the average temperature as intended + +## Summary and next steps +In this section, you created an HTTP-triggered Azure Function that retrieves and aggregates records from Cosmos DB. You then deployed your Azure Function to Azure, configured secure application settings to safely store the Cosmos DB connection string, and verified the functionality. You also learned that the local configuration file (local.settings.json) is not automatically deployed to Azure, making it necessary to manually set up these sensitive settings within the Azure portal. Securely managing these application settings in Azure ensures that your functions can reliably connect to Cosmos DB, facilitating the accurate retrieval and processing of IoT telemetry data. + +In the next step, you’ll create a static website that leverages this HTTP-triggered function to display the average temperature in a web-based portal, thus completing your IoT solution. \ No newline at end of file diff --git a/content/learning-paths/iot/azure-iot/device_registration.md b/content/learning-paths/iot/azure-iot/device_registration.md new file mode 100644 index 0000000000..603a60e3aa --- /dev/null +++ b/content/learning-paths/iot/azure-iot/device_registration.md @@ -0,0 +1,250 @@ +--- +# User change +title: "Device registration" + +weight: 4 + +layout: "learningpathall" +--- + +## Intro +In this section, you’ll learn how to build a comprehensive IoT simulation using Azure IoT Hub and Python. You’ll create a reusable SensorReading class designed to simulate realistic sensor readings, including temperature, pressure, humidity, and timestamps. Additionally, you’ll implement a telemetry simulator that periodically sends these sensor readings to Azure IoT Hub, enabling you to observe real-time data streaming and cloud integration. + +Finally, you’ll configure your Python application to connect securely to Azure IoT Hub, allowing you to monitor and validate the continuous data flow. By the end of this section, you’ll have hands-on experience simulating IoT telemetry, providing a solid foundation for developing more advanced data analytics and visualization solutions in Azure. + +## Azure IoT device SDK +Begin by installing the Azure IoT Device SDK for Python, which provides essential tools and libraries needed to develop IoT applications that communicate seamlessly with Azure IoT Hub. This SDK enables secure device connectivity, message transmission, and management functionalities directly from Python code. + +You can install the SDK easily using Python’s package manager pip. Open a terminal or command prompt and run the following command: +``` +pip install azure-iot-device +``` +The output should look similar to the following one: +```output +Collecting azure-iot-device + Downloading azure_iot_device-2.14.0-py3-none-any.whl.metadata (15 kB) +Requirement already satisfied: urllib3<3.0.0,>=2.2.2 in /Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages (from azure-iot-device) (2.2.3) +Collecting deprecation<3.0.0,>=2.1.0 (from azure-iot-device) + Downloading deprecation-2.1.0-py2.py3-none-any.whl.metadata (4.6 kB) +Collecting paho-mqtt<2.0.0,>=1.6.1 (from azure-iot-device) + Downloading paho-mqtt-1.6.1.tar.gz (99 kB) + Preparing metadata (setup.py) ... done +Requirement already satisfied: requests<3.0.0,>=2.32.3 in /Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages (from azure-iot-device) (2.32.3) +Collecting requests-unixsocket2>=0.4.1 (from azure-iot-device) + Downloading requests_unixsocket2-0.4.2-py3-none-any.whl.metadata (3.9 kB) +Collecting janus (from azure-iot-device) + Downloading janus-2.0.0-py3-none-any.whl.metadata (5.3 kB) +Collecting PySocks (from azure-iot-device) + Downloading PySocks-1.7.1-py3-none-any.whl.metadata (13 kB) +Requirement already satisfied: typing-extensions in /Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages (from azure-iot-device) (4.12.2) +Requirement already satisfied: packaging in /Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages (from deprecation<3.0.0,>=2.1.0->azure-iot-device) (24.1) +Requirement already satisfied: charset-normalizer<4,>=2 in /Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages (from requests<3.0.0,>=2.32.3->azure-iot-device) (3.3.2) +Requirement already satisfied: idna<4,>=2.5 in /Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages (from requests<3.0.0,>=2.32.3->azure-iot-device) (3.10) +Requirement already satisfied: certifi>=2017.4.17 in /Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages (from requests<3.0.0,>=2.32.3->azure-iot-device) (2024.8.30) +Downloading azure_iot_device-2.14.0-py3-none-any.whl (168 kB) +Downloading deprecation-2.1.0-py2.py3-none-any.whl (11 kB) +Downloading requests_unixsocket2-0.4.2-py3-none-any.whl (7.8 kB) +Downloading janus-2.0.0-py3-none-any.whl (12 kB) +Downloading PySocks-1.7.1-py3-none-any.whl (16 kB) +Building wheels for collected packages: paho-mqtt + Building wheel for paho-mqtt (setup.py) ... done + Created wheel for paho-mqtt: filename=paho_mqtt-1.6.1-py3-none-any.whl size=62116 sha256=2ef0547e1a8e9d70c8e2e10bf98593bdeed291e1ceb03ab19a3f47189da31a6c + Stored in directory: /Users/db/Library/Caches/pip/wheels/8b/bb/0c/79444d1dee20324d442856979b5b519b48828b0bd3d05df84a +Successfully built paho-mqtt +Installing collected packages: paho-mqtt, PySocks, janus, deprecation, requests-unixsocket2, azure-iot-device +Successfully installed PySocks-1.7.1 azure-iot-device-2.14.0 deprecation-2.1.0 janus-2.0.0 paho-mqtt-1.6.1 requests-unixsocket2-0.4.2 +``` + +## Creating a Python IoT Simulator Application +In this section, you’ll create a Python application that simulates realistic sensor data generated by an Arm64-powered IoT device and streams this data securely to Azure IoT Hub. You’ll define a reusable and structured SensorReading class, capable of generating randomized yet realistic sensor measurements, including temperature, pressure, humidity, and timestamps. + +Following this, you’ll implement an asynchronous telemetry simulator method, which continuously generates sensor readings at predefined intervals and transmits them to Azure IoT Hub. + +To achieve the above, create the iot_simulator.py file and modify it as follows: + +```python +import asyncio +import json +import random +import logging +from datetime import datetime, timezone +from azure.iot.device.aio import IoTHubDeviceClient + +# Configure logging +logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') + +class SensorReading: + """ + Represents a sensor reading from an IoT device. + """ + def __init__(self, device_id, temperature, pressure, humidity, timestamp): + self.device_id = device_id + self.temperature = temperature + self.pressure = pressure + self.humidity = humidity + self.timestamp = timestamp + + def to_json(self): + """ + Serialize the sensor reading to a JSON string. + """ + return json.dumps({ + "deviceId": self.device_id, + "temperature": self.temperature, + "pressure": self.pressure, + "humidity": self.humidity, + "timestamp": self.timestamp.isoformat() + }) + + @staticmethod + def generate_random(device_id): + """ + Generate a sensor reading with random values for temperature, pressure, and humidity. + """ + return SensorReading( + device_id=device_id, + temperature=round(random.uniform(20.0, 30.0), 2), + pressure=round(random.uniform(990.0, 1020.0), 2), + humidity=round(random.uniform(30.0, 80.0), 2), + timestamp=datetime.now(timezone.utc) + ) + +async def send_telemetry(connection_string, device_id, interval_seconds=1): + device_client = IoTHubDeviceClient.create_from_connection_string(connection_string) + + try: + await device_client.connect() + logging.info("Connected to Azure IoT Hub.") + + while True: + reading = SensorReading.generate_random(device_id) + message = reading.to_json() + await device_client.send_message(message) + logging.info("Telemetry sent: %s", message) + await asyncio.sleep(interval_seconds) + + except KeyboardInterrupt: + logging.info("Telemetry sending stopped by user.") + except Exception as e: + logging.error("An error occurred: %s", e) + finally: + await device_client.disconnect() + logging.info("Disconnected from Azure IoT Hub.") + +def main(): + # Replace with your actual device connection string from Azure IoT Hub. + CONNECTION_STRING = "" + DEVICE_ID = "arm64Device01" + INTERVAL_SECONDS = 1 + + try: + asyncio.run(send_telemetry(CONNECTION_STRING, DEVICE_ID, INTERVAL_SECONDS)) + except Exception as e: + logging.error("Error running the telemetry sender: %s", e) + +if __name__ == "__main__": + main() +``` + +The SensorReading class is designed to encapsulate and manage data from an IoT sensor. It models a single sensor reading by holding key attributes such as the device’s unique identifier, the measured temperature, pressure, and humidity, along with a timestamp that records exactly when the reading was taken. This structured representation allows you to easily handle and process sensor data within your application. + +In addition to storing these values, the class provides a to_json method, which converts the sensor reading into a JSON-formatted string. This makes it particularly useful for transmitting data over the network, as JSON is a widely accepted format in IoT communications and web services. + +Furthermore, the class includes a static method called generate_random. This method is a utility that simulates sensor data by generating random, yet realistic, values for temperature, pressure, and humidity. It also automatically sets the current UTC time (with proper timezone awareness) as the timestamp for the reading. This feature is especially useful when you need to simulate sensor output for testing or demonstration purposes, allowing you to mimic the behavior of a real IoT device without requiring actual sensor hardware. + +Then we have the send_telemetry method. It is an asynchronous function designed to connect an IoT device to Azure IoT Hub and continuously transmit telemetry data at specified intervals. When invoked, it begins by creating a device client instance using the provided connection string, which contains the necessary credentials to authenticate with the IoT Hub. Once the connection is established, the send_telemetry method logs a confirmation message indicating a successful connection. + +Inside an infinite loop, the function repeatedly generates a new sensor reading by calling the generate_random method of the SensorReading class, which simulates realistic sensor data for temperature, pressure, and humidity along with a current timestamp. This sensor data is then converted to a JSON string using the to_json method, making it suitable for transmission. The JSON message is sent to the Azure IoT Hub using the device client, and a log entry records each transmission for monitoring purposes. + +The function includes error handling to gracefully manage interruptions. For instance, if the user stops the process (via a keyboard interrupt), it logs that the telemetry sending has been halted. Additionally, any unexpected errors are caught and logged. Finally, regardless of how the loop is exited, the function ensures that the device client is properly disconnected from the IoT Hub, logging this disconnection to maintain clear operational records. + +The main function serves as the entry point of the application, where essential configuration values for connecting to Azure IoT Hub are defined. Here, it sets the CONNECTION_STRING—which you need to replace with your actual device connection string from Azure IoT Hub—along with a unique DEVICE_ID and an INTERVAL_SECONDS value that determines how frequently the telemetry data is sent (in this case, every second). + +Within the main function, the asynchronous send_telemetry function is executed using asyncio.run(), which manages the event loop and ensures that the asynchronous operations run correctly. This function call initiates the process of connecting to the IoT Hub, generating sensor readings, and transmitting telemetry data at regular intervals. The entire operation is wrapped in a try-except block to catch and log any errors that might occur during execution. + +Finally, the conditional check if __name__ == "__main__": ensures that the main function is called only when the script is executed directly, rather than when it is imported as a module in another script. This structure provides a clear and organized starting point for the application, making it easier to understand and maintain. + +## Connecting IoT Device to Azure IoT Hub +To connect the Python application you developed earlier to Azure IoT Hub, follow these detailed steps. + +1. Register a Device on Azure IoT Hub: +* Open the Azure Portal, go to your IoT Hub, and click Devices under Device management: +![img6 alt-text#center](Figures/06.png) +* Click “Add Device”, enter a device ID (e.g., arm64Device01), and leave the authentication type as “Symmetric key.” +![img7 alt-text#center](Figures/07.png) +* Click “Save”. + +2. Next, you’ll need to retrieve the connection string to integrate your Python application with Azure IoT Hub: +* From the device list, select your newly created device (arm64Device01) +* Copy the Primary Connection String from the device details page. You’ll need this connection string to authenticate your Python application when connecting and streaming telemetry data +![img8 alt-text#center](Figures/08.png) + +Ensure this connection string is stored securely, as it provides authentication credentials for your device. In the next step, you’ll integrate this connection string into your Python simulator app, enabling secure communication and real-time data streaming to Azure IoT Hub + +## Streaming Telemetry Data to Azure IoT Hub +Now you’re ready to stream telemetry data from your Python application directly to Azure IoT Hub. Follow these steps to configure and run the application: +1. Open your Python IoT simulator script (iot_simulator.py). +2. Update the connection string. Replace the placeholder with your actual Azure IoT device primary connection string, which you retrieved from the Azure portal: +```python +CONNECTION_STRING = "HostName=iot-hub-arm64.azure-devices.net" +``` +Ensure this connection string exactly matches the string copied from Azure IoT Hub to establish a secure connection. + +3. Run your Python application. Open a terminal or command prompt, navigate to the script location, and execute: +``` +python3 iot_simulator.py +``` + +Upon successful execution, you should see output similar to the following, confirming a stable connection and periodic telemetry data transmission to Azure IoT Hub: + +```output +2025-03-16 19:39:12,944 - INFO - Creating client for connecting using MQTT over TCP +2025-03-16 19:39:12,949 - INFO - Connecting to Hub... +2025-03-16 19:39:12,949 - INFO - Connect using port 8883 (TCP) +2025-03-16 19:39:13,617 - INFO - connected with result code: 0 +2025-03-16 19:39:13,617 - INFO - _on_mqtt_connected called +2025-03-16 19:39:13,618 - INFO - Connection State - Connected +2025-03-16 19:39:13,618 - INFO - Successfully connected to Hub +2025-03-16 19:39:13,618 - INFO - Connected to Azure IoT Hub. +2025-03-16 19:39:13,619 - INFO - Sending message to Hub... +2025-03-16 19:39:13,620 - INFO - publishing on devices/arm64Device01/messages/events/ +2025-03-16 19:39:13,922 - INFO - payload published for 1 +2025-03-16 19:39:13,922 - INFO - Successfully sent message to Hub +2025-03-16 19:39:13,922 - INFO - Telemetry sent: {"deviceId": "arm64Device01", "temperature": 27.96, "pressure": 1013.59, "humidity": 41.94, "timestamp": "2025-03-16T18:39:13.619102+00:00"} +2025-03-16 19:39:14,924 - INFO - Sending message to Hub... +2025-03-16 19:39:14,925 - INFO - publishing on devices/arm64Device01/messages/events/ +2025-03-16 19:39:15,165 - INFO - payload published for 2 +2025-03-16 19:39:15,166 - INFO - Successfully sent message to Hub +2025-03-16 19:39:15,167 - INFO - Telemetry sent: {"deviceId": "arm64Device01", "temperature": 25.28, "pressure": 1006.62, "humidity": 79.14, "timestamp": "2025-03-16T18:39:14.924209+00:00"} +2025-03-16 19:39:16,168 - INFO - Sending message to Hub... +2025-03-16 19:39:16,170 - INFO - publishing on devices/arm64Device01/messages/events/ +2025-03-16 19:39:16,401 - INFO - payload published for 3 +2025-03-16 19:39:16,402 - INFO - Successfully sent message to Hub +2025-03-16 19:39:16,402 - INFO - Telemetry sent: {"deviceId": "arm64Device01", "temperature": 28.87, "pressure": 994.86, "humidity": 50.39, "timestamp": "2025-03-16T18:39:16.168566+00:00"} +2025-03-16 19:39:17,404 - INFO - Sending message to Hub... +2025-03-16 19:39:17,405 - INFO - publishing on devices/arm64Device01/messages/events/ +2025-03-16 19:39:17,634 - INFO - payload published for 4 +2025-03-16 19:39:17,635 - INFO - Successfully sent message to Hub +2025-03-16 19:39:17,635 - INFO - Telemetry sent: {"deviceId": "arm64Device01", "temperature": 24.44, "pressure": 1015.0, "humidity": 70.05, "timestamp": "2025-03-16T18:39:17.404173+00:00"} +2025-03-16 19:39:18,636 - INFO - Sending message to Hub... +2025-03-16 19:39:18,637 - INFO - publishing on devices/arm64Device01/messages/events/ +2025-03-16 19:39:18,873 - INFO - payload published for 5 +2025-03-16 19:39:18,874 - INFO - Successfully sent message to Hub +2025-03-16 19:39:18,874 - INFO - Telemetry sent: {"deviceId": "arm64Device01", "temperature": 26.26, "pressure": 1002.21, "humidity": 43.71, "timestamp": "2025-03-16T18:39:18.636419+00:00"} +2025-03-16 19:39:19,875 - INFO - Sending message to Hub... +2025-03-16 19:39:19,875 - INFO - publishing on devices/arm64Device01/messages/events/ +2025-03-16 19:39:20,108 - INFO - payload published for 6 +2025-03-16 19:39:20,109 - INFO - Successfully sent message to Hub +2025-03-16 19:39:20,109 - INFO - Telemetry sent: {"deviceId": "arm64Device01", "temperature": 25.71, "pressure": 996.22, "humidity": 54.84, "timestamp": "2025-03-16T18:39:19.874801+00:00"} +``` + +Each telemetry message contains randomized sensor data (temperature, pressure, humidity), device ID, and a timestamp, providing realistic simulated data for IoT applications. + +To stop streaming telemetry data, press Ctrl+C in the terminal. The application will gracefully disconnect from Azure IoT Hub. + +This step completes the telemetry-streaming component of your Azure IoT application, laying the groundwork for subsequent steps like data processing, monitoring, and visualization + +## Summary +In this part, you’ve successfully created and configured a Python IoT simulator application designed specifically for streaming realistic sensor data from Arm64-powered IoT devices. You’ve implemented a robust and reusable SensorReading class, generating randomized values for key sensor metrics—temperature, pressure, humidity—and timestamping these readings accurately. Additionally, you connected the Python application securely to your newly created Azure IoT Hub using the Azure IoT device SDK, establishing real-time telemetry data streaming capabilities. + +With this setup in place, your simulated IoT device continuously transmits data to Azure IoT Hub, providing a solid foundation to explore more advanced IoT scenarios, such as real-time data analytics, cloud storage, monitoring, alerts, and data visualization within the Azure ecosystem. \ No newline at end of file diff --git a/content/learning-paths/iot/azure-iot/image.png b/content/learning-paths/iot/azure-iot/image.png new file mode 100644 index 0000000000..dfd41ec262 Binary files /dev/null and b/content/learning-paths/iot/azure-iot/image.png differ diff --git a/content/learning-paths/iot/azure-iot/intro.md b/content/learning-paths/iot/azure-iot/intro.md new file mode 100644 index 0000000000..57e15490ae --- /dev/null +++ b/content/learning-paths/iot/azure-iot/intro.md @@ -0,0 +1,27 @@ +--- +# User change +title: "Azure IoT" + +weight: 2 + +layout: "learningpathall" +--- + +## Introduction to Internet of Things +The Internet of Things (IoT) is a technological landscape where physical devices, vehicles, buildings, and everyday objects become interconnected, enabling them to communicate, exchange data, and operate collaboratively without direct human intervention. IoT has immense potential across various industries, including manufacturing, healthcare, agriculture, logistics, and smart homes, where it enhances operational efficiency, productivity, safety, and convenience. By collecting and analyzing real-time data from interconnected sensors and devices, IoT solutions help businesses make informed decisions, predict maintenance needs, optimize resource usage, and create innovative user experiences. + +## IoT and Arm Devices +In the context of IoT applications, Arm64-powered devices play a crucial role due to their superior performance, efficiency, and energy optimization capabilities. Arm64, also known as ARMv8-A or AArch64, is a 64-bit architecture widely adopted in mobile devices, embedded systems, edge computing devices, and single-board computers such as Raspberry Pi 4, NVIDIA Jetson, and similar hardware platforms. These devices are characterized by their low power consumption, compact size, and cost-effectiveness, making them ideally suited for battery-operated scenarios, remote monitoring systems, edge analytics, and IoT deployments in environments where power and computational efficiency are critical. Leveraging Arm64-based IoT solutions enables developers and organizations to build intelligent, scalable, and energy-efficient systems, thereby significantly reducing operational costs while maximizing performance. + +## Azure IoT +Azure IoT is a cloud platform provided by Microsoft, designed to build, deploy, and manage scalable Internet of Things (IoT) solutions across various industries. It offers a suite of managed services and tools that facilitate secure device connectivity, data ingestion, real-time analytics, data storage, monitoring, and visualization. By leveraging Azure IoT, organizations can seamlessly integrate diverse IoT devices, sensors, and applications into robust cloud-based solutions, making it ideal for scenarios ranging from predictive maintenance and smart manufacturing to remote asset monitoring and smart cities. + +At the heart of Azure IoT is the Azure IoT Hub, a fully managed, secure communication gateway enabling reliable and bi-directional communication between millions of IoT devices and the cloud. IoT Hub simplifies device management through secure provisioning, authentication, and connectivity. Complementary services like Azure IoT Central provide ready-to-use IoT solutions with minimal coding, allowing rapid prototyping and deployment of IoT applications, especially suitable for businesses looking to accelerate time-to-value. + +Azure IoT’s powerful analytics capabilities are delivered through services such as Azure Stream Analytics and integration with Azure Cosmos DB. These tools enable real-time processing, storage, and analysis of high-velocity IoT data streams, facilitating timely decision-making and proactive monitoring. Additionally, serverless offerings such as Azure Functions further enhance flexibility, allowing businesses to build event-driven applications that react instantly to IoT events and sensor readings. + +Overall, Azure IoT offers an extensive, secure, and highly scalable environment, empowering organizations to transform data from connected devices into actionable insights, operational efficiencies, and innovative solutions, all while simplifying the complexities inherent in building and managing IoT infrastructures. + +In this learning path, you’ll learn how to effectively leverage the Azure IoT ecosystem by building a complete, end-to-end IoT solution tailored specifically for Arm64-powered devices using Python. We’ll start by setting up and configuring an Azure IoT Hub, the central component that facilitates secure communication and device management. Next, we’ll register our Arm64 IoT device and use the Azure IoT Python SDK to stream real-time sensor data to the cloud. + +Once data streaming is established, you’ll explore real-time analytics capabilities with Azure Stream Analytics, enabling immediate processing and transformation of incoming telemetry. We’ll store this streaming IoT data securely and efficiently in Azure Cosmos DB, configuring Stream Analytics to ensure seamless data persistence. To enhance our solution’s robustness, you’ll implement a serverless data monitoring and alerting system using Azure Functions, automatically notifying users when sensor data exceeds predefined thresholds. Additionally, you’ll learn how to aggregate sensor readings by creating an Azure Function that calculates critical statistics like averages, minimums, and maximums. Finally, we’ll visualize and share our aggregated IoT data by publishing it to a publicly accessible web portal, built as a static web application hosted on Azure Blob Storage. \ No newline at end of file diff --git a/content/learning-paths/iot/azure-iot/iot-hub.md b/content/learning-paths/iot/azure-iot/iot-hub.md new file mode 100644 index 0000000000..f215e28989 --- /dev/null +++ b/content/learning-paths/iot/azure-iot/iot-hub.md @@ -0,0 +1,62 @@ +--- +# User change +title: "IoT Hub" + +weight: 3 + +layout: "learningpathall" +--- + +## Azure IoT Hub +Azure IoT Hub is a fully managed cloud service provided by Microsoft Azure, designed as a secure, scalable communication gateway for connecting IoT devices to cloud-hosted applications and analytics systems. It acts as the core element of Azure-based IoT solutions, facilitating reliable two-way communication between millions of IoT devices and the cloud. IoT Hub supports bi-directional messaging, enabling not only device-to-cloud telemetry data transfer but also cloud-to-device commands, configuration updates, and remote device management. + +A key advantage of Azure IoT Hub is its built-in device provisioning, authentication, and management capabilities, which allow you to securely onboard, register, and manage IoT devices at scale. It supports multiple communication protocols, including MQTT, AMQP, and HTTPS, making it versatile and compatible with a broad range of devices. IoT Hub integrates seamlessly with other Azure services, such as Azure Stream Analytics, Azure Cosmos DB, Azure Functions, and Azure Blob Storage, facilitating the development of sophisticated IoT solutions with minimal complexity. + +Additionally, Azure IoT Hub provides monitoring and diagnostics capabilities, making it easier to identify connectivity issues, analyze device performance, and maintain operational efficiency. Its built-in security features, such as per-device authentication and secure device identity management, ensure that sensitive data remains protected throughout its lifecycle. + +In the following sections of this tutorial, you’ll learn how to create and configure an Azure IoT Hub, register an Arm64-based IoT device, and utilize Python to stream sensor data securely and efficiently into Azure. + +## Create Azure IoT Hub +Start by creating an Azure IoT Hub +1. Sign in to the Azure Portal: +* Open your web browser and go to portal.azure.com. +* Sign in using your Azure account credentials. + +2. Create a new Azure IoT Hub resource +* On the Azure Portal home page, select “Create a resource” at the top left as shown below +![img1 alt-text#center](Figures/01.png) + +* In the Search services and marketplace box, type “IoT Hub” and press Enter. +* Click on IoT Hub from the search results: +![img2 alt-text#center](Figures/02.png) + +3. Click the “Create” button: +![img3 alt-text#center](Figures/03.png) + +4. Configure Basic IoT Hub Settings +* Subscription: Select your Azure subscription. +* Resource group: Choose an existing resource group or click “Create new” to create one (e.g., rg-arm). +* IoT Hub Name: Enter a unique name for your IoT Hub (must be globally unique, e.g., iot-hub-arm-64). +* Region: Select a region closest to your location or users. +* Tier: Free. This will update the daily message limit accordingly: +![img4 alt-text#center](Figures/04.png) + +5. Click “Next: Networking”. +6. Configure Networking: +* Keep the default setting (Public access) unless specific network restrictions apply. +* Select 1.0 for the minimum TLS version. +* Click “Next: Management”. +7. Management Settings (Optional) +* Under Management, you can keep default settings. +* Click “Next: Add-ons". +8. Add-ons - keep default settings. Then, click “Next: Tags”. +9. Add tags as needed and then click "Next: Review + Create". +10. Wait for the configuration to be validated, and click Create. +11. Verify IoT Hub Deployment: +* Once deployed, you’ll see a message stating “Your deployment is complete”. +* Click “Go to resource” to open the newly created Azure IoT Hub. +12. Check IoT Hub Overview and Details. From the IoT Hub overview page, verify important details such as the hub name, region, status, and hostname, which you’ll use to connect devices: +![img5 alt-text#center](Figures/05.png) + +## Next steps +Now that your Azure IoT Hub is ready, you can proceed to register and configure your IoT devices. In the next step, you’ll learn how to register an Arm64-based IoT device and start streaming data using Python and Azure IoT SDK. \ No newline at end of file diff --git a/content/learning-paths/iot/azure-iot/monitoring.md b/content/learning-paths/iot/azure-iot/monitoring.md new file mode 100644 index 0000000000..13c71fa98e --- /dev/null +++ b/content/learning-paths/iot/azure-iot/monitoring.md @@ -0,0 +1,299 @@ +--- +# User change +title: "Set Up Data Monitoring and Alerts with Azure Functions" + +weight: 7 + +layout: "learningpathall" +--- + +## Objective +In the previous section, you successfully configured Azure Stream Analytics to store incoming IoT telemetry data securely in Azure Cosmos DB. The stored sensor data is now readily accessible for further analysis, monitoring, and action. In this section, you’ll enhance your IoT solution by implementing real-time data monitoring and alerting capabilities using Azure Functions. + +Azure Functions is a powerful, event-driven, serverless compute service provided by Azure, enabling you to execute custom code in response to specific events or triggers without the need to manage infrastructure. You’ll create an Azure Function that regularly queries temperature data from Cosmos DB, evaluates sensor readings against predefined thresholds, and sends notifications when critical values are exceeded—such as detecting overheating or environmental anomalies. By adding this functionality, you’ll build proactive monitoring into your IoT pipeline, ensuring timely responses to sensor data events and improving overall operational reliability. + +## Azure Functions +Azure Functions is a serverless computing platform provided by Microsoft Azure, designed to enable developers to run event-driven code without having to provision or manage infrastructure. With Azure Functions, you can easily create small, focused applications or services that automatically respond to events, such as database updates, HTTP requests, IoT sensor data events, or scheduled tasks. Because Azure Functions is serverless, it automatically scales based on workload, providing elasticity, rapid deployment, and simplified maintenance—developers only pay for resources actually consumed during execution. + +In IoT scenarios, Azure Functions are particularly valuable for responding to real-time data events, such as sensor readings exceeding specific thresholds. You can integrate Azure Functions seamlessly with services like Azure Cosmos DB, Azure IoT Hub, or Azure Notification Hubs, enabling functions to trigger automatically when new data is received or when certain conditions are met. This flexibility allows you to build responsive, cost-effective, and efficient IoT applications that require minimal setup yet offer highly scalable, real-time processing capabilities. + +### Event-driven Architecture +Azure Functions are inherently event-driven, meaning your code is automatically executed in direct response to specific events or triggers, without manual intervention. Each Azure Function remains dormant and consumes no resources until activated by a defined event, at which point the function is instantly executed. Common triggers include events like new data being written to Azure Cosmos DB, telemetry messages arriving in Azure IoT Hub, incoming HTTP requests, scheduled timers, or even queue-based messages. + +This event-driven design ensures real-time responsiveness, making Azure Functions especially well-suited for IoT scenarios, where timely reactions to incoming sensor data or critical events are crucial. For instance, an Azure Function can immediately activate upon detecting new sensor data in Cosmos DB, evaluate the data (such as checking whether temperature thresholds are exceeded), and promptly send alerts or trigger follow-up actions. + +### Serverless and Scalability +Azure Functions is built on a serverless computing model, meaning you can execute custom code in response to specific events without having to provision or maintain any underlying server infrastructure. This approach enables developers to focus purely on application logic rather than spending time on managing servers, operating systems, or runtime environments. When an event—such as an HTTP request, database update, or new IoT sensor reading—occurs, Azure Functions automatically triggers your custom code, scales the necessary resources dynamically, and executes the function. + +In the context of IoT solutions, the serverless model offered by Azure Functions is especially valuable because it can efficiently handle unpredictable workloads, scaling instantly as data volume fluctuates. Functions can scale out horizontally to accommodate spikes in IoT data without manual intervention, providing real-time responsiveness and reliability. This automatic scaling, coupled with a consumption-based billing model (paying only for resources actually consumed), makes Azure Functions an optimal choice for cost-effective, efficient, and responsive IoT monitoring, alerting, and analytics applications. + +### Triggers and Events +In Azure Functions, the concepts of triggers and events are central to how functions are executed. A trigger defines how and when an Azure Function is executed. Triggers respond to specific events, such as data arrival, HTTP requests, scheduled timers, or changes to database content. When the defined event occurs, the trigger initiates the execution of your custom code automatically, without manual intervention. + +Examples of triggers include: +* HTTP Trigger - executes functions via HTTP requests. +* Timer Trigger - executes code at scheduled intervals. +* Cosmos DB Trigger - runs whenever new data is added or updated in Cosmos DB. +* IoT Hub/Event Hub Triggers - respond immediately to events like incoming IoT device messages. + +In IoT scenarios, triggers tied to Cosmos DB are particularly powerful. For example, an Azure Function can automatically activate when new sensor readings are stored in Cosmos DB, allowing you to implement real-time monitoring, send immediate notifications, or perform analytics. Each function’s event-driven execution ensures your application remains highly responsive, efficient, and scalable—crucial for maintaining performance and cost-effectiveness in IoT solutions + +### Azure Functions Bindings +In addition to triggers, Azure Functions provide another powerful feature called bindings. Bindings allow you to effortlessly connect your functions to other Azure services or external resources, simplifying both input and output integration. Using bindings, you can directly access data from services like Azure Cosmos DB, Azure Blob Storage, Azure Queue Storage without the need to write custom integration code or manage connection logic manually. + +Bindings greatly accelerate development, as you can easily read from or write to external services declaratively—just by defining simple configurations. For instance, when working with IoT solutions, you can configure a Cosmos DB input binding to automatically retrieve sensor data as documents, or set up an output binding to seamlessly persist aggregated data or alerts back to Cosmos DB or Azure Storage. This eliminates repetitive integration code, reduces complexity, and significantly enhances productivity. + +Overall, Azure Function bindings simplify and speed up your development workflow, allowing you to focus entirely on your application logic rather than managing integration details + +### Deployment Options +When working with Azure Functions, there are two primary deployment approaches: using the Azure Portal directly or developing and deploying locally from your development environment. + +Azure Portal provides a user-friendly interface to quickly create, configure, and manage Azure Functions directly from your web browser. It’s particularly suitable for simple scenarios or quick prototypes, as it requires minimal setup and no installation of additional tools. You can easily define triggers, bindings, environment variables, and monitor function execution directly through the portal interface. + +However, there’s an important limitation for Python-based Azure Functions. If you choose Python with a Linux-based hosting plan (required for Python functions), the Azure Portal does not support in-portal code editing. In other words, while you can manage and monitor your Python functions in the portal, you can’t directly edit or modify the function’s Python source code there. + +To overcome this limitation, local development is highly recommended for Python-based Azure Functions. Local development involves developing and testing your Azure Functions on your own computer using the Azure Functions Core Tools and a IDE like Visual Studio Code. After development and local testing, you deploy your function to Azure using command-line tools (func CLI), IDE integrations, or continuous integration solutions such as GitHub Actions. + +For Python functions on Linux-based plans, local development and deployment represent the best-practice approach, enabling you to efficiently create, debug, test, and manage more sophisticated IoT solutions. Therefore, in this section we will use local development. + +## Create an Azure Function App +We will start by creating an Azure Function App, in which we will create an Azure Function that regularly queries temperature data from Cosmos DB. In the next step, we will add the capability to send notifications, whenever the temperature reading exceeds a predefined threshold. Proceed as follows: +1. Sign in to the Azure Portal. +2. Click “Create a resource”, type “Function App”, and select it: +![img24 alt-text#center](Figures/24.png) +3. Click Create, then select Consumption as a hosting option: +![img25 alt-text#center](Figures/25.png) +4. Provide the required details: +* Subscription: Your Azure subscription. +* Resource Group: Select your existing IoT resource group. +* Function App Name: Provide a unique name (e.g., IoTTemperatureAlertFunc). +* Runtime Stack: Select Python. +* Version: Select 3.11 +* Region: Select the same region as your Cosmos DB and IoT Hub. +* Operating System: Linux as Windows is unavailable for Python. +5. Click Review + Create, and then Create. + +![img26 alt-text#center](Figures/26.png) + +## Install Prerequisites +Before writing the code make sure you have the following tools installed: +1. Python (≥3.8 recommended) ([download](https://www.python.org/downloads/)) +2. Azure Functions Core Tools ([installation guide](https://learn.microsoft.com/en-gb/azure/azure-functions/functions-run-local?tabs=macos%2Cisolated-process%2Cnode-v4%2Cpython-v2%2Chttp-trigger%2Ccontainer-apps&pivots=programming-language-python)) +3. Azure CLI ([installation guide](https://learn.microsoft.com/en-gb/cli/azure/install-azure-cli)) +4. Visual Studio Code. + +Ensure Azure Functions Core Tools are properly installed by running: +```console +func --version +``` + +The output would be like: +```output +func --version +4.0.6821 +``` + +Ensure you also see a v4.x.x output, indicating compatibility with Python v2 model. + +## Create Azure Function to Read Cosmos DB Data +Follow these steps to create an Azure Function locally using Visual Studio Code: +1. In Visual Studio Cod, click View->Command Palette... +2. Type "Create Function": +![img27 alt-text#center](Figures/27.png) +3. Select Azure Functions: Create Function... +4. Select folder for your new function. For example create a new folder Arm.AzureIoT.AzureFunctions +5. Visual Studio Code will display the wizard, which enables you to configure your function: +![img28 alt-text#center](Figures/28.png) +6. Use this wizard to configure the function: +* Select a language: pick Python +* Select a Python programming model: Model V2 (Recommended) +* Select a Python interpreter to create a virtual environment: select python3 +* Select a template for your project's first function: CosmosDB trigger +* Name of the function you want to create: keep default (cosmosdb_trigger) +* Name of the container to be monitored: SensorReadings (or the one you created during Azure Cosmos DB provisioning) +* Name of the Cosmos DB database that includes the container to be monitored: IoTDatabaee (or the one you created during Azure Cosmos DB provisioning) +* Select the app setting with your Cosmos DB account connection string from "local.settings.json: Select + Create new local app setting +* Select your Azure subscription, and then select a database account (armiotcosmosdb or the one you used during Azure Cosmos DB provisioning) +* Select how you would like to open your project: Open in current window + +Visual Studio Code will create the following files: +* function_app.py - this primary function code file. In the Azure Functions Python V2 programming model, bindings and triggers are defined using Python decorators directly within this file +* local.settings.json - this file is specifically used for local development, storing connection strings, app settings, and environment variables. It’s not deployed to Azure, so it’s safe to include sensitive data (like connection strings) locally for testing purpose. +* host.json - defines global configuration settings that affect the behavior of your entire Azure Functions application. Examples include function timeout limits, logging levels, and concurrency settings +* requirements.txt - this file lists all Python packages and dependencies required by your function app. Azure Functions uses this file to automatically install the necessary Python packages when deploying the function to Azure + +### Modify Function Code +You will now modify the function code to check whether the temperature of the new sensor readings is above a threshold. To do so, open the function_app.py and modify it as follows: + +```Python +import azure.functions as func +import logging + +app = func.FunctionApp() + +TEMPERATURE_THRESHOLD = 28.0 + +@app.cosmos_db_trigger(arg_name="azcosmosdb", container_name="SensorReadings", + database_name="IoTDatabase", connection="armiotcosmosdb_DOCUMENTDB") +def cosmosdb_trigger(azcosmosdb: func.DocumentList): + logging.info('Python CosmosDB triggered.') + if azcosmosdb: + for doc in azcosmosdb: + device_id = doc.get("deviceId") + temperature = float(doc.get("temperature", 0)) + timestamp = doc.get("timestamp") + + if temperature > TEMPERATURE_THRESHOLD: + logging.warning( + f"High temperature detected! Device: {device_id}, " + f"Temperature: {temperature}°C at {timestamp}" + ) + else: + logging.info( + f"Temperature within limits. Device: {device_id}, " + f"Temperature: {temperature}°C at {timestamp}" + ) +``` + +This Azure Function is designed to automatically process and monitor sensor data stored in Azure Cosmos DB. It begins by importing the required Azure Functions libraries and Python’s built-in logging module. Then, it initializes a function app object, which serves as a container for defining and managing Azure Functions. + +Next, the code defines a global temperature threshold (TEMPERATURE_THRESHOLD) of 28.0°C, which is used to determine whether incoming sensor readings require attention. + +The core of the function is triggered automatically by Azure Cosmos DB whenever new sensor readings are added or updated in the specified database (IoTDatabase) and container (SensorReadings). When triggered, the function receives these new documents (items) as a list of sensor readings, represented by azcosmosdb. For each document in this list, the function extracts deviceId, temperature, and timestamp. + +After extracting these values, the function compares each sensor’s temperature reading against the predefined threshold. If the temperature exceeds the threshold, a warning is logged. If the temperature remains within safe limits, an informational log message is recorded instead, confirming normal operating conditions for the device. + +### Run and Test Your Function Locally +To run your function locally proceed as follows: +1. In the Terminal navigate to the folder, in which you saved the function (here that is Arm.AzureIoT.AzureFunctions). +2. Type the following command: + +```console +func start +``` + +3. Run the iot_simulator.py in a separate terminal window. + +You should then see the following logs, depending on the generated temperature values: + +![img29 alt-text#center](Figures/29.png) + +## Monitoring and Notifications +In this section, you'll extend your existing Azure Function to send email notifications using SendGrid whenever the temperature exceeds the defined threshold + +### Create a SendGrid Account in Azure +Follow these steps to create a SendGrid account: +1. Sign in to the Azure Portal. +2. Click “Create a resource” and search for SendGrid. +![img30 alt-text#center](Figures/30.png) +3. Select Twilio SendGrid, choose the Free 100 (2022) plan, and then click Subscribe. +4. Provide the following details: +* Subscription: Select your Azure subscription. +* Resource group: Choose your existing IoT project resource group. +* Name: Enter a descriptive name (e.g., iot-alerts-sendgrid). +![img31 alt-text#center](Figures/31.png) +5. Click Review + subscribe and then Subscribe. +6. On the next screen, click Configure account now: +![img32 alt-text#center](Figures/32.png) +7. Accept any permissions required by SendGrid, and then enter your details to create a sender identity: +![img33 alt-text#center](Figures/33.png) +8. Fill out the required details, such as your name and email address. +9. After the sender identity is verified, click API Keys in the left menu: +![img34 alt-text#center](Figures/34.png) +10. Click Create API Key. In the popup window, enter a key name (e.g., iot-api-key), and select Full Access. +![img35 alt-text#center](Figures/35.png) +11. Copy the generated API key securely. You will not be able to retrieve it later. + +### Configure SendGrid API Key in your Azure Function +Update your local.settings.json file to include the SendGrid API key: +{ + "IsEncrypted": false, + "Values": { + "AzureWebJobsStorage": "", + "FUNCTIONS_WORKER_RUNTIME": "python", + "armiotcosmosdb_DOCUMENTDB": "<____>", + "SENDGRID_API_KEY": "" + } +} + +Replace with the actual key obtained earlier + +### Install SendGrid Python Library +Update your project's requirements.txt to include the SendGrid Python SDK: +```console +azure-functions +sendgrid +``` + +Then, install these dependencies locally: +```console +pip install -r requirements.txt +``` + +### Extend Your Azure Function to Send Email Notifications +Modify your existing function_app.py file as follows: +```python +import azure.functions as func +import logging +import os +from sendgrid import SendGridAPIClient +from sendgrid.helpers.mail import Mail + +app = func.FunctionApp() + +TEMPERATURE_THRESHOLD = 28.0 + +def send_email_alert(device_id, temperature, timestamp): + message = Mail( + from_email='dawid@borycki.com.pl', # Use your actual email + to_emails='dawid@borycki.com.pl', # Use your actual email + subject=f'⚠️ High Temperature Alert: Device {device_id}', + html_content=f""" + Temperature Alert!
+ Device ID: {device_id}
+ Temperature: {temperature}°C
+ Timestamp: {timestamp} + """ + ) + + try: + sg = SendGridAPIClient(os.environ["SENDGRID_API_KEY"]) + response = sg.send(message) + logging.info(f"SendGrid email sent, status: {response.status_code}") + except Exception as e: + logging.error(f"SendGrid email failed: {e}") + +@app.cosmos_db_trigger(arg_name="azcosmosdb", container_name="SensorReadings", + database_name="IoTDatabase", connection="armiotcosmosdb_DOCUMENTDB") +def cosmosdb_trigger(azcosmosdb: func.DocumentList): + logging.info('Python CosmosDB triggered.') + if azcosmosdb: + for doc in azcosmosdb: + device_id = doc.get("deviceId") + temperature = float(doc.get("temperature", 0)) + timestamp = doc.get("timestamp") + + if temperature > TEMPERATURE_THRESHOLD: + logging.warning( + f"High temperature detected! Device: {device_id}, " + f"Temperature: {temperature}°C at {timestamp}" + ) + send_email_alert(device_id, temperature, timestamp) + else: + logging.info( + f"Temperature within limits. Device: {device_id}, " + f"Temperature: {temperature}°C at {timestamp}" + ) +``` + +The send_email_alert function is responsible for sending an email notification through SendGrid whenever a sensor reading exceeds the specified temperature threshold. It constructs an email message using details about the IoT device, including the device_id, current temperature, and the event timestamp. The function utilizes SendGrid’s Python SDK (SendGridAPIClient) to send the email message. If the email is successfully sent, it logs a confirmation with the status code. If the email fails, it captures and logs the error details, ensuring that any issues with email delivery can be easily identified and resolved. This function enables proactive monitoring by immediately alerting the user when potentially critical temperature conditions are detected, significantly enhancing the reliability and responsiveness of the IoT system + +Now, start your function: +```console +func start +``` + +Then run the iot_simulator.py to send telemetry data, and wait for alert notifications. + +## Summary and Next Steps +You have now successfully configured real-time monitoring with email notifications. You can proceed to enhance your IoT solution by aggregating data and creating dashboards or visualizations. \ No newline at end of file diff --git a/content/learning-paths/iot/azure-iot/portal.md b/content/learning-paths/iot/azure-iot/portal.md new file mode 100644 index 0000000000..9db8dbcc38 --- /dev/null +++ b/content/learning-paths/iot/azure-iot/portal.md @@ -0,0 +1,183 @@ +--- +# User change +title: "IoT Portal" + +weight: 9 + +layout: "learningpathall" +--- + +## Objective +We have successfully established the core backend components for our IoT solution. An IoT simulator continuously generates sensor data, streaming it securely to the cloud via Azure IoT Hub. These sensor readings are stored in Cosmos DB, ensuring data persistence and scalability. Additionally, we’ve implemented an Azure Function that can be triggered through HTTP requests to query Cosmos DB and calculate the average temperature from recent sensor data. With these underlying services fully operational, we’re now prepared to build a WWW portal that will visually present real-time temperature information to our end-users. + +## Website +Start by creating a new folder named Arm.AzureIoT.Portal, inside which you’ll create three files: index.html, main.js, and styles.css. The first file, index.html will define the structure of the webpage, and contain the HTML markup and links to the JavaScript and CSS files. The second file, main.js will include the logic and interactivity of the webpage. In this project, it will handle fetching temperature data from your Azure Function and updating the displayed content dynamically. The last file, styles.css will contain all the styling information, controlling the visual appearance of your webpage. + +### styles.css +Modify your styles.css file by adding the following CSS: +```css +body, html { + margin: 0; + padding: 0; + font-family: 'Segoe UI', Tahoma, Geneva, Verdana, sans-serif; + background-color: #121212; + color: #ffffff; + height: 100%; + display: flex; + justify-content: center; + align-items: center; + } + .container { + text-align: center; + padding: 2rem; + border-radius: 8px; + box-shadow: 0 4px 15px rgba(0, 0, 0, 0.5); + background-color: #1e1e1e; + } + h1 { + margin-bottom: 1.5rem; + font-size: 2.5rem; + } + button { + background-color: #1e88e5; + color: #ffffff; + border: none; + padding: 0.75rem 1.5rem; + font-size: 1rem; + border-radius: 4px; + cursor: pointer; + transition: background-color 0.3s ease; + } + button:hover { + background-color: #1565c0; + } + .result { + margin-top: 1.5rem; + font-size: 1.25rem; + } +``` + +The provided CSS sets a modern, dark-themed appearance for your IoT portal webpage. Here’s a breakdown of its styling: +* body and html. The styles remove default margins and paddings, define a dark background color (#121212), set the text color to white for high contrast, and center content both horizontally and vertically using Flexbox. +* .container - this creates a central container element with padding for spacing, rounded corners (border-radius: 8px) for a softer look, a subtle shadow effect for depth, and a slightly lighter dark background (#1e1e1e) to distinguish the content area from the main page background. +* h1 - this defines the main title style with increased font size (2.5rem) and additional spacing below to clearly separate the title from other content. +* button - styles the interactive “Get Temperature” button, giving it a blue color (#1e88e5), white text for readability, rounded corners for a friendly appearance, and smooth color-transition effects when hovered to improve user experience. +* .result - formats the text area where the temperature reading will appear, adding sufficient margin for clear spacing and slightly larger text size to make the results easily readable. + +### main.js +Now, open your main.js file and update it with the following JavaScript code: +```JavaScript +const functionUrl = ""; + +document.getElementById("getTempBtn").addEventListener("click", async () => { + const resultElement = document.getElementById("result"); + resultElement.textContent = "Fetching temperature..."; + try { + const response = await fetch(functionUrl); + if (!response.ok) { + throw new Error("Network response was not ok"); + } + const data = await response.json(); + if (data && data.averageTemperature !== null) { + resultElement.textContent = "Temperature: " + data.averageTemperature + " °C"; + } else { + resultElement.textContent = "No temperature data available."; + } + } catch (error) { + console.error("Error fetching temperature:", error); + resultElement.textContent = "Error fetching temperature."; + } +}); +``` + +This JavaScript provides the interactive functionality for the webpage. It connects the portal to the Azure Function previously deployed. Here’s how it works step-by-step. First, replace the placeholder "" with the actual URL of your Azure Function that calculates and returns the average temperature. The code uses event listener for the button. Specifically, it attaches a click event listener to your button (getTempBtn). Each time the button is clicked, it triggers the async JavaScript function that retrieves data. + +When the button is clicked, the label (element with id "result") displays a temporary message—“Fetching temperature...”. It is used to inform the user that the request is in progress. The script sends a GET request to your Azure Function URL. If the request succeeds, it parses the JSON response. If the response contains valid temperature data (averageTemperature), it updates the label to show the current temperature. If no data is returned, it notifies the user accordingly. + +If any error occurs (e.g., network issues, or a problem in fetching or parsing the data), the script logs the error to the browser console and updates the UI to inform the user (“Error fetching temperature.”). + +### index.html +Finally, open the index.html file and replace its content with the following HTML code: +```HTML + + + + + + IoT Solution + + + +
+

IoT Solution

+ +
Temperature: -- °C
+
+ + + +``` + +This HTML file represents the main structure and entry point of your IoT web portal. It is divided into Head and Body sections: The head body defines basic metadata such as character set (UTF-8) and viewport configuration for responsive design. Then, it sets the title of your webpage to "IoT Solution" and links your CSS stylesheet (styles.css), which defines the appearance of the page. + +In the body section we have: +* a centered container (div) with a clear heading (h1) labeled "IoT Solution". +* a button (id="getTempBtn") that users click to trigger the JavaScript logic retrieving temperature data from your Azure Function. +* a placeholder label (div) with the id "result" initially showing "Temperature: -- °C". The JavaScript updates this label dynamically with the actual temperature retrieved from your backend. + +Finally, the index.html includes the JavaScript file (main.js) placed at the end of the body to ensure the HTML elements are fully loaded before executing scripts. + +## Testing the Implementation: +Make sure you’ve saved all three files (index.html, main.js, and styles.css). Next: +1. Start the IoT Simulator to begin streaming data to the Azure IoT Hub. +2. Open the index.html file locally in your web browser. +3. Click the "Get temperature" button. + +You should now see real-time temperature readings displayed: + +![img44 alt-text#center](Figures/44.png) + +## Deployment to Azure Blob Storage +You will now deploy the web portal you’ve created to Azure Blob Storage, making it accessible online. + +### Create and Configure Azure Blob Storage +1. Sign in to the Azure Portal. +2. Create a Storage Account: +* Click “Create a resource” +* Search for “Storage account” +![img45 alt-text#center](Figures/45.png) +* Click “Create”. +![img46 alt-text#center](Figures/46.png) +3. Provide required details: +* Subscription, resource group, storage account name (e.g. armiotstorage). +* For Primary service, choose Azure Blob Storage or Azure Data Lake Storage Gen 2. +* Select Standard performance and Locally-redundant storage (LRS). +![img47 alt-text#center](Figures/47.png) +* Click "Review + create", then "Create". +3. Enable Static Website Hosting: +* Navigate to your newly created storage account. +* Under Data management, click “Static website”. +* Select “Enabled”. +* Set index.html as the index document name. +![img48 alt-text#center](Figures/48.png) +* Click Save. + +After saving, Azure provides you with a URL like: https://.z22.web.core.windows.net/. +Make sure to save this URL, as it will serve as the public endpoint for your website. + +### Upload Files to Azure Blob Storage +You can upload your website files directly using the Azure Portal or via Azure Storage Explorer. Here, we’ll use the Azure Portal: +1. Navigate to your storage account. +2. Under Data storage, select “Containers”. +3. Open the container named ”$web” (created automatically when enabling static websites). +4. Click Upload and select your three website files (index.html, main.js, styles.css), and upload them. + +![img49 alt-text#center](Figures/49.png) + +### Verify the Deployment +After uploading your files, open a browser and navigate to https://.z22.web.core.windows.net/. Your static website should load, allowing you to test the “Get temperature” button (to see temperatures make sure to start the IoT simulator): + +![img50 alt-text#center](Figures/50.png) + +## Summary +In this learning path, we successfully built a complete, end-to-end prototype of an IoT solution. Our journey began with a simulator streaming realistic telemetry data to Azure through IoT Hub. We leveraged Azure Stream Analytics to process and route this streaming data directly into Cosmos DB, providing scalable and reliable storage. Additionally, we developed two Azure Functions: the first continuously monitors incoming temperature readings, sending email notifications whenever the temperature exceeds a predefined threshold, ensuring proactive alerts. The second Azure Function aggregates recent temperature data from the last minute and provides this information via an HTTP endpoint. Finally, we utilized this aggregation function within our user-friendly web portal, enabling real-time visualization of temperature data, thus completing our robust and interactive IoT solution. diff --git a/content/learning-paths/iot/azure-iot/stream-analytics-dynamo-db.md b/content/learning-paths/iot/azure-iot/stream-analytics-dynamo-db.md new file mode 100644 index 0000000000..6672e3d3b0 --- /dev/null +++ b/content/learning-paths/iot/azure-iot/stream-analytics-dynamo-db.md @@ -0,0 +1,150 @@ +--- +# User change +title: "Store Data in Azure Cosmos DB with Azure Stream Analytics" + +weight: 6 + +layout: "learningpathall" +--- + +## Objective +In the previous section, you successfully set up an Azure Stream Analytics job and configured Azure IoT Hub as an input source. You implemented a simple query to stream real-time sensor data directly from IoT Hub, establishing a seamless flow of telemetry data into Azure. Now, you’ll take the next step to persist this streaming data in Azure Cosmos DB. + +## Azure Cosmos DB +Azure Cosmos DB is a fully managed, globally distributed NoSQL database service designed for scalability, reliability, and high availability. Its flexible data schema allows for easy storage of diverse data types from multiple IoT devices without requiring rigid schema definitions. This schema flexibility is especially valuable in IoT scenarios, where sensors and devices may send varied or evolving data structures. + +Consider a scenario where your existing IoT devices stream telemetry data (temperature, pressure, humidity) to Azure Cosmos DB via Azure Stream Analytics. Now imagine that you need to integrate a new sensor type—such as an air-quality sensor that provides an additional measurement (e.g., “AirQualityIndex”). + +With Azure Cosmos DB’s NoSQL architecture, you don’t have to explicitly modify or migrate database schemas when introducing new data fields. The new sensor data can simply be included in your Stream Analytics query, and Cosmos DB will automatically store the additional field alongside existing data entries without any extra setup. + +### Partitioning +Azure Cosmos DB uses partitioning for efficiently managing large-scale data and high-throughput operations. Partitioning distributes data across multiple servers (physical partitions), allowing Cosmos DB to scale seamlessly as data volume and query demands grow. + +Cosmos DB uses a partition key—a field chosen by the user—to determine how data is distributed across partitions. The partition key should ideally have a high cardinality (many distinct values) and should evenly distribute read and write workloads. Each unique value of the partition key corresponds to a logical grouping called a logical partition. Documents with the same partition key value reside within the same logical partition. + +When you select a suitable partition key, Cosmos DB ensures that operations (reads and writes) targeting a specific logical partition perform efficiently because queries can quickly locate and retrieve data without scanning the entire dataset. If your data and workload scale significantly, Cosmos DB transparently manages splitting and distributing data across additional physical partitions automatically. Therefore, selecting an effective partition key (such as deviceId for IoT scenarios) can greatly optimize performance, scalability, and cost efficiency. + +For IoT data specifically, choosing a device ID or similar attribute as your partition key ensures efficient data retrieval, balanced storage, and evenly distributed workload across the infrastructure, resulting in faster queries and reliable performance at any scale. + +### Scaling +In Azure Cosmos DB, scaling relies on partitions, allowing the database to handle increasing amounts of data and throughput demands smoothly. Cosmos DB partitions your data automatically into smaller manageable segments known as logical partitions based on the partition key you specify. Each logical partition can store up to 20 GB of data and has a throughput limit (typically around 10,000 RU/s per partition). + +When data volume or throughput requirements grow beyond the capacity of a single logical partition, Cosmos DB transparently distributes these logical partitions across multiple physical partitions (servers). As data volume and workload increase, Cosmos DB dynamically creates additional physical partitions, automatically redistributing your logical partitions across them. This horizontal scaling ensures that read and write operations remain fast and efficient, even as the database size and traffic significantly grow. + +Efficient scaling is directly linked to choosing a suitable partition key. Selecting a good partition key ensures that your data and workload are evenly balanced across physical partitions. Good partitioning prevents hotspots—scenarios where a single partition disproportionately handles more workload than others—which can lead to performance bottlenecks. Thus, careful selection of partition keys (such as device ID in IoT scenarios) allows Cosmos DB to scale smoothly and maintain high performance and reliability, regardless of how much your data or traffic grows. + +### Data Operations +In Azure Cosmos DB, all data operations—such as retrieval, update, insert, and delete—consume units of measure called Request Units (RUs). Request Units represent the resources required to perform operations such as reads, writes, queries, and updates. Each operation consumes a certain number of RUs, and Cosmos DB uses this concept to provide predictable and consistent performance. + +When you perform a data operation in Cosmos DB, the request is routed to the appropriate partition based on the partition key provided. If you’re retrieving or updating a single document by specifying its unique id and partition key value, Cosmos DB efficiently locates the data within a single logical partition, resulting in minimal RU consumption. However, cross-partition queries—queries spanning multiple logical partitions—consume more RUs because Cosmos DB must query multiple partitions simultaneously. + +As your application workload increases, Cosmos DB manages scalability through partitions. Increasing throughput (i.e., RU/s) allocates more resources to your container, automatically distributing the load across existing or newly created physical partitions. Conversely, if your workload decreases, you can scale down to reduce costs. Since Cosmos DB automatically handles partition management behind the scenes, data remains available and responsive with minimal intervention. + +In IoT scenarios, choosing an optimal partition key (e.g., device ID) helps evenly distribute load across partitions, ensuring consistent and reliable performance as your IoT application scales up and down. + +### Importance in IoT Solutions +Azure Cosmos DB plays an important role in IoT solutions due to its ability to handle vast volumes of diverse, rapidly streaming sensor data. Its NoSQL architecture allows IoT applications to adapt to schema changes, accommodating new sensor fields or evolving data structures without complex migrations. Moreover, Cosmos DB’s automatic partitioning and elastic scaling enable efficient handling of data at global scale, ensuring low latency, high availability, and predictable performance. With built-in partitioning, Cosmos DB seamlessly manages massive data growth, enabling IoT solutions to scale smoothly and maintain consistent performance. These capabilities make Azure Cosmos DB an essential component for building production-grade IoT solutions. + +## Configure Stream Analytics to write data into Azure Cosmos DB +You will now configure the stream analytics job such that the telemetry data will be automatically written to the table in Azure Cosmos DB. + +### Cosmos DB account and database +Start by creating the Create Cosmos DB account and database: +1. Log in to the Azure Portal. +2. Select “Create a resource”, search for “Azure Cosmos DB”, and click Create: +![img14 alt-text#center](Figures/14.png) +![img15 alt-text#center](Figures/15.png) + +3. Select Azure Cosmos DB for NoSQL, then click Create. +![img16 alt-text#center](Figures/16.png) + +4. Fill in the required details: +* Subscription: Select your subscription. +* Resource Group: Use your existing IoT resource group or create a new one. +* Account Name: Provide a unique name (e.g., armiotcosmosdb). +* Availability Zones: Disable. +* Region: Choose the same region as your IoT Hub and Stream Analytics job. +* Select Servleress as capacity mode. +* Apply Free Tier Discount: Apply +* Check Limit total account throughput. +![img17 alt-text#center](Figures/17.png) +5. Click Review + create, then click Create. + +Once the deployment completes: +* Navigate to your Cosmos DB account and select “Data Explorer”. +* Click New Container, create a database (e.g., named IoTDatabase), and create a container named SensorReadings. +* Select an appropriate partition key (recommended: /deviceId). +* Enable analytical store capability to perform near real-time analytics on your operational data, without impacting the performance of transactional workloads: Off. +* Click OK at the bottom. +![img18 alt-text#center](Figures/18.png) + +### Modify Stream Analytics Job +Now update your query in Stream Analytics to write data from IoT Hub directly into Cosmos DB: +1. Go to IoTStreamAnalyticsJob. +2. Under Job topology, select Outputs. +3. Click Add output, and select Cosmos DB: +![img19 alt-text#center](Figures/19.png) +4. In the Cosmos DB pane, type CosmosDBOutput for the alias name, leave other fields at their default values, and click the Save button: +![img20 alt-text#center](Figures/20.png) + +### Update Your Stream Analytics Query +Now once, you have the output configured, modify the query. To do so, select Query under Job topology. Then, modify your existing query to explicitly specify your Cosmos DB output alias: + +```SQL +SELECT + deviceId, + temperature, + pressure, + humidity, + timestamp +INTO + CosmosDBOutput +FROM + IoTHubInput +``` + +![img21 alt-text#center](Figures/21.png) + +Afterwards, click Start job, and then Start: + +![img22 alt-text#center](Figures/22.png) + +## Verify data flow in Cosmos DB +To verify that your data pipeline is working correctly, first start your Python IoT simulator application (iot_simulator.py). Ensure it’s actively sending telemetry data. Next, open the Azure Portal and navigate to your Azure Cosmos DB resource. Under Data Explorer, select your database and then your container (e.g., SensorReadings). Once selected, click Items to view your stored data. Sensor readings streamed from your IoT device will appear on the right-hand side of the Data Explorer interface, similar to the screenshot below: + +![img23 alt-text#center](Figures/23.png) + +Azure Cosmos DB stores data as JSON documents within a NoSQL (document-based) structure, making it ideal for flexible and dynamic data, such as IoT telemetry. Each record (also called a document) is stored in a container (or collection) that doesn’t enforce a rigid schema. As a result, each document can contain different fields without requiring schema changes or migrations, which is particularly valuable when collecting data from diverse IoT devices with evolving attributes. + +Consider the sample item with the sensor reading: + +```JSON +{ + "deviceId": "arm64Device01", + "temperature": 25.63, + "pressure": 1017.45, + "humidity": 79.64, + "timestamp": "2025-03-18T09:30:15.040263+00:00", + "id": "71cb92d4-63f3-44c2-880d-7d9bf0746661", + "_rid": "vAxGAPBgq2sBAAAAAAAAAA==", + "_self": "dbs/vAxGAA==/colls/vAxGAPBgq2s=/docs/vAxGAPBgq2sBAAAAAAAAAA==/", + "_etag": "\"4000555d-0000-4d00-0000-67d93d470000\"", + "_attachments": "attachments/", + "_ts": 1742290247 +} +``` + +Our defined fields (deviceId, temperature, pressure, humidity, and timestamp) contain the actual telemetry data from your IoT device. Then, Azure Cosmos DB automatically adds metadata fields to each stored document: +* _rid - this is a resource identifier used internally by Cosmos DB. +* _self - a unique URI representing the document within Cosmos DB. +* _etag - a version tag used for optimistic concurrency control. +* _self - a self-link, providing the path to access the document via Cosmos DB API. +* _ts - timestamp automatically indicating when the document was last modified (if present). +* each document has a unique id field, represented by your own ID or autogenerated by Cosmos DB, to identify documents uniquely within the container. + +When storing data, Cosmos DB uses the provided partition key (such as deviceId) to evenly distribute documents across logical partitions. This optimizes retrieval speed and scalability, especially in IoT scenarios where queries often target specific devices. This approach helps ensure efficient data management and high performance, even as your data volume and throughput requirements increase. + +## Summary and Next Steps +In this section, you successfully configured your Azure Stream Analytics job to persist real-time sensor data from Azure IoT Hub into Azure Cosmos DB. You defined and executed a query that captures and stores streaming telemetry in a flexible, schema-free JSON document structure, taking advantage of Cosmos DB’s powerful and dynamic data-storage capabilities. By completing these steps, you’ve built a data pipeline that streams sensor readings from your simulated IoT devices directly into a scalable, highly performant NoSQL database. You’re now prepared to extend your solution further, enabling data analytics, alerting, monitoring, and visualization. + +Now that you’ve successfully configured Azure Stream Analytics to store telemetry data in Azure Cosmos DB, the next step is to implement data monitoring and alerting. You’ll use Azure Functions, serverless compute service, to automatically monitor stored sensor data. Specifically, you’ll create an Azure Function that periodically reads temperature values from Cosmos DB, evaluates them against defined thresholds, and sends notifications whenever these thresholds are exceeded. diff --git a/content/learning-paths/iot/azure-iot/stream-analytics.md b/content/learning-paths/iot/azure-iot/stream-analytics.md new file mode 100644 index 0000000000..b453c8fc2a --- /dev/null +++ b/content/learning-paths/iot/azure-iot/stream-analytics.md @@ -0,0 +1,92 @@ +--- +# User change +title: "Azure Stream Analytics" + +weight: 5 + +layout: "learningpathall" +--- + +## Objective +In the previous step, you successfully established a Python application to stream real-time telemetry data from an Arm64-powered IoT device directly to Azure IoT Hub. Now, you’ll leverage Azure Stream Analytics, a powerful, real-time analytics and complex-event processing engine, to process and route the streaming sensor data. Stream Analytics allows you to easily analyze incoming data streams, run queries in real-time, and seamlessly integrate the processed data into other Azure services. With Stream Analytics, you’ll define custom queries to view or transform sensor readings, such as temperature, pressure, humidity, and timestamps, and efficiently direct this information to storage, analytics services, or visualization tools. In this section, you’ll set up Stream Analytics to ingest telemetry data from IoT Hub, and run continuous queries to process this data. + +## Azure Stream Analytics +Azure Stream Analytics is a real-time analytics and event-processing service provided by Microsoft Azure, specifically designed to handle large volumes of streaming data from IoT devices, applications, sensors, and other real-time sources. It enables developers and data engineers to create sophisticated analytics pipelines without the complexity of managing infrastructure. By processing data on the fly, Azure Stream Analytics helps users extract immediate insights, detect patterns, trigger alerts, and feed processed information into other Azure services like Azure Cosmos DB, Azure Functions, or Power BI dashboards. + +With Azure Stream Analytics, you can write queries using a simple SQL-like syntax, making it straightforward to filter, aggregate, and transform streaming data in real-time. The service provides built-in scalability, fault tolerance, and low latency, ensuring that critical insights are available immediately as data flows into the system. Stream Analytics supports integration with multiple data inputs (such as IoT Hubs and Event Hubs), and outputs, enabling seamless creation of comprehensive, end-to-end IoT data pipelines that can quickly adapt to evolving business needs and handle complex scenarios involving massive data volumes. + +Azure Stream Analytics organizes real-time data processing through four main architectural components: Jobs, Inputs, Queries, and Outputs. A Job in Azure Stream Analytics serves as a logical container that encapsulates all aspects of your stream-processing workflow. Each job manages streaming data from start to finish and can be independently started, stopped, or scaled as needed. Within a job, Inputs define the sources of streaming data, typically coming from services like Azure IoT Hub, Event Hubs, or Blob Storage. Queries, written in a familiar SQL-like syntax, specify how the incoming data should be filtered, aggregated, or transformed in real-time. Finally, the processed data flows into Outputs, which can include Azure Cosmos DB, Blob Storage, Azure SQL Database, Azure Functions, or Power BI, making it readily available for further analysis, monitoring, or visualization. + +## Create a Stream Analytics Job +To process and analyze the telemetry data we’re streaming to Azure IoT Hub, we’ll first create an Azure Stream Analytics job. Follow these steps to set it up: +1. Sign in to the Azure Portal. +2. Click “Create a resource”, type “Stream Analytics job” into the search box, and press Enter. +3. From the search results, select Stream Analytics job, then click Create: +![img9 alt-text#center](Figures/09.png) +4. Provide the necessary informations: +* Subscription: Choose the Azure subscription you want to use for this job. +* Resource group: Select the resource group you previously created (e.g., your IoT project’s resource group). +* Name: Provide a meaningful, unique name for your Stream Analytics job (e.g., IoTStreamAnalyticsJob). +* Region: Choose the same Azure region as your IoT Hub for optimal performance and minimal latency. +* Hosting environment: Select Cloud for Azure-managed infrastructure. +* Streaming units: Set this to 1 (appropriate for initial testing and smaller workloads, you can scale up later). + +![img10 alt-text#center](Figures/10.png) + +5. After reviewing your settings carefully, click Review + create, confirm that all details are correct, and then click Create to deploy your Azure Stream Analytics job. + +Your Stream Analytics job will deploy within a few minutes. Once the deployment is complete, you’ll be ready to configure inputs, define queries, and set outputs for real-time data processing and analytics. + +## Configure Azure IoT Hub as an Input for Stream Analytics Job +After successfully creating the Stream Analytics job, you’ll need to configure your Azure IoT Hub as an input source. This configuration allows Stream Analytics to read real-time telemetry data directly from your IoT devices. Follow these steps: +1. Navigate to your newly created Stream Analytics job in the Azure Portal. +2. In the left-hand menu, under the Job topology section, select Inputs. +3. Click “Add input”, and choose “IoT Hub” as the input type. +![img11 alt-text#center](Figures/11.png) +4. Enter the following configuration details: +* Input Alias: Provide a name, such as IoTHubInput. +* IoT Hub: Select the Azure IoT Hub you created earlier. +* Consumer group: Choose $Default, unless you’ve explicitly created a custom consumer group. +* Shared access policy name: Select iothubowner (provides full access for reading data from IoT Hub). +* Endpoint: Select Messaging. +* Partition key: Type deviceId (this helps ensure the data streams are partitioned by device identifiers). +* Event serialization format: Select JSON, as our telemetry data is transmitted in JSON format. +* Encoding: Choose UTF-8. +* Event compression type: Set this to None. + +![img12 alt-text#center](Figures/12.png) + +4. After entering these details, carefully verify them for accuracy and completeness. Click “Save” to apply the changes and successfully link your Azure IoT Hub as the input source for your Stream Analytics job. + +Your job is now configured to ingest streaming IoT telemetry data in real-time, preparing it for further analysis, storage, and visualization. + +## Define the Stream Analytics Query +Now that you have configured your Azure IoT Hub as an input source, the next step is to create and run a Stream Analytics query. This query defines how incoming IoT data will be filtered, transformed, or routed for further processing. Follow these steps: +1. Navigate to your Stream Analytics job in the Azure Portal. +2. Under the Job topology menu on the left, select Query. +3. In the query editor, enter the following simple SQL-like query to stream all incoming data from your IoT device +```SQL +SELECT + deviceId, + temperature, + pressure, + humidity, + timestamp +FROM + IoTHubInput +``` + +This straightforward query selects all relevant fields (deviceId, temperature, pressure, humidity, and timestamp) directly from your configured input (IoTHubInput), which corresponds to the Azure IoT Hub you previously connected. + +Before running this query, ensure your Python IoT simulator (iot_simulator.py) is actively sending telemetry data. After the simulator begins transmitting data, you can test and verify your Stream Analytics query within the Azure Portal using the built-in Test query feature. Doing this allows you to view live-streamed sensor data in real-time and confirm that your streaming pipeline is working as expected: + +![img13 alt-text#center](Figures/13.png) + +Constructing queries in Azure Stream Analytics involves using a straightforward SQL-like syntax specifically optimized for real-time stream processing. Typically, a query contains a SELECT statement to specify which fields from the incoming telemetry data to process, and a FROM statement to indicate the source stream. Queries can be expanded with advanced features such as filtering, aggregations, and temporal window functions to handle more complex scenarios. + +In our current example, we implemented a simple query using SELECT and FROM clauses to view and forward all incoming sensor data from our IoT device without any additional filtering or transformation. This straightforward query effectively demonstrates how Azure Stream Analytics ingests and processes real-time IoT data, establishing a foundation upon which you can build more advanced and powerful data analytics workflows in the future + +## Summary and Next Steps +In this section, you successfully set up an Azure Stream Analytics job to analyze and process real-time telemetry data streamed from your Python-based IoT simulator. You configured Azure IoT Hub as the data input source for Stream Analytics, defined a SQL-like query to select sensor readings, and confirmed that data was streaming correctly from your simulated Arm64-powered device into Azure. + +You established a real-time analytics pipeline that integrates with Azure IoT Hub, enabling immediate analysis of sensor data as it arrives in the cloud. In the next step, you’ll build upon this foundation by defining an additional query within your Azure Stream Analytics job. This new query will direct processed sensor data into Azure Cosmos DB. By writing the streaming IoT data into Cosmos DB, you’ll securely persist sensor telemetry, making it readily available for long-term storage, efficient retrieval, further analysis, and integration into applications or dashboards.