- Azure Functions Project
- TODO
- Advantages of the solution
- Disadvantages of the solution
- Settings
- Sample data
- EventHubTriggerFunction
- Setup
- Local Development
- Sample functions not relevant in project
This project contains Azure Functions implemented in Python. The functions are designed to handle HTTP requests, Blob storage triggers, and Event Hub messages.
Simplified architecture:
- Implement error handling to store error responses in
receipts/incoming_errandreceipts/outgoing_err. - Implement the Storage SDK to store receipts in Blob storage with dynamic names, e.g.,
2025012912332102231.json. - complete documentation about creating resources
- add doc about send.py and recive.py
The solution is highly configurable. Incoming JSON receipts can be described using a JSON schema, which can be stored in the function application settings. This allows for easy updates to the input schema by simply changing the configuration, without requiring costly development efforts.
Possible configurations include:
- Input and output JSON schemas
- Naming conventions for archived blobs in the storage account
- URLs of REST APIs
One limitation of the solution is that the Blob storage output functionality does not allow for flexible naming of blobs. To implement different naming conventions, the Azure Storage SDK must be used.
@app.blob_output(arg_name="blobout",
path="receipts/outgoing/extracted.json", # fixed blob name
connection="AzureWebJobsStorage")local.settings.json:
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "python",
"tkeventnamespace_RootManageSharedAccessKey_EVENTHUB": "<connection string>"
"tkeventnamespace_ListenPolicy_EVENTHUB": "<connection string>"
"json_schema": "<schema of the input json>"
}
}Sample data and input recipts json schema can be find in ./data folder :
- receipt_schema.json
- sample_receipt.json
This function is triggered by messages in an Event Hub. It validates the message against a JSON schema and logs the validation result.
Logic :
- Azure Function triggered by Event Hub events.
- Saves it to Blob Storage
- Validates the data against a JSON schema,
- Extracts specific fields,
- Sends the extracted data to a Service Bus.
Raises: Exception: If any error occurs during the processing of the event.
Event Hub Name: eh1
Connection: tkeventnamespace_ListenPolicy_EVENTHUB
This function is triggered by messages in a Service Bus queue. It processes the message, saves it to Blob storage, and sends the data to an HTTP endpoint.
Queue Name: mysbqueue
Connection: servicebusnamesapace_SERVICEBUS
Logic:
- Azure Function triggered by Service Bus queue messages.
- Saves the message to Blob Storage.
- Sends the data to an HTTP endpoint.
-
Install the required packages:
pip install -r requirements.txt
-
Deploy the functions to Azure:
func azure functionapp publish <FunctionAppName>
To run the functions locally:
- login to azure
azd auth loginLogin in browser to azure subscription
-use the Azure Functions Core Tools:
func startThis function is triggered by an HTTP request. It responds with a generic success message.
Route: http_trigger_func1
Auth Level: Anonymous
Example Request:
GET /api/http_trigger_func1Example Response:
This HTTP triggered function executed successfully. Pass a name in the query string or in the request body for a personalized response.This function is triggered by an HTTP request. It responds with a greeting message. If a name parameter is provided in the query string or request body, it includes the name in the greeting.
Route: http_trigger_func2
Auth Level: Anonymous
Example Request:
GET /api/http_trigger_func2?name=JohnExample Response:
Hello, John. This HTTP triggered function executed successfully.This function is triggered when a new blob is added to the specified Blob storage path. It logs the details of the processed blob.
Path: demo
Connection: AzureWebJobsStorage
RESOURCE_GROUP="draft-sg"
REGION="North Europe"
STORAGE_ACCOUNT="draftsgadc3"
az storage account create --name $STORAGE_ACCOUNT \
--location $REGION \
--resource-group $RESOURCE_GROUP \
--sku Standard_LRSCreate 2 blob containers in the storage you've created: inputitems and outputitems
# Get Storage Key
ACCESS_KEY=$(az storage account keys list --account-name $STORAGE_ACCOUNT --resource-group $RESOURCE_GROUP --output tsv |head -1 | awk '{print $3}')
az storage container create \
--name "receipts" \
--account-name $STORAGE_ACCOUNT \
--account-key $ACCESS_KEYTODO
TODO
TODO
