Azure Storage + Function App + Open Framework + Cosmos DB
Costa Rica
Last updated: 2025-05-20
Important
This example is based on a public network site and is intended for demonstration purposes only
. It showcases how several Azure resources can work together to achieve the desired result. Consider the section below about Important Considerations for Production Environment. Please note that these demos are intended as a guide and are based on my personal experiences. For official guidance, support, or more detailed information, please refer to Microsoft's official documentation or contact Microsoft directly
: Microsoft Sales and Support
List of References (Click to expand)
Table of Content (Click to expand)
- Prerequisites
- Where to start?
- Important Considerations for Production Environment
- Overview
- Function App Hosting Options
- Step 1: Set Up Your Azure Environment
- Step 2: Set Up Azure Blob Storage for PDF Ingestion
- Step 3: Set Up Azure Cosmos DB
- Step 4: Set Up Azure Functions for Document Ingestion and Processing
- Step 5: Test the solution
How to parse PDFs from an Azure Storage Account, process them using a Open Framework (needs manual configuration), and store the results in Cosmos DB for further analysis.
- Upload your PDFs to an Azure Blob Storage container.
- An Azure Function is triggered by the upload, which uses an Open Framework, and multiple customizations as part of an API call to analyze the PDFs.
- The extracted data is parsed and subsequently stored in a Cosmos DB database, ensuring a seamless and automated workflow from document upload to data storage.
Note
Limitations of this approach:
- Requires significant manual effort to structure and format extracted data.
- Limited in handling complex layouts and non-text elements like images and charts.
Private Network Configuration
For enhanced security, consider configuring your Azure resources to operate within a private network. This can be achieved using Azure Virtual Network (VNet) to isolate your resources and control inbound and outbound traffic. Implementing private endpoints for services like Azure Blob Storage and Azure Functions can further secure your data by restricting access to your VNet.
Security
Ensure that you implement appropriate security measures when deploying this solution in a production environment. This includes:
- Securing Access: Use Azure Entra ID (formerly known as Azure Active Directory or Azure AD) for authentication and role-based access control (RBAC) to manage permissions.
- Managing Secrets: Store sensitive information such as connection strings and API keys in Azure Key Vault.
- Data Encryption: Enable encryption for data at rest and in transit to protect sensitive information.
Scalability
While this example provides a basic setup, you may need to scale the resources based on your specific requirements. Azure services offer various scaling options to handle increased workloads. Consider using:
- Auto-scaling: Configure auto-scaling for Azure Functions and other services to automatically adjust based on demand.
- Load Balancing: Use Azure Load Balancer or Application Gateway to distribute traffic and ensure high availability.
Cost Management
Monitor and manage the costs associated with your Azure resources. Use Azure Cost Management and Billing to track usage and optimize resource allocation.
Compliance
Ensure that your deployment complies with relevant regulations and standards. Use Azure Policy to enforce compliance and governance policies across your resources.
Disaster Recovery
Implement a disaster recovery plan to ensure business continuity in case of failures. Use Azure Site Recovery and backup solutions to protect your data and applications.
- An
Azure subscription is required
. All other resources, including instructions for creating a Resource Group, are provided in this workshop. Contributor role assigned or any custom role that allows
: access to manage all resources, and the ability to deploy resources within subscription.- If you choose to use the Terraform approach, please ensure that:
- Terraform is installed on your local machine.
- Install the Azure CLI to work with both Terraform and Azure commands.
Please follow as described below.
- If you're choosing the
Infrastructure via Azure Portal
, please start here in this section. - If you're choosing the
Infrastructure via Terraform
approach:- Please follow the Terraform guide to deploy the necessary Azure resources for the workshop.
- Then, follow each each section but
skip the creation of each resource
.
Important
Regarding Networking
, this example will cover Public access configuration
, and system-managed identity
. However, please ensure you review your privacy requirements and adjust network and access settings as necessary for your specific case
.
Using Cosmos DB provides you with a flexible, scalable, and globally distributed database solution that can handle both structured and semi-structured data efficiently.
Azure Blob Storage
: Store the PDF invoices.Azure Functions
: Trigger on new PDF uploads, extract data, and process it.Azure SQL Database or Cosmos DB
: Store the extracted data for querying and analytics.
Resource | Recommendation |
---|---|
Azure Blob Storage | Use for storing the PDF files. This keeps your file storage separate from your data storage, which is a common best practice. |
Azure SQL Database | Use if your data is highly structured and you need complex queries and transactions. |
Azure Cosmos DB | Use if you need a globally distributed database with low latency and the ability to handle semi-structured data. |
In the context of Azure Function Apps, a
hosting option refers to the plan you choose to run your function app
. This choice affects how your function app is scaled, the resources available to each function app instance, and the support for advanced functionalities like virtual network connectivity and container support.
Tip
Scale to Zero
: Indicates whether the service can automatically scale down to zero instances when idle.- IDLE stands for:
- I – Inactive
- D – During
- L – Low
- E – Engagement
- In other words, when the application is not actively handling requests or events (it's in a low-activity or paused state).
- IDLE stands for:
Scale Behavior
: Describes how the service scales (e.g.,event-driven
,dedicated
, orcontainerized
).Virtual Networking
: Whether the service supports integration with virtual networks for secure communication.Dedicated Compute & Reserved Cold Start
: Availability of always-on compute to avoid cold starts and ensure low latency.Max Scale Out (Instances)
: Maximum number of instances the service can scale out to.Example AI Use Cases
: Real-world scenarios where each plan excels.
Flex Consumption
Feature | Description |
---|---|
Scale to Zero | Yes |
Scale Behavior | Fast event-driven |
Virtual Networking | Optional |
Dedicated Compute & Reserved Cold Start | Optional (Always Ready) |
Max Scale Out (Instances) | 1000 |
Example AI Use Cases | Real-time data processing for AI models, high-traffic AI-powered APIs , event-driven AI microservices . Ideal for fraud detection, real-time recommendations, NLP, and computer vision services. |
Consumption
Feature | Description |
---|---|
Scale to Zero | Yes |
Scale Behavior | Event-driven |
Virtual Networking | Optional |
Dedicated Compute & Reserved Cold Start | No |
Max Scale Out (Instances) | 200 |
Example AI Use Cases | Lightweight AI APIs , scheduled AI tasks , low-traffic AI event processing . Great for sentiment analysis, simple image recognition, and batch ML tasks. |
Functions Premium
Feature | Description |
---|---|
Scale to Zero | No |
Scale Behavior | Event-driven with premium options |
Virtual Networking | Yes |
Dedicated Compute & Reserved Cold Start | Yes |
Max Scale Out (Instances) | 100 |
Example AI Use Cases | Enterprise AI applications , low-latency AI APIs , VNet integration . Ideal for secure, high-performance AI services like customer support and analytics. |
App Service
Feature | Description |
---|---|
Scale to Zero | No |
Scale Behavior | Dedicated VMs |
Virtual Networking | Yes |
Dedicated Compute & Reserved Cold Start | Yes |
Max Scale Out (Instances) | Varies |
Example AI Use Cases | AI-powered web applications , dedicated resources . Great for chatbots, personalized content, and intensive AI inference. |
Container Apps Env.
Feature | Description |
---|---|
Scale to Zero | No |
Scale Behavior | Containerized microservices environment |
Virtual Networking | Yes |
Dedicated Compute & Reserved Cold Start | Yes |
Max Scale Out (Instances) | Varies |
Example AI Use Cases | AI microservices architecture , containerized AI workloads , complex AI workflows . Ideal for orchestrating AI services like image processing, text analysis, and real-time analytics. |
An Azure
Resource Group
is acontainer that holds related resources for an Azure solution
. It can include all the resources for the solution or only those you want to manage as a group. Typically, resources that share the same lifecycle are added to the same resource group, allowing for easier deployment, updating, and deletion as a unit. Resource groups also store metadata about the resources, and you can apply access control, locks, and tags to them for better management and organization.
- Create an Azure Account: If you don't have one, sign up for an Azure account.
- Create a Resource Group:
An
Azure Storage Account
provides aunique namespace in Azure for your data, allowing you to store and manage various types of data such as blobs, files, queues, and tables
. It serves as the foundation for all Azure Storage services, ensuring high availability, scalability, and security for your data.
ABlob Container
is alogical grouping of blobs within an Azure Storage Account, similar to a directory in a file system
. Containers help organize and manage blobs, which can be any type of unstructured data like text or binary data. Each container can store an unlimited number of blobs, and you must create a container before uploading any blobs.
-
Create a Storage Account:
-
In the Azure portal, navigate to your Resource Group.
-
Click + Create.
-
Search for
Storage Account
. -
Select the Resource Group you created.
-
Enter a Storage Account name (e.g.,
contosostorageaidemo
). -
Choose the region and performance options, and click
Next
to continue. -
If you need to modify anything related to
Security, Access protocols, Blob Storage Tier
, you can do that in theAdvanced
tab. -
Regarding
Networking
, this example will coverPublic access
configuration. However, please ensure you review your privacy requirements and adjust network and access settings as necessary for your specific case. -
Click Review + create and then Create. Once is done, you'll be able to see it in your Resource Group.
-
-
Create a Blob Container:
Azure Cosmos DB
is a globally distributed,multi-model database service provided by Microsoft Azure
. It is designed to offer high availability, scalability, and low-latency access to data for modern applications. Unlike traditional relational databases, Cosmos DB is aNoSQL database, meaning it can handle unstructured, semi-structured, and structured data types
.It supports multiple data models, including document, key-value, graph, and column-family, making it versatile for various use cases.
AnAzure Cosmos DB container
is alogical unit
within a Cosmos DB database where data is stored.Containers are schema-agnostic, meaning they can store items with different structures. Each container is automatically partitioned to scale out across multiple servers, providing virtually unlimited throughput and storage
. Containers are the primary scalability unit in Cosmos DB, and they use a partition key to distribute data efficiently across partitions.
-
Create a Cosmos DB Account:
-
In the Azure portal, navigate to your Resource Group.
-
Click + Create.
-
Search for
Cosmos DB
, click onCreate
: -
Choose your desired API type, for this will be using
Azure Cosmos DB for NoSQL
. This option supports a SQL-like query language, which is familiar and powerful for querying and analyzing your invoice data. It also integrates well with various client libraries, making development easier and more flexible. -
Please enter an account name (e.g.,
contosocosmosdbaidemo
). As with the previously configured resources, we will use thePublic network
for this example. Ensure that you adjust the architecture to include your networking requirements. -
Select the region and other settings.
-
Click Review + create and then Create.
-
-
Create a Database and Container:
An
Azure Function App
is acontainer for hosting individual Azure Functions
. It provides the execution context for your functions, allowing you to manage, deploy, and scale them together.Each function app can host multiple functions, which are small pieces of code that run in response to various triggers or events, such as HTTP requests, timers, or messages from other Azure services
.
Azure Functions are designed to be lightweight and event-driven, enabling you to build scalable and serverless applications.You only pay for the resources your functions consume while they are running, making it a cost-effective solution for many scenarios
.
-
In the Azure portal, go to your Resource Group.
-
Click + Create.
-
Search for
Function App
, click onCreate
: -
Choose a
hosting option
; for this example, we will useConsumption
. Click here for a quick overview of hosting options: -
Enter a name for the Function App (e.g.,
ContosoFunctionAppAI
). -
Choose your runtime stack (e.g.,
.NET
orPython
). -
Select the region and other settings.
-
Select Review + create and then Create. Verify the resources created in your
Resource Group
.
Important
This example is using system-assigned managed identity to assign RBACs (Role-based Access Control).
-
Please assign the
Storage Blob Data Contributor
andStorage File Data SMB Share Contributor
roles to theFunction App
within theStorage Account
related to the runtime (the one created with the function app). -
Assign
Storage Blob Data Reader
to theFunction App
within theStorage Account
that will contains the invoices, clickNext
. Then, click onselect members
and search for yourFunction App
identity. Finally click onReview + assign
: -
Also add
Cosmos DB Operator
,DocumentDB Account Contributor
,Cosmos DB Account Reader Role
,Contributor
: -
To assign the
Microsoft.DocumentDB/databaseAccounts/readMetadata
permission, you need to create a custom role in Azure Cosmos DB. This permission is required for accessing metadata in Cosmos DB. Click here to understand more about it.Aspect Data Plane Access Control Plane Access Scope Focuses on data operations
within databases and containers. This includes actions such as reading, writing, and querying data in your databases and containers.Focuses on management operations
at the account level. This includes actions such as creating, deleting, and configuring databases and containers.Roles - Cosmos DB Built-in Data Reader
: Provides read-only access to data within the databases and containers.
-Cosmos DB Built-in Data Contributor
: Allows read and write access to data within the databases and containers.
-Cosmos DB Built-in Data Owner
: Grants full access to manage data within the databases and containers.- Contributor
: Grants full access to manage all Azure resources, including Cosmos DB.
-Owner
: Grants full access to manage all resources, including the ability to assign roles in Azure RBAC.
-Cosmos DB Account Contributor
: Allows management of Cosmos DB accounts, including creating and deleting databases and containers.
-Cosmos DB Account Reader
: Provides read-only access to Cosmos DB account metadata.Permissions - Reading documents
-Writing documents
- Managing data within containers.- Creating or deleting databases and containers
- Configuring settings
- Managing account-level configurations.Authentication Uses Azure Active Directory (AAD) tokens
orresource tokens
for authentication.Uses Azure Active Directory (AAD)
for authentication.
Steps to assing it:
-
Open Azure CLI: Go to the Azure portal and click on the icon for the Azure CLI.
-
List Role Definitions: Run the following command to list all of the role definitions associated with your Azure Cosmos DB for NoSQL account. Review the output and locate the role definition named
Cosmos DB Built-in Data Contributor
.az cosmosdb sql role definition list \ --resource-group "<your-resource-group>" \ --account-name "<your-account-name>"
-
Get Cosmos DB Account ID: Run this command to get the ID of your Cosmos DB account. Record the value of the
id
property as it is required for the next step.az cosmosdb show --resource-group "<your-resource-group>" --name "<your-account-name>" --query "{id:id}"
Example output:
{ "id": "/subscriptions/{subscription-id}/resourceGroups/{resource-group-name}/providers/Microsoft.DocumentDB/databaseAccounts/{cosmos-account-name}" }
-
Assign the Role: Assign the new role using
az cosmosdb sql role assignment create
. Use the previously recorded role definition ID for the--role-definition-id
argument, the unique identifier for your identity for the--principal-id
argument, and youraccount's ID and the Function App
for the--scope
argument. You need to do this for both the Function App to read metadata from Cosmos DB and your ID to access and view the information.You can extract the
principal-id
, fromIdentity
of theFunction App
:az cosmosdb sql role assignment create \ --resource-group "<your-resource-group>" \ --account-name "<your-account-name>" \ --role-definition-id "<role-definition-id>" \ --principal-id "<principal-id>" \ --scope "/subscriptions/{subscriptions-id}/resourceGroups/{resource-group-name}/providers/Microsoft.DocumentDB/databaseAccounts/{cosmos-account-name}"
After a few minutes, you will see something like this:
-
Verify Role Assignment: Use
az cosmosdb sql role assignment list
to list all role assignments for your Azure Cosmos DB for NoSQL account. Review the output to ensure your role assignment was created.az cosmosdb sql role assignment list \ --resource-group "<your-resource-group>" \ --account-name "<your-account-name>"
-
Under
Settings
, go toEnvironment variables
. And+ Add
the following variables: -
COSMOS_DB_ENDPOINT
: Your Cosmos DB account endpoint. -
COSMOS_DB_KEY
: Your Cosmos DB account key. -
contosostorageaidemo_STORAGE
: Your Storage Account connection string. -
Click on
Apply
to save your configuration.
-
You need to install VSCode
-
Install python from Microsoft store:
-
Open VSCode, and install some extensions:
python
, andAzure Tools
. -
Click on the
Azure
icon, andsign in
into your account. Allow the extensionAzure Resources
to sign in using Microsoft, it will open a browser window. After doing so, you will be able to see your subscription and resources. -
Under Workspace, click on
Create Function Project
, and choose a path in your local computer to develop your function. -
Choose the language, in this case is
python
: -
Select the model version, for this example let's use
v2
: -
For the python interpreter, let's use the one installed via
Microsoft Store
: -
Choose a template (e.g., Blob trigger) and configure it to trigger on new PDF uploads in your Blob container.
-
Provide a function name, like
BlobTriggerContosoPDFInvoicesRaw
: -
Next, it will prompt you for the path of the blob container where you expect the function to be triggered after a file is uploaded. In this case is
pdfinvoices
as was previously created. -
Click on
Create new local app settings
, and then choose your subscription. -
Choose
Azure Storage Account for remote storage
, and select one. I'll be using thecontosostorageaidemo
. -
Then click on
Open in the current window
. You will see something like this: -
Now we need to update the function code to extract data from PDFs and store it in Cosmos DB, use this an example:
- Blob Trigger: The function is triggered when a new PDF file is uploaded to the
pdfinvoices
container.- PDF Processing: The read_pdf_content function uses pdfminer.six to read and extract text from the PDF.
- Data Extraction: The extracted text is processed to extract invoice data. The
generate_id
function generates a unique ID for each invoice.- Data Storage: The processed invoice data is saved to Azure Cosmos DB in the
ContosoAIDemo
database andInvoices
container.
pdfminer.six
is an open-source framework. It is a community-maintained fork of the original PDFMiner,designed for extracting and analyzing text data from PDF documents
. The framework is built in a modular way, allowing each component to be easily replaced or extended for various purpose
-
Update the
function_app.py
:Template Blob Trigger Function Code updated import azure.functions as func import logging import json import os import uuid import io from pdfminer.high_level import extract_text from azure.cosmos import CosmosClient, PartitionKey app = func.FunctionApp(http_auth_level=func.AuthLevel.FUNCTION) def read_pdf_content(myblob): # Read the blob content into a BytesIO stream blob_bytes = myblob.read() pdf_stream = io.BytesIO(blob_bytes) # Extract text from the PDF stream text = extract_text(pdf_stream) return text def extract_invoice_data(text): lines = text.split('\n') invoice_data = { "id": generate_id(), "customer_name": "", "customer_email": "", "customer_address": "", "company_name": "", "company_phone": "", "company_address": "", "rentals": [] } for i, line in enumerate(lines): if "BILL TO:" in line: invoice_data["customer_name"] = lines[i + 1].strip() invoice_data["customer_email"] = lines[i + 2].strip() invoice_data["customer_address"] = lines[i + 3].strip() elif "Company Information:" in line: invoice_data["company_name"] = lines[i + 1].strip() invoice_data["company_phone"] = lines[i + 2].strip() invoice_data["company_address"] = lines[i + 3].strip() elif "Rental Date" in line: for j in range(i + 1, len(lines)): if lines[j].strip() == "": break rental_details = lines[j].split() rental_date = rental_details[0] title = " ".join(rental_details[1:-3]) description = rental_details[-3] quantity = rental_details[-2] total_price = rental_details[-1] invoice_data["rentals"].append({ "rental_date": rental_date, "title": title, "description": description, "quantity": quantity, "total_price": total_price }) logging.info("Successfully extracted invoice data.") return invoice_data def save_invoice_data_to_cosmos(invoice_data, blob_name): try: endpoint = os.getenv("COSMOS_DB_ENDPOINT") key = os.getenv("COSMOS_DB_KEY") client = CosmosClient(endpoint, key) logging.info("Successfully connected to Cosmos DB.") except Exception as e: logging.error(f"Error connecting to Cosmos DB: {e}") return database_name = 'ContosoDBAIDemo' container_name = 'Invoices' try: database = client.create_database_if_not_exists(id=database_name) container = database.create_container_if_not_exists( id=container_name, partition_key=PartitionKey(path="/invoice_number"), offer_throughput=400 ) logging.info("Successfully ensured database and container exist.") except Exception as e: logging.error(f"Error creating database or container: {e}") return try: response = container.upsert_item(invoice_data) logging.info(f"Saved processed invoice data to Cosmos DB: {response}") except Exception as e: logging.error(f"Error inserting item into Cosmos DB: {e}") def generate_id(): return str(uuid.uuid4()) @app.blob_trigger(arg_name="myblob", path="pdfinvoices/{name}", connection="contosostorageaidemo_STORAGE") def BlobTriggerContosoPDFInvoicesRaw(myblob: func.InputStream): logging.info(f"Python blob trigger function processed blob\n" f"Name: {myblob.name}\n" f"Blob Size: {myblob.length} bytes") try: text = read_pdf_content(myblob) logging.info("Successfully read and extracted text from PDF.") except Exception as e: logging.error(f"Error reading PDF: {e}") return logging.info(f"Extracted text from PDF: {text}") try: invoice_data = extract_invoice_data(text) logging.info(f"Extracted invoice data: {invoice_data}") except Exception as e: logging.error(f"Error extracting invoice data: {e}") return try: save_invoice_data_to_cosmos(invoice_data, myblob.name) logging.info("Successfully saved invoice data to Cosmos DB.") except Exception as e: logging.error(f"Error saving invoice data to Cosmos DB: {e}")
-
Now, let's update the
requirements.txt
:
Template requirements.txt |
Updated requirements.txt |
---|---|
![]() |
![]() |
azure-functions
pdfminer.six
azure-cosmos==4.3.0
- Since this function has already been tested, you can deploy your code to the function app in your subscription. If you want to test, you can use run your function locally for testing.
Important
If you need further assistance with the code, please click here to view all the function code.
Upload sample PDF invoices to the Blob container and verify that data is correctly ingested and stored in Cosmos DB.
-
Click on
Upload
, then selectBrowse for files
and choose your PDF invoices to be stored in the blob container, which will trigger the function app to parse them. -
Check the logs, and traces from your function with
Application Insights
: -
Under
Investigate
, click onPerformance
. Filter by time range, anddrill into the samples
. Sort the results by date (if you have many, like in my case) and click on the last one. -
Click on
View all
: -
Check all the logs, and traces generated. Also review the information parsed:
-
Validate that the information was uploaded to the Cosmos DB. Under
Data Explorer
, check yourDatabase
: