Skip to content

Example of how to create to extract PDFs from an Azure Storage Account, process them using Azure resources and Open Framework, and store the results in Cosmos DB for further analysis.

License

Notifications You must be signed in to change notification settings

MicrosoftCloudEssentials-LearningHub/PDFs-Invoice-Processing-Fapp-OpenFramework

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

69 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Demo: Automated PDF Invoice Processing with Open Framework (full-code approach)

Azure Storage + Function App + Open Framework + Cosmos DB

Costa Rica

GitHub GitHub brown9804

Last updated: 2025-05-20


Important

This example is based on a public network site and is intended for demonstration purposes only. It showcases how several Azure resources can work together to achieve the desired result. Consider the section below about Important Considerations for Production Environment. Please note that these demos are intended as a guide and are based on my personal experiences. For official guidance, support, or more detailed information, please refer to Microsoft's official documentation or contact Microsoft directly: Microsoft Sales and Support

List of References (Click to expand)
Table of Content (Click to expand)

How to parse PDFs from an Azure Storage Account, process them using a Open Framework (needs manual configuration), and store the results in Cosmos DB for further analysis.

  1. Upload your PDFs to an Azure Blob Storage container.
  2. An Azure Function is triggered by the upload, which uses an Open Framework, and multiple customizations as part of an API call to analyze the PDFs.
  3. The extracted data is parsed and subsequently stored in a Cosmos DB database, ensuring a seamless and automated workflow from document upload to data storage.

Note

Limitations of this approach:

  • Requires significant manual effort to structure and format extracted data.
  • Limited in handling complex layouts and non-text elements like images and charts.
Centered Image

Important Considerations for Production Environment

Private Network Configuration

For enhanced security, consider configuring your Azure resources to operate within a private network. This can be achieved using Azure Virtual Network (VNet) to isolate your resources and control inbound and outbound traffic. Implementing private endpoints for services like Azure Blob Storage and Azure Functions can further secure your data by restricting access to your VNet.

Security

Ensure that you implement appropriate security measures when deploying this solution in a production environment. This includes:

  • Securing Access: Use Azure Entra ID (formerly known as Azure Active Directory or Azure AD) for authentication and role-based access control (RBAC) to manage permissions.
  • Managing Secrets: Store sensitive information such as connection strings and API keys in Azure Key Vault.
  • Data Encryption: Enable encryption for data at rest and in transit to protect sensitive information.
Scalability

While this example provides a basic setup, you may need to scale the resources based on your specific requirements. Azure services offer various scaling options to handle increased workloads. Consider using:

  • Auto-scaling: Configure auto-scaling for Azure Functions and other services to automatically adjust based on demand.
  • Load Balancing: Use Azure Load Balancer or Application Gateway to distribute traffic and ensure high availability.
Cost Management

Monitor and manage the costs associated with your Azure resources. Use Azure Cost Management and Billing to track usage and optimize resource allocation.

Compliance

Ensure that your deployment complies with relevant regulations and standards. Use Azure Policy to enforce compliance and governance policies across your resources.

Disaster Recovery

Implement a disaster recovery plan to ensure business continuity in case of failures. Use Azure Site Recovery and backup solutions to protect your data and applications.

Prerequisites

  • An Azure subscription is required. All other resources, including instructions for creating a Resource Group, are provided in this workshop.
  • Contributor role assigned or any custom role that allows: access to manage all resources, and the ability to deploy resources within subscription.
  • If you choose to use the Terraform approach, please ensure that:

Where to start?

Please follow as described below.

  • If you're choosing the Infrastructure via Azure Portal, please start here in this section.
  • If you're choosing the Infrastructure via Terraform approach:
    1. Please follow the Terraform guide to deploy the necessary Azure resources for the workshop.
    2. Then, follow each each section but skip the creation of each resource.

Important

Regarding Networking, this example will cover Public access configuration, and system-managed identity. However, please ensure you review your privacy requirements and adjust network and access settings as necessary for your specific case.

Overview

Using Cosmos DB provides you with a flexible, scalable, and globally distributed database solution that can handle both structured and semi-structured data efficiently.

  • Azure Blob Storage: Store the PDF invoices.
  • Azure Functions: Trigger on new PDF uploads, extract data, and process it.
  • Azure SQL Database or Cosmos DB: Store the extracted data for querying and analytics.
Resource Recommendation
Azure Blob Storage Use for storing the PDF files. This keeps your file storage separate from your data storage, which is a common best practice.
Azure SQL Database Use if your data is highly structured and you need complex queries and transactions.
Azure Cosmos DB Use if you need a globally distributed database with low latency and the ability to handle semi-structured data.

Function App Hosting Options

In the context of Azure Function Apps, a hosting option refers to the plan you choose to run your function app. This choice affects how your function app is scaled, the resources available to each function app instance, and the support for advanced functionalities like virtual network connectivity and container support.

Tip

  • Scale to Zero: Indicates whether the service can automatically scale down to zero instances when idle.
    • IDLE stands for:
      • I – Inactive
      • D – During
      • L – Low
      • E – Engagement
    • In other words, when the application is not actively handling requests or events (it's in a low-activity or paused state).
  • Scale Behavior: Describes how the service scales (e.g., event-driven, dedicated, or containerized).
  • Virtual Networking: Whether the service supports integration with virtual networks for secure communication.
  • Dedicated Compute & Reserved Cold Start: Availability of always-on compute to avoid cold starts and ensure low latency.
  • Max Scale Out (Instances): Maximum number of instances the service can scale out to.
  • Example AI Use Cases: Real-world scenarios where each plan excels.
Flex Consumption
Feature Description
Scale to Zero Yes
Scale Behavior Fast event-driven
Virtual Networking Optional
Dedicated Compute & Reserved Cold Start Optional (Always Ready)
Max Scale Out (Instances) 1000
Example AI Use Cases Real-time data processing for AI models, high-traffic AI-powered APIs, event-driven AI microservices. Ideal for fraud detection, real-time recommendations, NLP, and computer vision services.
Consumption
Feature Description
Scale to Zero Yes
Scale Behavior Event-driven
Virtual Networking Optional
Dedicated Compute & Reserved Cold Start No
Max Scale Out (Instances) 200
Example AI Use Cases Lightweight AI APIs, scheduled AI tasks, low-traffic AI event processing. Great for sentiment analysis, simple image recognition, and batch ML tasks.
Functions Premium
Feature Description
Scale to Zero No
Scale Behavior Event-driven with premium options
Virtual Networking Yes
Dedicated Compute & Reserved Cold Start Yes
Max Scale Out (Instances) 100
Example AI Use Cases Enterprise AI applications, low-latency AI APIs, VNet integration. Ideal for secure, high-performance AI services like customer support and analytics.
App Service
Feature Description
Scale to Zero No
Scale Behavior Dedicated VMs
Virtual Networking Yes
Dedicated Compute & Reserved Cold Start Yes
Max Scale Out (Instances) Varies
Example AI Use Cases AI-powered web applications, dedicated resources. Great for chatbots, personalized content, and intensive AI inference.
Container Apps Env.
Feature Description
Scale to Zero No
Scale Behavior Containerized microservices environment
Virtual Networking Yes
Dedicated Compute & Reserved Cold Start Yes
Max Scale Out (Instances) Varies
Example AI Use Cases AI microservices architecture, containerized AI workloads, complex AI workflows. Ideal for orchestrating AI services like image processing, text analysis, and real-time analytics.

Step 1: Set Up Your Azure Environment

An Azure Resource Group is a container that holds related resources for an Azure solution. It can include all the resources for the solution or only those you want to manage as a group. Typically, resources that share the same lifecycle are added to the same resource group, allowing for easier deployment, updating, and deletion as a unit. Resource groups also store metadata about the resources, and you can apply access control, locks, and tags to them for better management and organization.

  1. Create an Azure Account: If you don't have one, sign up for an Azure account.
  2. Create a Resource Group:
    • Go to the Azure portal.

    • Navigate to Resource groups.

    • Click + Create.

      image
    • Enter the Resource Group name (e.g., RGContosoAI) and select a region (e.g., East US 2). You can add tags if needed.

    • Click Review + create and then Create.

      image

Step 2: Set Up Azure Blob Storage for PDF Ingestion

An Azure Storage Account provides a unique namespace in Azure for your data, allowing you to store and manage various types of data such as blobs, files, queues, and tables. It serves as the foundation for all Azure Storage services, ensuring high availability, scalability, and security for your data.

A Blob Container is a logical grouping of blobs within an Azure Storage Account, similar to a directory in a file system. Containers help organize and manage blobs, which can be any type of unstructured data like text or binary data. Each container can store an unlimited number of blobs, and you must create a container before uploading any blobs.

  1. Create a Storage Account:

    • In the Azure portal, navigate to your Resource Group.

    • Click + Create.

      image
    • Search for Storage Account.

      image
    • Select the Resource Group you created.

    • Enter a Storage Account name (e.g., contosostorageaidemo).

    • Choose the region and performance options, and click Next to continue.

      image
    • If you need to modify anything related to Security, Access protocols, Blob Storage Tier, you can do that in the Advanced tab.

      image
    • Regarding Networking, this example will cover Public access configuration. However, please ensure you review your privacy requirements and adjust network and access settings as necessary for your specific case.

      image
    • Click Review + create and then Create. Once is done, you'll be able to see it in your Resource Group.

      image
  2. Create a Blob Container:

    • Go to your Storage Account.

    • Under Data storage, select Containers.

    • Click + Container.

    • Enter a name for the container (e.g., pdfinvoices) and set the public access level to Private.

    • Click Create.

      image

Step 3: Set Up Azure Cosmos DB

Azure Cosmos DB is a globally distributed,multi-model database service provided by Microsoft Azure. It is designed to offer high availability, scalability, and low-latency access to data for modern applications. Unlike traditional relational databases, Cosmos DB is a NoSQL database, meaning it can handle unstructured, semi-structured, and structured data types. It supports multiple data models, including document, key-value, graph, and column-family, making it versatile for various use cases.

An Azure Cosmos DB container is a logical unit within a Cosmos DB database where data is stored. Containers are schema-agnostic, meaning they can store items with different structures. Each container is automatically partitioned to scale out across multiple servers, providing virtually unlimited throughput and storage. Containers are the primary scalability unit in Cosmos DB, and they use a partition key to distribute data efficiently across partitions.

  1. Create a Cosmos DB Account:

    • In the Azure portal, navigate to your Resource Group.

    • Click + Create.

    • Search for Cosmos DB, click on Create:

      image
    • Choose your desired API type, for this will be using Azure Cosmos DB for NoSQL. This option supports a SQL-like query language, which is familiar and powerful for querying and analyzing your invoice data. It also integrates well with various client libraries, making development easier and more flexible.

      image
    • Please enter an account name (e.g., contosocosmosdbaidemo). As with the previously configured resources, we will use the Public network for this example. Ensure that you adjust the architecture to include your networking requirements.

    • Select the region and other settings.

    • Click Review + create and then Create.

      image
  2. Create a Database and Container:

    • Go to your Cosmos DB account.

    • Under Data Explorer, click New Database.

      image
    • Enter a database name (e.g., ContosoDBAIDemo) and click OK.

      image
    • Click New Container.

    • Enter a container name (e.g., Invoices) and set the partition key (e.g., /invoice_number).

    • Click OK.

      image image

Step 4: Set Up Azure Functions for Document Ingestion and Processing

An Azure Function App is a container for hosting individual Azure Functions. It provides the execution context for your functions, allowing you to manage, deploy, and scale them together. Each function app can host multiple functions, which are small pieces of code that run in response to various triggers or events, such as HTTP requests, timers, or messages from other Azure services.

Azure Functions are designed to be lightweight and event-driven, enabling you to build scalable and serverless applications. You only pay for the resources your functions consume while they are running, making it a cost-effective solution for many scenarios.

Create a Function App

  • In the Azure portal, go to your Resource Group.

  • Click + Create.

    image
  • Search for Function App, click on Create:

    image
  • Choose a hosting option; for this example, we will use Consumption. Click here for a quick overview of hosting options:

    image
  • Enter a name for the Function App (e.g., ContosoFunctionAppAI).

  • Choose your runtime stack (e.g., .NET or Python).

  • Select the region and other settings.

    image
  • Select Review + create and then Create. Verify the resources created in your Resource Group.

    image

Important

This example is using system-assigned managed identity to assign RBACs (Role-based Access Control). image

  • Please assign the Storage Blob Data Contributor and Storage File Data SMB Share Contributor roles to the Function App within the Storage Account related to the runtime (the one created with the function app).

    image
  • Assign Storage Blob Data Reader to the Function App within the Storage Account that will contains the invoices, click Next. Then, click on select members and search for your Function App identity. Finally click on Review + assign:

    image
  • Also add Cosmos DB Operator, DocumentDB Account Contributor, Cosmos DB Account Reader Role, Contributor:

    image
  • To assign the Microsoft.DocumentDB/databaseAccounts/readMetadata permission, you need to create a custom role in Azure Cosmos DB. This permission is required for accessing metadata in Cosmos DB. Click here to understand more about it.

    Aspect Data Plane Access Control Plane Access
    Scope Focuses on data operations within databases and containers. This includes actions such as reading, writing, and querying data in your databases and containers. Focuses on management operations at the account level. This includes actions such as creating, deleting, and configuring databases and containers.
    Roles - Cosmos DB Built-in Data Reader: Provides read-only access to data within the databases and containers.
    - Cosmos DB Built-in Data Contributor: Allows read and write access to data within the databases and containers.
    - Cosmos DB Built-in Data Owner: Grants full access to manage data within the databases and containers.
    - Contributor: Grants full access to manage all Azure resources, including Cosmos DB.
    - Owner: Grants full access to manage all resources, including the ability to assign roles in Azure RBAC.
    - Cosmos DB Account Contributor: Allows management of Cosmos DB accounts, including creating and deleting databases and containers.
    - Cosmos DB Account Reader: Provides read-only access to Cosmos DB account metadata.
    Permissions - Reading documents
    - Writing documents
    - Managing data within containers.
    - Creating or deleting databases and containers
    - Configuring settings
    - Managing account-level configurations.
    Authentication Uses Azure Active Directory (AAD) tokens or resource tokens for authentication. Uses Azure Active Directory (AAD) for authentication.

Steps to assing it:

  1. Open Azure CLI: Go to the Azure portal and click on the icon for the Azure CLI.

    image
  2. List Role Definitions: Run the following command to list all of the role definitions associated with your Azure Cosmos DB for NoSQL account. Review the output and locate the role definition named Cosmos DB Built-in Data Contributor.

    az cosmosdb sql role definition list \
        --resource-group "<your-resource-group>" \
        --account-name "<your-account-name>"
    image
  3. Get Cosmos DB Account ID: Run this command to get the ID of your Cosmos DB account. Record the value of the id property as it is required for the next step.

    az cosmosdb show --resource-group "<your-resource-group>" --name "<your-account-name>" --query "{id:id}"

    Example output:

    {                                                               
     "id": "/subscriptions/{subscription-id}/resourceGroups/{resource-group-name}/providers/Microsoft.DocumentDB/databaseAccounts/{cosmos-account-name}"
    }     
    image
  4. Assign the Role: Assign the new role using az cosmosdb sql role assignment create. Use the previously recorded role definition ID for the --role-definition-id argument, the unique identifier for your identity for the --principal-id argument, and your account's ID and the Function App for the --scope argument. You need to do this for both the Function App to read metadata from Cosmos DB and your ID to access and view the information.

    You can extract the principal-id, from Identity of the Function App:

    image
    az cosmosdb sql role assignment create \
        --resource-group "<your-resource-group>" \
        --account-name "<your-account-name>" \
        --role-definition-id "<role-definition-id>" \
        --principal-id "<principal-id>" \
        --scope "/subscriptions/{subscriptions-id}/resourceGroups/{resource-group-name}/providers/Microsoft.DocumentDB/databaseAccounts/{cosmos-account-name}"
    image

    After a few minutes, you will see something like this:

    image
  5. Verify Role Assignment: Use az cosmosdb sql role assignment list to list all role assignments for your Azure Cosmos DB for NoSQL account. Review the output to ensure your role assignment was created.

    az cosmosdb sql role assignment list \
        --resource-group "<your-resource-group>" \
        --account-name "<your-account-name>"
    image

Configure/Validate the Environment variables

  • Under Settings, go to Environment variables. And + Add the following variables:

  • COSMOS_DB_ENDPOINT: Your Cosmos DB account endpoint.

  • COSMOS_DB_KEY: Your Cosmos DB account key.

  • contosostorageaidemo_STORAGE: Your Storage Account connection string.

    image image image
  • Click on Apply to save your configuration.

Develop the Function

  • You need to install VSCode

  • Install python from Microsoft store:

    image
  • Open VSCode, and install some extensions: python, and Azure Tools.

    image image
  • Click on the Azure icon, and sign in into your account. Allow the extension Azure Resources to sign in using Microsoft, it will open a browser window. After doing so, you will be able to see your subscription and resources.

    image
  • Under Workspace, click on Create Function Project, and choose a path in your local computer to develop your function.

    image
  • Choose the language, in this case is python:

    image
  • Select the model version, for this example let's use v2:

    image
  • For the python interpreter, let's use the one installed via Microsoft Store:

    image
  • Choose a template (e.g., Blob trigger) and configure it to trigger on new PDF uploads in your Blob container.

    image
  • Provide a function name, like BlobTriggerContosoPDFInvoicesRaw:

    image
  • Next, it will prompt you for the path of the blob container where you expect the function to be triggered after a file is uploaded. In this case is pdfinvoices as was previously created.

    image
  • Click on Create new local app settings, and then choose your subscription.

    image
  • Choose Azure Storage Account for remote storage, and select one. I'll be using the contosostorageaidemo.

    image
  • Then click on Open in the current window. You will see something like this:

    image
  • Now we need to update the function code to extract data from PDFs and store it in Cosmos DB, use this an example:

  1. Blob Trigger: The function is triggered when a new PDF file is uploaded to the pdfinvoices container.
  2. PDF Processing: The read_pdf_content function uses pdfminer.six to read and extract text from the PDF.
  3. Data Extraction: The extracted text is processed to extract invoice data. The generate_id function generates a unique ID for each invoice.
  4. Data Storage: The processed invoice data is saved to Azure Cosmos DB in the ContosoAIDemo database and Invoices container.

pdfminer.six is an open-source framework. It is a community-maintained fork of the original PDFMiner,designed for extracting and analyzing text data from PDF documents. The framework is built in a modular way, allowing each component to be easily replaced or extended for various purpose

  • Update the function_app.py:

    Template Blob Trigger Function Code updated
    image image
    import azure.functions as func
    import logging
    import json
    import os
    import uuid
    import io
    from pdfminer.high_level import extract_text
    from azure.cosmos import CosmosClient, PartitionKey
    
    app = func.FunctionApp(http_auth_level=func.AuthLevel.FUNCTION)
    
    def read_pdf_content(myblob):
        # Read the blob content into a BytesIO stream
        blob_bytes = myblob.read()
        pdf_stream = io.BytesIO(blob_bytes)
        
        # Extract text from the PDF stream
        text = extract_text(pdf_stream)
        return text
    
    def extract_invoice_data(text):
        lines = text.split('\n')
        invoice_data = {
            "id": generate_id(),
            "customer_name": "",
            "customer_email": "",
            "customer_address": "",
            "company_name": "",
            "company_phone": "",
            "company_address": "",
            "rentals": []
        }
    
        for i, line in enumerate(lines):
            if "BILL TO:" in line:
                invoice_data["customer_name"] = lines[i + 1].strip()
                invoice_data["customer_email"] = lines[i + 2].strip()
                invoice_data["customer_address"] = lines[i + 3].strip()
            elif "Company Information:" in line:
                invoice_data["company_name"] = lines[i + 1].strip()
                invoice_data["company_phone"] = lines[i + 2].strip()
                invoice_data["company_address"] = lines[i + 3].strip()
            elif "Rental Date" in line:
                for j in range(i + 1, len(lines)):
                    if lines[j].strip() == "":
                        break
                    rental_details = lines[j].split()
                    rental_date = rental_details[0]
                    title = " ".join(rental_details[1:-3])
                    description = rental_details[-3]
                    quantity = rental_details[-2]
                    total_price = rental_details[-1]
                    invoice_data["rentals"].append({
                        "rental_date": rental_date,
                        "title": title,
                        "description": description,
                        "quantity": quantity,
                        "total_price": total_price
                    })
    
        logging.info("Successfully extracted invoice data.")
        return invoice_data
    
    def save_invoice_data_to_cosmos(invoice_data, blob_name):
        try:
            endpoint = os.getenv("COSMOS_DB_ENDPOINT")
            key = os.getenv("COSMOS_DB_KEY")
            client = CosmosClient(endpoint, key)
            logging.info("Successfully connected to Cosmos DB.")
        except Exception as e:
            logging.error(f"Error connecting to Cosmos DB: {e}")
            return
        
        database_name = 'ContosoDBAIDemo'
        container_name = 'Invoices'
        
        try:
            database = client.create_database_if_not_exists(id=database_name)
            container = database.create_container_if_not_exists(
                id=container_name,
                partition_key=PartitionKey(path="/invoice_number"),
                offer_throughput=400
            )
            logging.info("Successfully ensured database and container exist.")
        except Exception as e:
            logging.error(f"Error creating database or container: {e}")
            return
        
        try:
            response = container.upsert_item(invoice_data)
            logging.info(f"Saved processed invoice data to Cosmos DB: {response}")
        except Exception as e:
            logging.error(f"Error inserting item into Cosmos DB: {e}")
    
    def generate_id():
        return str(uuid.uuid4())
    
    @app.blob_trigger(arg_name="myblob", path="pdfinvoices/{name}",
                      connection="contosostorageaidemo_STORAGE")
    def BlobTriggerContosoPDFInvoicesRaw(myblob: func.InputStream):
        logging.info(f"Python blob trigger function processed blob\n"
                     f"Name: {myblob.name}\n"
                     f"Blob Size: {myblob.length} bytes")
    
        try:
            text = read_pdf_content(myblob)
            logging.info("Successfully read and extracted text from PDF.")
        except Exception as e:
            logging.error(f"Error reading PDF: {e}")
            return
    
        logging.info(f"Extracted text from PDF: {text}")
    
        try:
            invoice_data = extract_invoice_data(text)
            logging.info(f"Extracted invoice data: {invoice_data}")
        except Exception as e:
            logging.error(f"Error extracting invoice data: {e}")
            return
    
        try:
            save_invoice_data_to_cosmos(invoice_data, myblob.name)
            logging.info("Successfully saved invoice data to Cosmos DB.")
        except Exception as e:
            logging.error(f"Error saving invoice data to Cosmos DB: {e}")
  • Now, let's update the requirements.txt:

Template requirements.txt Updated requirements.txt
image image
azure-functions
pdfminer.six
azure-cosmos==4.3.0
  • Since this function has already been tested, you can deploy your code to the function app in your subscription. If you want to test, you can use run your function locally for testing.
    • Click on the Azure icon.

    • Under workspace, click on the Function App icon.

    • Click on Deploy to Azure.

      image
    • Select your subscription, your function app, and accept the prompt to overwrite:

      image
    • After completing, you see the status in your terminal:

      image image

Important

If you need further assistance with the code, please click here to view all the function code.

Step 5: Test the solution

Upload sample PDF invoices to the Blob container and verify that data is correctly ingested and stored in Cosmos DB.

  • Click on Upload, then select Browse for files and choose your PDF invoices to be stored in the blob container, which will trigger the function app to parse them.

    image
  • Check the logs, and traces from your function with Application Insights:

    image
  • Under Investigate, click on Performance. Filter by time range, and drill into the samples. Sort the results by date (if you have many, like in my case) and click on the last one.

    image
  • Click on View all:

    image
  • Check all the logs, and traces generated. Also review the information parsed:

    image
  • Validate that the information was uploaded to the Cosmos DB. Under Data Explorer, check your Database:

    image

Total Visitors

Visitor Count

About

Example of how to create to extract PDFs from an Azure Storage Account, process them using Azure resources and Open Framework, and store the results in Cosmos DB for further analysis.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •