This project provides a user-friendly AI-powered text summarization bot that helps users quickly condense long texts while preserving key information. The system is designed as an API-based solution with a Streamlit frontend for interaction.
The backend is powered by BCG AgentKit, a robust framework for building AI agents, and integrates Dynatrace OpenTelemetry to export traces for monitoring AI performance and system health.
It leverages Ollama and OpenAI's language models for accurate and context-aware text summarization. The entire workflow is orchestrated via Langchain, ensuring smooth AI interactions.
Component | Technology Used |
---|---|
Frontend | Streamlit (Python-based web framework) |
Backend | Python + BCG AgentKit |
AI/ML | Ollama + OpenAI API |
Monitoring | Dynatrace + OpenTelemetry |
Orchestration | Langchain |
Containerization | Docker |
Cloud Platform | Microsoft Azure (Azure Container Instances) |
Version Control | Git |
✅ AI-based summarization: Uses OpenAI's models for concise, meaningful text summaries.
✅ User-friendly interface: Simple web app built with Streamlit.
✅ Backend powered by BCG AgentKit: Utilizes BCG AgentKit for efficient AI agent interactions.
✅ Scalable & cloud-deployable: Runs on Azure Container Instances for seamless scaling.
✅ End-to-end monitoring: Integrated with Dynatrace via OpenTelemetry for exporting traces and performance insights.
✅ Efficient orchestration: Uses Langchain to manage AI tasks efficiently.
✅ Containerized deployment: Packaged with Docker for portability.
git clone https://github.com/Chaimaaorg/SummarizeBot.git
cd SummarizeBot
- Rename
.env.example
to.env
:mv .env.example .env
- Open
.env
and fill in your personal API keys (e.g., OpenAI API key, Dynatrace credentials, etc.).
Use docker-compose
to set up and launch the services:
docker-compose up --build
Once the Ollama image is built, you need to add the required models:
docker exec -it ollama_summarizer bash
ollama pull llama3.2:1b # Example model
exit
(Replace llama3.2:1b
with the model of your choice if needed.)
After pulling the models, restart the service:
docker-compose up --build
Once running, navigate to:
➡️ http://localhost:8501/ to interact with the summarization bot! 🎯🚀
This project exports traces to Dynatrace using OpenTelemetry, ensuring real-time observability and debugging capabilities.
✔ Traces AI interactions and API requests.
✔ Monitors response times, latency, and errors.
✔ Provides performance insights into AI-generated summaries.
+-----------------------+
| User Input |
+-----------------------+
↓
+-----------------------+
| Streamlit UI |
+-----------------------+
↓
+-----------------------+
| FastAPI Backend |
| (BCG AgentKit-based) |
+-----------------------+
↓
+-----------------------+
| Langchain Agent |
+-----------------------+
↓
+-----------------------+
| Ollama + OpenAI API |
+-----------------------+
↓
+-----------------------+
| Dynatrace Tracing |
| (via OpenTelemetry) |
+-----------------------+
↓
+-----------------------+
| Summarized Text |
+-----------------------+
Developed by Chaimaâ Ourgani 🚀
Feel free to contribute via Pull Requests!