A locally run AI assistant that uses Ollama (WhiteRabbitNeo-2.5-Qwen-2.5-Coder-7B or LLM of your choice) and scrape the web using Bright Data's Unlocker MCP tools β all orchestrated in Python and Streamlit with some Node.js under the hood. No cloud LLMs. No data sharing. Just local compute magic.
π Local, Private, and Powerful β because your prompts are nobody else's business.
- Python + Asyncio
- LangChain
- Streamlit
- Bright Data MCP
- Ollama
- Terminal spinner magic courtesy of
itertools
- Go to brightdata.com
- Under Proxies and Scraping, create an
Unlocker_MCP
zone - Make sure to:
- Allow Admin Access
- Set token to never expire
- Paste the token into your terminal session:
export BRD_API_KEY=your_token_here
πΈ They offer free credits on signup (no credit card needed). Ignore the payment screens, click through.
git clone https://github.com/drewesk/ai-mcp-py.git
cd ai-mcp-py
conda create -n mcp_env python=3.12
conda activate mcp_env
conda install nodejs
π§ͺ Make sure
node
,npx
, andconda
work from your terminal. You may need to update your.zshrc
or.bashrc
.
pip install -r requirements.txt
npx @brightdata/mcp API_TOKEN=$BRD_API_KEY
In a separate terminal:
ollama serve
ollama run WhiteRabbitNeo/WhiteRabbitNeo-2.5-Qwen-2.5-Coder-7B
Make sure the model runs smoothly on your machine (GPU preferred).
streamlit run mcp_app.py
- Paste URL β
https://medium.com/ayuth/install-anaconda-on-macos-with-homebrew-c94437d63a37
- Prompt β
"What does this article tell me to do?"
- Hit Submit, wait for the π spinner to complete
- VoilΓ ! β you'll see a local LLM-generated summary right on screen
Open PRs, make suggestions, or fork at your liesure.