Skip to content

Run Local LLM Offline

Eliran Wong edited this page Feb 24, 2024 · 2 revisions

Run Local LLM Offline

LetMeDoIt AI support open source Hugging Face language models, with running local LLM server 'ollama'.

An Ollama chatbot is designed for users to chat with various open source offline models.

We also created some plugins for users to seamlessly integrates the offline chatbot with LetMeDoIt AI prompt.

Requirement

LLM server: Install 'Ollama' first. Read https://ollama.com

Memory: You should have at least 8 GB of RAM available to run the 7B models, 16 GB to run the 13B models, and 32 GB to run the 33B models.

Ollama Chat

Launch Ollama Chat and select a model.

Find model information at https://ollama.com/library

hugging_face_models

During the initial execution of a newly chosen model, LetMeDoIt AI will automatically download it for you.

gemma1

After the initial download, you can enter your query in the Ollama chat prompt

gemma2

Plugins

Related plugins are: ask ollama, ask gemma, ask mistral, ask llama2, ask llava

The ask ollama is the wildcard option, as you can choose any language models supported by ollama.   Enable these plugins allows you to ask Ollam / Gemma / Mistral / Llama2 / Llava directly from LetMeDoIt AI prompt, e.g.:

ask mistral the same question

ask_mistral

Launch from Tray Menu

You can launch ollama chat models from tray menu.

tray_menu_updated

Edit Tray menu

You can modify extended items displayed on LetMeDoIt AI tray menu:

  1. Run '.editconfigs' in LetMeDoIt AI prompt

  2. Locate and edit item 'customTrayCommands' # default: ['mistral', 'llama2', 'gemma7b', 'lava']

  3. Press ctrl+s & ctrl+q to save and return to LetMeDoIt AI prompt

Launch from CLI commands

Available CLI commands: 'ollamachat', 'mistral', 'llama2', 'llama213b', 'llama270b', 'gemma2b', 'gemma7b', 'llava', 'phi', 'vicuna'

All the commands support default argument as default entry.

Only 'ollamachat' accepts -m to specify a model.

Installation

Installation
Installation on Android
Install a Supported Python Version
Install ffmpeg
Android Support
Install LetMeDoIt AI on Android Termux App Automatic Upgrade Option

Video Demonstration

Video Demo

Basics

Quick Guide
Action Menu
ChatGPT API Key
Use GPT4 Models
Google API Setup
ElevenLabs API Setup
OpenWeatherMap API Setup
Run Local LLM Offline
Token Management
Command Line Interface Options
Command Execution
Chat-only Features
Developer Mode
Save Chart Content Locally
Work with Text Selection
Work with File Selection
System Tray
Custom Key Bindings

Selective Features

Examples
Features
Change Assistant Name
Setup Multiple LetMeDoIt Assistants
Memory Capabilities
Data Visualization
Analyze Files
Analyze Images
Analyze Audio
Google and Outlook Calendars
Python Code Auto‐heal Feature
Integration with AutoGen
Integration with Google AI Tools
Integration with Open Interpreter
Speak to LetMeDoIt AI
LetMeDoIt Speaks
Speak multi‐dialects
Create a map anytime
Modify your images with simple words
Work with Database Files
Create a Team of AI Assistants
Search and Load Chat Records
Search Weather Information
Search Financial Data
Social Media

Plugins

Plugins ‐ Overview
Plugins - How to Write a Custom Plugin
Plugins ‐ Add Aliases
Plugins ‐ Input Suggestions
Plugins ‐ Install Additional Packages
Plugins ‐ Predefined Contexts
Plugins ‐ Transform Text Output
Plugins ‐ Work with LetMeDoIt AI Configurations
Plugins ‐ Function Calling
Plugins ‐ Run Codes with Specific Packages
Plugins ‐ Work with Non‐conversational Model
Plugins ‐ Integrate Text‐to‐speech Feature
Plugins ‐ Integrate Other Shared Utilities

Comparison

Compare with ChatGPT
Compare with Siri and Others

Clone this wiki locally