This is a simple chat application built with Express, Socket.io, and a local Ollama model.
- Real-time chat functionality using Socket.io
- Integration with a local Ollama model to generate responses
- Serves a React frontend
-
Hardware Requirements:
- Minimum 8 GB RAM
- Minimum 2 GHz dual-core processor
- 500 MB of free disk space
-
Software Requirements:
- Node.js
- npm
- Ollama model installed and accessible via the command line
- Built on Linux OS (Ubuntu)
-
Clone the repository:
git clone <repository-url> cd <repository-directory>
-
Install the dependencies:
npm install
-
Build the React frontend:
cd /home/subodhi/Desktop/Apps\ 2025/reactjsnew/chat-app npm install npm run build cd -
-
Install Ollama:
curl -fsSL https://ollama.com/install.sh | sh
-
Verify Ollama installation:
which ollama /usr/local/bin/ollama
-
Check installed models:
ls ~/.ollama/models # Expected output: # phi -> ~2.2 GB # mistral -> ~4–5 GB
-
Start the server:
node src/server.js
-
Open your browser and navigate to
http://localhost:5000
.
- Open the chat application in your browser.
- Enter a message in the chat input and press enter.
- The message will be sent to the server, processed by the local Ollama model, and the response will be displayed in the chat.
-
Run the Ollama model:
ollama run phi
-
Check if Ollama is already running:
ps aux | grep ollama
-
Restart Ollama:
kill <pid> ollama serve
- Ensure the Ollama model is installed and accessible via the command line.
- Check the server logs for any errors related to the
exec
function.
This project is licensed under the MIT License.