Skip to content

A Windows console email client powered by LlamaCpp

Notifications You must be signed in to change notification settings

perpendicularai/LlaMail

Repository files navigation

📬 LlaMail :octocat:

A Windows console email client powered by LlamaCpp

📰 Project Background

The idea of having to read endless amounts of email is cumbersome, many may agree that their inbox is a sheer mess. To address this dilemma, we make use of generative ai to formulate a response to an email read using imap. This debatebly eliminates skipping or missing an email and can help drive the cause towards inbox zero. Furthermore, it is strongly inspired by the Semantic Kernel's Function Calling Stepwise Planner by Microsoft

🗺️ Getting started

  • Firstly, you'll need to have the (1) mail server address for your email server, (2) the email address of the account you would like to log into and (3) the password. Imap uses port 587 for sending emails, which is hardcoded as default.
  • Ensure that you have LlamaCpp_Python installed
  • Clone the APIinaShell repo and serve it with your GGUF model of choice. You may obtain a copy of a gguf model from Huggingface.
  • Serve the model with your desired host and port configuration.
  • Once that has been done, navigate to releases, then first download the zip archive named source code, extract the contents and then download the msixbundle and store it in the same directory. From here you have either the option of installing it using Powershell or by double-clicking the msixbundle package. Once the program has been installed, can it then be launched from the start-menu. See below:

📷 Photo Gallery

📹 Short Films

llamail_0.mp4