MoodleGPT Local Runtime
Supercharge your extension with offline, strictly confidential AI computation securely hosted on your own machine.
Private 100% Local Inference
Leverage Local LLMs (like Llama 3 via Ollama) directly on your device. Absolutely zero queries, answers, or quiz data are transmitted to external servers. Your academic integrity and data remain entirely isolated on your machine.
Zero API Costs
No need for paid Gemini or OpenAI subscription keys. The Local Runtime uses your hardware to generate responses efficiently, enabling unlimited solving without counting tokens.
RAG System (Knowledge Base)
The Local Runtime includes a custom Knowledge Base (Retrieval-Augmented Generation) system. Drop texts and course documents into the app, and the AI will scan your lectures to construct highly accurate, course-specific answers.
Seamless Extension Integration
It acts as a backend bridge for your MoodleGPT 3 Chrome Extension. Just keep it running in the background. The extension will automatically detect it and route queries through Native Messaging.
Version 1.0.0 | ~95MB | Requires a 64-bit Windows system
How does it connect?
Install the application, open it, and leave it running. The Chrome extension utilizes a Native Messaging protocol to securely interact with the local client, bypassing traditional web requests entirely.