Skip to content

traromal/Ollama-Studio

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Ollama Studio

A chat interface with local LLMs using Ollama.

OllamaChat Screenshot

Table of Contents

Introduction

OllamaChat is a chat interface designed to work with local Language Model Machines (LLMs) using Ollama. This project aims to provide a robust chat application with support for local LLMs, ensuring privacy and performance.

Features

  • Local LLM support using Ollama
  • Easy setup and deployment with Django
  • Secure and private communication

Installation

Prerequisites

Before you begin, ensure you have met the following requirements:

  • Python 3.x
  • Django
  • Ollama

Setup

  1. Clone the Repository
    git clone https://github.com/traromal/OllamaChat.git
    cd OllamaChat
  2. Set Up a Virtual Environment
    python3 -m venv venv
    source venv/bin/activate
  3. Install Django and Other Dependencies
    pip install django
  4. Configure Django Navigate to the OllamaChat directory and create a new Django project: Apply initial migrations:
     python manage.py migrate

5.Run the Development Server

   python manage.py runserver



About

A chat interface with locall LLMs using Ollama

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published