Skip to content

This is a local llama3 using vllm which belongs to pcs lab.

Notifications You must be signed in to change notification settings

IoTtalk/pcs_llama3

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 

Repository files navigation

Llama 3 on Your Local Computer

Run the Llama 3 model (8-B or 70B) on your server / computer.

Getting Started

Installation

  1. Clone the repository:
git clone https://github.com/IoTtalk/pcs_llama3.git
cd llama3_local
  1. Install required dependencies:
pip install -r requirements.txt

notify that python3.12 cannot be used.

Usage

python llama3_local.py

If you want to setup in a linux server for long term using, you are suggested to use tmux to run code.

About

This is a local llama3 using vllm which belongs to pcs lab.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages