Skip to content
This repository has been archived by the owner on Jan 26, 2024. It is now read-only.

Latest commit

 

History

History
78 lines (53 loc) · 2.74 KB

README.md

File metadata and controls

78 lines (53 loc) · 2.74 KB

Contributors Forks Stargazers Issues

About The Project

This is a project aiming at allowing users to easily have a good llm running locally on their laptops / machines.

All credits go to the llama2 team, Luna AI, TheBloke, and the llama.cpp project authors.

This project is archived in favour of using ollama, as it is just far more convenient.

(back to top)

Getting Started

Installation

  1. Clone this repository
    git clone https://github.com/beeemT/llama2-chat-aur.git
  2. Install with makepkg
    cd llama2-chat-aur && makepkg -sic

(back to top)

Usage

run llama2-chat in the terminal of your choice.

(back to top)

Acknowledgments

(back to top)