Skip to content

Commit

Permalink
Merge pull request #69 from meysamhadeli/docs/update-doc
Browse files Browse the repository at this point in the history
docs: update doc
  • Loading branch information
meysamhadeli authored Nov 16, 2024
2 parents a57911a + 4ec280b commit de402ba
Showing 1 changed file with 4 additions and 20 deletions.
24 changes: 4 additions & 20 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,12 +49,12 @@ $env:API_KEY="your_api_key""
### 🔧 Configuration
`codai` requires a `config.yml` file in the `root of your working directory` or using `environment variables` to set below configs `globally` as a configuration.

By default codai config works with `openai` provider and the `config.yml` contains the following values:
The `config` file should be like following example base on your `AI provider`:

**config.yml - openai sample**
**config.yml**
```yml
ai_provider_config:
provider_name: "openai"
provider_name: "openai" # openai | ollama
chat_completion_url: "https://api.openai.com/v1/chat/completions"
chat_completion_model: "gpt-4o"
embedding_url: "https://api.openai.com/v1/embeddings" #(Optional, If you want use RAG.)
Expand All @@ -65,23 +65,7 @@ theme: "dracula"
rag: true #(Optional, If you want use RAG.)
```
Also, to provide the config for the `ollama` provider and the `config.yml` contains the following values:

**config.yml - ollama sample**

```yml
ai_provider_config:
provider_name: "ollama"
chat_completion_url: "http://localhost:11434/v1/chat/completions"
chat_completion_model: "llama3.1"
embedding_url: "http://localhost:11434/v1/embeddings" #(Optional, If you want use RAG.)
embedding_model: "all-minilm:l6-v2" #(Optional, If you want use RAG.)
temperature: 0.2
threshold: 0.3 #(Optional, If you want use RAG.)
theme: "dracula"
rag: true #(Optional, If you want use RAG.)
```

> Note: For `ollama` provider use base url `http://localhost:11434` ollama for chat, embeddings url and also use your `desire models` ollama for chat, embeddings model.
> Note: We used the standard integration of [OpenAI APIs](https://platform.openai.com/docs/api-reference/introduction) and [Ollama APIs](https://github.com/ollama/ollama/blob/main/docs/api.md) and you can find more details in documentation of each APIs.

If you wish to customize your configuration, you can create your own `config.yml` file and place it in the `root directory` of `each project` you want to analyze with codai. If `no configuration` file is provided, codai will use the `default settings`.
Expand Down

0 comments on commit de402ba

Please sign in to comment.