Skip to content

Commit

Permalink
Fix/echo (#58)
Browse files Browse the repository at this point in the history
* feat: Update QLLM CLI to support running commands in chat

* fix(packages/qllm-cli/src): improve CLI config management and validation

* fix(packages/qllm-cli): update chat config from cli config

* fix(packages/qllm-cli/src/utils): remove unused type ConfigKey

* fix(packages/qllm-lib/src/providers/ollama): remove unused import

* fix(cli): update readme with minor formatting improvements

* fix(packages/qllm-cli): update readme with installation instructions

* update

* fix(.): (readme): enhance documentation with more details and examples

* fix(readme): update chapter structure and content

* chore: Update README with additional resources

* chore: Update README with additional resources

* fix(README.md): update README with new content and formatting

* fix(.): Improve README.md formatting and add acknowledgements section

* fix(packages/qllm-cli): Update README.md and chat-command.ts

* fix(packages/qllm-cli): improve configuration and fix several issues
  • Loading branch information
raphaelmansuy committed Sep 5, 2024
1 parent ed93b06 commit a5e14a0
Show file tree
Hide file tree
Showing 27 changed files with 642 additions and 475 deletions.
505 changes: 294 additions & 211 deletions README.md

Large diffs are not rendered by default.

11 changes: 11 additions & 0 deletions packages/qllm-cli/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,16 @@
# qllm

## 2.9.0

### Minor Changes

- Improve configuration, and several fix

### Patch Changes

- Updated dependencies
- qllm-lib@3.6.0

## 2.8.0

### Minor Changes
Expand Down
104 changes: 57 additions & 47 deletions packages/qllm-cli/README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,13 @@
# QLLM: Quantalogic Large Language Model CLI & AI Toolbox 🚀

![npm version](https://img.shields.io/npm/v/qllm)
![Stars](https://img.shields.io/github/stars/quantalogic/qllm)
![Forks](https://img.shields.io/github/forks/quantalogic/qllm)

## Table of Contents

...

## Table of Contents

1. [Introduction](#1-introduction)
Expand Down Expand Up @@ -82,15 +90,17 @@ QLLM CLI boasts an impressive array of features designed to elevate your AI inte

To embark on your QLLM CLI journey, ensure you have Node.js (version 14 or higher) installed on your system. Then, execute the following command:

```bash
- **Install QLLM CLI globally:**

```
npm install -g qllm
```

This global installation makes the `qllm` command readily available in your terminal.

Verify the installation with:

```bash
```
qllm --version
```

Expand All @@ -104,7 +114,7 @@ Before diving into the world of AI interactions, configure QLLM CLI with your AP

Initiate the interactive configuration mode:

```bash
```
qllm configure
```

Expand Down Expand Up @@ -140,7 +150,7 @@ This command allows you to manage configuration settings for the QLLM CLI.

Examples:

```bash
```
qllm configure --set provider=openai
qllm configure --set model=gpt-4
```
Expand All @@ -149,7 +159,7 @@ qllm configure --set model=gpt-4

Display your current settings at any time:

```bash
```
qllm configure --list
```

Expand All @@ -167,13 +177,13 @@ QLLM CLI offers a variety of commands for interacting with LLMs. Here's an overv

QLLM CLI allows you to run templates directly. This is now the default behavior when no specific command is provided:

```bash
```
qllm <template_url>
```

For example:

```bash
```
qllm https://raw.githubusercontent.com/quantalogic/qllm/main/prompts/chain_of_tought_leader.yaml
```

Expand All @@ -192,39 +202,39 @@ The `run` command supports various options:

#### Using with Piped Input

```bash
```
echo "Explain quantum computing" | qllm ask
```

or

```bash
```
cat article.txt | qllm ask "Summarize this text"
```

#### Image Analysis

```bash
```
qllm ask "Describe this image" -i path/to/image.jpg
```

#### Streaming Responses

```bash
```
qllm ask "Write a short story about AI" -s
```

#### Saving Output to File

```bash
```
qllm ask "Explain the theory of relativity" -o relativity_explanation.txt
```

### Interactive Chat

Start an interactive chat session:

```bash
```
qllm chat
```

Expand Down Expand Up @@ -252,13 +262,13 @@ The `chat` command also supports options similar to the `ask` command for settin

View available providers:

```bash
```
qllm list providers
```

List models for a specific provider:

```bash
```
qllm list models openai
```

Expand All @@ -273,7 +283,7 @@ The `list models` command offers several options:

Manage your settings at any time:

```bash
```
qllm configure --set model gpt-4
qllm configure --get logLevel
qllm configure --list
Expand All @@ -287,7 +297,7 @@ QLLM CLI offers sophisticated features for power users:

Include images in your queries for visual analysis:

```bash
```
qllm ask "Describe this image" -i path/to/image.jpg
```

Expand All @@ -300,27 +310,27 @@ QLLM CLI supports multiple image input methods:

Use an image from your clipboard:

```bash
```
qllm ask "What's in this image?" --use-clipboard
```

Capture and use a screenshot:

```bash
```
qllm ask "Analyze this screenshot" --screenshot 0
```

Combine multiple image inputs:

```bash
```
qllm ask "Compare these images" -i image1.jpg -i image2.jpg --use-clipboard
```

### Streaming Responses

For long-form content, stream the output in real-time:

```bash
```
qllm ask "Write a short story about AI" -s
```

Expand All @@ -330,7 +340,7 @@ This feature allows you to see the AI's response as it's generated, providing a

Save the LLM's response directly to a file:

```bash
```
qllm ask "Explain the theory of relativity" -o relativity_explanation.txt
```

Expand Down Expand Up @@ -364,62 +374,62 @@ Each command supports various options. Use `qllm <command> --help` for detailed

Explore these example use cases for QLLM CLI:

1. Creative Writing Assistance:
1. **Creative Writing Assistance:**

```bash
```
qllm ask "Write a haiku about artificial intelligence"
```

2. Code Explanation:
2. **Code Explanation:**

```bash
```
qllm ask "Explain this Python code: [paste your code here]"
```

3. Image Analysis:
3. **Image Analysis:**

```bash
```
qllm ask "Describe the contents of this image" -i vacation_photo.jpg
```

4. Interactive Problem-Solving:
4. **Interactive Problem-Solving:**

```bash
```
qllm chat -p anthropic -m claude-2
```

5. Data Analysis:
5. **Data Analysis:**

```bash
```
qllm ask "Analyze this CSV data: [paste CSV here]" --max-tokens 500
```

6. Language Translation:
6. **Language Translation:**

```bash
```
qllm ask "Translate 'Hello, world!' to French, Spanish, and Japanese"
```

7. Document Summarization:
7. **Document Summarization:**

```bash
```
qllm ask "Summarize this article: [paste article text]" -o summary.txt
```

8. Character Creation:
8. **Character Creation:**

```bash
```
qllm ask "Create a detailed character profile for a sci-fi novel"
```

9. Recipe Generation:
9. **Recipe Generation:**

```bash
```
qllm ask "Create a recipe using chicken, spinach, and feta cheese"
```

10. Workout Planning:
```bash
10. **Workout Planning:**
```
qllm ask "Design a 30-minute HIIT workout routine"
```

Expand All @@ -429,7 +439,7 @@ If you encounter issues while using QLLM CLI, try these troubleshooting steps:

1. Verify your API keys are correctly configured:

```bash
```
qllm configure --list
```

Expand All @@ -439,7 +449,7 @@ If you encounter issues while using QLLM CLI, try these troubleshooting steps:

3. Update to the latest version of QLLM CLI:

```bash
```
npm update -g qllm
```

Expand All @@ -455,15 +465,15 @@ If problems persist, please open an issue on our GitHub repository with a detail

## 10. Contributing

We warmly welcome contributions to QLLM CLI! To contribute, please follow these steps:
We warmly welcome contributions to QLLM CLI! This project is licensed under the Apache License, Version 2.0. To contribute, please follow these steps:

1. Fork the repository on GitHub.
2. Clone your forked repository to your local machine.
3. Create a new branch for your feature or bug fix.
4. Make your changes, adhering to the existing code style and conventions.
5. Write tests for your changes if applicable.
6. Run the existing test suite to ensure your changes don't introduce regressions:
```bash
```
npm test
```
7. Commit your changes with a clear and descriptive commit message.
Expand All @@ -479,7 +489,7 @@ Please ensure your code adheres to our coding standards:

We use GitHub Actions for CI/CD, so make sure your changes pass all automated checks.

## 11. License
### License

This project is licensed under the Apache License, Version 2.0. You may obtain a copy of the License at

Expand Down
2 changes: 1 addition & 1 deletion packages/qllm-cli/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "qllm",
"version": "2.8.0",
"version": "2.9.0",
"description": "QLLM CLI: A versatile CLI tool for interacting with multiple AI/LLM providers. Features include chat sessions, one-time queries, image handling, and conversation management. Streamlines AI development with easy provider/model switching and configuration.",
"keywords": [
"ai",
Expand Down
1 change: 0 additions & 1 deletion packages/qllm-cli/src/chat/chat.ts
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,6 @@ export class Chat {

private async promptUser(): Promise<void> {
try {

const input = await this.ioManager.getUserInput("You: ");

// Check if input is undefined (e.g., due to Ctrl+C)
Expand Down
2 changes: 1 addition & 1 deletion packages/qllm-cli/src/chat/command-processor.ts
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,7 @@ export class CommandProcessor {
provider: config.get("provider"),
maxTokens: config.get("maxTokens"),
temperature: config.get("temperature"),
stream: true,
noStream: false,
});

if (result && conversationId) {
Expand Down
Loading

0 comments on commit a5e14a0

Please sign in to comment.