Skip to content

🦜πŸͺΊ 🐚 Parakeet4Shell is a set of scripts, made to simplify the development of small Bash generative AI applications with Ollama πŸ¦™.

Notifications You must be signed in to change notification settings

parakeet-nest/parakeet4shell

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

11 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Parakeet4Shell

🦜πŸͺΊπŸš Parakeet4Shell is a set of scripts, made to simplify the development of small Bash generative AI applications with Ollama πŸ¦™.

Requirements

How to use

Add this at the beginning of your script:

. "./lib/parakeet.sh"

Let's have a look to the example folder.

Chat completion without streaming

#!/bin/bash
. "./lib/parakeet.sh"

OLLAMA_URL=${OLLAMA_URL:-http://localhost:11434}

MODEL="tinyllama"

read -r -d '' SYSTEM_CONTENT <<- EOM
You are an expert of the StarTrek universe. 
Your name is Seven of Nine.
Speak like a Borg.
EOM

read -r -d '' USER_CONTENT <<- EOM
Who is Jean-Luc Picard?
EOM

SYSTEM_CONTENT=$(Sanitize "${SYSTEM_CONTENT}")
USER_CONTENT=$(Sanitize "${USER_CONTENT}")

read -r -d '' DATA <<- EOM
{
  "model":"${MODEL}",
  "options": {
    "temperature": 0.5,
    "repeat_last_n": 2
  },
  "messages": [
    {"role":"system", "content": "${SYSTEM_CONTENT}"},
    {"role":"user", "content": "${USER_CONTENT}"}
  ],
  "stream": false,
  "raw": false
}
EOM

jsonResult=$(Chat "${OLLAMA_URL}" "${DATA}")

messageContent=$(echo "${jsonResult}" | jq '.message.content')

echo "${messageContent}" 

Chat completion with streaming

#!/bin/bash
. "./lib/parakeet.sh"

OLLAMA_URL=${OLLAMA_URL:-http://localhost:11434}

MODEL="tinyllama"

# System instructions
read -r -d '' SYSTEM_CONTENT <<- EOM
You are an expert of the StarTrek universe. 
Your name is Seven of Nine.
Speak like a Borg.
EOM

# User message
read -r -d '' USER_CONTENT <<- EOM
Who is Jean-Luc Picard?
EOM

SYSTEM_CONTENT=$(Sanitize "${SYSTEM_CONTENT}")
USER_CONTENT=$(Sanitize "${USER_CONTENT}")

# Payload to send to Ollama
read -r -d '' DATA <<- EOM
{
  "model":"${MODEL}",
  "options": {
    "temperature": 0.5,
    "repeat_last_n": 2
  },
  "messages": [
    {"role":"system", "content": "${SYSTEM_CONTENT}"},
    {"role":"user", "content": "${USER_CONTENT}"}
  ],
  "stream": true
}
EOM

# This function will be called for each chunk of the response
function onChunk() {
  chunk=$1
  data=$(echo ${chunk} | jq -r '.message.content')
  echo -n "${data}"
}

ChatStream "${OLLAMA_URL}" "${DATA}" onChunk

Acknowledgments:

  • Thanks to Sylvain for the discussion on curl callbacks.
  • Thanks to Gemini for all the discussions on Bash.

About

🦜πŸͺΊ 🐚 Parakeet4Shell is a set of scripts, made to simplify the development of small Bash generative AI applications with Ollama πŸ¦™.

Resources

Stars

Watchers

Forks

Packages

No packages published