ollama-haskell is an unofficial Haskell binding for Ollama, similar to ollama-python
.
This library allows you to interact with Ollama, a tool that lets you run large language models (LLMs) locally, from within your Haskell projects.
{-# LANGUAGE OverloadedStrings #-}
module Lib where
import Ollama (GenerateOps(..), defaultGenerateOps, generate)
main :: IO ()
main = do
void $
generate
defaultGenerateOps
{ modelName = "llama3.2"
, prompt = "what is functional programming?"
, stream = Just (T.putStr . Ollama.response_, pure ())
}
ghci> import Lib
ghci> main
Whether Haskell is a "good" language depends on what you're looking for in a programming language and your personal preferences. Here are some points to consider:
**Pros:**
1. **Strongly typed**: Haskell's type system ensures that you catch errors early, which leads to fewer bugs and easier maintenance.
2. **Functional programming paradigm**: Haskell encourages declarative coding, making it easier to reason about code and write correct programs.
3. **Garbage collection**: Haskell handles memory management automatically, freeing you from worries about manual memory deallocation.
You can find practical examples demonstrating how to use the library in the src/OllamaExamples.hs
file.
Make sure you have Ollama installed and running on your local machine. You can download it from here.
-
Include the
ollama-haskell
package in your.cabal
file:build-depends: base >= 4.7 && < 5, ollama-haskell
-
Import the
Ollama
module and start integrating with your local LLM.
- Improve documentation
- Add tests.
- Add examples.
- Add CI/CD pipeline.
-
options
parameter ingenerate
.
Stay tuned for future updates and improvements!
This library is developed and maintained by Tushar. Feel free to reach out for any questions or suggestions!
Contributions are welcome! If you'd like to improve the library, please submit a pull request or open an issue. Whether it's fixing bugs, adding new features, or improving documentation, all contributions are greatly appreciated.