red-ollama

command module
v0.0.0-...-a240560 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Aug 3, 2024 License: MIT Imports: 5 Imported by: 0

README

Go Redis Ollama

tl;dr

git clone "https://github.com/davidhintelmann/red-ollama.git"
go build -o "llm.exe"
.\llm.exe

use p flag to enter a prompt

.\llm.exe -p "tell me a joke"

use m flag to enter which model you want to use

.\llm.exe -p "tell me a joke" -m "phi3"

Summary

This repo is an example of how one can use the go programming language to send prompts to Ollama server hosted locally. Using Ollama one can request prompts from LLM or SLM hosted locally.

For example you can download and serve:

  • Microsoft's Phi3 SLM
  • Meta's Llama3.1 LLM

Additionally, using Redis to cache prompts along with their responses

Before Using this Repo

Prerequisites:

  1. Download go
  2. Install Ollama
  3. Install Redis on Windows

Cache Responses

A simple approach to using Redis is to cache prompts along with their response, and then if a user enters the same prompt twice then the cached result will be returned instead.

This was developed on Windows 11 and one can use WSL 2 to install Redis on Windows.

Caution

This example only uses Redis Strings to cache data.

From Redis docs

Similar to byte arrays, Redis strings store sequences of bytes, including text, serialized objects, counter values, and binary arrays.

There are other types as well, for example:

  • Hash
  • List
  • Geospatial

If you install Redis Stack you can also store data as JSON, more info here.

LLMs often output their responses in JSON and caching the data in the same format would be the ideal approach to take.

Documentation

The Go Gopher

There is no documentation for this package.

Directories

Path Synopsis
ollama package is for interacting with local Ollama server.
ollama package is for interacting with local Ollama server.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL