Twitch chat bot that interacts with ollama.
Find a file
2024-03-05 20:40:31 +01:00
.gitignore add docker-compose.yml 2024-03-05 19:03:37 +01:00
commands.go provide ollama system and context through .env 2024-03-05 20:40:31 +01:00
docker-compose.yml add docker-compose.yml 2024-03-05 19:03:37 +01:00
Dockerfile remove unneeded dependencies 2024-03-05 19:14:01 +01:00
env.example provide ollama system and context through .env 2024-03-05 20:40:31 +01:00
generate.go provide ollama system and context through .env 2024-03-05 20:40:31 +01:00
go.mod init commit 2024-03-03 00:07:33 +01:00
go.sum init commit 2024-03-03 00:07:33 +01:00
LICENSE add LICENSE 2024-03-05 19:28:48 +01:00
main.go provide ollama system and context through .env 2024-03-05 20:40:31 +01:00
Makefile cleanup 2024-03-05 18:54:38 +01:00
README.md update README.md 2024-03-05 19:59:56 +01:00
send.go cleanup 2024-03-05 18:54:38 +01:00

NourybotGPT

Twitch chat bot that interacts with ollama. Work in Progress.

Requirements:

Golang

Ollama.com

Build and run:

  1. Change the example values in the provided env.example and rename it to .env.
  2. Make sure ollama is running on the host and reachable at localhost:11434 and the model that you specified in the .env file is already downloaded and ready to go. (Can be checked with e.g. ollama run wizard-vicuna-uncensored)
  3. Run:
    • With docker compose (might need sudo infront if you haven't setup rootless):
      • $ make up
    • Without docker:
      • $ make build && make run
  4. Join the Twitch channels you chose and type ()gpt <cool query> and hopefully get a response.