Skip to content

SvenBrnn/runpod-worker-ollama

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Runpod serverless runner for ollama

How to use

Start a runpod serverless with the docker container svenbrnn/runpod-ollama:0.5.7. Set MODEL_NAME environment to a model from ollama.com to automatically download a model. A mounted volume will be automatically used.

Environment variables

Variable Name Description Default Value
MODEL_NAME The name of the model to download NULL

Test requests for runpod.io console

See the test_inputs directory for example test requests.

Streaming

Streaming for openai requests are fully working.

Preload model into the docker image

See the embed_model directory for instructions.

Licence

This project is licensed under the Creative Commons Attribution 4.0 International License. You are free to use, share, and adapt the material for any purpose, even commercially, under the following terms:

  • Attribution: You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
  • Reference: You must reference the original repository at https://github.com/svenbrnn/runpod-worker-ollama.

For more details, see the license.

About

A serverless ollama worker for runpod.io

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published