introduction: simple lightweight inference server container
1: Llama server, simple inference server for docker based on llama cpp python
2: stable diffusion inlcuding a basic server based on diffusers lib
3: vLLM package
introduction: simple lightweight inference server container
1: Llama server, simple inference server for docker based on llama cpp python
2: stable diffusion inlcuding a basic server based on diffusers lib
3: vLLM package