From f553ab9c4b117104fe3b27f50456d5a1158bc1c9 Mon Sep 17 00:00:00 2001 From: Kentaro Wada Date: Thu, 1 Aug 2024 19:35:56 +0900 Subject: [PATCH] Update README.md --- README.md | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/README.md b/README.md index fed8ea5..9aed243 100644 --- a/README.md +++ b/README.md @@ -2,19 +2,19 @@ logo

Osam

- Get up and running with segment-anything models locally. + Get up and running with promptable vision models locally.




-*Osam* (/oʊˈsɑm/) is a tool to run open-source segment-anything models locally +*Osam* (/oʊˈsɑm/) is a tool to run open-source promptable vision models locally (inspired by [Ollama](https://github.com/ollama/ollama)). *Osam* provides: -- **Segment-Anything Models** - original SAM, EfficientSAM; +- **Promptable Vision Models** - Segment Anything Model (SAM), EfficientSAM, YOLO-World; - **Local APIs** - CLI & Python & HTTP interface; - **Customization** - Host custom vision models. @@ -44,7 +44,7 @@ To run with EfficientSAM: osam run efficientsam --image ``` -To run with YoloWorld: +To run with YOLO-World: ```bash osam run yoloworld --image