Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Marlin Oyster teeML changes #29

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 3 additions & 13 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -136,21 +136,11 @@ We use **Cairo** for two proofs:

We used the LambdaClass CairoVM. Because of the current dependency mismatches between the prover and the runner, the Cairo prover and the Cairo runner had to be compiled separately.

### Using Giza for zkML
### teeML

ZkML is one of ZK's many use cases. It helps you assert that a prediction's result was obtained with the right model, trained on the right dataset, and fed with the right input.
The demo used to generate zk-proofs for the facial feature classification predictions using [Cairo](https://starkware.co/cairo/). This was necessary to predict if the person is smiling & prove the same using a zk-proof. Althought the computations required for this were server-side done on the Cairo VM it still resulted in long wait time for the users of the demo.

**Giza** focuses on helping developers create a provable machine-learning model.

Here is the flow we followed:

1. We used a **simple classifier from the [XGBoost library](https://xgboost.readthedocs.io/en/stable/) in Python**, which Giza fully supports.
2. We serialized our model in json thanks to the Giza SDK.
3. We used the Giza API to turn our model into a Cairo program.
4. We compiled the Cairo ML into Sierra using `scarb`
5. We executed our model in the Cairo VM we were using.

Deep learning models, especially CNNs, would typically be more appropriate for image recognition, but some primitives used by those are not yet supported. Larger models are also extremely hard to run in a Cairo VM because of their high memory requirements.
By using [TEE](https://www.marlin.org/ai) for serving classification predictions & their attestations we not only reduced wait time for the users but also give security security for this particular use-case.

## Thanks

Expand Down
6 changes: 6 additions & 0 deletions package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

30 changes: 30 additions & 0 deletions vibe-check-frontend/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
target
corelib
node_modules
pkg
.env

# Logs
logs
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*
pnpm-debug.log*
lerna-debug.log*

node_modules
dist
dist-ssr
*.local

# Editor directories and files
.vscode/*
!.vscode/extensions.json
.idea
.DS_Store
*.suo
*.ntvs*
*.njsproj
*.sln
*.sw?
21 changes: 16 additions & 5 deletions vibe-check-frontend/README.md
Original file line number Diff line number Diff line change
@@ -1,15 +1,26 @@
# Vibe Check
# Hyle teeML Vibe Check

Frontend for the EthCC Vibe check demo.
Forked from Hyle Vibe check frontend [repo](https://github.com/Hyle-org/vibe-check/tree/main/vibe-check-frontend).

# Changes
## Changed zkML to teeML

The demo used to generate zk-proofs for the facial feature classification predictions using [Cairo](https://starkware.co/cairo/). This was necessary to predict if the person is smiling & prove the same using a zk-proof. Althought the computations required for this were server-side done on the Cairo VM it still resulted in long wait time for the users of the demo.

By using [TEE](https://www.marlin.org/ai) for serving classification predictions & their attestations we not only reduced wait time for the users but also give security security for this particular use-case.

# Todo
- [ ] Make production ready. Currently runnable in dev only.

### Development

Create a `.env` file before running the below commands. See `sample.env`.
```bash
bun install
bun run dev
npm install
npm run dev
```

### Setup

- [VS Code](https://code.visualstudio.com/) + [Vue - Official](https://marketplace.visualstudio.com/items?itemName=Vue.volar)
- [Prettier](https://marketplace.visualstudio.com/items?itemName=esbenp.prettier-vscode)
- [Prettier](https://marketplace.visualstudio.com/items?itemName=esbenp.prettier-vscode)
Binary file modified vibe-check-frontend/bun.lockb
Binary file not shown.
Loading