Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[feature]: customise GPT2-124M model #108

Open
Aisuko opened this issue Jun 28, 2024 · 0 comments
Open

[feature]: customise GPT2-124M model #108

Aisuko opened this issue Jun 28, 2024 · 0 comments
Assignees

Comments

@Aisuko
Copy link
Member

Aisuko commented Jun 28, 2024

We use GPT2-series as base line model in our testing process.

So, we will create a customise GPT-2 base on GPT-2-124M with Hugging Face transformers. This is the first step of our customised language model. We want to fine-tune this model with RLHF on special dataset. And we want to create adapters based on this it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
No open projects
Status: No status
Development

No branches or pull requests

1 participant