-
Notifications
You must be signed in to change notification settings - Fork 194
Issues: NVlabs/Sana
Sana 2K pixels model is release. Generate 2K images within 4 ...
#102
by lawrence-cj
was closed Dec 20, 2024
Closed
4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
How much VRAM does the original Sana need to complete training?
#161
opened Jan 24, 2025 by
Pevernow
[Good news] Training a 1.6B model on 24G VRAM through offload and additional support for AdamW8bit optimizer!
documentation
Improvements or additions to documentation
#156
opened Jan 19, 2025 by
Pevernow
Reproduction of the performance
Answered
Answered the question
#154
opened Jan 18, 2025 by
cubeyoung
I’m looking forward to when this model will be compatible with Ipadapter.
Answered
Answered the question
#153
opened Jan 17, 2025 by
wave5fight
Are there any plans to update Replicate with support for SANA LoRAs soon?
#150
opened Jan 15, 2025 by
futureflix87
How to run a single prompt with several pictures on multiple GPUs?
Answered
Answered the question
#149
opened Jan 15, 2025 by
13918763630
How to generate a series of pictures.
Answered
Answered the question
#148
opened Jan 15, 2025 by
13918763630
600M-512px for comfyui not working
Answered
Answered the question
#147
opened Jan 15, 2025 by
Pancat009
ValueError: AutoPipeline can't find a pipeline linked to SanaPipeline for None
Answered
Answered the question
bug
Something isn't working
fixed
fix a bug
#141
opened Jan 10, 2025 by
RageshAntonyHM
question about load sanapipeline
Answered
Answered the question
#134
opened Jan 8, 2025 by
LearningHx
[BUG] FSDP error: Could not find the transformer layer class SanaBlock in the model
Answered
Answered the question
#129
opened Jan 5, 2025 by
Pevernow
Can inference be done at FP8? for both 1K and 2K models
Answered
Answered the question
#128
opened Jan 5, 2025 by
FurkanGozukara
Previous Next
ProTip!
Follow long discussions with comments:>50.