camenduru's picture
thanks to LPX55 ❤
c510ad7 verified
---
library_name: diffusers
base_model:
- black-forest-labs/FLUX.1-Kontext-dev
tags:
- flux
- flux.1
- kontext
- flux dev
- lightning
- turbo
license: other
license_name: flux-1-dev-non-commercial-license
license_link: LICENSE.md
pipeline_tag: image-to-image
---
**Update 7/9/25:** This model is now quantized and implemented in [this example space.](https://huggingface.co/spaces/LPX55/Kontext-Multi_Lightning_4bit-nf4/) Seeing preliminary VRAM usage at around ~10GB with faster inferencing. Will be experimenting with different weights and schedulers to find particularly well-performing libraries.
# FLUX.1 Kontext-dev X LoRA Experimentation
Highly experimental, will update with more details later.
- 6-8 steps
- <s>Euler, SGM Uniform (Recommended, feel free to play around)</s> Getting mixed results now, feel free to play around and share.
## Model Details
Experimenting with FLUX.1-dev LoRAs and how it affects Kontext-dev. This model has been fused with acceleration LoRAs.
# License
This model falls under the [FLUX.1 \[dev\] Non-Commercial License](https://github.com/black-forest-labs/flux/blob/main/model_licenses/LICENSE-FLUX1-dev), please familiarize yourself with the license.