v0.3.0 #722
carson-katri
announced in
Announcements
v0.3.0
#722
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
This update adds support for SDXL,
safetensors
andckpt
files, a new API for extending Dream Textures with new backends, and more.Choose Your Installation
Several versions are available. Find the one that will work for you.
Windows
NVIDIA GPU
dream_textures-windows-cuda.7z
archive, select "7-Zip" > "Extract Here" to get thedream_textures-windows-cuda.zip
file.dream_textures-windows-cuda.zip
file.AMD GPU
macOS
Apple Silicon (M1/M2)
Intel
Dream Textures is not currently available for Intel Macs.
Linux
Installation must be completed manually. Follow the instructions to install manually for more details.
Blender Market
You can optionally purchase it for a small fee on Blender Market.
Setup
After installing the appropriate add-on ZIP file, enable the add-on and expand its preferences. There you will find further instructions for setup.
See the setup guide for more detailed instructions.
What's New
Stable Diffusion XL
Use the model
stabilityai/stable-diffusion-xl-base-1.0
for higher resolution 1024x1024 images. Include the refiner modelstabilityai/stable-diffusion-xl-refiner-1.0
to improve the results even further..safetensors
and.ckpt
file supportLink individual files or entire folders of models. You can configure linked models from Dream Textures' preferences.
Public Backend API
A new API is available for extending Dream Textures. This can be used to add custom generation backends.
For example, a backend that connects to ComfyUI could be created as a standalone addon.
Dream Textures has built-in support for a HuggingFace Diffusers backend. If you are interested in contributing a new backend, this backend can be used as a reference.
Dream Textures backend addons are kept in the
community_backends
folder. If you make a useful backend, feel free to open a PR adding it to this folder.Full Changelog
load_model
by @carson-katri in Check for refiner before unpacking tuple fromload_model
#716AutoPipeline.from_pipe
lookup workaround by @carson-katri in ControlNetAutoPipeline.from_pipe
lookup workaround #720Full Changelog: 0.2.0...0.3.0
This discussion was created from the release v0.3.0.
Beta Was this translation helpful? Give feedback.
All reactions