Skip to content

lukexyz/home-improvement

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

29 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

home-improvement

Exterior design using stable-diffusion ๐Ÿก โ†’ General install instructions.

References

  1. CompVis Stable Diffusion
    High-Resolution Image Synthesis with Latent Diffusion Models

  2. Basujindal fork optimisation for lesser VRAM
    Optimized Stable Diffusion (Sort of)

  3. ControlNet
    https://github.com/lllyasviel/ControlNet


๐Ÿ–ผ๏ธโ†’๐Ÿ–ผ๏ธ img2img with custom images

pstring = "An fantasy english family home, dog in the foreground, fantasy, illustration, trending on artstation"
input_img = "../inputs/halle_at_home_2021_s.JPG"

strength = range(30, 75, 5)
for s in strength:
    !python optimizedSD/optimized_img2img.py --prompt "{pstring}" --init-img {input_img} --strength {s*0.01} --seed 200 --outdir {outdir}

home example

๐ŸŽจโ†’๐Ÿ–ผ๏ธ controlnet concept generation

  • ๐Ÿ  Exterior design with stable diffusion with controlnet, canny-fp16 edge detection.
prompt = "modern english front garden, with traditional lush green lawn and striking architectural design"

controlnet home example

  • Alternative edge control, using hed-fp16
Steps: 30, Sampler: DPM++ SDE Karras, CFG scale: 7, Seed: 3669285758, Size: 512x512, 
Model hash: bb6e6362d8, Model: chikmix_V1, ControlNet: "preprocessor: softedge_hed, 
model: control_hed-fp16 [13fee50b]

controlnet backyard example


๐ŸŽจโ†’๐Ÿ–ผ๏ธ Generation from scratch with MidJourney + controlnet

  1. Midjourney generation from prompt
line art drawing of top down landscape 
architectural plan of a classic english garden --s 1 --v 4 --q 2 --s 5000
  1. Stable Diffusion + ControlNet with canny-fp16
landscape garden with flowers, professional photograph, acurate, intricate

midjourney example




๐ŸŒ Stable Diffusion


๐Ÿ–ผ๏ธโ†’๐Ÿ–ผ๏ธ img2img iterative improvements

Example from argaman123 ๐Ÿ”—

  • Using the output of one image to generate a new image.
  • This iterative process can make increasingly complex and customizable images.

A distant futuristic city full of tall buildings inside a huge transparent glass dome, In the middle of a barren desert full of large dunes, Sun rays, Artstation, Dark sky full of stars with a shiny sun, Massive scale, Fog, Highly detailed, Cinematic, Colorful


img2img_given_example

img2img_given_example

!python optimizedSD/optimized_img2img.py --prompt "{pstring}" --init-img {input_img} --strength 0.8 
--n_iter 2 --n_samples 3 --H 512 --W 512 --seed 12 --outdir {outdir} --ddim_steps 200


๐Ÿ–ผ๏ธโ†’๐Ÿ–ผ๏ธ img2img with strength variation

Using an input image to create unlimited variations.

img2img example



๐Ÿ–ผ๏ธโ†’๐Ÿ–ผ๏ธ Inpainting with diffusers

Inpainting allows applying a layer mask to an area of interest โ€“ and then running img2img with a text prompt to generate new content.

Example: Adding a dragon to the castle (1) and then adding flaming rubble to the gate (2).

Inpainting_given_example

prompt = "A fantasy castle with a dragon defending. Trending on artstation, 
          precise lineart, award winning, divine"

with autocast("cuda"):
    images = pipe(prompt=prompt, init_image=init_image, mask_image=mask_image, strength=0.7)["sample"]


๐Ÿ“ฑ๐Ÿ–ผ๏ธ Gradio WebUI by hlky

Gradio webui by hlky https://github.com/sd-webui/stable-diffusion-webui

  • Clone repo
  • Run webui.bat from windows explorer


Training Data Visualisations

LAION-Aesthetics v2 6+ on Datasette:

From this blog post, and Hackernews conversation.

  1. Top Artists
    https://laion-aesthetic.datasette.io/laion-aesthetic-6pls/artists?_sort_desc=image_counts

  2. Search by Artist
    https://laion-aesthetic.datasette.io/laion-aesthetic-6pls/images?_search=%22Thomas+Kinkade%22&_sort=rowid



About

Exterior design using stable-diffusion ๐Ÿก

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published