from Hacker News

Web Stable Diffusion

by crowwork on 3/17/23, 1:37 AM with 41 comments

  • by nl on 3/17/23, 2:37 AM

    Note that (only?) Chrome canary supports WebGPU so this won't yet work in most people's browsers.

    They kindly provide instructions to run it (even on Apple M1).

  • by tormeh on 3/17/23, 2:36 AM

    Anyone interested in this might also be interested in WONNX: https://github.com/webonnx/wonnx
  • by rattt on 3/17/23, 6:26 AM

    Ohh wow that actually worked, that's awesome: https://i.imgur.com/4tYEphX.png

    Tested on Intel MacOS 12.5 PC with AMD 8GB RX 580 GPU, about 28 secs for 20 steps, surprisingly fast too. I did have to go to chrome://flags and enable "Unsafe WebGPU" even on Chrome Canary (113.0.5656.0) before it would work, otherwise I just got "no adapter" errors.

  • by wood_spirit on 3/17/23, 6:02 AM

    This is a tangent, but I’ve been wondering… and perhaps HNera know?

    I use some basic libre cad programs to plan my dream house project. And their renderings are pretty non-photorealistic.

    Are there any upscalers that can take an inside or outside house render and make it look like something from Pinterest? Meaning the input is an image, not text?

  • by junrushao1994 on 3/17/23, 3:31 AM

    Is it possible to integrate this with [onnxruntime-web](https://onnxruntime.ai/docs/tutorials/web/)?
  • by gamblor956 on 3/17/23, 6:19 PM

    Confirmed working on Windows with AMD RX 590 in Chrome Canary. About 23 seconds on average using DPM (20 steps), 55 second average using PNDM (50 steps).

    I had issues compiling it on my own computer, but the demo version at https://mlc.ai/web-stable-diffusion/#text-to-image-generatio... works fine.

  • by kevinlinxc on 3/17/23, 2:28 AM

    Can someone Eli5 how machine learning compilation works? Is site basically A1111's SD web ui with fewer bells and whistles but way less intensive?
  • by jaimex2 on 3/17/23, 7:46 AM

    Where's the 4GB model loaded from and to where?
  • by jokethrowaway on 3/17/23, 11:52 AM

    Interesting choice!

    Before reading what they used I assumed they would run tch-rs (torchlib bindings for rust) on wgpu and ship it via wasm.

  • by bulbosaur123 on 3/17/23, 12:09 PM

    How does this compare to Automatic1111?
  • by onion2k on 3/17/23, 9:15 AM

    It works (very, very slowly) on my Intel Macbook. Very impressive indeed. WebGPU has a ton of potential.