from Hacker News

Show HN: I built an interactive cloth solver for Apple Vision Pro

by lukko on 5/31/24, 8:23 PM with 80 comments

A bit more context - the cloth sim is part of my app, Lungy (https://www.lungy.app). It's designed to be an active meditation / relaxation app, so you can play relaxing instruments in space and immersive breathing exercises. The original Lungy is a breathing app available for iOS, that uses real-time breathing with interactive visuals.

The cloth sim is a Verlet integration, running on a regular grid. For now, I have tried a couple of different cloth scenes - a sort of touch reactive 'pad', where different parts of the cloth are mapped to different sounds and a cloth that blows in sync with breathing. The collision detection is a little bit tricky with the deforming mesh, but seems to work ok overall. Overall, it seems like a cool interaction to explore.

The cloth sim is live on the app store now (and free) - would love to hear feedback from anyone with a Vision Pro.

App Store (Vision Pro): https://apps.apple.com/app/id6470201263

Lungy, original for iOS - https://apps.apple.com/app/id1545223887

  • by v1sea on 6/2/24, 1:13 AM

    Really neat, are you running the cloth simulation on the CPU or GPU? How many elements are in the simulated cloth? Good luck on future AR projects!
  • by xyst on 6/1/24, 9:11 PM

    Is the choppy movement of the cloth because of the limitations of the device or something else?

    Can’t believe this is what $4-5K piece of tech looks like. Wild.

  • by spaceman_2020 on 6/1/24, 11:34 PM

    Not to disparage OP or this product - which sure looks cool - but man, the idea of buying a $3500 device to simulate something I can do by opening my wardrobe seems…absurd.
  • by superamit on 6/1/24, 10:24 PM

    This is really cool! I save AVP for work but have been eager to find new meditation interfaces because it feels like AR has so much potential for this. Will try it out!
  • by throwaway115 on 6/1/24, 8:52 PM

    Congrats! What has it been like developing on the AVP?
  • by jncfhnb on 6/1/24, 7:50 PM

    It seems weird that the interface seems to encourage interacting with it from afar and has no indicators as to where your actions would affect it (like a highly on a pickupable node). Is there some sort of intuitive reason for that that’s not obvious on a video? Not a complaint just curious.
  • by Falimonda on 6/1/24, 11:40 PM

    Are you able to have it interact with the environment?

    Say you slide it over so it's hovering over the edge of your sofa and then you turn gravity on. It seems like you're just a few steps away from what!

  • by actionfromafar on 6/1/24, 11:35 PM

    I imagined it would be a tool for designing clothes. It would then output cutting patterns or stitching instructions. That would be the solving part. :)
  • by wouldbecouldbe on 6/1/24, 9:28 PM

    I’ve been trying to figure out if Vision Pro suffers the same fate as other AR glasses that it can’t be used in the sun, Anyone knows?
  • by kromokromo on 6/1/24, 8:38 PM

    From the title I thought you made a optical recognition system for sorting out the laundry, finding sock pairs etc.

    Very cool sim, but kind of disappointed still. Will build this someday.

  • by hi-v-rocknroll on 6/1/24, 9:51 PM

    Willing to trade a barely-used Quest Pro for an AVP. ;@D
  • by localfirst on 6/1/24, 11:24 PM

    guys im interested in developing for Apple Vision Pro, what are some gotchas, are you seeing success, what have you put on the store?
  • by mritchie712 on 6/2/24, 10:46 AM

    how do you track breathing for Lungy? Is it using the microphone?
  • by pavlov on 6/1/24, 9:37 PM

    I literally haven’t thought about the Vision Pro even a single time in months.

    Wild for such a hyped-up technology product. It just shipped and vanished.

  • by gizajob on 6/2/24, 2:58 PM

    Whatever happened to the Vision Pro? It seems quite strange for Apple to have flubbed its release so hard and to have also gotten away with it.