by arkadiyt on 9/8/24, 6:17 PM with 19 comments
by simple10 on 9/8/24, 7:55 PM
For context, I'm currently working on a HIPAA compliant app that uses AI to collect medical background info, then connects the user with an actual human doctor. To get HIPAA certified, the app code, infrastructure, and LLMs all need to be certified, using enterprise accounts with signed BAAs (contracts) that isolate PII and medical data. This prevents the medical data from being used as training data for the LLM.
HIPAA is not a foolproof system, but it's a crucial piece in the trust puzzle. I wouldn't trust an AI medical app without HIPAA certification. The chance of data leaking out through the LLM or hacks is too high without HIPAA.
by jarule on 9/8/24, 8:18 PM
Self-identified woman submits her dick pic.
AI: Massive tumor detected. Seek immediate medical help.
by troupo on 9/8/24, 6:45 PM
by wizzwizz4 on 9/8/24, 6:51 PM
I wouldn't trust the state-of-the-art "machine learning" classifier, and the app described in the article certainly isn't state-of-the-art.
by johnea on 9/8/24, 7:10 PM
But an image analysis application like this is exactly what the tech is good at.
I also highly agree with the skeptisism regarding the companies selling the products. I would want this used along with a Dr's diagnosis, as an additional tool.
by yieldcrv on 9/8/24, 8:02 PM
But since visual analysis doesn't work for almost any STIs, and is intended to be used by uneducated partners that believe a visual inspection would protect them, then no.
by JohnFen on 9/9/24, 4:50 PM
by ARandomerDude on 9/8/24, 6:52 PM
by InfiniteRand on 9/8/24, 7:39 PM
by ein0p on 9/8/24, 7:02 PM
by throw310822 on 9/8/24, 9:15 PM
by echlebek on 9/8/24, 6:47 PM