by jerhewet on 3/17/25, 9:52 PM with 3 comments
by WuxiFingerHold on 3/18/25, 4:26 AM
That doesn't mean it's a fad or generally bad. It's a great way to implement ideas and then validate them. A dream for startups. It doesn't matter if founders can't even write their "hello world" when validating an idea.
The downside is as with all AI generated code comes during the lifecycle of the code: Who's responsible? Who knows what the code does? Who's going to maintain it beyond AI? Is it secure? Reliable? I don't care about the next Tic Toc clone being written by AI, but how about software in the nuclear plants that is powering all the AI? Or in transportation, medical devices?
by techpineapple on 3/17/25, 10:10 PM
by sherdil2022 on 3/18/25, 2:11 AM
For me, it is a better search engine - but we still have to do hard work and due diligence - and debugging.
Overall, we have to embrace LLMs - and whatever tech comes next - and ensure it is used for betterment of humanity.
Tech has also always been used to subjugate others, create deep fakes, wage wars, kill, surveil, discriminate and worse - and AI tech is going to be no different. We have to ensure that guardrails are put and the tech is not abused.