by caust1c on 1/6/25, 2:55 AM with 557 comments
by 015a on 1/6/25, 5:22 AM
One of the few examples I can think of however is Apple Maps. And it did get better; a lot better, some say better than Google Maps nowadays. So I generally do have hope for Apple Intelligence. At the end of the day, there are some disparate competing utilities in this class on the Samsung and Google phones, but no one is shipping something that is obviously game-changing and in first place; they all kinda suck, they're all tech demos, and it'll inevitably take many years to get this technology honed in to something that is truly useful to consumers.
by viccis on 1/6/25, 8:20 PM
The problem with interpreting AI through that lens is that AI, as it is being used here, is not an extension of your mind. Plenty of other things are (organizers for example), but AI does not extend your thoughts. It replaces them. Its notification summary feature does not improve your ability to quickly digest lots of notification information, it replaces it with its own attempt, which, not being your own judgment, can and does easily err.
There are some uses of AI that do act more like a McLuhanesque medium. Some copilot applications, in which suggestions are presented that a user accepts and refines them, are examples of this. But a lot of the uses of both image generation and LLM tools serve to limit what your mind does rather than expand it.
by mrcwinn on 1/6/25, 2:47 PM
It's also worth noting that Apple traditionally is not a first mover and looks for "inspiration" from smaller competitors. In this case, there is no comp to reference. There is no startup mobile OS innovating in integrated AI. That, and the supposedly rushed timetable, probably explains a lot.
by lo_fye on 1/6/25, 4:02 PM
FALSE. Apple defines a photo as a record of something that actually happened. iPhones take photos. They doen't auto-swap a high-res moon in for the real one like Samsung phones do.
Clean Up (like crop) is just an editing feature, manually applied after a photo already exists, and using it effectively changes the image from a photo into an "edited image", the same way using Photoshop does.
Definitions of What a Photo Is:
Apple - "Here’s our view of what a photograph is. The way we like to think of it is that it’s a personal celebration of something that really, actually happened. Whether that’s a simple thing like a fancy cup of coffee that’s got some cool design on it, all the way through to my kid’s first steps, or my parents’ last breath, It’s something that really happened. It’s something that is a marker in my life, and it’s something that deserves to be celebrated." - John McCormack, VP of Camera Software Engineering @ Apple
Samsung - "Actually, there is no such thing as a real picture. As soon as you have sensors to capture something, you reproduce [what you’re seeing], and it doesn’t mean anything. There is no real picture. You can try to define a real picture by saying, ‘I took that picture’, but if you used AI to optimize the zoom, the autofocus, the scene — is it real? Or is it all filters? There is no real picture, full stop." - Patrick Chomet, Executive VP of Customer Experience @ Samsung
Google - "It’s about what you’re remembering,” he says. “When you define a memory as that there is a fallibility to it: You could have a true and perfect representation of a moment that felt completely fake and completely wrong. What some of these edits do is help you create the moment that is the way you remember it, that’s authentic to your memory and to the greater context, but maybe isn’t authentic to a particular millisecond." - Isaac Reynolds, Product Manager for Pixel Cameras @ Google
Definitions via https://www.theverge.com/2024/9/23/24252231/lets-compare-app...
by voidfunc on 1/6/25, 8:25 PM
An Apple marketing executive is smiling somewhere. Brainwashed another one!
by twodave on 1/7/25, 3:24 AM
by officeplant on 1/6/25, 3:05 PM
by ripped_britches on 1/6/25, 4:47 AM
Really loved this article overall, but I have to super-disagree here. The core of ChatGPT is you can have a conversation with a computer program.
Take away saving history and you can still have a conversation with a computer program (see ephemeral chats).
Take away typing one word at a time and you still have a conversation with a computer program (see non-streaming API / batch API).
But major props for writing this live on a twitch stream, benefit of the doubt there my friend.
by deergomoo on 1/6/25, 8:18 PM
I can’t help but wonder if the reason “agentic” systems seem so appealing to people is because as an industry we’ve spent the past fifteen years making software harder to use.
by armada651 on 1/6/25, 2:45 PM
MacWrite was released 5 years after WordPerfect, which itself is predated by WordStar. I don't get why Apple fans have this obsession with pretending Apple invents these things.
Apple refines what others have attempted before, that's what they're good at. Part of the reason people are disappointed with Apple these days is because of this fantasy image of Apple as an inventor.
by infecto on 1/6/25, 2:56 PM
by AlexandrB on 1/6/25, 4:24 AM
by rylan-talerico on 1/7/25, 12:24 AM
In my view, Private Cloud Compute and Apple Intelligence, together with the ubiquity of Apple devices, position Apple as a leading candidate to realize the widespread, AI-enabled transformation of personal computing discussed in the post — with tasks requiring less energy and cognitive load than they do today for the general consumer.
by cratermoon on 1/6/25, 3:14 PM
> I want the data coming off of the sensor to be the data that makes up the image. I want to avoid as much processing as possible and I want the photo to be a reflection of reality as it is, not reality as it should have been. Sure, sometimes I'll do some color correction or cropping in post, but that doesn't change the content of the image, only its presentation.
First nit: the iPhone camera, and all digital cameras, are deeply influenced by computational photography techniques. What this means is that you essentially never get the raw pixel values, although there are exceptions. The image you get is already significantly manipulated.
Second nit: color correction, color in general, dynamic range, focus, depth of field, and more are all manipulations made by default, even long before digital cameras when film was king. There is no "correct" image version of what our eyes see, there is only pleasing to the photographer and the audience.
An example: the negative for Ansel Adams' well known "Moonrise Over Hernandez , New Mexico" looks like, at first glance, something a professional would trash for lacking detail.
Here's contact print vs the version most of use will probably recognize: https://images.squarespace-cdn.com/content/v1/5f5fe5ca8d6a35...
Here are four different versions Adams printed over the course of 3 decades: https://images.squarespace-cdn.com/content/v1/5f5fe5ca8d6a35...
I will mention, but won't even get into a topic that will surely bait HN commentors: Kodak designed and standardized its color film to represent Caucasian skin tones. It wasn't until chocolate and furniture makers complained that everything looked like the same gross mud in their expensively-produced product catalogs that Kodak took a look at rendering dark brown/red/yellow tones more pleasingly. Notice I said "more pleasingly", not "correctly".
by r00fus on 1/6/25, 6:40 AM
by PaulHoule on 1/6/25, 3:37 PM
LLMs make different mistakes than I do so I've thought about using one as a copy editor but I've had terrible experiences with copy editors: I've hired more than one when I was writing marketing copy who injected more errors than they fixed. (A friend of mine wrote an article for The New York Times that got terribly mangled and barely made sense after the editors made it read like an NYT article.)
by ericd on 1/6/25, 11:22 PM
In the meantime, they’re shipping the best non-workstation computers, by far, to run models locally.
They don’t have to be the ones to implement all of this themselves, you can install ollama right now, and BoltAI can integrate those models into other parts of the OS. And Apple will watch, and Sherlock the best parts of what others do into the OS, and sell gobs of machines.
They haven’t squandered anything, the foundations are still there.
by hbn on 1/6/25, 4:28 PM
Parsing text for variables when it sees an equals sign and running basic calculations on them? I feel that could have been a novel feature 30 years ago.
by adamc on 1/6/25, 4:03 PM
Most of the intro to this is credulous hooey. Macs weren't "bicycles for the mind" in some magic way that was different from PCs. What the early ones had was 1) a better and much more standardized interface, and 2) task switching that worked.
As for the AI tools, image generation might occasionally be useful for a D&D game, but otherwise nothing on offer at the moment has much value. And the value of image generation (for me) is pretty small.
by yalogin on 1/6/25, 2:24 PM
by KennyBlanken on 1/6/25, 3:28 PM
No, it's not. The sensor in an iPhone is AI/ML'd up the ass to hide all the noise because it has 1µm sensor wells.
A Panasonic video-oriented mirrorless micro 4/3rds (so not even anywhere near 35mm) like the GH5 is 3-4x that.
A Sony Alpha 7 III? six times the sensor well size.
I don't care how many megabits of video bandwidth you throw at it or how fancy you think "raw" shooting is, or how fancy your sensor technology is; nobody these days has anything that is even close to 2x better than anyone else. The top sensor from all the major players are pushing the limits of physics, and have been for a long time.
No amount of AI/ML shit will give you depth of field and bokeh that looks as nice as a big sensor and a fast lens with nice shutter leaf shape.
by gwern on 1/6/25, 10:01 PM
by gigel82 on 1/6/25, 4:27 PM
His first instinct was right. It seems impossible, because it is. Unless I can run the entirety of the "Private Cloud Compute" on my own hardware in my own firewalled network, I 100% believe that the pipeline is compromised; our data is siphoned off and sold off to advertisers, especially now that they know they can do it and get less than slap on the wrist: https://news.ycombinator.com/item?id=42578929
by aaroninsf on 1/6/25, 6:32 PM
This is sardonic, as yes, Apple could have chosen different monopolies than it currently has, at different points, and had a different (maybe not better?) trajectory, some of of us are old enough to remember the antipathy towards Microsoft when to the Office suite it added default Explorer,
But also, maybe our system fundamentally rewards the "wrong" things if one's definition of "right" includes things like innovation. Or maybe, the welfare of the commons and the common good.
by xivusr on 1/6/25, 6:08 PM
I’m hoping with what they’ve built in infrastructure and custom chips is a step towards making personal LLM also highly available to non-technical people. I think this is where Apple has always shined - making things not just better, but accessible and grokable for normal people.
by asimpletune on 1/6/25, 5:03 PM
by nbzso on 1/7/25, 4:00 AM
Generative AI is a dud. ML has a lot of applications. I use my smartphone for calls, chat and banking apps. I use my iPad only for drawing, mail and some games.
For everything else, I have computers. With real OS.
by arkensaw on 1/6/25, 4:04 PM
Sorry, what?
Apart from the level of dream-detail recalled being highly dubious, quoting your own hallucination of Steve Jobs to help with your argument about generative AI being useless (and missing the irony) is downright weird.
Also Math notes is basically the same thing search engines have been able to do for over a decade now. Enter a sum, get an answer.
by gdubs on 1/6/25, 4:26 PM
by BenFranklin100 on 1/6/25, 4:32 PM
Ultimately Apple’s strategy of a privacy focused AI will be a winner for a consumer device with access to sensitive personal information. It’s a question of whether they can pull it off technically.
by llm_nerd on 1/6/25, 3:44 PM
Having said that, I actually paid attention to the image playground criticism. Image playground is literally a playground. It is meant to make fun, low-effort images for friends and family, largely for social type interactions.
"It uses a placid corporate artstyle and communicates nothing." It's a hot taco holding a beer. What is it SUPPOSED to communicate? Looks like a pretty great image to me. But of course this piece was leading into the anti- angle, so suddenly it's "horrifying". I guess I didn't get the special training to understand what was wrong with a clearly lighthearted, fun image.
Similar asinine, overly-jaded complaints about the cartoonish, memoji style portrait generation. I think the image is actually pretty hilarious. Actually used image playground to make my social media image, and I care not what this guy thinks about it, or that it is "soulless" (as if a cartoonish representation is supposed to be soulful?)
by archeantus on 1/7/25, 2:35 AM
What they released as Apple Intelligence wasn’t a well-planned, cohesive product as much as the only thing they could possibly do, given the timelines they were up against. Maybe they’ll catch up, but they’re definitely behind and it’s a shocking thing to behold.
by LeicaLatte on 1/6/25, 3:14 PM
But the demands of intelligence and the general trajectory means no amount of hardware - storage, RAM or battery size would be enough to generate the high fidelity experiences or solutions that fans and customers have come to expect from the company.
by kennyloginz on 1/8/25, 7:17 AM
They are dedicating “AI” hardware to their current devices, and are building a framework for many different futures. In regards to software, they are first building the safety, then MVP’s to provide APIs to their chips.
So yah, maybe they aren’t holding the holy grail at the moment. At least we won’t be drinking from lead goblets .
by daft_pink on 1/6/25, 8:47 AM
They have set themselves up for a loser in the next year or two, because they can’t double their resources to catch back up to a normal release schedule.
by keepamovin on 1/7/25, 2:33 AM
by jacobsimon on 1/6/25, 3:17 PM
Didn’t realize how widespread that type of spam was until now. Why hasn’t someone implemented better spam detection at Apple like we have for email? It would be nice if they could classify texts as spam, promotions, etc and organize them the way Gmail does.
by LudwigNagasena on 1/6/25, 7:09 PM
Because that’s a laughably false premise.
by bentt on 1/7/25, 12:04 AM
by lenerdenator on 1/6/25, 3:40 PM
"Apple Intelligence" is less than a year old. Give it some time, for chripe's sakes.
by jedberg on 1/6/25, 9:45 PM
Apple has done such a good job with marketing that my 10 year old thinks that AI stands for Apple Intelligence.
We live in the Bay Area so she's seen a bunch of billboards with that. I have to constantly remind her that that is not what it means in most cases.
by fijiaarone on 1/7/25, 3:38 AM
by choonway on 1/7/25, 2:11 AM
by fnordpiglet on 1/6/25, 6:05 PM
by henry_viii on 1/7/25, 5:24 PM
The Cursor editor literally has this.
by ghostly_s on 1/6/25, 3:52 PM
by tracerbulletx on 1/6/25, 4:24 PM
by est31 on 1/6/25, 2:59 PM
In the end, even if the features aren't perfect, they still raise the bar for competitors, so Apple is less in danger of being disrupted.
Also, there is plenty of AI driven features that people do not talk about, but those "just work" so you don't see them as well.
by fredsted on 1/6/25, 3:56 PM
by jjkaczor on 1/6/25, 3:32 PM
Having used "smart devices" since the Apple Newton 2.0 days, followed by Windows Mobile, a very brief Android excursion (Motorola Milestone - early enough Android that I was often frustrated trying to copy/paste text between apps), then another side-pivot into Windows Phone for awhile (mainly because the development was incredibly easy - and Microsoft gave me a free one), I have been in the iOS mobile phone ecosystem ever since the iPhone 6.
And - the software has gotten increasingly better over time - I wouldn't have (for me) alot of content/subscribers on TikTok, if iMovie on my phone did not exist - attempting to edit videos using OpenShot was taking forever (While I have Davinci Resolve installed, it seems "daunting" for someone who doesn't want to be a professional videographer/editor).
But then I tried iMovie "Magic Movie" on my phone and ... "it just works". Still not great for long-form YouTube style content, but for quick things, slice-of-life videos - it does the job rather well.
... I expect that Apple will improve these AI offerings dramatically over the next couple of years as people upgrade their devices.
by musesum on 1/6/25, 4:43 AM
by zombiwoof on 1/6/25, 10:20 PM
by tehjoker on 1/7/25, 3:21 AM
by saagarjha on 1/6/25, 5:17 AM
I mean you can hack it the same way you would hack any other Darwin platform
by tim333 on 1/6/25, 5:05 PM
By which I think he means the AI stuff runs on your machine rather than the cloud. For me that's not a holy grail at all or even something I'm terribly interested in. I downloaded Apple AI on the macbook, found it quite meh and am now seeing if there is a way to remove it as it uses quite a lot of GB or memory. I can see for someone wanting to use LLMs on confidential corporate data that would be important but that's a specialist use case that I don't think Apple Intelligence is particually good for.
by nordiczordic on 1/7/25, 2:29 AM
by djaouen on 1/6/25, 5:35 PM
by VectorLock on 1/6/25, 2:39 PM
by vachina on 1/6/25, 4:54 AM
Before the AI craze you could search Photos on iOS based on content and metadata. You could lift subjects off photos with a long tap and copy them, recognize faces and make montages based on inferred relationship with them. And all of this is done on-prem, on your device.
These are very subtle, nice features. Apple had to put a name on all these otherwise there would be no marketing material.
by eviks on 1/6/25, 5:15 PM
Given the lack of a single good supporting example (what, did PCs have no word processors that reacted to backspace keys?) it seems like these are fantasy bicycles...
And since no evidence is needed to believe, you can of course believe in Intelligence that can act better than your brain (what are "those pics from San Francisco", you've snapped a hundred there, which 5 would you like to post?)
And yet the disillusionment comes a bit faster than expected, why not give Apple a few more decades to iron out some kinks on such a revolutionary fantasy path?
by october8140 on 1/6/25, 4:12 AM
by aucisson_masque on 1/6/25, 11:01 PM
How is apple intelligence going to help me with that ?
by bitpush on 1/6/25, 3:34 AM
LLM are built on data, and copious amounts of it. Apple has been on a decade long marketing campaign to make data radio active. It has now permeated the culture so much so that, Apple CANNOT build a proprietary, world-class AI product without compromising on their outspoken positions.
It is a losing battle because the more apple wants to do it, the users are gonna punish them and meanwhile, other companies (ChatGPT, anthropic) are gonna extract maximum value.
by ilrwbwrkhv on 1/6/25, 4:14 AM
So that means all of the big tech companies are going down: Facebook, Google, Apple.
Only Microsoft remains strong though for how much longer remains to be seen.
A great time for startups.
by apricot13 on 1/6/25, 8:41 AM
They don't want to scare any part of their audience away from future uses of Apple intelligence. Their audience is tech and non-tech folk alike.
If the tech folk say it's safe and the non-tech folk get comfortable with the basic AI features then they're onto a winner.
How many people's parents/grand parents have iPhones because they're simpler for them to understand who are also scared or don't understand this 'AI thing'. I think Apple have been quite savvy in introducing it slowly and are probably watching the metrics like a hawk!
I suspect image playground is so creepy in an attempt to mark the images as clearly AI generated when they get posted to social media?