by zerop on 12/13/24, 10:29 AM with 42 comments
by j4coh on 12/13/24, 11:43 AM
by brettcooke on 12/13/24, 12:27 PM
Curious what others think about forgoing design thinking in AI product development in favor of this more direct, concrete approach.
[1] https://www.deeplearning.ai/the-batch/concrete-ideas-make-st...
by SilverBirch on 12/13/24, 12:06 PM
I think the more interesting question for the PM is how are you going to make a differentiated product in the market if everything you're planning to build is trivial? If it's not trivial, maybe talk to an engineer or two.
by senko on 12/13/24, 11:42 AM
Table stakes for any product manager, not just AI related.
by cadamsau on 12/13/24, 8:04 PM
Currently the generated prototype usually needs tweaks and that’s if it even works. But when it does work, it’s like the model is reading your mind.
In the future as models improve at coding, they will anticipate the tweaks that make sense, less of the prompt will need to be specified & there’ll be less polish work after you get the generated artifact, and you can work at an even higher level of abstraction and thought. Domain experts can create even bigger, cooler things without spending’s years getting software engineering skills.
Assemblers and compilers came along very early in our industry’s history. If you run the thought experiment that that’s where we are at with prompted software creation, it will be a wild and exciting future. More people creating more stuff means a tremendous amount of amazing creations to enjoy.
by n4r9 on 12/13/24, 11:52 AM
* Specify the product as concretely as possible
* Use existing applications to test feasibility
* Get non-engineer user feedback on early prototypes
These all obviously apply to product management more generally, but Andrew gives some examples/ways in which they apply specifically to AI products. Still, I feel like they're talking more generally about complex/abstract software engineering rather than simply AI.
by egeozcan on 12/13/24, 12:21 PM
by esel2k on 12/13/24, 1:31 PM
Nothing new - we heard the same message with Figma, containerisation… you name it.
Having a good sense what problem solve, building rapport and trust with early customer and being a fantastic leader and communicatore has always been the most important skills. Thanks, nothing to see here…
by lifeisstillgood on 12/13/24, 11:52 AM
And this is an example I feel of the form evolving.
The point that AI could not learn from a vague mission statement (whereas most people today would think wow that’s a good start to a two year project) suggests that AI companies as Ng suggests are “just” well thought out companies.
Sorry not making a lot of sense - what I think I mean is that one can write down a human sentence and the phase space of possible meanings is very large - the behaviours that meet the specification can be huge and most projects are attempts to find a working output that meets that and has everyone understanding it.
But a *working* piece of software has a much more constrained phase space of possible behaviours - just to get it working (or even get a set of tests it must pass) drastically reduces the possible behaviours and so makes clearer intentions and makes the discussion more focused.
by bananapub on 12/13/24, 12:05 PM
by hacker002 on 12/14/24, 2:24 PM
by gigatexal on 12/13/24, 12:07 PM
Given an epic with keywords organize tasks into that epic and estimate the time and then track if it’s on track or not.
Yeah not a lot to PM work.
Ooh also a 50/50 coin flipper to saying no to adhoc things
There that’s an AI PM
by eichi on 12/13/24, 11:27 AM
by tsak on 12/13/24, 1:16 PM