from Hacker News

Mark Cuban on Why You Need to Study Artificial Intelligence

by applecore on 2/8/17, 3:44 PM with 130 comments

  • by minimaxir on 2/8/17, 4:24 PM

    The title is clickbait, but the problem with the religious fervor of machine learning/deep learning is that entrepreneurs/VC keep arguing that deep learning is magic and can solve any problem if you just stack enough layers. (see also: https://www.reddit.com/r/ProgrammerHumor/comments/5si1f0/mac...)

    Meanwhile, statistical methods for non-image/text data with identifiable features can often work better than neural networks, but they are not as sexy. (Good discussion on HN about this: https://news.ycombinator.com/item?id=13563892)

  • by feral on 2/8/17, 5:16 PM

    Reading HN I worry that we're going to have the opposite problem - a glut of people will try and (badly?) learn ML and then realize there aren't enough ML jobs.

    I've a PhD and held ML-engineer positions in a few different companies - I've good industry awareness.

    Most applied ML, for most companies, right now, is actually relatively simple models (hand-coded rules! logistic regression! You'd be shocked how common these are.) The bulk of the work is data cleaning, gathering, integration, deployment, productisation, reliability, avoiding pathological cases, special-casing, Product, UX. You do need ML specialists who understand the stuff, to make it all work and come together - but the ratio of ML specialists to the wider team is low. Maybe 1 or 2 specialist on a team of 10 for an ML heavy product.

    This is going to remain the case IMO. Yes, there will be small teams, in highly resourced organizations (GOOG, FB etc), academic research labs, or occasional hard-tech startups, who do new model development. Maybe if AI becomes huge, you'll see more traditional Fortune 500s spin up similar efforts.

    But there'll be a much wider set of people&businesses applying and tuning well understood approaches, rather than doing new model development. And you just don't need as many ML specialists, for that approach.

    Even with deep learning, the tooling will advance. I mean, even look at all the research papers describing applications at the moment - so many of them are using pre-trained models. Industry will be similar. Tooling will advance, and you'll be able to do increasingly more with off-the-shelf pieces.

    I think ML is absolutely going to have a big impact - I buy at least some of the hype. But should all developers, or even a substantial minority of developers, start learning ML as a career imperative? I don't think so.

    Finally, it takes serious time to learn this stuff. Its easy to dabble (and worthwhile doing - its fun; and sometimes you can do powerful things in a using tools in a very blackbox manner!). But actually thoroughly learning it takes time. It takes serious time to build statistical intuition, as just one example.

    We could easily end up with a great many career developers who have a specialization in ML, frustrated they never get to use it.

  • by deepnotderp on 2/8/17, 6:08 PM

    I'm going to go against the grain here and (gasp) not hate on deep learning. People should realize that although many older statistical methods and "traditional" machine learning methods such as LDA, SVMs and decision trees may be good enough for business tasks, they are not the cutting edge of AI research. I think people are forgetting exactly how difficult image classification and object detection was before the advent of deep learning. People hating on "stack more layers" forget that "stack more layers" is EXACTLY what improved imagenet performance to such as massive extent. ResNets pushed the limits by figuring out how to stack more layers in a beneficial way.

    And let's take a look at AlphaGo, how would you do that with SVMs or decision trees? Just get over the fact that deep learning provides a level of "intuition" (Go's search space is famously greater than the estimated number of stars in the universe).

    I think that a part of the problem is that older ML PhDs are angry that deep learning is so easy (until the learning rate fails to provide convergence of course...) and would prefer that their preferred methods would still reign supreme.

    I'll end this wall of text on the note that OpenAI's Ian Goodfellow notes that all projects at OpenAI use deep learning right now, but they are not dogmatic and will consider other approaches if they work well. I think this is the path that should be taken. On the other hand, I also see a bright future for uniting traditional techniques with deep learning such as attaching a SVM to a CNN and combining decision trees with CNNs both of which have resulted in good results.

  • by evgen on 2/8/17, 4:14 PM

    I would be far more inclined to heed this advice if Cuban have any indication of understanding ML as anything more than a magic black box; fairy dust to be sprinkled into every pitch deck to solve any objection or solve difficult problems. The bandwagon is passing through folks, jump on board with Mark or you will have fewer buzzwords with which to craft your deck...
  • by pjungwir on 2/8/17, 4:59 PM

    Sort of a content-free article, but the headline is an interesting bold claim that conjures a lot of thoughts:

    - I know enough machine learning to be dangerous, but I'm hardly ever asked to use it. I designed a Bayesian classifier for my own startup around 6 years ago, analyzing political donor networks. I've completed the Stanford ML course. Back in college I did a math minor, so I'm comfortable with linear algebra, calculus, etc. I'm pretty comfortable with statistics of both kinds. But my bread-and-butter is freelance web development . . . and I'm not really even sure how to find work doing more MLy things.

    - I've read over and over that the most time-consuming part of ML work is data collection & cleanup, and that matches my own experience. It is the same thing that killed so many data warehouse projects in the 90s. You don't need a Ph.D. to do it, but it is a tough and costly prerequisite. So it seems like you'll need non-ML programmers even for specifically ML projects.

    - In a similar vein, Google has written about the challenges of "operationalizing" machine learning projects.[1] Having a little experience collaborating with a team doing an ML project, where they did the ML engine and I did the user-facing application, I can say that many ML experts are not experts in building reliable, production-ready software.

    - Will there ever be a Wordpress of machine learning? If there is, the author will be rich, but you won't need a Ph.D. to operate it. But because ML requires hooks into your existing systems, I don't know if this will ever happen. What will happen I think is plugins to existing e-commerce systems for product recommendation or other off-the-shelf ML-powered features. These already exist, but I assume they will become more prevalent and powerful over time. In any case, the mainstreaming of ML for business will be inversely correlated with the expense to implement it, which suggests it will be easier and easier for non-expert developers to use (and misuse).

    EDIT: Added the (now-)third bullet point I forgot before.

    [1] https://research.google.com/pubs/pub43146.html

  • by vidarh on 2/8/17, 4:37 PM

    One of the things I've realised is that the more I'm looking around, the more I find opportunities where people "should" have seen the opportunities of basic bayesian models, simple clustering algorithms etc. and other simple mathematical/statistical methods 20 years, but didn't, and still don't. That has massively changed my perspective on how quickly the onslaught of machine learning will come.

    E.g. when I was reading up on genetic algorithms etc. 20 years ago we also expected the "revolution" to be right around the corner, and that things like genetic programming would change the world in a few years time. And while various of those methods found use some places, most places that could have used at least some of the simpler ones, still don't.

    In other words, I think talking about a 3 year timeline is crazy. It's getting more attention, sure, but there is so much low-hanging fruit that most developers could be busy for the next 20 years putting in place the most trivial algoriths all over the place and we still wouldn't have picked off even the low hanging fruit where the computational resources and algorithms and data to make a big impact were well within reach 20 years ago.

    This certainly means there is plenty of room for a lot developers to do very cool stuff and build careers on machine learning today, but it also mean most developers will not have to learn the state of the art - or anything near it - for a very long time.

    As a concrete example I give to people, consider all of the search boxes out there on various sites - product searches, location searches, site searches - that are straight keyword based searches that don't take into account any clickstream data to improve ranking. The proportion of search boxes I see that take advantage of the available data is vanishingly small, even though very basic analysis can improve the perceived relevance of the results massively.

    We certainly will see more companies invest in proper machine learning as the payoff gets higher and difficulty in taking advantage of it drops. But we will also see a huge proportion of sites that could use it continue to ignore it for years to come.

    There are big business opportunities in finding ways of making a dent in that portion of the market, though, and so learning this stuff can certainly be well worth it on a personal level, but I don't believe in his timeline in terms of the overall market.

  • by anupshinde on 2/8/17, 5:14 PM

    Statements like these suggest that another AI Winter is coming (sooner than 3 years I guess)

    """He thinks even programming is vulnerable to being automated and reducing the number of available programming jobs."""

    I believed something similar could happen within 1-2 years of learning/writing AI programs (more than 12 years back). I believed it so much that it consumed most of my weekends as I took on the Genetic Programming approach. Yes! computers can write programs - BUT trying reading those. Eventually after spending hours or days, you will be able to read those programs and you might find a simple "hello-world" program represented by a complex mathematical equation. Good luck trying to get such program fixed by humans. Imagine an experience decoding deep-learning-neural-nets. However, that is black-box from a programmer perspective.

    From a business/management personnel perspective - the code is a black box anyways. When they get NNs that can generate required software, they will replace the people-manager with a NN-manager (who is a programmer btw!)

  • by gremlinsinc on 2/8/17, 4:26 PM

    That's as strong as a statement as Trump: Learn Machine Learning or you'll be a Dinosaur in 3 years...

    Maybe if it was coming from Bill Gates, Mark Zuckerberg, or another tech titan with some actual coding experience and a deeper level of learning about what ML even is. Cuban's a business man, and most CEO's I know don't have a clue about the stacks that run their own company, let alone what's popular.

    That said, I do think ML will be important, but I develop ecommerce apps and things of the such in Laravel, unless I move into AI and Neural nets I don't see needing to know a lot about ML (though I wouldn't mind moving in that direction as that space picks up) -- but there's still plenty of opportunities without it.

  • by marricks on 2/8/17, 5:51 PM

    Assuming we're all going to be deep learning programmers is quite a bit foolhardy. I think what's really relevant to consider is AI winters can and do happen[1]. I would not disagree deep learning has done some amazing things, what I would say is it does have limitations.

    What causes AI winters is when an advance such as deep learning can be applied to new problems and leads to increased interest. And while this new thing is really good as a subset of problems and impresses the public, of course it can't displace humans at everything and naturally has it's limitations.

    So funding pours in, everyone gets hyped, and then those natural limits are (re)discovered and everyone gets all anti-AI research. Of course many people knew the limitations all along, but the dream is gone and so is a lot of funding until the next thing comes along.

    This is probably natural to a lot of fields but AI just seems more prone to these boom and bust cycles because it's really exciting stuff.

    [1] https://en.wikipedia.org/wiki/AI_winter

  • by itg on 2/8/17, 4:31 PM

    Good luck with that. Any place doing serious ML will require the person to have a PhD or have publications and presentations at conferences like NIPS/ICML. Even most CS grads with a bachelors do not have the math background required unless they double majored in math or stats.

    This is more VC/founders who are hyping up AI and need more ML folks so they can drive down costs.

  • by bsaul on 2/8/17, 6:47 PM

    Honest question : once the technics will settle a bit, and libraries are created, what will be needed, apart from knowing that machine learning algorithm are based on some kind of stasticial inference, with a few settings here and there ?

    I mean, we don't need a phd in image compression to create a service that streams videos. We just use libraries. Same for everything in computer science, it always end up packaged in some kind of reusable code or service, and only some specialists remain working in the field to work on marginal improvements.

    Why would ML be any different ?

  • by msvan on 2/8/17, 4:14 PM

    Either he's right about machine learning, or this is exactly the kind of thing bubbles are made of.
  • by badthingfactory on 2/8/17, 4:48 PM

    I'll place this in the same folder as the articles claiming Wix will eventually replace web developers.
  • by brilliantcode on 2/8/17, 4:33 PM

    Normally I'd laugh off any Mark Cuban antics but he isn't wrong. AI is going to greatly reduce white collar jobs with economies of scale.

    Luddites of 18th century thought they would never be replaced and continued on their trajectory.

  • by mad44 on 2/8/17, 5:04 PM

    (Pre-apology. I am not trolling, please don't get my comment below more than what I intended: another perspective to look at the strong reaction Cuban's comments incited.)

    Reading through the comments, I see that Cuban's statement upset and even angered several HN commenters. That is a strong emotional reaction.

    I am not saying it is the 5 stages of grief, but the first 3 fits: denial, anger, bargaining, depression and acceptance.

    Also from Howard Aiken: Don't worry about people stealing your ideas. If your ideas are any good, you'll have to ram them down people's throats.

  • by dkarapetyan on 2/8/17, 6:04 PM

    No thanks. Fundamentals and not hype is what makes one not a dinosaur.
  • by anotheryou on 2/8/17, 5:50 PM

    I think prosthetic knowledge will become deeper and more accurate in the long run. And if it scales we don't need many people building the general purpose AI.

    With this prosthetic knowledge we will have to learn much more what to ask and know how much the machine knows.

    One has to quickly grasp the abstract that is one level too high or detailed for the machine to find and than find the seperate answers of the level below to recombine them. You can't yet ask where to open a restaurant, but you can google for demographics and write a program to map ratios between income, foot-traffic density and restaurant density.

    Once we can ask what and where is the most profitable business in to start in town, we probably still don't get a step by step guide how to do it, interior design included. Where the rubber meets the road there is still a lot of oppertunities to decide on and complex data we can grasp more easy than the machine.

  • by xs on 2/8/17, 6:08 PM

    Just like there isn't such a thing as polymaths anymore because the world has so many specialized skills, I think there won't be the possibilities of things like "full stack developer" in the near future because of the complexities for development. Some things that contribute to the complexities will be: ubiquitous controls, advanced AI, internet of things, augmented reality, machine learning, and new technologies we don't even have yet. We are in the golden age now where a single person can sometimes create a better website or app compared to a whole development team at a fortune 500 company. I think our Internet world will become so complex in the very near future that a single person simply won't be able be capable and they'll have to become specialized in only a portion of it.
  • by taytus on 2/8/17, 4:21 PM

    Sorry, I refuse to visit such a clickbait headline.
  • by owaislone on 2/8/17, 5:29 PM

    I think what he meant to say was that companies should learn it in 3 years or the competition will drive them out of business. I don't think he meant individuals will be dinosaurs, but startups that fail to take advantage of ML/AI.
  • by xamuel on 2/8/17, 6:34 PM

    Three years ago, Machine Learning on a resume meant: "Good candidate."

    Currently, it means: "Jumps on bandwagons, caution."

    In three years, it'll mean: "Brainless buzzwords, avoid."

  • by JustSomeNobody on 2/8/17, 5:55 PM

    I want a nickel every time someone says programming jobs will be automated in 5 years. This goes back to the beginning of programming and it hasn't happened.
  • by coldcode on 2/8/17, 4:46 PM

    Machine learning would imply the machine needs to learn it not me. Knowing something about it and actually using it are quite different. While it might appear "everywhere" most of programming is still not ML and probably won't me. When I started in 1981 I didn't know C yet, that did not make me a dinosaur then either.
  • by hnmot223 on 2/8/17, 8:53 PM

    "Mark also said that what happens in the next 5–10 years is just going to blow everybody away, especially in the field of automation. He thinks even programming is vulnerable to being automated and reducing the number of available programming jobs."

    He's talking out of his ass here. This won't be happening anytime soon (if ever)

  • by usgroup on 2/8/17, 6:53 PM

    I kind of read it "learn it and you'll be a dinasaur for 3 years". It made me quite excited at the prospect. I started to practice growling and running at a curious forward angle , but on second reading I find myself disappointed.
  • by thomasahle on 2/8/17, 4:18 PM

    I wonder to what degree he means. Because if he means "really" learning it, this is akind to the "learn to program in a week" books. It takes much more than three years to learn well.
  • by onmobiletemp on 2/8/17, 7:29 PM

    This is so dumb. You arent going to understand or get a job in machine learning unless you have at least a masters in the subject. Its extremely difficult and complex. I see tons of college students taking machine learning classes in anticipation of becoming a dinosaur amd none of them could get a machine learning job afterwards. Programming has always been pretty easy. The ai revolution wont be like the home computer revolution. Its going to be led by a relatively small group of academics, scientists and engineers working in prestigious research positions.
  • by id122015 on 2/8/17, 6:23 PM

    It doesn't matter how far technology gets, there are some -isms and they are the real Dinosaurs that we have to get over and we are too small.
  • by bgdkbtv on 2/9/17, 8:05 AM

    Oh yeah? Does Mark Cuban know artificial intelligence himself or is he just asking people to study it and work for him? :)
  • by acd on 2/8/17, 10:12 PM

    I am sysadmin/devops what machine learning tools and topics would you recommend to learn for that field?
  • by mi100hael on 2/8/17, 4:59 PM

    That page has so much JS bloat it made my top-of-the-line MBP lag just scrolling.
  • by bluekite2000 on 2/8/17, 5:46 PM

    anyone knows if there is a business need for human to train/label data? I have been thinking of going to a place w/ cheap labor cost (perhaps Vietnam) and set up an operation like this.
  • by cstuder on 2/8/17, 4:21 PM

    Ok, so where do I start?
  • by rocky1138 on 2/8/17, 4:21 PM

    Why don't we just learn it in 3 years, then?
  • by general_ai on 2/8/17, 7:30 PM

    No, you're not going to be a "dinosaur". 99% of extremely well compensated software engineering jobs do not involve ML. Using top large companies as a proxy of what things are going to be like in the world at large 3 years from now, maybe one in 200-300 engineers does anything in any way related to ML there. And that's a generous estimate. You do need to know what it is, roughly, but there's no need to drop everything you're doing and switch careers.
  • by otikik on 2/8/17, 5:38 PM

    Yeah. I'm going to be a clickbaitsaurus.