by danm07 on 3/12/17, 7:24 PM with 3 comments
I don't understand why the reason behind the OpenAI movement. If the benefits and fallouts of AI development is anything like nuclear technology, shouldn't its distribution measures be made in the image of the Nuclear Proliferation Treaty (i.e. controlled distribution of enrichment technology with checks and balances)?
Or is the open-source sentiment out of necessity because research output is difficult to control?
by deepnotderp on 3/12/17, 7:36 PM
For Google, I think their long term play is realizing that Facebook will release equivalent research, so in the long run, there's no benefit in having everything be closed source. Rather, if they publish the best papers, then they can claim the title of the "best AI company", which is great branding for other services like DeepMind Health and Google Cloud ML which would otherwise become a commodity. They can also recruit the best talent with this perception. Also, DeepMind doesn't open source jack, they only publish papers, no code.
Microsoft probably has a similar philosophy.
by Eridrus on 3/12/17, 8:28 PM
Apple has struggled to hire people since they don't publish much.
I think it is actually a minority of researchers publishing code, most implementations you will find online are reimplementations based on reading the papers.
I also think that very few researchers share the anxiety about AI that famous people seem to have. I spent some time at an academic ML conference and no-one I talked to thought superintelligence was going to happen in the next few decades.
by byteforscher on 3/12/17, 7:57 PM