from Hacker News

DeepMind's new algorithm adds 'memory' to AI

by thekodols on 3/15/17, 12:08 PM with 1 comments

  • by choxi on 3/15/17, 2:15 PM

    I always assumed there was only one mechanism for memory in the brain, but it's interesting how multiple Deep Learning architectures have different mechanisms that could all be described as memory: RNNs and LSTMs can selectively save states from sequences of data, Neural Turing Machines basically have a RAM unit to store larger data structures, and now we have these networks that can remember "skills" across different training programs.

    If we draw parallels to the human brain, the mechanisms sort of look like short term memory, long term memory, and something like "skills memory" (I'm not sure if the latter has a proper academic term, but it's a common experience e g. You never forget how to ride a bike).

    Maybe one of the reasons why human memory has been such an elusive concept is because it's actually many different and independent mechanisms instead of just one.