by jessup on 8/16/15, 8:09 PM with 51 comments
by chestervonwinch on 8/17/15, 2:29 AM
> ... Many textbooks wrongly state that these limits apply only to networks with one or two layers, but it appears that those authors did not read or understand our book! For it is easy to show that virtually all our conclusions also apply to feedforward networks of any depth (with smaller, but still-exponential rates of coefficient-growth). Therefore, the popular rumor is wrong: that Back-Propagation remedies this, because no matter how fast such a machine can learn, it can't find solutions that don't exist. Another sign that technical standards in that field are too weak: I've seen no publications at all that report any patterns that porder- or diameter-limited networks fail to learn, although such counterexamples are easy to make!
by jordigh on 8/16/15, 8:48 PM
http://snarkmarket.com/blog/snarkives/societyculture/old_man...
by westoncb on 8/16/15, 8:49 PM
by kleer001 on 8/16/15, 11:21 PM
by rectangletangle on 8/17/15, 4:01 AM
by MichaelMoser123 on 8/17/15, 4:32 AM
https://www.youtube.com/watch?v=-pb3z2w9gDg&list=PLUl4u3cNGP...
by adamzerner on 8/17/15, 3:12 AM