by matthieurouif on 9/16/22, 8:50 PM with 14 comments
by maegul on 9/17/22, 2:29 AM
Is this a trend I’ve missed … some sort of post-D3 nadir of a datavis hype curve or something where graphs are a cringey thing SEOd click bait articles or news pages do?
by mark_l_watson on 9/17/22, 2:38 PM
Federated privacy preserving learning, local models, etc. all help keep your private data on your devices. Good stuff.
by chmod775 on 9/17/22, 1:33 AM
Suppose I had one or two cameras attached to a computer and ran a software that would detect which object I'm pointing at and name it, how much power would that use?
The human brain would probably need around 0.5s - 1s to come up with an answer, consuming around 5 milliwatt hours of energy in that time.
How much power would the computer need to at least give it a fair shot compared to the human?
If we assume that a human is pretty close to the best theoretically achievable limit of overall usefulness vs energy usage (while, unlike current AI, having the ability to learn ad-hoc, self-correct and maintain itself), "work per watt" may give us an idea of how advanced our current technology really is compared to what already existed, and how far we can still go.
by kory on 9/17/22, 1:45 PM
by sqquima on 9/17/22, 1:35 PM
by miohtama on 9/17/22, 2:23 PM