by gexos on 6/15/19, 1:03 PM with 48 comments
by crazygringo on 6/15/19, 5:16 PM
And obviously if you have your children speak to Alexa, it will record them too. Same with visitors in your home. It's not a secret... it's how it works.
Alexa does support voice profiles. But it's certainly not perfect -- voice identification can be hard even for people to do (depending on similarity), and there's a huge leap from distinguishing between pre-identified voices, versus determining if a voice is one of those and not one of billions of other people. And there's certainly no magic way to analyze a voice a determine that it's 12 (not allowed) or 13 (allowed).
So I'm not exactly sure what this lawsuit intends to change?
The only thing I can imagine is perhaps to allow disabling voice recordings per voice profile, so you can create a voice profile of your child and then do that -- but intriguingly, Amazon says voice profiles can only be created for users 13+, so there's presumably something legal here.
But really, if you don't want your kids (or visitors) using Alexa, either tell them not to use it, or don't have one in the first place. I really can't see how this is up to anyone other than the parents?
by freehunter on 6/15/19, 2:32 PM
by idlewords on 6/15/19, 2:46 PM
by LMYahooTFY on 6/15/19, 2:43 PM
Presumably, the plaintiffs could at the least have records of various advertising that is uncannily targeting related subject matter to what the children may have been consuming and outputting, though I wonder how they could prove that the children didn't inadvertently activate Alexa and provide this data. Perhaps that is enough to subpoena Amazon?
It seems like the marketing industry will be at the front line of the battle over what age is enough to give your own consent in many aspects of life, and I personally am standing on the other side of that line.
I hear that some scandanavian regions have laws prohibiting adverters from targeting children, and I wonder how tech companies will deal with such restrictions.
by tmp2846 on 6/15/19, 2:43 PM