from Hacker News

TV anchor says live on-air ‘Alexa, order me a dollhouse’

by danielharan on 1/7/17, 5:13 AM with 232 comments

  • by wonko1 on 1/7/17, 5:49 AM

    I don't really get the appeal of voice prompted ordering. Is there really a big enough market of people for whom this is more than a novelty?

    For many products, you want to do at least some research before ordering. For products you order regularly, reordering is usually only a couple of clicks/taps.

  • by smoyer on 1/7/17, 1:55 PM

    "Telly station CW-6 said the blunder ..."

    I don't think it's a blunder to say (almost) any sentence ... we do still have freedom of speech. To think that these viewers would expect that no one ever utters the phrase again is ridiculous. Next they' expect there to be legislation prohibiting certain words on any broadcast. An as more voice controlled devices are created, there are bound to be conflicts.

    If they'd written this article correctly, they'd have an auto-playing audio clip that says "Alexa, order me a sandwich". A child named "Ok Google" will be the next version of "Little Bobby Tables".

  • by c3534l on 1/7/17, 9:43 AM

    I don't understand why people are blaming the news report for this instead of Alexa.
  • by kimburgess on 1/7/17, 9:28 AM

    Toyota ran and ad campaign in Sweden in 2015 that targeted Siri: https://www.youtube.com/watch?v=NqZBVTMrgFA.

    In their current state, voice assistants appear to be wide open for abuse. You think auto-play ads on sites are bad now? Wait until they start auto-ordering for you too.

  • by Tharkun on 1/7/17, 12:00 PM

    If your device is stupid enough to order things when some random person on TV tells it to, then your device probably shouldn't be in people's homes.
  • by jasonkostempski on 1/7/17, 12:09 PM

    I've been waiting for this for years :) I figured it would be some shock jock saying, "Ok Google, show me pictures of child porn" over the radio, causing people to panic while driving and that would be the end of stupid, always-on, voice commands.
  • by dplgk on 1/7/17, 5:13 PM

    For the HN crowd, I'm surprised most comments here focus on ordering things via voice. There's a much more disturbing lesson here and it's not far fetched at all compared to most doomsday theories. Right now, Alexa et Al can control your home, I'm sure it's one step away from summoning your Tesla. What comes after that? Then all you need is an IoT exploit deployed and an implanted Alexa time bomb that's triggered by a commonly uttered phrase. Without the exploit, you could probably get pretty far with a mass broadcast, "Alexa, have Tesla drive to Times Square"
  • by jayjay71 on 1/7/17, 7:24 AM

    30 Rock has a great scene making fun of this very scenario. It doesn't seem to be readily available online, although it's on the episode ¡Qué Sorpresa! for those interested. Then Forbes wrote an article explaining why this can never happen.

    http://www.forbes.com/sites/briancaulfield/2012/01/12/hed-wh...

    Edit: It is worth mentioning that Alexa being outside the TV does make a difference, as it's unlikely to have a directional microphone as would likely be installed in the TV.

  • by unpythonic on 1/7/17, 8:52 AM

    Given that everything must go through shipping & handling which is typically measured in days, what is the benefit to ordering immediately? Surely these services could batch your requests through the day and push a confirmation request to the buyer(s).

    The risk of unintentional purchases seems much too high, and one doesn't lose much convenience with a quick confirmation.

  • by jcoffland on 1/7/17, 5:56 AM

    This could be mostly solved by allowing people to rename their digital assistants.
  • by tombrossman on 1/7/17, 3:09 PM

    Serious question, what's preventing someone from exploiting this for profit?

    For example, could you list a uniquely named item on Amazon (perhaps as a Marketplace seller) and charge high restocking fees? Then instead of just trolling people for a laugh, your business model would basically be collecting restocking fees.

  • by tetraodonpuffer on 1/7/17, 2:18 PM

    Couldn't the assistants (google, alexa, siri) be improved to do voice fingerprinting for specific commands (like purchases, unlock the front door, ...) where only certain voices are allowed to execute them?
  • by icameron on 1/7/17, 9:15 AM

    Do Alexa TV commercials wake up Alexa? Like the one that has "Alexa, call me an Über"?
  • by cooper12 on 1/7/17, 7:04 AM

    If I remember correctly, doesn't Apple's Siri get trained to your voice so other people don't activate it? It only seems appropriate for Amazon to make sure only specific voices are associated with purchasing rights or else kids will just be ordering anything. There really should have been some sort of authentication method for sensitive things like this.
  • by StavrosK on 1/7/17, 12:42 PM

    Ah, censorship by robot, where we have to watch what we say in public for fear of triggering people's assistants.
  • by khrm on 1/7/17, 3:34 PM

    This blunder seems to be mild in comparison to https://www.youtube.com/watch?v=r5p0gqCIEa8
  • by 746F7475 on 1/7/17, 9:12 AM

    Can't wait when commercials just go: "Alexa, buy <whatever we are selling>".
  • by imsofuture on 1/7/17, 3:33 PM

    "ill-conceived TV spot"

    Not sure that was the ill-conceived bit.

  • by glitch003 on 1/7/17, 12:40 PM

    I wonder if it's possible to create a TV commercial that says "Alexa, order me a dollhouse" but not trigger the Amazon Echo in the room, by doing something like playing ultrasound static at a louder volume than the "Alexa, order me..." which would overload the Alexa microphone but would be too high of a frequency for your ear to hear.
  • by tzs on 1/7/17, 6:38 AM

    These devices should be using multiple microphones so that they can tell where sounds are coming from, and during setup they should be shown the location of the TV and exclude any commands coming from there.
  • by ChuckMcM on 1/7/17, 7:07 PM

    I always chuckle at these, and yes voice independent recognition is always going to have this challenge. I tried a half dozen different "activation" phrases for my Moto-X and they all triggered at odd times by non-activation things (like the movies).

    The really useful next step will be voice independent language recognition with voice dependent command recognition. That and accent independent language recognition. That is one of the, if not the, next billion dollar acquisition by one of the big players.

  • by JabavuAdams on 1/7/17, 12:21 PM

    Did any orders actually happen, or is this a fake outrage story?
  • by nkkollaw on 1/7/17, 8:15 AM

    Is it just me, or the subheading is "Story on accidental order begets story on accidental order begets accidental order"?
  • by orblivion on 1/7/17, 8:09 AM

    So this is the new taboo? We have to watch what we say on tv and radio because a lot of people decide to buy this product?
  • by JoeAltmaier on 1/8/17, 5:29 PM

    Confused. Alexa doesn't order anything unless you do it manually. It just creates a shopping list. So somebody say "Alexa, order me a dollhouse" and the viewers' shopping lists got something scribbled on them. Nobody bought anything.
  • by mrob on 1/7/17, 5:35 PM

    This is a technical problem and can be solved by technical improvements. The listening device needs enough microphones to locate the exact position of the sound source. If it never moves then it's not a human and it should be ignored.
  • by mikerg87 on 1/7/17, 3:22 PM

    This is so 2014. I remember when when Xbox users with Kinect had this happen.

    http://www.bbc.com/news/technology-27827545

  • by synicalx on 1/9/17, 5:50 AM

    Just quietly, I'm looking forward to a new form of wardriving where you hang out the window of your car with a mega phone and ask Alexa to buy expensive jewellery or sex toys.
  • by awqrre on 1/8/17, 9:10 PM

    "Alexa, order me a new tin foil hat"...

    by the way, how does Alexa knows that you said Alexa without sending anything to the cloud if it needs the cloud to decode any speech?

  • by alxndr on 1/8/17, 10:44 PM

    Amazon is currently running a TV ad which has three people saying "Alexa..." commands; I wonder why that ad isn't also causing false positives?
  • by crooked-v on 1/7/17, 7:57 PM

    This makes me wonder how long until some musician or band does a Superbowl ad with "Alexa/Siri/Cortana, play _____", followed by silence.
  • by mherdeg on 1/7/17, 5:06 PM

    I wonder whether fewer people in the US will be naming their children "Alexa" or "Siri" in the coming years.
  • by dlss on 1/7/17, 3:37 PM

    Sounds like someone needs to either (a) get a patch out, or (b) pass a law before alexa-aware advertisements become a thing.
  • by dkarapetyan on 1/7/17, 6:11 PM

    And now we enter the era of cybernetic capitalism and runaway purchasing loops.
  • by sogen on 1/7/17, 4:13 PM

    This reminded me of the Halloween III tv commercial...
  • by phaed on 1/7/17, 1:30 PM

    Viral purchasing?
  • by peterwwillis on 1/7/17, 6:27 PM

    What a time to be alive.
  • by kahrkunne on 1/7/17, 4:27 PM

    Seems like a critical flaw in voice-control. Imagine a prankster driving through town with speakers blasting "Alexa, order me 20 dollhouses".