from Hacker News

First look at Apple/Google contact tracing framework

by dmvaldman on 4/10/20, 9:05 PM with 109 comments

  • by est31 on 4/11/20, 12:33 AM

    Note that years ago, Moxie has studied a similar problem of how to let users know if their contacts use Signal or not without uploading the whole address books like e.g. WhatsApp does [0]. It's similar because in both instances you want to "match" users in some fashion using a centralized service while keeping their privacy.

    He ruled out downloads of megabytes of data (something that the Google/Apple proposal would imply) and couldn't find a good solution beyond trusting Intel's SGX technology, arguably not really a good solution but better than not adopting it at all [1].

    You have kind of a computation/download/privacy tradeoff here. You can increase the time interval of the daily keys to weeks. Gives you less stuff to download but the devices have to do more hashes to verify whether they have been in contact with other devices. You can increase the 10 minutes to an hour. That means less privacy and more trackability, but also less computation needed.

    My guess to why Google/Apple didn't introduce rough location (like US state or county) into the system was to prevent journalists from jumping onto that detail and sensationalizing it into something it isn't (Google/Apple grabbing your data). Both companies operate the most popular maps apps on the planet as well as OS level location services that phone home constantly so they are already in possession of that data.

    [0]: https://signal.org/blog/contact-discovery/

    [1]: https://signal.org/blog/private-contact-discovery/

  • by hn_throwaway_99 on 4/11/20, 1:36 AM

    Regardless of the technical issues with this, I think the "prank" issue Moxie brings up is much more serious. We've already seen the phenomenon of "Zoom bombing", I can imagine "tracer bombing" would be a much more serious issue. The only way I could see this working is that if when you enter a positive result you have to enter some sort of secret key from the testing authority, but that's totally not tenable given a lot (most?) testing these days is from private providers.
  • by krcz on 4/11/20, 1:59 AM

    > So first obvious caveat is that this is "private" (or at least not worse than BTLE), until the moment you test positive. > At that point all of your BTLE mac addrs over the previous period become linkable.

    Linkable over the period of 14 days. Or even linkable during one day - each day means new key, so linking between these might be attempted only on basis on behavioral correlations.

    What to do with such data? Microanalysis of customer behaviors? It won't be possible to use such data for future customer profiling, as it won't be possible to match the history with identifiers after the infection. This data is practically worthless.

  • by olliej on 4/11/20, 5:48 AM

    Let's just answer these

    * Use stationary beacons to track someone’s travel path

    Doesn't work because there's no externally visible correlation between reported identifiers until after the user chooses to report there test result.

    * Increased hit rate of stationary / marketing beacons

    Doesn't work because they depend on coherence in the beacons, and the identifiers roll every 10 or so minutes. Presumably you'd ensure that any rolling of the bluetooth MAC also rolls the reported identifier.

    * Leakage of information when someone isn’t sick

    The requests for data simply tell you someone is using an app - which you can already tell if they're using app.

    The system can encourage someone to get tested, if your app wants to tell people to get tested, then FairPlay to that app (though good luck in the US).

    - Fraud resistance

    Not a privacy/tracking concern, though I'm sure devs will have to do something to limit spam/dos

  • by antpls on 4/11/20, 6:53 AM

    Again, this solution _cannot_ work and it is a _threat_ to a permanent loss of privacy.

    This is like the government and the adtech companies sleeping in the same bed, without any other power opposition in the balance.

    1) The "solution" is created by a monopoly of 2 american private corporations.

    2) It can only work reliably if everyone wear an (Apple or Android) phone at all time, and consent to give data

    3) You are not necessarily infected if you cross an infected in the street at 5 meters. This will have too many false positives and give fuzzy information to people

    4) It doesn't help people who are infected and _dying_

    It just _doesnt make sense_. To me, it looks like electronic voting, but worse. No one can understand how it works, beside experts.

    Today it is reviewed, but then the app will be forgotten and updated in the background with "new features" for adtech.

    We are forgetting what we are fighting : a biological virus. All effort should go toward understanding the biological machinery of the virus and the hosts, in order to _cure_ the virus. We should be 3D printing ventilators, analysing DNA sequences, build nanorobots and synthesis new molecules.

  • by Reelin on 4/11/20, 2:50 AM

    Is there an official document somewhere?

    Also, how does it compare to DP-3T? (https://github.com/DP-3T/documents) (https://ncase.me/contact-tracing/)

    Edit: Apple's preliminary specification was linked in another HN comment. (https://covid19-static.cdn-apple.com/applications/covid19/cu...)

  • by pferde on 4/11/20, 1:14 PM

    What's it with people making long, split-up twitter threads like this? They're cumbersome and hard to read. Be an adult, write and publish an article on your blog.

    It feels weird having to criticize Marlinspike about this, but stupid practices are stupid no matter how prestigious the person doing them is.

  • by femto113 on 4/11/20, 1:58 AM

    The system doesn't need to ship every key to every phone, much more compact structures like Bloom filters could be used instead. If we assume about 1000 positives per day and each positive uploading 14 days of keys at 4 keys per hour that's a bit over 1 million keys per day. A Bloom filter with a false positive rate of 1/1000 could store that in about a megabyte. Phone downloads the filter each day and checks its observed keys, and only needs to download the actual keys if there's a potential match.
  • by zeckalpha on 4/11/20, 2:26 AM

    > Published keys are 16 bytes, one for each day. If moderate numbers of smartphone users are infected in any given week, that's 100s of MBs for all phones to DL.

    Seems like a usecase for bloom filters or k-anonymity.

  • by daenz on 4/11/20, 1:24 AM

    An important question here is: will this framework go away once the pandemic is over? Something tells me it won't.
  • by severine on 4/10/20, 11:16 PM

  • by grumple on 4/11/20, 11:55 AM

    Yikes, this is prep for big brother's guilt by association. I wouldn't want to test positive for anything the state can track (radical ideas? you're now a positive in this system). Opt out.
  • by themark on 4/11/20, 2:30 AM

    Seems like a lot of processing. I wonder how much battery performance will be affected.
  • by kome on 4/11/20, 12:05 PM

    that's the new electronic voting: making easy stuff more complicated and dangerous...

    the problem is not not a technological problem, it's a political problem.

  • by bobowzki on 4/11/20, 11:59 AM

    Goodbye last shred of privacy.

    "The road to hell is paved with good intentions" is an expression that comes to mind.

  • by redis_mlc on 4/11/20, 12:35 AM

    Can somebody address the issue that we have almost no testing ability in the US?
  • by Zenbit_UX on 4/10/20, 11:25 PM

    No clue who/what a moxie is (presumably some guy) and it makes this threads title seem even more absurd.

    OP feeling like we all need to know what moxie thinks about this reminds me of this [Chappelle Show skit](https://www.youtube.com/watch?v=Mo-ddYhXAZc) about getting Ja Rule's hot take on current events.

  • by mc32 on 4/11/20, 12:15 AM

    Of course Google promises [1]:

    “ adhering to our stringent privacy protocols and protecting people's privacy. No personally identifiable information, such as an individual's location, contacts or movement, will be made available at any point."

    [1] https://turnto10.com/news/local/privacy-advocates-raise-conc...

  • by howmayiannoyyou on 4/11/20, 3:35 AM

    Finally a decent use-case for blockchain and nobody is paying attention. Seems to make a lot more sense to reconcile location and proximity from a shared user-controlled anonymous ledger.
  • by Uhhrrr on 4/10/20, 11:23 PM

    A modest proposal: since almost everyone is going to get this and a much smaller percentage is vulnerable, perhaps we should just use this system to track those who choose to register as vulnerable.