from Hacker News

GPUs for Google Cloud Platform

by hurrycane on 11/15/16, 8:11 PM with 94 comments

  • by dkobran on 11/15/16, 11:20 PM

    Kudos to Google for making moves here. Having spent the last year+ tackling GPUs in the datacenter, super curious how custom sizing works. It's a huge technical feat to get eight GPUs running (let alone, in a virtual environment), but the real challenge is making sure the blocks/puzzle pieces all fit together so there's no idle hardware sitting around There's a reason why Amazon's G/P instances require that you double the RAM/CPU if you double the GPU. Another example would be Digital Ocean's linear scale-up of instance types. In any case, we'll have to see what pricing comes out to.

    Shameless plug, if you want raw access to a GPU in the cloud today, shoot me an email at daniel at paperspace.com We have people doing everything from image analysis to genomics to a whole lot of ML/AI.

  • by timdorr on 11/15/16, 9:54 PM

    > Google Cloud GPUs give you the flexibility to mix and match infrastructure. You’ll be able to attach up to 8 GPU dies to any non-shared-core machine...

    Wow, that's impressive. One thing I've loved about GCE has been the custom sizing. This takes it even further, so we don't have to buy what we don't need.

    Looking forward to seeing the pricing on this. Looks like they're going to heavily compete with AWS on this stuff.

  • by matt_wulfeck on 11/15/16, 10:00 PM

    One of the HUGE advanatages of GCE/AWS is that they will gobble up 100% of the unused resources for their own computation. Nothing is wasted, and the machines basically pay for themselves.

    Compare this was something like oracle, which simply can't consume the unused resources in order to discount the hardware effectively. They can't beat GCE/AWS at the cloud game until this changes.

  • by slizard on 11/15/16, 11:44 PM

    Kudos for Google and happy to see that at least in principle AMD is still an option.

    I wonder what kind of device driver does GCE use with AMD, the new ROCm?

    What about Power8 + NVLink harware? Does anybody know if the current NVIDIA GPUs, in particular the P100s are all on x86?

  • by boxerab on 11/15/16, 10:18 PM

    Very very happy to finally see AMD GPUs in cloud.
  • by eudoxus on 11/15/16, 10:21 PM

    This amazing!!! First cloud provider to have P100! Amazing opportunities ahead with compute power like that.
  • by fulafel on 11/16/16, 12:34 AM

    What's the assurance like regarding security against other concurrent users on the same hardware? Historically multitenancy with GPUs has been quite iffy and not much security research around, even if there theoretically are IOMMU's.
  • by kozikow on 11/16/16, 7:20 AM

    Now it would be great if kubernetes on GKE would work nicely with GPUs. It's still in the works: https://github.com/kubernetes/kubernetes/blob/master/docs/pr... .
  • by otto_ortega on 11/16/16, 4:42 AM

    Awesome news! The Tesla P100 is a monster, this will push ML development to new heights.
  • by AlexCoventry on 11/15/16, 10:11 PM

    Is there any public access to the TPUs?
  • by kesor on 11/15/16, 11:47 PM

    This happened some months ago ... how does it compare? Anyone in the know can pitch in on a short comparison?

    https://aws.amazon.com/about-aws/whats-new/2016/09/introduci...

  • by dylanz on 11/16/16, 4:57 AM

    There are a lot of excited posts here about this announcement! For someone that doesn't use GPU's in everyday life, can someone explain why this is great and maybe touch on the current landscape around GPU usage and the cost landscape?
  • by alecco on 11/16/16, 12:29 AM

    Is it possible to have a non-shared machine? Is it virtualized anyway?
  • by shaklee3 on 11/16/16, 4:39 AM

    Does anyone know what the cost will be for these? AWS is quite high for the K80.
  • by nojvek on 11/16/16, 5:54 AM

    Nvidia got a massive bump in share price. I was quite sad because I sold all my shares after election downfall. I think this announcement might have caused the huge peak. Could have made 10% in one day.
  • by n00b101 on 11/15/16, 9:40 PM

    Great news!
  • by eDameXxX on 11/15/16, 11:43 PM

  • by jaspervdmeer on 11/16/16, 10:39 AM

    Mine ALL the bitcoins
  • by largote on 11/15/16, 10:19 PM

    I wonder what kinds of cores will be available and whether that will be visible. Optimizing your code for a particular GPU architecture can have massive performance differences, much more so than for GPUs.
  • by kesor on 11/15/16, 11:42 PM

    Am I the only one annoyed that their "announcement" talks about something that will happen in the future?

    What kind of asshole move is this? Why not just say "here, you can use it now, good luck".