from Hacker News

Amazon SageMaker – Build, train, and deploy machine learning models at scale

by irs on 11/29/17, 5:33 PM with 9 comments

  • by gk1 on 11/29/17, 5:53 PM

    This blog post also includes screenshots: https://aws.amazon.com/blogs/aws/sagemaker/
  • by michaelbarton on 11/29/17, 10:58 PM

    If anyone from AWS is in this forum, could you comment if the custom Docker training in sagemaker can also be used for general optimisation of any dockerised objective function, e.g. bayesian hyper parameter training?

    In the blog post example there is this python code:

        def train(
            channel_input_dirs, hyperparameters, output_data_dir,
             model_dir, num_gpus, hosts, current_host):
    
    Would I also write some kind of similar function for scoring the result of the training?

    To provide some context, I work in bioinformatics where some of our algorithms have 100s of parameters. This is not ML where we want to classify or predict but rather optimise the parameters for a given objective function. If sagemaker allows general optimisation in an AWS lambda like way, that would be very useful.

  • by kernel_sanders on 11/29/17, 6:27 PM

    SageMaker

    {1G}

    Human Druid - Sage

    {G}, Tap: Create 0/1 Plant Token named Seed of Knowledge

    Sacrifice {X} Plants: Look at the top X cards of opponent's library

    1/1

  • by xtracto on 11/29/17, 7:21 PM

    I wonder how does this compare to offerings like DataRobot and the like.