by irs on 11/29/17, 5:33 PM with 9 comments
by gk1 on 11/29/17, 5:53 PM
by michaelbarton on 11/29/17, 10:58 PM
In the blog post example there is this python code:
def train(
channel_input_dirs, hyperparameters, output_data_dir,
model_dir, num_gpus, hosts, current_host):
Would I also write some kind of similar function for scoring the result of the training?To provide some context, I work in bioinformatics where some of our algorithms have 100s of parameters. This is not ML where we want to classify or predict but rather optimise the parameters for a given objective function. If sagemaker allows general optimisation in an AWS lambda like way, that would be very useful.
by kernel_sanders on 11/29/17, 6:27 PM
{1G}
Human Druid - Sage
{G}, Tap: Create 0/1 Plant Token named Seed of Knowledge
Sacrifice {X} Plants: Look at the top X cards of opponent's library
1/1
by xtracto on 11/29/17, 7:21 PM