from Hacker News

Microsoft open sources the inference engine in its Windows ML platform

by anirudhgarg on 12/4/18, 6:58 PM with 1 comments

  • by whitten on 12/4/18, 11:26 PM

    Apparently, the Open Neural Network Exchange (ONNX) runtime is an API so you can run models locally instead of on another machine.

    I didn't see any details about the inference engine, so I assume this is a neural net AI application programming interface instead of a symbolic AI inferencing engine.