We've started putting together some TensorFlow bindings for OCaml. It's very much a work in progress so there are some rough edges: some operators are not supported, the api is likely to change, you may even get some segfaults, etc. However it can already be used to train a convolutional neural network to recognize digits with 98% accuracy.
Here are a couple highlights:
- The code for most operators is automatically generated from the TensorFlow protobuf specs.
- The supported optimizers are gradient descent, momentum, and the adam optimizer.
- The gradient backpropagation graph is generated within the OCaml wrapper. We need to register the gradients for each operator manually, this has already been done for ~30 operators.
- The api should be pretty straightforward, as an example here is a simple linear classifier for the MNIST dataset:
https://github.com/LaurentMazare/tensorflow-ocaml/blob/master/examples/mnist/mnist_linear.ml

Some installation instructions can be found on the main page of the project.
https://github.com/LaurentMazare/tensorflow-ocaml
Installation is quite tricky for now as one has to compile or extract libtensorflow.so but we're hoping to improve this soon.

Any feedback would be very welcome.

(this has been cross-posted to the tensorflow-discuss group)

On Thu, Mar 10, 2016 at 7:58 PM, Jesper Louis Andersen <jesper.louis.andersen@gmail.com> wrote:

On Thu, Mar 10, 2016 at 6:43 PM, Milo Davis <davis.mil@husky.neu.edu> wrote:
Okay.  I'll see what needs to be done to wrap the C++ code.  Google recommends using Swig.

If memory serves, you only need to support around 4-5 function calls in order to load graphs and use them in sessions. The graph itself can be built in python, and the interface looks much like a tensored variant of Janes St.'s incremental library.

I'd definitely start by getting that part working first, and then add the remaining functions if need be. But I wouldn't make that a priority. If you do succeed however, I need a better language than Python for some TF-work :)


--
J.