Haskell bindings for TensorFlow
Go to file
Judah Jacobson c99a23b6a7 Add versions of each op that take optional params as an extra arg. (#84)
Each op `foo :: ...` now has a corresponding `foo' :: OpParams -> ...`
which lets you set optional attributes.  `OpParams` is currently a type alias for
`OpDef -> OpDef`.  In the future we should consider more type safety, e.g.,
using type-level strings and OverloadedLabels for optional attributes.

I used it to replace a few manual `buildOp`s in our code with the codegenerated
ops, now that it's easier to set attributes.  I also removed `tensorAttr` and
`named` since it's now possible to set those op attributes directly.

Although this clutters up the API a bit, I think it's simpler than using type
classes to implement optional arguments (as in, for example, `Text.Printf`) --
especially in terms of type inference with the rest of the library.
2017-03-20 18:16:38 -07:00
ci_build Update to 1.0 release and newest proto-lens (#77) 2017-02-22 15:24:45 -08:00
docker Update to 1.0 release and newest proto-lens (#77) 2017-02-22 15:24:45 -08:00
docs/haddock Update haddocks. (#46) 2016-11-23 10:55:35 -08:00
google-shim Initial commit 2016-10-24 19:26:42 +00:00
tensorflow Add versions of each op that take optional params as an extra arg. (#84) 2017-03-20 18:16:38 -07:00
tensorflow-core-ops Support lists of tensors in ops. (#79) 2017-03-17 13:53:19 -07:00
tensorflow-logging Add support for logging to tensorboard (#74) 2017-02-20 19:16:42 -08:00
tensorflow-mnist Introduce a MonadBuild class, and remove `buildAnd`. (#83) 2017-03-18 12:08:53 -07:00
tensorflow-mnist-input-data Initial commit 2016-10-24 19:26:42 +00:00
tensorflow-nn Introduce a MonadBuild class, and remove `buildAnd`. (#83) 2017-03-18 12:08:53 -07:00
tensorflow-opgen Add versions of each op that take optional params as an extra arg. (#84) 2017-03-20 18:16:38 -07:00
tensorflow-ops Add versions of each op that take optional params as an extra arg. (#84) 2017-03-20 18:16:38 -07:00
tensorflow-proto Add support for logging to tensorboard (#74) 2017-02-20 19:16:42 -08:00
tensorflow-queue Introduce a MonadBuild class, and remove `buildAnd`. (#83) 2017-03-18 12:08:53 -07:00
tensorflow-records Improve comments and make naming consistent. 2017-02-11 12:53:42 -08:00
tensorflow-records-conduit Add cabal files and CI setup for TFRecords. 2017-02-11 12:53:42 -08:00
tensorflow-test Make code --pedantic (#35) 2016-11-18 10:42:02 -08:00
third_party Uprev to TF 1.0rc1. (#69) 2017-02-09 14:20:43 -08:00
tools Added installation script for OS X dependencies under tools/ (#80) 2017-03-09 16:54:24 -08:00
.gitignore Optimize fetching (#27) 2016-11-17 10:41:49 -08:00
.gitmodules Initial commit 2016-10-24 19:26:42 +00:00
CONTRIBUTING.md Initial commit 2016-10-24 19:26:42 +00:00
LICENSE Initial commit 2016-10-24 19:26:42 +00:00
README.md Introduce a MonadBuild class, and remove `buildAnd`. (#83) 2017-03-18 12:08:53 -07:00
stack.yaml Update to 1.0 release and newest proto-lens (#77) 2017-02-22 15:24:45 -08:00

README.md

Build Status

The tensorflow-haskell package provides Haskell bindings to TensorFlow.

This is not an official Google product.

Documentation

https://tensorflow.github.io/haskell/haddock/

TensorFlow.Core is a good place to start.

Examples

Neural network model for the MNIST dataset: code

Toy example of a linear regression model (full code):

import Control.Monad (replicateM, replicateM_, zipWithM)
import System.Random (randomIO)
import Test.HUnit (assertBool)

import qualified TensorFlow.Core as TF
import qualified TensorFlow.GenOps.Core as TF
import qualified TensorFlow.Gradient as TF
import qualified TensorFlow.Ops as TF

main :: IO ()
main = do
    -- Generate data where `y = x*3 + 8`.
    xData <- replicateM 100 randomIO
    let yData = [x*3 + 8 | x <- xData]
    -- Fit linear regression model.
    (w, b) <- fit xData yData
    assertBool "w == 3" (abs (3 - w) < 0.001)
    assertBool "b == 8" (abs (8 - b) < 0.001)

fit :: [Float] -> [Float] -> IO (Float, Float)
fit xData yData = TF.runSession $ do
    -- Create tensorflow constants for x and y.
    let x = TF.vector xData
        y = TF.vector yData
    -- Create scalar variables for slope and intercept.
    w <- TF.initializedVariable 0
    b <- TF.initializedVariable 0
    -- Define the loss function.
    let yHat = (x `TF.mul` w) `TF.add` b
        loss = TF.square (yHat `TF.sub` y)
    -- Optimize with gradient descent.
    trainStep <- gradientDescent 0.001 loss [w, b]
    replicateM_ 1000 (TF.run trainStep)
    -- Return the learned parameters.
    (TF.Scalar w', TF.Scalar b') <- TF.run (w, b)
    return (w', b')

gradientDescent :: Float
                -> TF.Tensor TF.Value Float
                -> [TF.Tensor TF.Ref Float]
                -> TF.Session TF.ControlNode
gradientDescent alpha loss params = do
    let applyGrad param grad =
            TF.assign param (param `TF.sub` (TF.scalar alpha `TF.mul` grad))
    TF.group =<< zipWithM applyGrad params =<< TF.gradients loss params

Installation Instructions

Build with Docker on Linux

As an expedient we use docker for building. Once you have docker working, the following commands will compile and run the tests.

git clone --recursive https://github.com/tensorflow/haskell.git tensorflow-haskell
cd tensorflow-haskell
IMAGE_NAME=tensorflow/haskell:v0
docker build -t $IMAGE_NAME docker
# TODO: move the setup step to the docker script.
stack --docker --docker-image=$IMAGE_NAME setup
stack --docker --docker-image=$IMAGE_NAME test

There is also a demo application:

cd tensorflow-mnist
stack --docker --docker-image=$IMAGE_NAME build --exec Main

Build on Mac OS X

Run the install_osx_dependencies.sh script in the tools/ directory. The script installs dependencies via Homebrew and then downloads and installs the TensorFlow library on your machine under /usr/local.

After running the script to install system dependencies, build the project with stack:

stack test