2018-04-11 01:32:53 +02:00
![Build Status ](https://storage.googleapis.com/tensorflow-haskell-kokoro-build-badges/github.png )
2016-10-29 03:08:32 +02:00
2016-10-24 21:26:42 +02:00
The tensorflow-haskell package provides Haskell bindings to
[TensorFlow ](https://www.tensorflow.org/ ).
This is not an official Google product.
2017-01-17 05:44:45 +01:00
# Documentation
https://tensorflow.github.io/haskell/haddock/
2017-11-11 23:01:52 +01:00
[TensorFlow.Core ](https://tensorflow.github.io/haskell/haddock/tensorflow-0.1.0.2/TensorFlow-Core.html )
2017-01-17 05:44:45 +01:00
is a good place to start.
# Examples
Neural network model for the MNIST dataset: [code ](tensorflow-mnist/app/Main.hs )
Toy example of a linear regression model
([full code](tensorflow-ops/tests/RegressionTest.hs)):
```haskell
2017-05-26 04:19:22 +02:00
import Control.Monad (replicateM, replicateM_)
2017-01-17 05:44:45 +01:00
import System.Random (randomIO)
import Test.HUnit (assertBool)
import qualified TensorFlow.Core as TF
import qualified TensorFlow.GenOps.Core as TF
2017-05-26 04:19:22 +02:00
import qualified TensorFlow.Minimize as TF
import qualified TensorFlow.Ops as TF hiding (initializedVariable)
import qualified TensorFlow.Variable as TF
2017-01-17 05:44:45 +01:00
main :: IO ()
main = do
-- Generate data where `y = x*3 + 8` .
xData < - replicateM 100 randomIO
let yData = [x*3 + 8 | x < - xData ]
-- Fit linear regression model.
(w, b) < - fit xData yData
assertBool "w == 3" (abs (3 - w) < 0.001 )
assertBool "b == 8" (abs (8 - b) < 0.001 )
fit :: [Float] -> [Float] -> IO (Float, Float)
fit xData yData = TF.runSession $ do
-- Create tensorflow constants for x and y.
let x = TF.vector xData
y = TF.vector yData
-- Create scalar variables for slope and intercept.
2017-03-18 20:08:53 +01:00
w < - TF . initializedVariable 0
b < - TF . initializedVariable 0
2017-01-17 05:44:45 +01:00
-- Define the loss function.
2017-05-26 04:19:22 +02:00
let yHat = (x `TF.mul` TF.readValue w) `TF.add` TF.readValue b
2017-01-17 05:44:45 +01:00
loss = TF.square (yHat `TF.sub` y)
-- Optimize with gradient descent.
2017-05-26 04:19:22 +02:00
trainStep < - TF . minimizeWith ( TF . gradientDescent 0 . 001 ) loss [ w , b ]
2017-01-17 05:44:45 +01:00
replicateM_ 1000 (TF.run trainStep)
-- Return the learned parameters.
2017-05-26 04:19:22 +02:00
(TF.Scalar w', TF.Scalar b') < - TF . run ( TF . readValue w , TF . readValue b )
2017-01-17 05:44:45 +01:00
return (w', b')
```
# Installation Instructions
2016-10-24 21:26:42 +02:00
2017-06-03 01:02:30 +02:00
Note: building this repository with `stack` requires version `1.4.0` or newer.
2017-07-17 18:04:15 +02:00
Check your stack version with `stack --version` in a terminal.
2017-06-03 01:02:30 +02:00
2016-10-26 20:13:42 +02:00
## Build with Docker on Linux
2016-10-24 21:26:42 +02:00
2016-10-25 18:53:35 +02:00
As an expedient we use [docker ](https://www.docker.com/ ) for building. Once you have docker
2016-10-24 21:26:42 +02:00
working, the following commands will compile and run the tests.
git clone --recursive https://github.com/tensorflow/haskell.git tensorflow-haskell
cd tensorflow-haskell
IMAGE_NAME=tensorflow/haskell:v0
docker build -t $IMAGE_NAME docker
# TODO: move the setup step to the docker script.
stack --docker --docker-image=$IMAGE_NAME setup
stack --docker --docker-image=$IMAGE_NAME test
There is also a demo application:
cd tensorflow-mnist
stack --docker --docker-image=$IMAGE_NAME build --exec Main
2016-10-26 20:13:42 +02:00
2017-10-22 02:32:19 +02:00
### Docker GPU support
If you want to use GPU you can do:
IMAGE_NAME=tensorflow/haskell:1.3.0-gpu
docker build -t $IMAGE_NAME docker/gpu
We need stack to use nvidia-docker by using a 'docker' wrapper script. This will shadow the normal docker command.
ln -s `pwd` /tools/nvidia-docker-wrapper.sh < somewhere in your path > /docker
stack --docker --docker-image=$IMAGE_NAME setup
stack --docker --docker-image=$IMAGE_NAME test
2017-07-20 22:17:50 +02:00
## Build on macOS
2016-10-26 20:13:42 +02:00
2017-07-20 22:17:50 +02:00
Run the [install_macos_dependencies.sh ](./tools/install_macos_dependencies.sh )
2017-03-10 01:54:24 +01:00
script in the `tools/` directory. The script installs dependencies
2017-08-21 21:59:22 +02:00
via [Homebrew ](https://brew.sh/ ) and then downloads and installs the TensorFlow
2017-03-10 01:54:24 +01:00
library on your machine under `/usr/local` .
2016-10-26 20:13:42 +02:00
2017-07-17 18:04:15 +02:00
After running the script to install system dependencies, build the project with stack:
2016-10-26 20:13:42 +02:00
2017-03-10 01:54:24 +01:00
stack test
2017-06-11 07:24:54 +02:00
## Build on NixOS
2018-05-25 04:33:15 +02:00
The `shell.nix` provides an environment containing the necessary
dependencies. To build, run:
2017-06-11 07:24:54 +02:00
2018-05-25 04:33:15 +02:00
$ stack --nix build
or alternatively you can run
$ nix-shell
2017-06-11 07:24:54 +02:00
to enter the environment and build the project. Note, that it is an emulation
of common Linux environment rather than full-featured Nix package expression.
No exportable Nix package will appear, but local development is possible.
2017-08-21 21:59:22 +02:00
2017-12-22 02:24:19 +01:00
# Related Projects
2018-01-17 19:31:23 +01:00
## Statically validated tensor shapes
2017-12-22 02:24:19 +01:00
https://github.com/helq/tensorflow-haskell-deptyped is experimenting with using dependent types to statically validate tensor shapes. May be merged with this repository in the future.
2018-01-17 19:31:23 +01:00
Example:
```haskell
{-# LANGUAGE DataKinds, ScopedTypeVariables #-}
import Data.Maybe (fromJust)
import Data.Vector.Sized (Vector, fromList)
import TensorFlow.DepTyped
test :: IO (Vector 8 Float)
test = runSession $ do
(x :: Placeholder "x" '[4,3] Float) < - placeholder
let elems1 = fromJust $ fromList [1,2,3,4,1,2]
elems2 = fromJust $ fromList [5,6,7,8]
(w :: Tensor '[3,2] '[] Build Float) = constant elems1
(b :: Tensor '[4,1] '[] Build Float) = constant elems2
y = (x `matMul` w) `add` b -- y shape: [4,2] (b shape is [4.1] but `add` broadcasts it to [4,2])
let (inputX :: TensorData "x" [4,3] Float) =
encodeTensorData . fromJust $ fromList [1,2,3,4,1,0,7,9,5,3,5,4]
runWithFeeds (feed x inputX :~~ NilFeedList) y
main :: IO ()
main = test >>= print
```
2017-12-22 02:24:19 +01:00
# License
2017-08-21 21:59:22 +02:00
This project is licensed under the terms of the [Apache 2.0 license ](LICENSE ).