<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"><htmlxmlns="http://www.w3.org/1999/xhtml"><head><metahttp-equiv="Content-Type"content="text/html; charset=UTF-8"/><title>TensorFlow.EmbeddingOps</title><linkhref="ocean.css"rel="stylesheet"type="text/css"title="Ocean"/><scriptsrc="haddock-util.js"type="text/javascript"></script><scripttype="text/javascript">//<![CDATA[
window.onload = function () {pageLoad();setSynopsis("mini_TensorFlow-EmbeddingOps.html");};
</script></head><body><divid="package-header"><ulclass="links"id="page-menu"><li><ahref="index.html">Contents</a></li><li><ahref="doc-index.html">Index</a></li></ul><pclass="caption">tensorflow-ops-0.1.0.0: Friendly layer around TensorFlow bindings.</p></div><divid="content"><divid="module-header"><tableclass="info"><tr><th>Safe Haskell</th><td>None</td></tr><tr><th>Language</th><td>Haskell2010</td></tr></table><pclass="caption">TensorFlow.EmbeddingOps</p></div><divid="description"><pclass="caption">Description</p><divclass="doc"><p>Parallel lookups on the list of tensors.</p></div></div><divid="synopsis"><pid="control.syn"class="caption expander"onclick="toggleSection('syn')">Synopsis</p><ulid="section.syn"class="hide"onclick="toggleSection('syn')"><liclass="src short"><ahref="#v:embeddingLookup">embeddingLookup</a> :: <spanclass="keyword">forall</span> a b v. (<ahref="../tensorflow-0.1.0.0/TensorFlow-Types.html#t:TensorType">TensorType</a> a, <ahref="../tensorflow-0.1.0.0/TensorFlow-Types.html#t:OneOf">OneOf</a> `[<ahref="../base-4.8.2.0/Data-Int.html#t:Int64">Int64</a>, <ahref="../base-4.8.2.0/Data-Int.html#t:Int32">Int32</a>]` b, <ahref="../base-4.8.2.0/Prelude.html#t:Num">Num</a> b) => [<ahref="../tensorflow-0.1.0.0/TensorFlow-Tensor.html#t:Tensor">Tensor</a> v a] -><ahref="../tensorflow-0.1.0.0/TensorFlow-Tensor.html#t:Tensor">Tensor</a><ahref="../tensorflow-0.1.0.0/TensorFlow-Tensor.html#t:Value">Value</a> b -><ahref="../tensorflow-0.1.0.0/TensorFlow-Build.html#t:Build">Build</a> (<ahref="../tensorflow-0.1.0.0/TensorFlow-Tensor.html#t:Tensor">Tensor</a><ahref="../tensorflow-0.1.0.0/TensorFlow-Tensor.html#t:Value">Value</a> a)</li></ul></div><divid="interface"><h1>Documentation</h1><divclass="top"><pclass="src"><aname="v:embeddingLookup"class="def">embeddingLookup</a></p><divclass="subs arguments"><pclass="caption">Arguments</p><table><tr><tdclass="src">:: (<ahref="../tensorflow-0.1.0.0/TensorFlow-Types.html#t:TensorType">TensorType</a> a, <ahref="../tensorflow-0.1.0.0/TensorFlow-Types.html#t:OneOf">OneOf</a> `[<ahref="../base-4.8.2.0/Data-Int.html#t:Int64">Int64</a>, <ahref="../base-4.8.2.0/Data-Int.html#t:Int32">Int32</a>]` b, <ahref="../base-4.8.2.0/Prelude.html#t:Num">Num</a> b)</td><tdclass="doc empty"> </td></tr><tr><tdclass="src">=> [<ahref="../tensorflow-0.1.0.0/TensorFlow-Tensor.html#t:Tensor">Tensor</a> v a]</td><tdclass="doc"><p>A list of tensors which can be concatenated along
dimension 0. Each <code><ahref="../tensorflow-0.1.0.0/TensorFlow-Tensor.html#t:Tensor">Tensor</a></code> must be appropriately
sized for <code><ahref="../base-4.8.2.0/Prelude.html#v:mod">mod</a></code> partition strategy.</p></td></tr><tr><tdclass="src">-><ahref="../tensorflow-0.1.0.0/TensorFlow-Tensor.html#t:Tensor">Tensor</a><ahref="../tensorflow-0.1.0.0/TensorFlow-Tensor.html#t:Value">Value</a> b</td><tdclass="doc"><p>A <code><ahref="../tensorflow-0.1.0.0/TensorFlow-Tensor.html#t:Tensor">Tensor</a></code> with type <code>int32</code> or <code>int64</code>
containing the ids to be looked up in <code>params</code>.
The ids are required to be flat on entry and have
fewer than 2^31 entries.</p></td></tr><tr><tdclass="src">-><ahref="../tensorflow-0.1.0.0/TensorFlow-Build.html#t:Build">Build</a> (<ahref="../tensorflow-0.1.0.0/TensorFlow-Tensor.html#t:Tensor">Tensor</a><ahref="../tensorflow-0.1.0.0/TensorFlow-Tensor.html#t:Value">Value</a> a)</td><tdclass="doc"><p>A dense tensor with shape `shape(ids) + shape(params)[1:]`.</p></td></tr></table></div><divclass="doc"><p>Looks up <code>ids</code> in a list of embedding tensors.</p><p>This function is used to perform parallel lookups on the list of
tensors in <code>params</code>. It is a generalization of <code><ahref="TF.html#v:gather">gather</a></code>, where
<code>params</code> is interpreted as a partition of a larger embedding
tensor.</p><p>The partition_strategy is "mod", we assign each id to partition
`p = id % len(params)`. For instance,
13 ids are split across 5 partitions as:
`[[0, 5, 10], [1, 6, 11], [2, 7, 12], [3, 8], [4, 9]]`</p><p>The results of the lookup are concatenated into a dense
tensor. The returned tensor has shape `shape(ids) + shape(params)[1:]`.</p></div></div></div></div><divid="footer"><p>Produced by <ahref="http://www.haskell.org/haddock/">Haddock</a> version 2.16.1</p></div></body></html>