aboutsummaryrefslogtreecommitdiff
path: root/doc/nerv_layer.md
diff options
context:
space:
mode:
authorcloudygoose <cloudygooseg@gmail.com>2015-06-12 13:06:27 +0800
committercloudygoose <cloudygooseg@gmail.com>2015-06-21 10:25:03 +0800
commit839d938df0d83ec311c5d1299923c667adff6a87 (patch)
tree5e774230b9a9fd1c99a3f0a0dff0a776ec628d2f /doc/nerv_layer.md
parenta55769787d1b3ec2d1db519cd5efb3b5b2e75404 (diff)
git rebase
... ... ... ... .. .. ... .... ... ... .. ... ... ... ... ... ... ... ... ... ... ... ... ... ... git rebase ... ... ... doc change doc change ... added nerv.Matrix:randomize() ... doc change for DAGLayer bug fix in nerv.Matrix:random() doc change
Diffstat (limited to 'doc/nerv_layer.md')
-rw-r--r--doc/nerv_layer.md13
1 files changed, 11 insertions, 2 deletions
diff --git a/doc/nerv_layer.md b/doc/nerv_layer.md
index ac6480c..de2fb12 100644
--- a/doc/nerv_layer.md
+++ b/doc/nerv_layer.md
@@ -15,7 +15,7 @@ __nerv.Layer__ is the base class and most of its methods are abstract.
* __nerv.BiasLayer__ inherits __nerv.Layer__, both `#dim_in` nad `#dim_out` are 1.
* `BiasParam bias` The bias parameter.
* __nerv.SigmoidLayer__ inherits __nerv.Layer__, both `#dim_in` and `#dim_out` are 1.
-* __nerv.SoftmaxCELayer__ inherits __nerv.Layer__, `#dim_in` is 2 and `#dim_out` is 0. `input[1]` is the input to the softmax layer, `input[2]` is the reference distribution.
+* __nerv.SoftmaxCELayer__ inherits __nerv.Layer__, `#dim_in` is 2 and `#dim_out` is -1(optional). `input[1]` is the input to the softmax layer, `input[2]` is the reference distribution. In its `propagate(input, output)` method, if `output[1] ~= nil`, cross\_entropy value will outputed.
* `float total_ce` Records the accumlated cross entropy value.
* `int total_frams` Records how many frames have passed.
* `bool compressed` The reference distribution can be a one-hot format. This feature is enabled by `layer_conf.compressed`.
@@ -43,6 +43,15 @@ Check whether `#self.dim_in == len_in` and `#self.dim_out == len_out`, if violat
Abstract method.
The layer should return a list containing its parameters.
+####nerv.Layer.get\_dim(self)####
+* Returns:
+ `dim_in`: __table__.
+ `dim_out`: __table__.
+* Parameters:
+ `self`: __nerv.Layer__.
+* Description:
+ Returns `self.dim_in, self.dim_out`.
+
##Examples##
* a basic example using __Nerv__ layers to a linear classification.
@@ -168,4 +177,4 @@ for l = 0, 10, 1 do
end
end
--[[end training]]--
-``` \ No newline at end of file
+```