aboutsummaryrefslogtreecommitdiff
path: root/doc/nerv_layer.md
diff options
context:
space:
mode:
authorcloudygoose <cloudygooseg@gmail.com>2015-06-08 16:25:36 +0800
committercloudygoose <cloudygooseg@gmail.com>2015-06-08 16:25:36 +0800
commitf6786b0e7c71437a100c88377b96f832acb8125d (patch)
tree714c67188f000b47144b8bdfa5e3312f13f6a8a9 /doc/nerv_layer.md
parent155b0c0803f5f7cd3f8780273f6b0bdfbaed5970 (diff)
doc change
Diffstat (limited to 'doc/nerv_layer.md')
-rw-r--r--doc/nerv_layer.md33
1 files changed, 33 insertions, 0 deletions
diff --git a/doc/nerv_layer.md b/doc/nerv_layer.md
new file mode 100644
index 0000000..587bf24
--- /dev/null
+++ b/doc/nerv_layer.md
@@ -0,0 +1,33 @@
+#The Nerv Layer Package#
+Part of the [Nerv](../README.md) toolkit.
+
+##Description##
+__nerv.Layer__ is the base class and most of its methods are abstract.
+###Class hierarchy and their members###
+* __nerv.AffineLayer__ inherits __nerv.Layer__.
+ * `MatrixParam ltp` The liner transform parameter.
+ * `BiasParam bp` The bias parameter.
+ * `table dim_in` should be of length 1.
+ * `table dim_out` should be of length 1.
+
+##Methods##
+* __void Layer.\_\_init(Layer self, string id, table global_conf, table layer_conf)__
+Abstract method.
+The constructing method should assign `id` to `self.id` and `global_conf` to `self.gconf`, `layer_conf.dim_in` to `self.dim_in`, `layer_conf.dim_out` to `self.dim_out`. `dim_in` and `dim_out` are a list specifies the dimensions of the inputs and outputs. Also, `layer_conf` will include the parameters, which should also be properly saved.
+* __void Layer.init(Layer self)__
+Abstract method.
+Initialization method, in this method the layer should do some self-checking and allocate space for intermediate results.
+* __void Layer.update(Layer self, table bp_err, table input, table output)__
+Abstract method.
+`bp_err[i]` should be the error on `output[i]`. In this method the parameters of `self` is updated.
+* __void Layer.propagate(Layer self, table input, table output)__
+Abstract method.
+Given `input` and the current parameters, propagate and store the result in `output`.
+* __void Layer.back_propagate(Layer self, Matrix next_bp_err, Matrix bp_err, Matrix input, Matrix output)__
+Abstract method.
+
+* __void Layer.check_dim_len(int len_in, int len_out)__
+Check whether `#self.dim_in == len_in` and `#self.dim_out == len_out`, if violated, an error will be posted.
+* __void Layer.get_params(Layer self)__
+Abstract method.
+