From 18990c8d90ad8e57fed2e5fd4d4acd4af491f880 Mon Sep 17 00:00:00 2001 From: TianxingHe Date: Mon, 7 Mar 2016 11:23:13 +0800 Subject: Update nerv_matrix.md Doc change about the softmax operation. --- nerv/doc/nerv_matrix.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/nerv/doc/nerv_matrix.md b/nerv/doc/nerv_matrix.md index 8ae97f9..3782eb3 100644 --- a/nerv/doc/nerv_matrix.md +++ b/nerv/doc/nerv_matrix.md @@ -83,8 +83,8 @@ Fill the content of __Matrix__ `self` to be `value`. Set the element of __Matrix__ `self` to be elementwise-sigmoid of `ma`. * __void Matrix.sigmoid_grad(Matrix self, Matrix err, Matrix output)__ Set the element of __Matrix__ `self`, to be `self[i][j]=err[i][j]*output[i][j]*(1-output[i][j])`. This function is used to propagate sigmoid layer error. -* __void Matrix.softmax(Matrix self, Matrix a)__ -Calculate a row-by-row softmax of __Matrix__ `a` and save the result in `self`. +* __Matrix Matrix.softmax(Matrix self, Matrix a)__ +Calculate a row-by-row softmax of __Matrix__ `a` and save the result in `self`. Returns a new `self.nrow*1` index matrix that stores the index of the maximum value of each row. * __void Matrix.mul_elem(Matrix self, Matrix ma, Matrix mb)__ Calculate element-wise multiplication of __Matrix__ `ma` and `mb`, store the result in `self`. * __void Matrix.log_elem(Matrix self, Matrix ma)__ -- cgit v1.2.3-70-g09d2