diff options
author | TianxingHe <[email protected]> | 2016-03-07 11:23:13 +0800 |
---|---|---|
committer | TianxingHe <[email protected]> | 2016-03-07 11:23:13 +0800 |
commit | 18990c8d90ad8e57fed2e5fd4d4acd4af491f880 (patch) | |
tree | 5412bbc767182d2f4d9bb71f636a6395cae4f090 /nerv | |
parent | 155132e122eca83942f49bb6a95c9dcf2bae8a81 (diff) |
Update nerv_matrix.md
Doc change about the softmax operation.
Diffstat (limited to 'nerv')
-rw-r--r-- | nerv/doc/nerv_matrix.md | 4 |
1 files changed, 2 insertions, 2 deletions
diff --git a/nerv/doc/nerv_matrix.md b/nerv/doc/nerv_matrix.md index 8ae97f9..3782eb3 100644 --- a/nerv/doc/nerv_matrix.md +++ b/nerv/doc/nerv_matrix.md @@ -83,8 +83,8 @@ Fill the content of __Matrix__ `self` to be `value`. Set the element of __Matrix__ `self` to be elementwise-sigmoid of `ma`. * __void Matrix.sigmoid_grad(Matrix self, Matrix err, Matrix output)__ Set the element of __Matrix__ `self`, to be `self[i][j]=err[i][j]*output[i][j]*(1-output[i][j])`. This function is used to propagate sigmoid layer error. -* __void Matrix.softmax(Matrix self, Matrix a)__ -Calculate a row-by-row softmax of __Matrix__ `a` and save the result in `self`. +* __Matrix Matrix.softmax(Matrix self, Matrix a)__ +Calculate a row-by-row softmax of __Matrix__ `a` and save the result in `self`. Returns a new `self.nrow*1` index matrix that stores the index of the maximum value of each row. * __void Matrix.mul_elem(Matrix self, Matrix ma, Matrix mb)__ Calculate element-wise multiplication of __Matrix__ `ma` and `mb`, store the result in `self`. * __void Matrix.log_elem(Matrix self, Matrix ma)__ |