aboutsummaryrefslogtreecommitdiff
path: root/nerv/examples/lmptb
Commit message (Collapse)AuthorAge
...
* applying dropout on lstm.h before combinerL seems bad, PPL only 95, trying ↵txh182015-12-03
| | | | another way(like in the paper) of dropout for lstm.h after combinerL
* added two layers to lstmlm_ptb, todo:test ittxh182015-12-03
|
* added al_sen_start stat for lmseqreadertxh182015-12-03
|
* moved tnn to main nerv dir and added it to Makefiletxh182015-12-03
|
* ...txh182015-12-03
|
* small bug fix in tnn for se_mode, todo:test ittxh182015-12-02
|
* added se_mode for lmseqreader, todo:check ittxh182015-12-02
|
* function name change in LMTrainertxh182015-12-02
|
* added dropout_t layertxh182015-12-02
|
* changed thres_mask function of matrix to a more standard apitxh182015-12-02
|
* got PPL115 for ptb on h300lr1bat10wc1e-4txh182015-12-01
|
* bug fix for lstm_t layer, t not inclueded in propagate!txh182015-11-30
|
* small opt for initing tnn:clip_ttxh182015-11-30
|
* added ooutputGate for lstm_ttxh182015-11-30
|
* added clip_t for tnntxh182015-11-27
|
* lstm_tnn can be run, todo:testingtxh182015-11-27
|
* still working..txh182015-11-26
|
* working on lstmtxh182015-11-26
|
* changed auto-generating params, won not save in global_conf.paramtxh182015-11-25
|
* let affine supported multiple inputstxh182015-11-24
|
* added wcost for biasparam in lm_trainertxh182015-11-24
|
* still working on dagL_Ttxh182015-11-24
|
* completed layerdag_t, now testing...txh182015-11-23
|
* completed gate_fff layertxh182015-11-23
|
* implementing GateFFF layertxh182015-11-23
|
* complete auto-generate paramstxh182015-11-20
|
* working on automatic parameter for layerstxh182015-11-20
|
* changed work_dir settingtxh182015-11-18
|
* h300 and h400 worked well, log addedtxh182015-11-18
|
* switch to kernel updatetxh182015-11-17
|
* bug fix for select_linear layer-by-layer updatetxh182015-11-17
|
* added atomicAdd for select_linear update, however, the result still seems ↵txh182015-11-17
| | | | unreproducable, I changed select_linear layer update back to line-by-line
* added small opt: use mmatrix in lm_trainer and readertxh182015-11-17
|
* coding style changetxh182015-11-17
|
* added LOG-tnn-h400 LOGtxh182015-11-16
|
* unified param updates, now direct_update is the same speed with undirect_updatetxh182015-11-16
|
* ...txh182015-11-16
|
* ...txh182015-11-16
|
* used os.clock() for timertxh182015-11-16
|
* fixed direct update, did not know the resulttxh182015-11-16
|
* added timertxh182015-11-15
|
* merge lr schedule changetxh182015-11-15
|\ | | | | | | Merge branch 'txh18/rnnlm' of github.com:Nerv-SJTU/nerv into txh18/rnnlm
| * got good PPL for H400...cloudygoose2015-11-15
| |
* | added msr_sc settxh182015-11-15
|/
* small bug: lr_halftxh182015-11-13
|
* ...txh182015-11-13
|
* added random seedtxh182015-11-13
|
* added loadstringtxh182015-11-13
|
* saving param file for every itertxh182015-11-13
|
* change ppl_net to ppl_alltxh182015-11-13
|