RNN test

来自cslt Wiki
2014年9月28日 (日) 11:15Lr讨论 | 贡献的版本

跳转至: 导航搜索

tool

  • LTSM/RNN training, GPU&deep supported [1]

paper

Steps

process dict and data

wsj Test

wsj_data

  • Data
  • size:200M,npdata
  • parameter
     rand_seed=1
     nwords=10000 # This is how many words we're putting in the vocab of the RNNLM.
     hidden=320
     class=300 # Num-classes... should be somewhat larger than sqrt of nwords.
     direct=2000 # Number of weights that are used for "direct" connections, in millions.
     rnnlm_ver=rnnlm-0.3e # version of RNNLM to use
     threads=1 # for RNNLM-HS
     bptt=2 # length of BPTT unfolding in RNNLM
     bptt_block=20 # length of BPTT unfolding in RNNLM
  • Train RNNLM set
Train Set Environment
Parameters hidden class direct bbt bptt_block threads direct-order rand_seed nwords time(min)
set1 320 300 2000 2 20 1 4 1 10000 3380(56h)

RNNLM Rescore

  • Acoustic Model
    • location: /nfs/disk/work/users/zhangzy/work/train_wsj_eng_new/data/train_si284
  • test set
    • location: /nfs/disk/work/users/zhangzy/work/train_wsj_eng_new/dt/test_eval92
    • decode: /nfs/disk/work/users/zhangzy/work/train_wsj_eng_new/exp/tri4b_dnn_org/decode_eval92_tri4b_dnn_org
  • Result
    • lm:4.16%,rnnlm:3.47%

chinese data

prepare data

  • now data
    • gigaword: /work2/xingchao/corpus/Chinese_corpus/gigaword
    • bing parallel corpus:/nfs/disk/work/users/xingchao/bing_dict
    • baidu:
    • sougou:
  • using data
    • sample gigword about 344M
    • dict:tencent11w
  • train set
Train Set Environment
Parameters hidden class direct bbt bptt_block threads direct-order rand_seed nwords time(min)
set1 320 300 2000 2 20 1 4 1 10000 3380(56h)