<?xml version="1.0"?>
<?xml-stylesheet type="text/css" href="http://index.cslt.org/mediawiki/skins/common/feed.css?303"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="zh-cn">
		<id>http://index.cslt.org/mediawiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Zhangjy</id>
		<title>cslt Wiki - 用户贡献 [zh-cn]</title>
		<link rel="self" type="application/atom+xml" href="http://index.cslt.org/mediawiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Zhangjy"/>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/%E7%89%B9%E6%AE%8A:%E7%94%A8%E6%88%B7%E8%B4%A1%E7%8C%AE/Zhangjy"/>
		<updated>2026-04-10T14:46:28Z</updated>
		<subtitle>用户贡献</subtitle>
		<generator>MediaWiki 1.23.3</generator>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-9-11</id>
		<title>NLP Status Report 2017-9-11</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-9-11"/>
				<updated>2017-09-11T07:19:37Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：以“{| class=&amp;quot;wikitable&amp;quot; !Date !! People !! Last Week !! This Week |- | rowspan=&amp;quot;6&amp;quot;|2017/8/14 |Jiyuan Zhang || ||   |- |Aodong LI ||  ||  |- |Shiyue Zhang ||   || |- |Sh...”为内容创建页面&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/8/14&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
|| &lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Aodong LI ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
|Shipan Ren &lt;br /&gt;
||  &lt;br /&gt;
||  &lt;br /&gt;
|-&lt;br /&gt;
&lt;br /&gt;
    &lt;br /&gt;
|Jiayu Guo||&lt;br /&gt;
&lt;br /&gt;
||  &lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-8-28</id>
		<title>NLP Status Report 2017-8-28</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-8-28"/>
				<updated>2017-08-28T06:35:51Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/8/14&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*code refactoring&lt;br /&gt;
*wrote a document[http://cslt.riit.tsinghua.edu.cn/mediawiki/index.php/%E6%96%87%E4%BB%B6:VvPoem.docx]&lt;br /&gt;
|| &lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Aodong LI ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Shipan Ren &lt;br /&gt;
||  &lt;br /&gt;
* read the released information of other toolkits for nmt&lt;br /&gt;
* cleaned up the code&lt;br /&gt;
* wrote the documents &lt;br /&gt;
||  &lt;br /&gt;
* write the papers of our baseline system&lt;br /&gt;
* read augmented nmt code&lt;br /&gt;
|-&lt;br /&gt;
&lt;br /&gt;
    &lt;br /&gt;
|Jiayu Guo||&lt;br /&gt;
* process data and run model;&lt;br /&gt;
* test results.&lt;br /&gt;
checkpoint-100000 translation model&lt;br /&gt;
BLEU： 11.11&lt;br /&gt;
&lt;br /&gt;
*source:在秦者名错，与张仪争论,於是惠王使错将伐蜀，遂拔，因而守之。&lt;br /&gt;
*target:在秦国的名叫司马错，曾与张仪发生争论，秦惠王采纳了他的意见，于是司马错率军攻蜀国，攻取后，又让他做了蜀地郡守。&lt;br /&gt;
*trans：当时秦国的人都很欣赏他的建议，与张仪一起商议，所以吴王派使者率军攻打蜀地，一举攻，接着又下令守城 。&lt;br /&gt;
*source:神大用则竭，形大劳则敝，形神离则死 。 &lt;br /&gt;
*target:精神过度使用就会衰竭，形体过度劳累就会疲惫，神形分离就会死亡。 &lt;br /&gt;
*trans: 精神过度就可衰竭,身体过度劳累就会疲惫，地形也就会死。&lt;br /&gt;
*source:今天子接千岁之统，封泰山，而余不得从行，是命也夫，命也夫！&lt;br /&gt;
*target:现天子继承汉朝千年一统的大业，在泰山举行封禅典礼而我不能随行，这是命啊，是命啊！ &lt;br /&gt;
*trans: 现在天子可以继承帝位的成就爵位，爵位至泰山，而我却未能执行先帝的命运。&lt;br /&gt;
*1.data used Zizhitongjian only(6,000 pairs), we can get BLEU 6 at most.&lt;br /&gt;
*2.data used Zizhitongjian only(12,000 pairs), we can get BLEU 7 at most.&lt;br /&gt;
*3.data used Shiji and Zizhitongjian(43,0000 pairs), we can get BLEU about 9.&lt;br /&gt;
*4.data used Shiji and Zizhitongjian(43,0000 pairs), and split the ancient language text one character by one, we can get BLEU 11.11 at most.&lt;br /&gt;
*The main factors now is the data(pairs of sentence/the quality——the modern language text includes context information).&lt;br /&gt;
||  &lt;br /&gt;
*plan to read source code of seq2seq model and learn tensorflow;&lt;br /&gt;
*plan to read a paper named Automatic Long Sentence Segmentation for NMT&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
|zhangshuai &lt;br /&gt;
||  &lt;br /&gt;
* learn model source code&lt;br /&gt;
||  &lt;br /&gt;
* learn tensorflow and source code&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-8-28</id>
		<title>NLP Status Report 2017-8-28</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-8-28"/>
				<updated>2017-08-28T06:35:31Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/8/14&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*code refactoring&lt;br /&gt;
*wrote a document[[http://cslt.riit.tsinghua.edu.cn/mediawiki/index.php/%E6%96%87%E4%BB%B6:VvPoem.docx]]&lt;br /&gt;
|| &lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Aodong LI ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Shipan Ren &lt;br /&gt;
||  &lt;br /&gt;
* read the released information of other toolkits for nmt&lt;br /&gt;
* cleaned up the code&lt;br /&gt;
* wrote the documents &lt;br /&gt;
||  &lt;br /&gt;
* write the papers of our baseline system&lt;br /&gt;
* read augmented nmt code&lt;br /&gt;
|-&lt;br /&gt;
&lt;br /&gt;
    &lt;br /&gt;
|Jiayu Guo||&lt;br /&gt;
* process data and run model;&lt;br /&gt;
* test results.&lt;br /&gt;
checkpoint-100000 translation model&lt;br /&gt;
BLEU： 11.11&lt;br /&gt;
&lt;br /&gt;
*source:在秦者名错，与张仪争论,於是惠王使错将伐蜀，遂拔，因而守之。&lt;br /&gt;
*target:在秦国的名叫司马错，曾与张仪发生争论，秦惠王采纳了他的意见，于是司马错率军攻蜀国，攻取后，又让他做了蜀地郡守。&lt;br /&gt;
*trans：当时秦国的人都很欣赏他的建议，与张仪一起商议，所以吴王派使者率军攻打蜀地，一举攻，接着又下令守城 。&lt;br /&gt;
*source:神大用则竭，形大劳则敝，形神离则死 。 &lt;br /&gt;
*target:精神过度使用就会衰竭，形体过度劳累就会疲惫，神形分离就会死亡。 &lt;br /&gt;
*trans: 精神过度就可衰竭,身体过度劳累就会疲惫，地形也就会死。&lt;br /&gt;
*source:今天子接千岁之统，封泰山，而余不得从行，是命也夫，命也夫！&lt;br /&gt;
*target:现天子继承汉朝千年一统的大业，在泰山举行封禅典礼而我不能随行，这是命啊，是命啊！ &lt;br /&gt;
*trans: 现在天子可以继承帝位的成就爵位，爵位至泰山，而我却未能执行先帝的命运。&lt;br /&gt;
*1.data used Zizhitongjian only(6,000 pairs), we can get BLEU 6 at most.&lt;br /&gt;
*2.data used Zizhitongjian only(12,000 pairs), we can get BLEU 7 at most.&lt;br /&gt;
*3.data used Shiji and Zizhitongjian(43,0000 pairs), we can get BLEU about 9.&lt;br /&gt;
*4.data used Shiji and Zizhitongjian(43,0000 pairs), and split the ancient language text one character by one, we can get BLEU 11.11 at most.&lt;br /&gt;
*The main factors now is the data(pairs of sentence/the quality——the modern language text includes context information).&lt;br /&gt;
||  &lt;br /&gt;
*plan to read source code of seq2seq model and learn tensorflow;&lt;br /&gt;
*plan to read a paper named Automatic Long Sentence Segmentation for NMT&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
|zhangshuai &lt;br /&gt;
||  &lt;br /&gt;
* learn model source code&lt;br /&gt;
||  &lt;br /&gt;
* learn tensorflow and source code&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/%E6%96%87%E4%BB%B6:VvPoem.docx</id>
		<title>文件:VvPoem.docx</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/%E6%96%87%E4%BB%B6:VvPoem.docx"/>
				<updated>2017-08-28T06:21:54Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-8-28</id>
		<title>NLP Status Report 2017-8-28</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-8-28"/>
				<updated>2017-08-28T06:18:48Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/8/14&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*code refactoring&lt;br /&gt;
*wrote a document&lt;br /&gt;
|| &lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Aodong LI ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Shipan Ren ||&lt;br /&gt;
* organized all the experimental results(our baseline system,Moses,THUMT) &lt;br /&gt;
* trained and tested translation models（Toolkit:THUMT ）&lt;br /&gt;
* compared with our system&lt;br /&gt;
||&lt;br /&gt;
* prepare to release the baseline system（tensorflow1.0 version）&lt;br /&gt;
|-&lt;br /&gt;
    &lt;br /&gt;
|Jiayu Guo||&lt;br /&gt;
* process data and run model;&lt;br /&gt;
* test results.&lt;br /&gt;
checkpoint-100000 translation model&lt;br /&gt;
BLEU： 11.11&lt;br /&gt;
&lt;br /&gt;
*source:在秦者名错，与张仪争论,於是惠王使错将伐蜀，遂拔，因而守之。&lt;br /&gt;
*target:在秦国的名叫司马错，曾与张仪发生争论，秦惠王采纳了他的意见，于是司马错率军攻蜀国，攻取后，又让他做了蜀地郡守。&lt;br /&gt;
*trans：当时秦国的人都很欣赏他的建议，与张仪一起商议，所以吴王派使者率军攻打蜀地，一举攻，接着又下令守城 。&lt;br /&gt;
*source:神大用则竭，形大劳则敝，形神离则死 。 &lt;br /&gt;
*target:精神过度使用就会衰竭，形体过度劳累就会疲惫，神形分离就会死亡。 &lt;br /&gt;
*trans: 精神过度就可衰竭,身体过度劳累就会疲惫，地形也就会死。&lt;br /&gt;
*source:今天子接千岁之统，封泰山，而余不得从行，是命也夫，命也夫！&lt;br /&gt;
*target:现天子继承汉朝千年一统的大业，在泰山举行封禅典礼而我不能随行，这是命啊，是命啊！ &lt;br /&gt;
*trans: 现在天子可以继承帝位的成就爵位，爵位至泰山，而我却未能执行先帝的命运。&lt;br /&gt;
*1.data used Zizhitongjian only(6,000 pairs), we can get BLEU 6 at most.&lt;br /&gt;
*2.data used Zizhitongjian only(12,000 pairs), we can get BLEU 7 at most.&lt;br /&gt;
*3.data used Shiji and Zizhitongjian(43,0000 pairs), we can get BLEU about 9.&lt;br /&gt;
*4.data used Shiji and Zizhitongjian(43,0000 pairs), and split the ancient language text one character by one, we can get BLEU 11.11 at most.&lt;br /&gt;
*The main factors now is the data(pairs of sentence/the quality——the modern language text includes context information).&lt;br /&gt;
||  &lt;br /&gt;
*plan to read source code of seq2seq model and learn tensorflow;&lt;br /&gt;
*plan to read a paper named Automatic Long Sentence Segmentation for NMT&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
|zhangshuai &lt;br /&gt;
||  &lt;br /&gt;
* learn model source code&lt;br /&gt;
||  &lt;br /&gt;
* learn tensorflow and source code&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-8-28</id>
		<title>NLP Status Report 2017-8-28</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-8-28"/>
				<updated>2017-08-28T06:13:55Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/8/14&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*done some work about code refactoring for poem system &lt;br /&gt;
|| &lt;br /&gt;
*plan to complete code refactoring for poem system&lt;br /&gt;
|-&lt;br /&gt;
|Aodong LI ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Shipan Ren ||&lt;br /&gt;
* organized all the experimental results(our baseline system,Moses,THUMT) &lt;br /&gt;
* trained and tested translation models（Toolkit:THUMT ）&lt;br /&gt;
* compared with our system&lt;br /&gt;
||&lt;br /&gt;
* prepare to release the baseline system（tensorflow1.0 version）&lt;br /&gt;
|-&lt;br /&gt;
    &lt;br /&gt;
|Jiayu Guo||&lt;br /&gt;
* process data and run model;&lt;br /&gt;
* test results.&lt;br /&gt;
checkpoint-100000 translation model&lt;br /&gt;
BLEU： 11.11&lt;br /&gt;
&lt;br /&gt;
*source:在秦者名错，与张仪争论,於是惠王使错将伐蜀，遂拔，因而守之。&lt;br /&gt;
*target:在秦国的名叫司马错，曾与张仪发生争论，秦惠王采纳了他的意见，于是司马错率军攻蜀国，攻取后，又让他做了蜀地郡守。&lt;br /&gt;
*trans：当时秦国的人都很欣赏他的建议，与张仪一起商议，所以吴王派使者率军攻打蜀地，一举攻，接着又下令守城 。&lt;br /&gt;
*source:神大用则竭，形大劳则敝，形神离则死 。 &lt;br /&gt;
*target:精神过度使用就会衰竭，形体过度劳累就会疲惫，神形分离就会死亡。 &lt;br /&gt;
*trans: 精神过度就可衰竭,身体过度劳累就会疲惫，地形也就会死。&lt;br /&gt;
*source:今天子接千岁之统，封泰山，而余不得从行，是命也夫，命也夫！&lt;br /&gt;
*target:现天子继承汉朝千年一统的大业，在泰山举行封禅典礼而我不能随行，这是命啊，是命啊！ &lt;br /&gt;
*trans: 现在天子可以继承帝位的成就爵位，爵位至泰山，而我却未能执行先帝的命运。&lt;br /&gt;
*1.data used Zizhitongjian only(6,000 pairs), we can get BLEU 6 at most.&lt;br /&gt;
*2.data used Zizhitongjian only(12,000 pairs), we can get BLEU 7 at most.&lt;br /&gt;
*3.data used Shiji and Zizhitongjian(43,0000 pairs), we can get BLEU about 9.&lt;br /&gt;
*4.data used Shiji and Zizhitongjian(43,0000 pairs), and split the ancient language text one character by one, we can get BLEU 11.11 at most.&lt;br /&gt;
*The main factors now is the data(pairs of sentence/the quality——the modern language text includes context information).&lt;br /&gt;
||  &lt;br /&gt;
*plan to read source code of seq2seq model and learn tensorflow;&lt;br /&gt;
*plan to read a paper named Automatic Long Sentence Segmentation for NMT&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
|zhangshuai &lt;br /&gt;
||  &lt;br /&gt;
* learn model source code&lt;br /&gt;
||  &lt;br /&gt;
* learn tensorflow and source code&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-8-21</id>
		<title>NLP Status Report 2017-8-21</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-8-21"/>
				<updated>2017-08-21T05:13:42Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/8/14&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*done some work about code refactoring for poem system &lt;br /&gt;
|| &lt;br /&gt;
*plan to complete code refactoring for poem system&lt;br /&gt;
|-&lt;br /&gt;
|Aodong LI ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Shipan Ren ||&lt;br /&gt;
* organized all the experimental results(our baseline system,Moses,THUMT) &lt;br /&gt;
* train translation models by using THUMT &lt;br /&gt;
* test the bleu of these models&lt;br /&gt;
* compare with our system&lt;br /&gt;
||&lt;br /&gt;
* prepare to release the baseline system（tensorflow1.0 version）&lt;br /&gt;
|-&lt;br /&gt;
    &lt;br /&gt;
|Jiayu Guo||&lt;br /&gt;
* process data and run model;&lt;br /&gt;
* test results.&lt;br /&gt;
checkpoint-100000 translation model&lt;br /&gt;
BLEU： 11.11&lt;br /&gt;
&lt;br /&gt;
*source:在秦者名错，与张仪争论,於是惠王使错将伐蜀，遂拔，因而守之。&lt;br /&gt;
*target:在秦国的名叫司马错，曾与张仪发生争论，秦惠王采纳了他的意见，于是司马错率军攻蜀国，攻取后，又让他做了蜀地郡守。&lt;br /&gt;
*trans：当时秦国的人都很欣赏他的建议，与张仪一起商议，所以吴王派使者率军攻打蜀地，一举攻，接着又下令守城 。&lt;br /&gt;
*source:神大用则竭，形大劳则敝，形神离则死 。 &lt;br /&gt;
*target:精神过度使用就会衰竭，形体过度劳累就会疲惫，神形分离就会死亡。 &lt;br /&gt;
*trans: 精神过度就可衰竭,身体过度劳累就会疲惫，地形也就会死。&lt;br /&gt;
*source:今天子接千岁之统，封泰山，而余不得从行，是命也夫，命也夫！&lt;br /&gt;
*target:现天子继承汉朝千年一统的大业，在泰山举行封禅典礼而我不能随行，这是命啊，是命啊！ &lt;br /&gt;
*trans: 现在天子可以继承帝位的成就爵位，爵位至泰山，而我却未能执行先帝的命运。&lt;br /&gt;
&lt;br /&gt;
||  &lt;br /&gt;
* read source code of seq2seq model;&lt;br /&gt;
*1.data used Zizhitongjian only(6,000 pairs), we can get BLEU 6 at most.&lt;br /&gt;
*2.data used Zizhitongjian only(12,000 pairs), we can get BLEU 7 at most.&lt;br /&gt;
*3.data used Shiji and Zizhitongjian(43,0000 pairs), we can get BLEU about 9.&lt;br /&gt;
*4.data used Shiji and Zizhitongjian(43,0000 pairs), and split the ancient language text one character by one, we can get BLEU 11.11 at most.&lt;br /&gt;
*The main factors now is the data(including pairs of sentence、the quality——cause the modern language text include context information.&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-8-21</id>
		<title>NLP Status Report 2017-8-21</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-8-21"/>
				<updated>2017-08-21T05:04:48Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/8/14&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*done work about code refactoring for poem system &lt;br /&gt;
|| &lt;br /&gt;
*plan to complete code refactoring for poem system&lt;br /&gt;
|-&lt;br /&gt;
|Aodong LI ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Shipan Ren ||&lt;br /&gt;
* organized the results of the experiment&lt;br /&gt;
* learned how to use THUMT and how did it work&lt;br /&gt;
||&lt;br /&gt;
* train translation models by using THUMT&lt;br /&gt;
* test the bleu of these models&lt;br /&gt;
* compare with our system&lt;br /&gt;
|-&lt;br /&gt;
    &lt;br /&gt;
|Jiayu Guo||&lt;br /&gt;
* process data and run model;&lt;br /&gt;
* test results.&lt;br /&gt;
checkpoint-100000 translation model&lt;br /&gt;
BLEU： 11.11&lt;br /&gt;
&lt;br /&gt;
*source:在秦者名错，与张仪争论,於是惠王使错将伐蜀，遂拔，因而守之。&lt;br /&gt;
*target:在秦国的名叫司马错，曾与张仪发生争论，秦惠王采纳了他的意见，于是司马错率军攻蜀国，攻取后，又让他做了蜀地郡守。&lt;br /&gt;
*trans：当时秦国的人都很欣赏他的建议，与张仪一起商议，所以吴王派使者率军攻打蜀地，一举攻，接着又下令守城 。&lt;br /&gt;
*source:神大用则竭，形大劳则敝，形神离则死 。 &lt;br /&gt;
*target:精神过度使用就会衰竭，形体过度劳累就会疲惫，神形分离就会死亡。 &lt;br /&gt;
*trans: 精神过度就可衰竭,身体过度劳累就会疲惫，地形也就会死。&lt;br /&gt;
*source:今天子接千岁之统，封泰山，而余不得从行，是命也夫，命也夫！&lt;br /&gt;
*target:现天子继承汉朝千年一统的大业，在泰山举行封禅典礼而我不能随行，这是命啊，是命啊！ &lt;br /&gt;
*trans: 现在天子可以继承帝位的成就爵位，爵位至泰山，而我却未能执行先帝的命运。&lt;br /&gt;
&lt;br /&gt;
||  &lt;br /&gt;
* read source code of seq2seq model;&lt;br /&gt;
*1.data used Zizhitongjian only(6,000 pairs), we can get BLEU 6 at most.&lt;br /&gt;
*2.data used Zizhitongjian only(12,000 pairs), we can get BLEU 7 at most.&lt;br /&gt;
*3.data used Shiji and Zizhitongjian(43,0000 pairs), we can get BLEU about 9.&lt;br /&gt;
*4.data used Shiji and Zizhitongjian(43,0000 pairs), and split the ancient language text one character by one, we can get BLEU 11.11 at most.&lt;br /&gt;
*The main factors now is the data(including pairs of sentence、the quality——cause the modern language text include context information.&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-8-21</id>
		<title>NLP Status Report 2017-8-21</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-8-21"/>
				<updated>2017-08-21T05:03:10Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/8/14&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*polished the couplet model &lt;br /&gt;
|| &lt;br /&gt;
Code refactoring for poem system&lt;br /&gt;
|-&lt;br /&gt;
|Aodong LI ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Shipan Ren ||&lt;br /&gt;
* organized the results of the experiment&lt;br /&gt;
* learned how to use THUMT and how did it work&lt;br /&gt;
||&lt;br /&gt;
* train translation models by using THUMT&lt;br /&gt;
* test the bleu of these models&lt;br /&gt;
* compare with our system&lt;br /&gt;
|-&lt;br /&gt;
    &lt;br /&gt;
|Jiayu Guo||&lt;br /&gt;
* process data and run model;&lt;br /&gt;
* test results.&lt;br /&gt;
checkpoint-100000 translation model&lt;br /&gt;
BLEU： 11.11&lt;br /&gt;
&lt;br /&gt;
*source:在秦者名错，与张仪争论,於是惠王使错将伐蜀，遂拔，因而守之。&lt;br /&gt;
*target:在秦国的名叫司马错，曾与张仪发生争论，秦惠王采纳了他的意见，于是司马错率军攻蜀国，攻取后，又让他做了蜀地郡守。&lt;br /&gt;
*trans：当时秦国的人都很欣赏他的建议，与张仪一起商议，所以吴王派使者率军攻打蜀地，一举攻，接着又下令守城 。&lt;br /&gt;
*source:神大用则竭，形大劳则敝，形神离则死 。 &lt;br /&gt;
*target:精神过度使用就会衰竭，形体过度劳累就会疲惫，神形分离就会死亡。 &lt;br /&gt;
*trans: 精神过度就可衰竭,身体过度劳累就会疲惫，地形也就会死。&lt;br /&gt;
*source:今天子接千岁之统，封泰山，而余不得从行，是命也夫，命也夫！&lt;br /&gt;
*target:现天子继承汉朝千年一统的大业，在泰山举行封禅典礼而我不能随行，这是命啊，是命啊！ &lt;br /&gt;
*trans: 现在天子可以继承帝位的成就爵位，爵位至泰山，而我却未能执行先帝的命运。&lt;br /&gt;
&lt;br /&gt;
||  &lt;br /&gt;
* read source code of seq2seq model;&lt;br /&gt;
*1.data used Zizhitongjian only(6,000 pairs), we can get BLEU 6 at most.&lt;br /&gt;
*2.data used Zizhitongjian only(12,000 pairs), we can get BLEU 7 at most.&lt;br /&gt;
*3.data used Shiji and Zizhitongjian(43,0000 pairs), we can get BLEU about 9.&lt;br /&gt;
*4.data used Shiji and Zizhitongjian(43,0000 pairs), and split the ancient language text one character by one, we can get BLEU 11.11 at most.&lt;br /&gt;
*The main factors now is the data(including pairs of sentence、the quality——cause the modern language text include context information.&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-8-14</id>
		<title>NLP Status Report 2017-8-14</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-8-14"/>
				<updated>2017-08-14T06:15:17Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/7/3&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*polished the couplet model &lt;br /&gt;
||&lt;br /&gt;
Code refactoring for poem system&lt;br /&gt;
|-&lt;br /&gt;
|Aodong LI ||&lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
|Shipan Ren ||&lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
|Jiayu Guo||&lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
&lt;br /&gt;
}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-8-14</id>
		<title>NLP Status Report 2017-8-14</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-8-14"/>
				<updated>2017-08-14T06:14:14Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/7/3&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*polished the couplet model &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Aodong LI ||&lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
|Shipan Ren ||&lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
|Jiayu Guo||&lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
&lt;br /&gt;
}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-8-14</id>
		<title>NLP Status Report 2017-8-14</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-8-14"/>
				<updated>2017-08-14T06:11:35Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/7/3&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*polished the couplet model &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Aodong LI ||&lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
|Shipan Ren ||&lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
|Jiayu Guo||&lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-8-14</id>
		<title>NLP Status Report 2017-8-14</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-8-14"/>
				<updated>2017-08-14T06:10:56Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/7/3&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*polished the couplet model &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Aodong LI ||&lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
|Shipan Ren ||&lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
|Jiayu Guo||&lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
||&lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-8-14</id>
		<title>NLP Status Report 2017-8-14</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-8-14"/>
				<updated>2017-08-14T06:10:28Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：以“{| class=&amp;quot;wikitable&amp;quot; !Date !! People !! Last Week !! This Week |- | rowspan=&amp;quot;6&amp;quot;|2017/7/3 |Jiyuan Zhang || *polished the couplet model  ||  |- |Aodong LI || || |- |Sh...”为内容创建页面&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/7/3&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*polished the couplet model &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Aodong LI ||&lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
|Shipan Ren ||&lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
|Jiayu Guo||&lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
||&lt;br /&gt;
||&lt;br /&gt;
}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-8-7</id>
		<title>NLP Status Report 2017-8-7</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-8-7"/>
				<updated>2017-08-07T04:29:08Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/7/3&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*generated streame according to a couplet&lt;br /&gt;
*almost completed the task of filling in the blanks of a couplet&lt;br /&gt;
|| &lt;br /&gt;
*continue to perfect the couplet model&lt;br /&gt;
|-&lt;br /&gt;
|Aodong LI ||&lt;br /&gt;
* Got 55,000+ Englsih poems and 260,000+ lines after preprocessing&lt;br /&gt;
* Added phase separators as the style indicator, and every line has at least one separator&lt;br /&gt;
* Training loss didn't decrease very much, only from 440 to 50&lt;br /&gt;
* The translation quality deteriorated when added language model&lt;br /&gt;
||&lt;br /&gt;
* Try to use a larger language model to decrease the training loss&lt;br /&gt;
* Try to use character-based MT in English-Chinese translation&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Shipan Ren ||&lt;br /&gt;
* looked for the performance(the bleu value) of other models &lt;br /&gt;
  on the WMT2014 dataset from the published papers,but not found.&lt;br /&gt;
* installed and built Moses on the server   &lt;br /&gt;
||&lt;br /&gt;
* train statistical machine translation model and test it &lt;br /&gt;
  toolkit: Moses&lt;br /&gt;
  data sets:WMT2014 en-de、en-fr data sets&lt;br /&gt;
* collate experimental results.compare our baseline model with Moses &lt;br /&gt;
|-&lt;br /&gt;
    &lt;br /&gt;
|Jiayu Guo||&lt;br /&gt;
*process document.Until now, Shiji has been split up to 2,4000 pairs of sentence.&lt;br /&gt;
*Zizhitongjian has been split up to 1,6000 pairs.&lt;br /&gt;
||&lt;br /&gt;
*adjust jieba source code, in order to make jieba more accurate for ancient language wordpiece&lt;br /&gt;
*read model source code&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-8-7</id>
		<title>NLP Status Report 2017-8-7</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-8-7"/>
				<updated>2017-08-07T04:27:54Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：以“{| class=&amp;quot;wikitable&amp;quot; !Date !! People !! Last Week !! This Week |- | rowspan=&amp;quot;6&amp;quot;|2017/7/3 |Jiyuan Zhang || *made the poster for ACL [http://cslt.riit.tsinghua.edu.cn/...”为内容创建页面&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/7/3&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*made the poster for ACL [http://cslt.riit.tsinghua.edu.cn/mediawiki/images/9/95/Acl2017-poster.pdf]&lt;br /&gt;
*attempted to fix repeated word, but failed&lt;br /&gt;
*done some work of n-gram model of the couplet&lt;br /&gt;
|| &lt;br /&gt;
*generate streame according to a couplet&lt;br /&gt;
*complete the task of filling in the blanks of a couplet&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Aodong LI ||&lt;br /&gt;
* Got 55,000+ Englsih poems and 260,000+ lines after preprocessing&lt;br /&gt;
* Added phase separators as the style indicator, and every line has at least one separator&lt;br /&gt;
* Training loss didn't decrease very much, only from 440 to 50&lt;br /&gt;
* The translation quality deteriorated when added language model&lt;br /&gt;
||&lt;br /&gt;
* Try to use a larger language model to decrease the training loss&lt;br /&gt;
* Try to use character-based MT in English-Chinese translation&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Shipan Ren ||&lt;br /&gt;
* looked for the performance(the bleu value) of other models &lt;br /&gt;
  on the WMT2014 dataset from the published papers,but not found.&lt;br /&gt;
* installed and built Moses on the server   &lt;br /&gt;
||&lt;br /&gt;
* train statistical machine translation model and test it &lt;br /&gt;
  toolkit: Moses&lt;br /&gt;
  data sets:WMT2014 en-de、en-fr data sets&lt;br /&gt;
* collate experimental results.compare our baseline model with Moses &lt;br /&gt;
|-&lt;br /&gt;
    &lt;br /&gt;
|Jiayu Guo||&lt;br /&gt;
*process document.Until now, Shiji has been split up to 2,4000 pairs of sentence.&lt;br /&gt;
*Zizhitongjian has been split up to 1,6000 pairs.&lt;br /&gt;
||&lt;br /&gt;
*adjust jieba source code, in order to make jieba more accurate for ancient language wordpiece&lt;br /&gt;
*read model source code&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-7-31</id>
		<title>NLP Status Report 2017-7-31</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-7-31"/>
				<updated>2017-07-31T06:41:29Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/7/3&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*made the poster for ACL [http://cslt.riit.tsinghua.edu.cn/mediawiki/images/9/95/Acl2017-poster.pdf]&lt;br /&gt;
*attempted to fix repeated word, but failed&lt;br /&gt;
*done some work of n-gram model of the couplet&lt;br /&gt;
|| &lt;br /&gt;
*generate streame according to a couplet&lt;br /&gt;
*complete the task of filling in the blanks of a couplet&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Aodong LI ||&lt;br /&gt;
* Got 55,000+ Englsih poems and 260,000+ lines after preprocessing&lt;br /&gt;
* Added phase separators as the style indicator, and every line has at least one separator&lt;br /&gt;
* Training loss didn't decrease very much, only from 440 to 50&lt;br /&gt;
* The translation quality deteriorated when added language model&lt;br /&gt;
||&lt;br /&gt;
* Try to use a larger language model to decrease the training loss&lt;br /&gt;
* Try to use character-based MT in English-Chinese translation&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Shipan Ren ||&lt;br /&gt;
* looked for the performance(the bleu value) of other models &lt;br /&gt;
  on the WMT2014 dataset from the published papers,but not found.&lt;br /&gt;
* installed and built Moses on the server   &lt;br /&gt;
||&lt;br /&gt;
* train statistical machine translation model and test it &lt;br /&gt;
  toolkit: Moses&lt;br /&gt;
  data sets:WMT2014 en-de、en-fr data sets&lt;br /&gt;
* collate experimental results.compare our baseline model with Moses &lt;br /&gt;
|-&lt;br /&gt;
    &lt;br /&gt;
|Jiayu Guo||&lt;br /&gt;
*process document.Until now, Shiji has been split up to 2,4000 pairs of sentence.&lt;br /&gt;
*Zizhitongjian has been split up to 1,6000 pairs.&lt;br /&gt;
||&lt;br /&gt;
*adjust jieba source code, in order to make jieba more accurate for ancient language wordpiece&lt;br /&gt;
*read model source code&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-7-31</id>
		<title>NLP Status Report 2017-7-31</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-7-31"/>
				<updated>2017-07-31T06:41:08Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/7/3&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*made the poster for ACL [[http://cslt.riit.tsinghua.edu.cn/mediawiki/images/9/95/Acl2017-poster.pdf]]&lt;br /&gt;
*attempted to fix repeated word, but failed&lt;br /&gt;
*done some work of n-gram model of the couplet&lt;br /&gt;
|| &lt;br /&gt;
*generate streame according to a couplet&lt;br /&gt;
*complete the task of filling in the blanks of a couplet&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Aodong LI ||&lt;br /&gt;
* Got 55,000+ Englsih poems and 260,000+ lines after preprocessing&lt;br /&gt;
* Added phase separators as the style indicator, and every line has at least one separator&lt;br /&gt;
* Training loss didn't decrease very much, only from 440 to 50&lt;br /&gt;
* The translation quality deteriorated when added language model&lt;br /&gt;
||&lt;br /&gt;
* Try to use a larger language model to decrease the training loss&lt;br /&gt;
* Try to use character-based MT in English-Chinese translation&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Shipan Ren ||&lt;br /&gt;
* looked for the performance(the bleu value) of other models &lt;br /&gt;
  on the WMT2014 dataset from the published papers,but not found.&lt;br /&gt;
* installed and built Moses on the server   &lt;br /&gt;
||&lt;br /&gt;
* train statistical machine translation model and test it &lt;br /&gt;
  toolkit: Moses&lt;br /&gt;
  data sets:WMT2014 en-de、en-fr data sets&lt;br /&gt;
* collate experimental results.compare our baseline model with Moses &lt;br /&gt;
|-&lt;br /&gt;
    &lt;br /&gt;
|Jiayu Guo||&lt;br /&gt;
*process document.Until now, Shiji has been split up to 2,4000 pairs of sentence.&lt;br /&gt;
*Zizhitongjian has been split up to 1,6000 pairs.&lt;br /&gt;
||&lt;br /&gt;
*adjust jieba source code, in order to make jieba more accurate for ancient language wordpiece&lt;br /&gt;
*read model source code&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/%E6%96%87%E4%BB%B6:Acl2017-poster.pdf</id>
		<title>文件:Acl2017-poster.pdf</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/%E6%96%87%E4%BB%B6:Acl2017-poster.pdf"/>
				<updated>2017-07-31T06:40:28Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-7-31</id>
		<title>NLP Status Report 2017-7-31</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-7-31"/>
				<updated>2017-07-31T04:38:52Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/7/3&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*made the poster for ACL&lt;br /&gt;
*attempted to fix repeated word, but failed&lt;br /&gt;
*done some work of n-gram model of the couplet&lt;br /&gt;
|| &lt;br /&gt;
*generate streame according to a couplet&lt;br /&gt;
*complete the task of filling in the blanks of a couplet&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Aodong LI ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Shipan Ren ||&lt;br /&gt;
* trained two models of the baseline using WMT2014 en-fr datasets&lt;br /&gt;
  under training &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
* read some papers（memory-augmented-nmt and Memory augmented Chinese-Uyghur Neural Machine Translation）   &lt;br /&gt;
||&lt;br /&gt;
* read memory-augmented-nmt code&lt;br /&gt;
* read papers about memory augmented NMT &lt;br /&gt;
|-&lt;br /&gt;
    &lt;br /&gt;
|Jiayu Guo||&lt;br /&gt;
*Until now, Shiji has been split up to 2,5000 pairs of sentence, Zizhitongjian has been split up to 2,0000 pairs.&lt;br /&gt;
*&lt;br /&gt;
||&lt;br /&gt;
*process document&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-7-31</id>
		<title>NLP Status Report 2017-7-31</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-7-31"/>
				<updated>2017-07-31T04:36:04Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/7/3&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*made the poster for ACL&lt;br /&gt;
*attempted to fix repeated word, but failed&lt;br /&gt;
*done some work of n-gram model of the couplet&lt;br /&gt;
|| &lt;br /&gt;
*complete &lt;br /&gt;
|-&lt;br /&gt;
|Aodong LI ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Shipan Ren ||&lt;br /&gt;
* trained two models of the baseline using WMT2014 en-fr datasets&lt;br /&gt;
  under training &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
* read some papers（memory-augmented-nmt and Memory augmented Chinese-Uyghur Neural Machine Translation）   &lt;br /&gt;
||&lt;br /&gt;
* read memory-augmented-nmt code&lt;br /&gt;
* read papers about memory augmented NMT &lt;br /&gt;
|-&lt;br /&gt;
    &lt;br /&gt;
|Jiayu Guo||&lt;br /&gt;
*Until now, Shiji has been split up to 2,5000 pairs of sentence, Zizhitongjian has been split up to 2,0000 pairs.&lt;br /&gt;
*&lt;br /&gt;
||&lt;br /&gt;
*process document&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-7-24</id>
		<title>NLP Status Report 2017-7-24</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-7-24"/>
				<updated>2017-07-26T02:41:06Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：以“{| class=&amp;quot;wikitable&amp;quot; !Date !! People !! Last Week !! This Week |- | rowspan=&amp;quot;6&amp;quot;|2017/7/3 |Jiyuan Zhang || * ||  *make the poster for ACL *complete neural model for t...”为内容创建页面&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/7/3&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*&lt;br /&gt;
|| &lt;br /&gt;
*make the poster for ACL&lt;br /&gt;
*complete neural model for the couplet&lt;br /&gt;
|-&lt;br /&gt;
|Aodong LI ||&lt;br /&gt;
*&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
*&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Shipan Ren ||&lt;br /&gt;
* found ways to tokenize the WMT2014 data &lt;br /&gt;
   rewrote prepare_data.py form moses-smt&lt;br /&gt;
   used the tokenizer of moses-smt&lt;br /&gt;
&lt;br /&gt;
*train two versions of the code on WMT2014 en-de and en-fr datasets&lt;br /&gt;
   tested these checkpoints of en-de dataset&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* tested these checkpoints of en-fr dataset&lt;br /&gt;
* record the result and do analysis &lt;br /&gt;
* read papers about memory augmented NMT &lt;br /&gt;
|-&lt;br /&gt;
    &lt;br /&gt;
&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-7-17</id>
		<title>NLP Status Report 2017-7-17</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-7-17"/>
				<updated>2017-07-18T04:32:21Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/7/3&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*&lt;br /&gt;
|| &lt;br /&gt;
*generate streame according to a couplet  &lt;br /&gt;
*try my best to complete the task of filling in the blanks of a couplet&lt;br /&gt;
|-&lt;br /&gt;
|Aodong LI ||&lt;br /&gt;
*&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
*&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Shipan Ren ||&lt;br /&gt;
* found ways to tokenize the WMT2014 data &lt;br /&gt;
   rewrote prepare_data.py form moses-smt&lt;br /&gt;
   used the tokenizer of moses-smt&lt;br /&gt;
&lt;br /&gt;
*train two versions of the code on WMT2014 en-de and en-fr datasets&lt;br /&gt;
   tested these checkpoints of en-de dataset&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* tested these checkpoints of en-fr dataset&lt;br /&gt;
* record the result and do analysis &lt;br /&gt;
* read papers about memory augmented NMT &lt;br /&gt;
|-&lt;br /&gt;
    &lt;br /&gt;
&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-7-10</id>
		<title>NLP Status Report 2017-7-10</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-7-10"/>
				<updated>2017-07-10T04:48:29Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/7/3&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*reproduced the couplet model using moses&lt;br /&gt;
|| &lt;br /&gt;
*continue to modify the couplet&lt;br /&gt;
|-&lt;br /&gt;
|Aodong LI ||&lt;br /&gt;
* Tried seq2seq with or without attention model to do style transfer (cross domain) task but this didn't work due to overfitting&lt;br /&gt;
  seq2seq with attention model: Chinese-to-English&lt;br /&gt;
  vanilla seq2seq model: English-to-English (Unsupervised)&lt;br /&gt;
* Read two style controlled papers in generative model field&lt;br /&gt;
* Trained seq2seq with style code model&lt;br /&gt;
||&lt;br /&gt;
* Understand the model and mechanism mentioned in the two related papers&lt;br /&gt;
* Figure out new ways to do style transfer task&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Shipan Ren ||&lt;br /&gt;
* read and run ViVi_NMT code &lt;br /&gt;
* read the API of tensorflow &lt;br /&gt;
* debugged ViVi_NMT and  upgraded code version to tensorflow1.0 &lt;br /&gt;
* found the new version saves more time，has lower complexity and better bleu than before &lt;br /&gt;
||&lt;br /&gt;
* test two versions of the code on small data sets (Chinese-English) and large data sets (Chinese-English) respectively&lt;br /&gt;
* test two versions of the code on WMT 2014 English-to-German parallel dataset and WMT 2014 English-French dataset respectively&lt;br /&gt;
* record experimental results&lt;br /&gt;
* read paper and try to make the bleu become a little better&lt;br /&gt;
|-&lt;br /&gt;
    &lt;br /&gt;
&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-7-10</id>
		<title>NLP Status Report 2017-7-10</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-7-10"/>
				<updated>2017-07-10T04:41:28Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：以“{| class=&amp;quot;wikitable&amp;quot; !Date !! People !! Last Week !! This Week |- | rowspan=&amp;quot;6&amp;quot;|2017/7/3 |Jiyuan Zhang || ||  |- |Aodong LI || * Tried seq2seq with or without attent...”为内容创建页面&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/7/3&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
|| &lt;br /&gt;
|-&lt;br /&gt;
|Aodong LI ||&lt;br /&gt;
* Tried seq2seq with or without attention model to do style transfer (cross domain) task but this didn't work due to overfitting&lt;br /&gt;
  seq2seq with attention model: Chinese-to-English&lt;br /&gt;
  vanilla seq2seq model: English-to-English (Unsupervised)&lt;br /&gt;
* Read two style controlled papers in generative model field&lt;br /&gt;
* Trained seq2seq with style code model&lt;br /&gt;
||&lt;br /&gt;
* Understand the model and mechanism mentioned in the two related papers&lt;br /&gt;
* Figure out new ways to do style transfer task&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Shipan Ren ||&lt;br /&gt;
* read and run ViVi_NMT code &lt;br /&gt;
* read the API of tensorflow &lt;br /&gt;
* debugged ViVi_NMT and  upgraded code version to tensorflow1.0 &lt;br /&gt;
* found the new version saves more time，has lower complexity and better bleu than before &lt;br /&gt;
||&lt;br /&gt;
* test two versions of the code on small data sets (Chinese-English) and large data sets (Chinese-English) respectively&lt;br /&gt;
* test two versions of the code on WMT 2014 English-to-German parallel dataset and WMT 2014 English-French dataset respectively&lt;br /&gt;
* record experimental results&lt;br /&gt;
* read paper and try to make the bleu become a little better&lt;br /&gt;
|-&lt;br /&gt;
    &lt;br /&gt;
&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/Weekly_meeting</id>
		<title>Weekly meeting</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/Weekly_meeting"/>
				<updated>2017-06-26T09:41:23Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*Location: FIT-1-304&lt;br /&gt;
*Time: Monday, 7:00 PM&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! Date !! Speaker!! Title !! Materials !! On duty&lt;br /&gt;
|-&lt;br /&gt;
| 2012/08/27  ||Dong Wang  || Heterogeneous Convolutive Non-negative Sparse Coding ||[[媒体文件:Heterogeneous_convolutive_non-negative_sparse_coding.pdf|slides]] [http://homepages.inf.ed.ac.uk/v1dwang2/public/pdf/inerspeech2012-hetero.pdf paper] ||&lt;br /&gt;
|-&lt;br /&gt;
|2012/09/03  ||NO Meeting|| || ||&lt;br /&gt;
|-&lt;br /&gt;
|2012/09/10  || NO Meeting|| || ||&lt;br /&gt;
|-&lt;br /&gt;
|2012/09/17  ||WALEED ABDULLA||Auditory Based Feature Vectors for Speech Recognition ||[[媒体文件:AuditoryBasedFeatureVectors.pdf|slides]]||范淼&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;2&amp;quot;|2012/09/24  ||刘超|| N-gram FST indexing for Spoken Term Detection || [[媒体文件:120924-N_gram_FST_indexing_for_Spoken_Term_Detection-LC-0.pdf|slides]] ||尹聪&lt;br /&gt;
|-&lt;br /&gt;
|范淼||Micro-blogging, Wikipedia, Folksonomy, What's Next? ||[[媒体文件:120924-Micro-blogging, Wikipedia, Folksonomy, What's Next-FM--01-FM-.pdf|slides]] ||&lt;br /&gt;
|-&lt;br /&gt;
| 2012/10/08 ||NO Meeting|| || ||&lt;br /&gt;
|-&lt;br /&gt;
| 2012/10/15  ||NO Meeting|| || ||&lt;br /&gt;
|-&lt;br /&gt;
|2012/10/22||Wu Xiaojun||speaker recognition in CSLT ||[[媒体文件:VPR_in_CSLT.pdf|slides]]||卡尔&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2012/10/29  ||王军||An overview of Automatic Speaker Diarization Systems || [[媒体文件:121027-Speaker Diarization-WJ.pdf|slides]] ||别凡虎&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2012/11/05  ||别凡虎||Experiments on Emotional Speaker Recognition||[[媒体文件:121104-Experiments_on_Emotional_Speaker_Recognition-BFH.pdf|slides]] ||刘超&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2012/11/12  ||唐国瑜||Statistical Word Sense Improves Document Clustering ||[[媒体文件:121112_Statistical_Word_Sense_Improves_Document_Clustering_TGY.pdf‎ |slides]]||邱晗&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2012/11/19  ||张陈昊||TDSR with Long-term Features Based on Functional Data Analysis||[[媒体文件:121118-ISCSLP-FDA_SR-ZCH.pdf|slides]] ||王俊俊&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2012/11/26  ||王琳琳||Time-Varying Speaker Recognition: An Introduction||[[媒体文件:121126-Time_Varying_Speaker_Recognition_I-Wll.pdf‎|slides]] ||龚宬&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2012/12/03  ||No meeting|| || ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2012/12/10  ||No meeting|| || ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2012/12/17  ||No meeting|| || ||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2012/01/07  || || || ||&lt;br /&gt;
|-&lt;br /&gt;
|2012/01/07  ||王军||基于DF-MAP的说话人模型训练方法||[[媒体文件:130107-基于DFMAP的说话人模型训练方法-WJ.pdf|slides]] ||唐国瑜&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2012/01/14  ||王东|| Computing in CSLT ||[[媒体文件:Computing_in_CSLT.pdf|slides]] ||王琳琳&lt;br /&gt;
|-&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/03/04  ||王军||Sequential Adaptive Learning for Speaker Verification ||[[媒体文件:130301-Sequential adaptive learning for speaker verification-WJ.pdf|slides]] ||别凡虎&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/03/11  || Du Jinle|| VAD stuff || ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/03/18  || || || ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/03/25  || || || ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/04/01  || || || ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/04/08  || 张陈昊|| A Fishervoice based Feature Fusion Method for SUSR ||[[媒体文件:130408-FisherVoice-ZCH.pdf|slides]] ||谢仲达&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/04/15  ||龚宬|| An Exploration on Influence Factors of VAD's Performance in Speaker Recognition ||[[媒体文件:130415-An_Exploration_on_Influence_Factors_of_VAD-GC.pdf|slides]] ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/04/22  ||王俊俊 || Understanding the Query: THCIB and THUIS at NTCIR-10 Intent Task ||[[媒体文件:130422-Understanding_the_Query-WJJ.pdf|slides‎]] ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/04/29  || || || ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/05/06  ||别凡虎 ||MLLR on Emotional Speaker Recognition ||[[媒体文件:130506-MLLR on Emotional Speaker Recognition-BFH.pdf|slides]] ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/05/13  ||刘超 || The Use of Deep Neural Network for Speech Recognition || [[媒体文件:130513-the_use_of_dnn_for_asr-lc.pdf|slides]] ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/05/20  || || || ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/05/27  ||王琳琳|| 说话人识别中的时变鲁棒性问题研究 || [[媒体文件:130527-TVSV-Wll.pdf|slides]] ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/06/03  ||王俊俊|| 汉语搜索结果聚类系统研究与实现 || [[媒体文件:130601-毕业答辩-02-WJJ.pdf|slides]] ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/06/10  || || || ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/06/17  ||范淼 || Relation Extraction ||[[媒体文件:130617-relation_extraction-fm.pdf|slides]] ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/06/24  ||唐国瑜 || Incorporating Statistical Word Senses in Topic Model  ||[[媒体文件:130624_Incorporating Statistical Word Senses in Topic Model_TGY.pdf|slides]] ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/07/01  || || || ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/07/08  ||  || || ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/07/15  || || || ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/09/09  ||王东 || Research Frontier in Speech Technology||[[媒体文件:Research Frontier in Speech Technology.pdf|slides]] ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/09/16  || || || ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/09/23  || || || ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/09/30  || || || ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/10/07  || || || ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/10/14  || || || ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/10/21  ||范淼 ||Transduction Classification with Matrix Completion （中文报告）||[[媒体文件: Transduction_Classifiction_with_Matrix_Completion.pdf‎|slides]] [http://pages.cs.wisc.edu/~jerryzhu/pub/mc4ssl_FINAL.pdf paper]|| 李蓝天&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/10/28  || || || ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/11/04  || 王军 || 基于i-vector的intersession补偿及打分方法(综述) || [[媒体文件:131104-ivecto下intersession补偿及打分方法--01-WJ-.pdf‎|slides]]||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/11/11  ||张陈昊 ||PLDA介绍及PLDA在说话人识别中的应用 ||[[媒体文件:PLDA.pdf|slides]] || 唐国瑜&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/11/18  ||别凡虎 ||i-vector理论介绍（讨论）||[[媒体文件:131118-i-vector_and_GMM-UBM-BFH.pdf|slides]]‎  ||王军&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/11/25  ||刘超 || Pruning Neural Networks By Optimal Brain Damage(综述)||[[媒体文件:131125-OBD-LC-01.pdf|slides]] ||范淼&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/12/02  ||范淼 ||Distant Supervision for Relation Extraction with Matrix Completion （英文报告）||[[媒体文件:131202-DRMC-FM-01.pdf|slides]] || 李蓝天&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/12/09  || Dong Wang|| Introduction to the HMM-based speech synthesis||[http://hts.sp.nitech.ac.jp/archives/2.2/HTS_Slides.zip slides] ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/12/16  ||张陈昊 ||语音研究中的基元介绍 ||[[媒体文件:131215-Phonology-ZCH.pdf|slides]]  ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/12/23  || Dong Wang|| Introduction to the HMM-based speech synthesis (2)||[http://hts.sp.nitech.ac.jp/archives/2.2/HTS_Slides.zip slides] ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/12/23  || || || ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2013/12/30  ||刘荣 || continuous space language model||[[媒体文件:Cslm-cslt.pdf|slides]]  ||刘超&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/01/06  || || || ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/01/13  || || || ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/01/20  || || || ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/02/24  || || || ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/03/03  || || || ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/03/10  ||范淼|| Distant Supervision for Information Extraction (英文报告)|| || 李蓝天&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/03/17  ||唐国瑜 || Topic Models Incorporating Statistical Word Senses || [[媒体文件:TMISWS_For_CICLing2014.pdf|slides]]||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/03/24  ||孟祥涛 || Noisy training for Deep Neural Networks|| ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/03/31  ||范淼|| Translating Embeddings for Modeling Multi-relational Data （中文报告） || [https://www.hds.utc.fr/everest/lib/exe/fetch.php?id=en%3Atranse&amp;amp;cache=cache&amp;amp;media=en:cr_paper_nips13.pdf paper]||李蓝天&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/04/07  || || || ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/04/14  || Wang Jun|| I-vector and PLDA in depth ||[[媒体文件:131104-ivector-microsoft-wj.pdf|slides]]  ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/04/21  || 邱晗||汉语事件句式规范化处理 ||[[媒体文件:140421-汉语事件句式规范化-QH.pdf‎|slides]] ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/04/28  || 唐国瑜|| Some papers in　CICLing2014 ||[[媒体文件:Some_papers_in_CICling2014.pdf|slides]]  ||刘超&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/05/05  || || || ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/05/12  || 卡尔|| paper introduction || [[媒体文件:Acoustic Factor Analysis.pdf|slides]] || 邱晗&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;2&amp;quot;|2014/05/19  || 邱晗|| 汉语事件句式CCG推导树重构 ||[[媒体文件:140519-CCG_reConstruction.pdf‎|slides]]‎|| 卡尔&lt;br /&gt;
|-&lt;br /&gt;
|Liu Chao|| master proposal: sparse and deep neural networks || [[媒体文件:140519-proposal-LC-01.pdf|slides]] ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;| || Liu Chao|| 2nd master proposal: sparse and deep neural networks|| ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/06/16  || 别凡虎 || Truncated Wave based VPR and Some Recent Work || [[媒体文件:140614-Truncated_Speech_based_VPR.pdf‎|slides]]‎ || 别凡虎&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/06/23  || 别凡虎 || Block-wise training for I-vector || [[媒体文件:140623-Block-wise training for I-vector.pdf‎|slides]]‎ || 别凡虎&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;| 2014/07/07||王军 ||Discriminative Scoring for Speaker Recognition Based on I-vectors || [[媒体文件:140707-work_report.pdf|slides]]|| 王军&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;| 2014/09/01|| || || ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/09/09 ||别凡虎 ||Reseach on Truncated Wave based VPR||[[媒体文件:140909-Truncated Speech based VPR.pdf|slides]] || 别凡虎&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;| 2014/09/15|| || || ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/09/22  || Miao Fan|| Large-scale Entity Relation Extraction based on Low-dimensional Representations (中文报告，博士开题)&lt;br /&gt;
||[[媒体文件:基于低维表示的大规模实体关系挖掘技术.pdf‎|slides]] || Lan TianLi&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;| 2014/09/29 || || || ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/10/13  || Miao Fan|| The Frontier of Knowledge Embedding （英文报告）|| [[媒体文件:The_Frontier_of_Knowledge_Embedding.pdf‎|slides]]|| Lan TianLi&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/10/20  || || || || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/10/27  || Li Yi || Phonemes, Features, and Syllables: Converting Onset and Rime Inventories to Consonants and Vowels||[[媒体文件:Lanzhou Phonemes, Features, and Syllables- fianl.pdf|paper]] [[媒体文件:Syllables and phonemes - 20141027.pdf|slides]]|| &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/11/3   || 米吉提|| Automatic Speech Recognition of Agglutinative Language based on Lexicon Optimization||[[媒体文件:Mijit-slides-清华大学-2014-11-3.pdf|slides]] || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/11/10  || || || || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/11/17  ||Dong Wang || Highly restricted keyword spotting for Uyghur using sparse analysis|| [[媒体文件:Highly Restricted Keyword Selection Based on Sparse Analysis.pdf|slides]]|| &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/11/24  || || || || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/12/1  ||ZhongDa Xie ||Incorporating Fine-Grained Ontological Relations in Medical Document Ranking || [[媒体文件:Fine-grained_relations.pdf|slides]]|| Lantian Li &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/12/8  || || || || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/12/15  || 唐国瑜 || 跨语言话题分析关键技术研究 ||[[媒体文件:141205-答辩-TGY.pdf|slides]] || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/12/22  || || || || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2014/12/29  || Askar || Language Mismatch in Speaker Recognition System||[[媒体文件:141229--askar.pdf|slides]] || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2015/1/5  ||Lantian Li || Deep Neural Networks for Speaker Recognition || [[媒体文件:150104_Deep_Neural_Networks_for_Speaker_Recognition_LLT.pdf|slides]]|| &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2015/1/12  || || || || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2015/1/19  || Dong Wang || Machine Learning Paradigms for Speech Recognition||[[媒体文件:Machine Learning Paradigms for Speech Recognition.pdf|slides]]  [http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6423821 paper] || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2015/1/26  || Chen Guorong || Information Transmission and Distribution on Web ||[[媒体文件:An_introduction_of_complex_network1.pdf|slides]] || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot; |2015/3/9 || Dong Wang || Joint Deep Learning || [[媒体文件:Joint Deep Learning.pdf|slides]] ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2015/3/16  || Dongxu Zhang || Knowledge learning from text data and knowledge bases || [[媒体文件:Joint Deep Learning.pdf|slides]] ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2015/4/13  || Xuewei Zhang || Lasso-based Reverberation Suppression In Automatic Speech Recognition || [[媒体文件:Lasso-based Reverberation Suppression In Automatic Speech Recognition.pdf|slides]] ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2015/5/11  || Dong Wang ||ASR and SID Research Frontier ||[[媒体文件:ASR and SID Research Frontier.pdf|slides]] ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2015/11/23  || Zhiyuan Tang|| CTC learning|| [[媒体文件:CTC.pdf|slides]] || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2015/11/30  || Mengyuan Zhao|| CNN-based music removal|| [[媒体文件:Music Removal by Convolutional Denoising.pdf | slides]] || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2015/12/3  || Zhiyuan Tang|| Networks of Memory|| [[媒体文件:Memory_net.pdf|slides]] || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2015/12/7  || Yiqiao Pan|| Document Classification with Spherical Word Vectors||[[媒体文件:Document Classification with Spherical Word Vectors.pdf|slides]] || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2015/12/14  || Dong Wang || Transfer Learning for Speech and Language Processing ||[[媒体文件:Transfer_Learning_for_Speech_and_Language_Processing.pdf|slides]] || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2015/12/21  || Qixin Wang || Attention for poem generation ||[[媒体文件:Ijcai 2016.pptx|slides]] || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2015/12/28  || Lantian Li || Max-margin metric learning for speaker recognition || [[媒体文件:Max-margin-Metric-Learning.pdf|slides]]|| &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2016/1/4  || Zhiyong Zhang || Parallel training,MPE and natural gradient||[[媒体文件:20160104_张之勇_Large-scale Parallel Training in Speech Recognition.pdf|slides]]||  &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2016/1/18  || Dongxu Zhang || Memoryless Document Vector ||[[媒体文件:Memoryless_document_vector.pdf|slides]]|| &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2016/3/14  || Zhiyuan Tang|| Oral presentation for &amp;quot;vMF-SNE: Embedding for Spherical Data&amp;quot;|| [[媒体文件:embedding.pdf|slides]] ||  &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2016/3/28  || Tianyi Luo || Review for Neural QA || [http://cslt.riit.tsinghua.edu.cn/mediawiki/images/2/29/CSLT_Weekly_Report--20160328.pdf slides] ||  &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2016/4/11  || Rong Liu || Recommendation in Youku || [http://cslt.riit.tsinghua.edu.cn/mediawiki/index.php/%E6%96%87%E4%BB%B6:Cslt%E5%AE%9E%E9%AA%8C%E5%AE%A4%E4%BA%A4%E6%B5%81.pptx slides] ||  &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2016/5/09 || Miao Fan || Learning contextual embeddings of knowledge base with entity descriptions.|| [http://cslt.riit.tsinghua.edu.cn/mediawiki/images/9/9c/Techreport_CSLT_2016_M.F..pdf slides]  || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2016/5/16 || Yang Wang || Research on conversation thread detection. || [http://cslt.riit.tsinghua.edu.cn/mediawiki/images/b/bb/%E6%B1%AA%E6%B4%8B-%E6%AF%95%E8%AE%BE-CSLT.pdf slides]  || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2016/5/20 || Yang Wang &amp;amp;  Maoning Wang || Research on portfolio selection. || [http://cslt.riit.tsinghua.edu.cn/mediawiki/images/8/89/%E6%B1%AA%E6%B4%8B-%E9%87%91%E8%9E%8D%E7%AC%AC%E4%B8%80%E6%AC%A1%E5%88%86%E4%BA%AB.pdf slides1]  [http://cslt.riit.tsinghua.edu.cn/mediawiki/images/b/bb/%E6%B1%87%E6%8A%A5_%E8%B5%84%E4%BA%A7%E7%BB%84%E5%90%88%E4%B8%AD%E5%87%A0%E4%B8%AA%E8%AF%84%E4%BB%B7%E6%8C%87%E6%A0%87%E7%9A%84%E8%A7%A3%E9%87%8A.pdf slides2]|| &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2016/5/20  || Zhiyuan Tang || ICASSP 2016 summary || [[媒体文件:Note icassp16.pdf|slides]] ||&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2016/5/23 || Dong Wang || graphical model and neural model || [[媒体文件:Graphic Model and Neural Model.pdf|slides]] [[媒体文件:Generative-Pdf.rar|papers]]  || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2016/8/02 || Zhiyuan Tang || Visualizing, Measuring and Understanding Neural Networks: A Brief Survey|| [[媒体文件:Nn analysis.pdf|slides]] || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2016/8/03 || Yang Wang || Neural networks and genetic programming for financial forecasting || [http://cslt.riit.tsinghua.edu.cn/mediawiki/images/7/79/GeneticNN.pdf slides] || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2016/11/05 || Yang Wang || Reinforcement Learning Models and Simulations || [http://cslt.riit.tsinghua.edu.cn/mediawiki/images/c/ca/RRL_and_sim.pdf slides] || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2016/11/08 || April Pu || SOFTWARE DEVELIPMENT METHODOLOGIES || [http://wangd.cslt.org/talks/pdf/april_software.pptx slides] || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2016/11/12 || Yang Wang || Generative Adversarial Nets || [http://cslt.riit.tsinghua.edu.cn/mediawiki/images/c/c9/Generative_adversarial_network.pdf slides] || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2016/11/22 || Zhiyuan Tang || INTERSPEECH 2016 summary || [http://cslt.riit.tsinghua.edu.cn/mediawiki/images/6/65/Interspeech16_review.pdf slides] || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2016/11/30 || Dong Wang || Deep and sparse learning in speech and language: an overview || [http://wangd.cslt.org/talks/pdf/bics2016.pptx slides] || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2017/2/17 || Yang Wang || Review understanding deep learning requires rethinking generalization || [http://cslt.riit.tsinghua.edu.cn/mediawiki/images/3/3b/Review_understanding_deep_learning_requires_rethinking_generalization.pdf slides] || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2017/6/5 || Dong Wang || Deep speech factorization || [http://wangd.cslt.org/talks/pdf/Deep-Speech-Factorization.pdf slides] || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2017/6/8 || Shiyue Zhang || Convolutional Sequence to Sequence Learning  || [http://cslt.riit.tsinghua.edu.cn/mediawiki/images/f/f3/Conv_seq2seq.pptx slides] || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2017/6/12 || Shiyue Zhang || Memory-augmented Neural Machine Translation || [http://cslt.riit.tsinghua.edu.cn/mediawiki/images/3/36/Memory-augmented_Neural_Machine_Translation_.pptx slides] || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2017/6/21 || Shiyue Zhang || Attention Is All You Need  || [http://cslt.riit.tsinghua.edu.cn/mediawiki/images/6/68/Attention_is_all_you_need.pptx slides] || &lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;1&amp;quot;|2017/6/26 || Jiyuan Zhang || Chinese poem generation using neural model  || [http://cslt.riit.tsinghua.edu.cn/mediawiki/images/5/50/Flexible_and_Creative_Chinese_Poetry_Generation_Using_Neural_Memory_.pptx slides] || &lt;br /&gt;
|-&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/%E6%96%87%E4%BB%B6:Flexible_and_Creative_Chinese_Poetry_Generation_Using_Neural_Memory_.pptx</id>
		<title>文件:Flexible and Creative Chinese Poetry Generation Using Neural Memory .pptx</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/%E6%96%87%E4%BB%B6:Flexible_and_Creative_Chinese_Poetry_Generation_Using_Neural_Memory_.pptx"/>
				<updated>2017-06-26T09:36:31Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/%E6%96%87%E4%BB%B6:Memory-augmented_Neural_Machine_Translation_.pptx</id>
		<title>文件:Memory-augmented Neural Machine Translation .pptx</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/%E6%96%87%E4%BB%B6:Memory-augmented_Neural_Machine_Translation_.pptx"/>
				<updated>2017-06-25T13:38:27Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：Zhangjy上传“文件:Memory-augmented Neural Machine Translation .pptx”的新版本&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Memory-augmented_Neural_Machine_Translation report&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/%E6%96%87%E4%BB%B6:Memory-augmented_Neural_Machine_Translation_.pptx</id>
		<title>文件:Memory-augmented Neural Machine Translation .pptx</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/%E6%96%87%E4%BB%B6:Memory-augmented_Neural_Machine_Translation_.pptx"/>
				<updated>2017-06-25T08:05:35Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：Zhangjy上传“文件:Memory-augmented Neural Machine Translation .pptx”的新版本&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Memory-augmented_Neural_Machine_Translation report&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/%E6%96%87%E4%BB%B6:Nlp_team_bi_monthly_report.pdf</id>
		<title>文件:Nlp team bi monthly report.pdf</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/%E6%96%87%E4%BB%B6:Nlp_team_bi_monthly_report.pdf"/>
				<updated>2017-05-09T04:52:29Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：Zhangjy上传“文件:Nlp team bi monthly report.pdf”的新版本&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/Bi-monthly-2017-04-language</id>
		<title>Bi-monthly-2017-04-language</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/Bi-monthly-2017-04-language"/>
				<updated>2017-05-09T04:39:23Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Team [http://cslt.riit.tsinghua.edu.cn/mediawiki/images/a/a8/Nlp_team_bi_monthly_report.pdf]&lt;br /&gt;
&lt;br /&gt;
Jiyuan Zhang [http://cslt.riit.tsinghua.edu.cn/mediawiki/images/e/e4/Bi-monthly_report_zhangjy.pdf]&lt;br /&gt;
&lt;br /&gt;
Shiyue Zhang &lt;br /&gt;
&lt;br /&gt;
Aodong Li&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/Bi-monthly-2017-04-language</id>
		<title>Bi-monthly-2017-04-language</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/Bi-monthly-2017-04-language"/>
				<updated>2017-05-09T04:39:04Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Team [http://cslt.riit.tsinghua.edu.cn/mediawiki/images/a/a8/Nlp_team_bi_monthly_report.pdf]&lt;br /&gt;
&lt;br /&gt;
Jiyuan Zhang http://cslt.riit.tsinghua.edu.cn/mediawiki/images/e/e4/Bi-monthly_report_zhangjy.pdf&lt;br /&gt;
&lt;br /&gt;
Shiyue Zhang&lt;br /&gt;
&lt;br /&gt;
Aodong Li&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/%E6%96%87%E4%BB%B6:Bi-monthly_report_zhangjy.pdf</id>
		<title>文件:Bi-monthly report zhangjy.pdf</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/%E6%96%87%E4%BB%B6:Bi-monthly_report_zhangjy.pdf"/>
				<updated>2017-05-09T04:38:38Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/Bi-monthly-2017-04-language</id>
		<title>Bi-monthly-2017-04-language</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/Bi-monthly-2017-04-language"/>
				<updated>2017-05-09T04:38:21Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Team [http://cslt.riit.tsinghua.edu.cn/mediawiki/images/a/a8/Nlp_team_bi_monthly_report.pdf]&lt;br /&gt;
&lt;br /&gt;
Jiyuan Zhang &lt;br /&gt;
&lt;br /&gt;
Shiyue Zhang&lt;br /&gt;
&lt;br /&gt;
Aodong Li&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/%E6%96%87%E4%BB%B6:Nlp_team_bi_monthly_report.pdf</id>
		<title>文件:Nlp team bi monthly report.pdf</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/%E6%96%87%E4%BB%B6:Nlp_team_bi_monthly_report.pdf"/>
				<updated>2017-05-09T04:37:48Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/Bi-monthly-2017-04-language</id>
		<title>Bi-monthly-2017-04-language</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/Bi-monthly-2017-04-language"/>
				<updated>2017-05-09T04:37:26Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Team &lt;br /&gt;
&lt;br /&gt;
Jiyuan Zhang &lt;br /&gt;
&lt;br /&gt;
Shiyue Zhang&lt;br /&gt;
&lt;br /&gt;
Aodong Li&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/Bi-monthly-2017-04-language</id>
		<title>Bi-monthly-2017-04-language</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/Bi-monthly-2017-04-language"/>
				<updated>2017-05-09T04:36:59Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：以“Team   Jiyuan Zhang   Shiyue Zhang   Aodong Li”为内容创建页面&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Team &lt;br /&gt;
&lt;br /&gt;
Jiyuan Zhang &lt;br /&gt;
&lt;br /&gt;
Shiyue Zhang&lt;br /&gt;
&lt;br /&gt;
 Aodong Li&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-4-24</id>
		<title>NLP Status Report 2017-4-24</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-4-24"/>
				<updated>2017-05-03T02:47:34Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/4/5&lt;br /&gt;
|Yang Feng ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*run the qx's model using  different dataset&lt;br /&gt;
*read some papers&lt;br /&gt;
|| &lt;br /&gt;
|-&lt;br /&gt;
|Andi Zhang ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
|Peilun Xiao ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-4-24</id>
		<title>NLP Status Report 2017-4-24</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-4-24"/>
				<updated>2017-05-03T02:47:19Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：以“{| class=&amp;quot;wikitable&amp;quot; !Date !! People !! Last Week !! This Week |- | rowspan=&amp;quot;6&amp;quot;|2017/4/5 |Yang Feng ||  || |- |Jiyuan Zhang || *run the qx's model using  different d...”为内容创建页面&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/4/5&lt;br /&gt;
|Yang Feng ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*run the qx's model using  different dataset&lt;br /&gt;
*read some paper&lt;br /&gt;
|| &lt;br /&gt;
|-&lt;br /&gt;
|Andi Zhang ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
|Peilun Xiao ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-4-17</id>
		<title>NLP Status Report 2017-4-17</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-4-17"/>
				<updated>2017-05-03T02:45:05Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/4/5&lt;br /&gt;
|Yang Feng ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*run the ppg model using different datasets&lt;br /&gt;
*check the emnlp paper&lt;br /&gt;
|| &lt;br /&gt;
|-&lt;br /&gt;
|Andi Zhang ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
||&lt;br /&gt;
|-&lt;br /&gt;
|Peilun Xiao ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-4-17</id>
		<title>NLP Status Report 2017-4-17</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-4-17"/>
				<updated>2017-05-03T02:44:25Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：以“{| class=&amp;quot;wikitable&amp;quot; !Date !! People !! Last Week !! This Week |- | rowspan=&amp;quot;6&amp;quot;|2017/4/5 |Yang Feng || * Got the sampled 100w good data and ran Moses (BLEU: 30.6) *...”为内容创建页面&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/4/5&lt;br /&gt;
|Yang Feng ||&lt;br /&gt;
* Got the sampled 100w good data and ran Moses (BLEU: 30.6)&lt;br /&gt;
* Reimplemented the idea of ACL (added some optimization to the previous code) and check the performance in the following gradual steps: 1. use s_i-1 as memory query; 2. use s_i-1+c_i as memory query; 3. use y as the memory states for attention; 4. use y + smt_attentions * h as memory states for attention.&lt;br /&gt;
* ran experiments for the above steps but the loss was inf. I am looking for reasons.&lt;br /&gt;
||&lt;br /&gt;
*do experiments and write the paper&lt;br /&gt;
|-&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*run the ppg model using different datasets&lt;br /&gt;
*check the emnlp paper&lt;br /&gt;
|| &lt;br /&gt;
*improve the effect of the qx's model&lt;br /&gt;
|-&lt;br /&gt;
|Andi Zhang ||&lt;br /&gt;
*revise the original oov model so that it can automatically detect oov words and translate them &lt;br /&gt;
*deal with the situation that source word is oov but target word is not oov first&lt;br /&gt;
*it didn't predict right&lt;br /&gt;
||&lt;br /&gt;
*make the model work as what we wanted&lt;br /&gt;
*deal with the situation that source word is oov and target word is also oov, then other situations&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
* got a reasonable baseline on big zhen data&lt;br /&gt;
||&lt;br /&gt;
* implement mem model on this baseline, and test on big data&lt;br /&gt;
|-&lt;br /&gt;
|Peilun Xiao ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-4-10</id>
		<title>NLP Status Report 2017-4-10</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-4-10"/>
				<updated>2017-05-03T02:41:45Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：以“{| class=&amp;quot;wikitable&amp;quot; !Date !! People !! Last Week !! This Week |- | rowspan=&amp;quot;6&amp;quot;|2017/4/5 |Yang Feng || * Got the sampled 100w good data and ran Moses (BLEU: 30.6) *...”为内容创建页面&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/4/5&lt;br /&gt;
|Yang Feng ||&lt;br /&gt;
* Got the sampled 100w good data and ran Moses (BLEU: 30.6)&lt;br /&gt;
* Reimplemented the idea of ACL (added some optimization to the previous code) and check the performance in the following gradual steps: 1. use s_i-1 as memory query; 2. use s_i-1+c_i as memory query; 3. use y as the memory states for attention; 4. use y + smt_attentions * h as memory states for attention.&lt;br /&gt;
* ran experiments for the above steps but the loss was inf. I am looking for reasons.&lt;br /&gt;
||&lt;br /&gt;
*do experiments and write the paper&lt;br /&gt;
|-&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*convert the style of the paper to EMNLP&lt;br /&gt;
*contact the ppg's author to get the code&lt;br /&gt;
|| &lt;br /&gt;
*improve the effect of the qx's model&lt;br /&gt;
|-&lt;br /&gt;
|Andi Zhang ||&lt;br /&gt;
*revise the original oov model so that it can automatically detect oov words and translate them &lt;br /&gt;
*deal with the situation that source word is oov but target word is not oov first&lt;br /&gt;
*it didn't predict right&lt;br /&gt;
||&lt;br /&gt;
*make the model work as what we wanted&lt;br /&gt;
*deal with the situation that source word is oov and target word is also oov, then other situations&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
* got a reasonable baseline on big zhen data&lt;br /&gt;
||&lt;br /&gt;
* implement mem model on this baseline, and test on big data&lt;br /&gt;
|-&lt;br /&gt;
|Peilun Xiao ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-4-5</id>
		<title>NLP Status Report 2017-4-5</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-4-5"/>
				<updated>2017-04-05T02:04:09Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/3/27&lt;br /&gt;
|Yang Feng ||&lt;br /&gt;
*tested for the baseline but cannot get the reasonable result.&lt;br /&gt;
*debug the baseline to try to reproduce the good result but failed.&lt;br /&gt;
*fixed the problem of nan in alpha-gamma method but the result is not good.&lt;br /&gt;
*changed the calculation of probability for alpha-gamma method but the result is neither good.&lt;br /&gt;
*ran Moses for cwmt zh-en translation, but the training data is case-sensitive, so need to rerun.&lt;br /&gt;
||&lt;br /&gt;
*rerun Moses for cwmt zh-en and cs-en&lt;br /&gt;
*decide to use tensorflow or theano&lt;br /&gt;
*run experiments based on the chosen platform&lt;br /&gt;
|-&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
* I did keyword expansion on the qx's model&lt;br /&gt;
*fixed some bugs&lt;br /&gt;
*read two papers&lt;br /&gt;
|| &lt;br /&gt;
*improve the effect of the qx's model&lt;br /&gt;
|-&lt;br /&gt;
|Andi Zhang ||&lt;br /&gt;
*fixed the bug, turns out it rises from the unfamiliarity with numpy.resize() function&lt;br /&gt;
*the demo model can deal with oov problem(both source word and target word are oov)&lt;br /&gt;
||&lt;br /&gt;
*some paper work about graduation design&lt;br /&gt;
*run some experiments using theano on old data set and new zh2en from lihang&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
* got a reasonable baseline on big zhen data&lt;br /&gt;
||&lt;br /&gt;
* implement mem model on this baseline, and test on big data&lt;br /&gt;
|-&lt;br /&gt;
|Peilun Xiao ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-3-20</id>
		<title>NLP Status Report 2017-3-20</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-3-20"/>
				<updated>2017-04-05T01:44:29Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/3/20&lt;br /&gt;
|Yang Feng ||&lt;br /&gt;
*went through the code and made different attempt, and managed to produce the good result (47--&amp;gt;50 on the small zh-en data set)&lt;br /&gt;
*wrote the cross-entropy method for alpha-gamma method, but found it is different from the built-in method.&lt;br /&gt;
*changed to use the build-in soft-cross-entropy method and ran experiments&lt;br /&gt;
||&lt;br /&gt;
*get the result for alpha-gamma method&lt;br /&gt;
*run experiments on the big data&lt;br /&gt;
|-&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*response ACL reviewers' questions&lt;br /&gt;
|| &lt;br /&gt;
*improve the effect of the qx's model&lt;br /&gt;
|-&lt;br /&gt;
|Andi Zhang ||&lt;br /&gt;
*tried to fix the bug that prevents the prediction of EOS symbol&lt;br /&gt;
*the output_projection matrix is out of order after the copying vector process &lt;br /&gt;
||&lt;br /&gt;
*fix the bug&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
* learned to use google's new seq2seq code(gnmt)&lt;br /&gt;
* ran gnmt on en-de , small zh-en, cs-en, big zh-en&lt;br /&gt;
||&lt;br /&gt;
* run gnmt on new big zh-en&lt;br /&gt;
* try to find how to implement our model on gnmt&lt;br /&gt;
|-&lt;br /&gt;
|Peilun Xiao ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-3-27</id>
		<title>NLP Status Report 2017-3-27</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-3-27"/>
				<updated>2017-04-05T01:43:38Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/3/27&lt;br /&gt;
|Yang Feng ||&lt;br /&gt;
*tested for the baseline but cannot get the reasonable result.&lt;br /&gt;
*debug the baseline to try to reproduce the good result but failed.&lt;br /&gt;
*fixed the problem of nan in alpha-gamma method but the result is not good.&lt;br /&gt;
*changed the calculation of probability for alpha-gamma method but the result is neither good.&lt;br /&gt;
*ran Moses for cwmt zh-en translation, but the training data is case-sensitive, so need to rerun.&lt;br /&gt;
||&lt;br /&gt;
*rerun Moses for cwmt zh-en and cs-en&lt;br /&gt;
*decide to use tensorflow or theano&lt;br /&gt;
*run experiments based on the chosen platform&lt;br /&gt;
|-&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
* I did nothing (I found my ACL mark was at borderline, so I didn't have a mind to work)&lt;br /&gt;
|| &lt;br /&gt;
*improve the effect of the qx's model&lt;br /&gt;
|-&lt;br /&gt;
|Andi Zhang ||&lt;br /&gt;
*fixed the bug, turns out it rises from the unfamiliarity with numpy.resize() function&lt;br /&gt;
*the demo model can deal with oov problem(both source word and target word are oov)&lt;br /&gt;
||&lt;br /&gt;
*some paper work about graduation design&lt;br /&gt;
*run some experiments using theano on old data set and new zh2en from lihang&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
* got good results from gnmt&lt;br /&gt;
* but haven't found the way to implement our model on gnmt&lt;br /&gt;
* trying to modify our code to make it work on big data&lt;br /&gt;
||&lt;br /&gt;
* go on trying to modify our code to make it work on big data&lt;br /&gt;
* go on looking into gnmt code&lt;br /&gt;
|-&lt;br /&gt;
|Peilun Xiao ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-3-27</id>
		<title>NLP Status Report 2017-3-27</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-3-27"/>
				<updated>2017-04-05T01:42:51Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/3/27&lt;br /&gt;
|Yang Feng ||&lt;br /&gt;
*tested for the baseline but cannot get the reasonable result.&lt;br /&gt;
*debug the baseline to try to reproduce the good result but failed.&lt;br /&gt;
*fixed the problem of nan in alpha-gamma method but the result is not good.&lt;br /&gt;
*changed the calculation of probability for alpha-gamma method but the result is neither good.&lt;br /&gt;
*ran Moses for cwmt zh-en translation, but the training data is case-sensitive, so need to rerun.&lt;br /&gt;
||&lt;br /&gt;
*rerun Moses for cwmt zh-en and cs-en&lt;br /&gt;
*decide to use tensorflow or theano&lt;br /&gt;
*run experiments based on the chosen platform&lt;br /&gt;
|-&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
* I did nothing (I found my ACL mark was at borderline, so I didn't have a mind to work)&lt;br /&gt;
|| &lt;br /&gt;
improve the effect of the qx's model&lt;br /&gt;
|-&lt;br /&gt;
|Andi Zhang ||&lt;br /&gt;
*fixed the bug, turns out it rises from the unfamiliarity with numpy.resize() function&lt;br /&gt;
*the demo model can deal with oov problem(both source word and target word are oov)&lt;br /&gt;
||&lt;br /&gt;
*some paper work about graduation design&lt;br /&gt;
*run some experiments using theano on old data set and new zh2en from lihang&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
* got good results from gnmt&lt;br /&gt;
* but haven't found the way to implement our model on gnmt&lt;br /&gt;
* trying to modify our code to make it work on big data&lt;br /&gt;
||&lt;br /&gt;
* go on trying to modify our code to make it work on big data&lt;br /&gt;
* go on looking into gnmt code&lt;br /&gt;
|-&lt;br /&gt;
|Peilun Xiao ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-3-27</id>
		<title>NLP Status Report 2017-3-27</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-3-27"/>
				<updated>2017-04-05T01:39:52Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/3/27&lt;br /&gt;
|Yang Feng ||&lt;br /&gt;
*tested for the baseline but cannot get the reasonable result.&lt;br /&gt;
*debug the baseline to try to reproduce the good result but failed.&lt;br /&gt;
*fixed the problem of nan in alpha-gamma method but the result is not good.&lt;br /&gt;
*changed the calculation of probability for alpha-gamma method but the result is neither good.&lt;br /&gt;
*ran Moses for cwmt zh-en translation, but the training data is case-sensitive, so need to rerun.&lt;br /&gt;
||&lt;br /&gt;
*rerun Moses for cwmt zh-en and cs-en&lt;br /&gt;
*decide to use tensorflow or theano&lt;br /&gt;
*run experiments based on the chosen platform&lt;br /&gt;
|-&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
* I did nothing (I found my ACL mask was at borderline, so I didn't have a mind to work)&lt;br /&gt;
|| &lt;br /&gt;
improve the effect of the qx's model&lt;br /&gt;
|-&lt;br /&gt;
|Andi Zhang ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
* got good results from gnmt&lt;br /&gt;
* but haven't found the way to implement our model on gnmt&lt;br /&gt;
* trying to modify our code to make it work on big data&lt;br /&gt;
||&lt;br /&gt;
* go on trying to modify our code to make it work on big data&lt;br /&gt;
* go on looking into gnmt code&lt;br /&gt;
|-&lt;br /&gt;
|Peilun Xiao ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-3-20</id>
		<title>NLP Status Report 2017-3-20</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-3-20"/>
				<updated>2017-04-05T01:34:02Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/3/20&lt;br /&gt;
|Yang Feng ||&lt;br /&gt;
*went through the code and made different attempt, and managed to produce the good result (47--&amp;gt;50 on the small zh-en data set)&lt;br /&gt;
*wrote the cross-entropy method for alpha-gamma method, but found it is different from the built-in method.&lt;br /&gt;
*changed to use the build-in soft-cross-entropy method and ran experiments&lt;br /&gt;
||&lt;br /&gt;
*get the result for alpha-gamma method&lt;br /&gt;
*run experiments on the big data&lt;br /&gt;
|-&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*response ACL reviewers' suggestions&lt;br /&gt;
|| &lt;br /&gt;
*improve the effect of the qx's model&lt;br /&gt;
|-&lt;br /&gt;
|Andi Zhang ||&lt;br /&gt;
*tried to fix the bug that prevents the prediction of EOS symbol&lt;br /&gt;
*the output_projection matrix is out of order after the copying vector process &lt;br /&gt;
||&lt;br /&gt;
*fix the bug&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
* learned to use google's new seq2seq code(gnmt)&lt;br /&gt;
* ran gnmt on en-de , small zh-en, cs-en, big zh-en&lt;br /&gt;
||&lt;br /&gt;
* run gnmt on new big zh-en&lt;br /&gt;
* try to find how to implement our model on gnmt&lt;br /&gt;
|-&lt;br /&gt;
|Peilun Xiao ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-3-20</id>
		<title>NLP Status Report 2017-3-20</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-3-20"/>
				<updated>2017-04-05T01:31:41Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/3/20&lt;br /&gt;
|Yang Feng ||&lt;br /&gt;
*went through the code and made different attempt, and managed to produce the good result (47--&amp;gt;50 on the small zh-en data set)&lt;br /&gt;
*wrote the cross-entropy method for alpha-gamma method, but found it is different from the built-in method.&lt;br /&gt;
*changed to use the build-in soft-cross-entropy method and ran experiments&lt;br /&gt;
||&lt;br /&gt;
*get the result for alpha-gamma method&lt;br /&gt;
*run experiments on the big data&lt;br /&gt;
|-&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*response ACL reviewers' suggestions&lt;br /&gt;
|| &lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Andi Zhang ||&lt;br /&gt;
*tried to fix the bug that prevents the prediction of EOS symbol&lt;br /&gt;
*the output_projection matrix is out of order after the copying vector process &lt;br /&gt;
||&lt;br /&gt;
*fix the bug&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
* learned to use google's new seq2seq code(gnmt)&lt;br /&gt;
* ran gnmt on en-de , small zh-en, cs-en, big zh-en&lt;br /&gt;
||&lt;br /&gt;
* run gnmt on new big zh-en&lt;br /&gt;
* try to find how to implement our model on gnmt&lt;br /&gt;
|-&lt;br /&gt;
|Peilun Xiao ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	<entry>
		<id>http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-3-13</id>
		<title>NLP Status Report 2017-3-13</title>
		<link rel="alternate" type="text/html" href="http://index.cslt.org/mediawiki/index.php/NLP_Status_Report_2017-3-13"/>
				<updated>2017-03-13T07:10:10Z</updated>
		
		<summary type="html">&lt;p&gt;Zhangjy：&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Date !! People !! Last Week !! This Week&lt;br /&gt;
|-&lt;br /&gt;
| rowspan=&amp;quot;6&amp;quot;|2017/1/3&lt;br /&gt;
|Yang Feng ||&lt;br /&gt;
* tested and analyzed the results on the cs-en data set (30.4 on the heldout-training set and 7.3 on the dev set);&lt;br /&gt;
* added masks to the baseline (44.4 on the cn-en);&lt;br /&gt;
* added encoder-masks and memory-masks to alpha-gamma method and fixed the bugs. Got an improvement of 0.5 again the masked baseline [[http://cslt.riit.tsinghua.edu.cn/mediawiki/images/b/b8/Nmt_mn_report_continue.pdf report]];&lt;br /&gt;
* To avoid doing softmax twice, rewrite the softmax_cross_entropy function myself. (under-training)&lt;br /&gt;
||&lt;br /&gt;
* analyze and improve the alpha-gamma method.&lt;br /&gt;
|-&lt;br /&gt;
|Jiyuan Zhang ||&lt;br /&gt;
*completed to reproduce planning neural network&lt;br /&gt;
*chose best attention_memory model for huilian  and ran big train dataset(about 370k) [http://cslt.riit.tsinghua.edu.cn/mediawiki/images/b/b9/Model_with_different_dataset.pdf  result]&lt;br /&gt;
&lt;br /&gt;
|| &lt;br /&gt;
*Keyword expansion model&lt;br /&gt;
*collect more poem from Internet&lt;br /&gt;
*recruiting&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Andi Zhang ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Shiyue Zhang || &lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Peilun Xiao ||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Zhangjy</name></author>	</entry>

	</feed>