|
|
第108行: |
第108行: |
| |- | | |- |
| | | |
− |
| |
− | |-
| |
− | |Wenwei Dong
| |
− | ||
| |
− | *
| |
− | ||
| |
− | *
| |
− | ||
| |
− | *
| |
− | |-
| |
| | | |
| |} | | |} |
People |
Last Week |
This Week |
Task Tracking (DeadLine)
|
Yibo Liu
|
- Use another word vector and lyrics training set to train a model for lyrics generation. (Ongoing)
|
- Reconstruct code and make it flexible for different baseline models.
|
|
Xiuqi Jiang
|
- Focused back on the quatrain generation, thinking of the drawbacks of current model.
- Tried to weaken the attention mechanism between sentences.
|
- Add VAE into model and try to generate instead of predicting.
|
|
Jiayao Wu
|
- continue doing sparse-node experiment using the mask method
|
- run through the first experiment
|
|
Zhaodi Qi
|
- Finish the speech language recognition of speech book.
- Compared the experiments of tdnn model under different duration test sets
- Research on model acceleration and model compression
|
- Do experiments to make the lid model smaller and faster
|
|
Jiawei Yu
|
- Finish the speech emotion recognition of speech book.
- Learning the tensorflow tutorial.
|
- keep learning tensorflow and do some experiment.
|
|
Yunqi Cai
|
- 1.Run through the bert model. 2.study the details of the bert model.
|
|
|
Dan He
|
- Doing some comparative experiments about TT-decompositions.
|
- Summarize the results of the comparative experiments and initially complete the relevant research on TT-decomposition.
|
|
Yang Zhang
|
|
- (I will finish all my exams on Friday night)
|
|