“Xingchao work”版本间的差异
来自cslt Wiki
(→Date Line) |
|||
| 第36行: | 第36行: | ||
:2. Write Binary Vector for theano mode. | :2. Write Binary Vector for theano mode. | ||
:3. Thinking the method for merge different resource for one matrix factorization process. | :3. Thinking the method for merge different resource for one matrix factorization process. | ||
| − | <Written by 10- | + | ====Done==== |
| + | :1. Test different matrix factorization lost functions. | ||
| + | :2. Write Binary Vector for theano mode. | ||
| + | :3. Thinking the method for merge different resource for one matrix factorization process. | ||
| + | <Written by 10-13> | ||
| + | |||
| + | ===Date 2015-10-13=== | ||
| + | ====Plan to do==== | ||
| + | :1. Read Papers for matrix factorization. | ||
| + | :2. Read Papers for binary vectors. | ||
| + | ====Done==== | ||
| + | :1. Read Papers for matrix factorization. | ||
| + | Paper list: | ||
| + | Neural Word Embedding as Implicit Matrix Factorization. Omer Levy; | ||
| + | :2. Read Papers for binary vectors. | ||
| + | Paper list: | ||
| + | Learning to Hash for Indexing Big Data - A Survey. Jun Wang; | ||
| + | Learning Hash Codes with Listwise Supervision. Jun Wang; | ||
| + | Kernelized Locality-Sensitive Hashing for Scalable Image Search. Brian Kulis; | ||
| + | <Written by 10-14> | ||
| + | |||
| + | ===Date 2015-10-13=== | ||
| + | ====Plan to do==== | ||
| + | :1. Read Papers for matrix factorization. | ||
| + | :2. Read Papers for binary vectors. | ||
====Done==== | ====Done==== | ||
2015年10月14日 (三) 10:59的版本
目录
Chaos Work
Date Line
Date 2015-10-09
Plan to do
- 1. Write Matrix Factorization program.
- 2. Write DNN Max-Margin program with dropout.
- 3. Run some test model in GPU machines.
<Written by 10-08>
Done
- 2. Write DNN Max-Margin program with dropout.
- 3. Run some test model in GPU machines.
Date 2015-10-10
Plan to do
- 1. Write Matrix Factorization program.
- 2. Start test different matrix factorization lost functions.
- 3. Write Binary Vector for theano mode.
<Written by 10-09>
Done
- 1. Write Matrix Factorization program.
- 2. Start test different matrix factorization lost functions.
<Written by 10-12>
Date 2015-10-12
Plan to do
- 1. Test different matrix factorization lost functions.
- 2. Write Binary Vector for theano mode.
- 3. Thinking the method for merge different resource for one matrix factorization process.
Done
- 1. Test different matrix factorization lost functions.
- 2. Write Binary Vector for theano mode.
- 3. Thinking the method for merge different resource for one matrix factorization process.
<Written by 10-13>
Date 2015-10-13
Plan to do
- 1. Read Papers for matrix factorization.
- 2. Read Papers for binary vectors.
Done
- 1. Read Papers for matrix factorization.
Paper list:
Neural Word Embedding as Implicit Matrix Factorization. Omer Levy;
- 2. Read Papers for binary vectors.
Paper list:
Learning to Hash for Indexing Big Data - A Survey. Jun Wang;
Learning Hash Codes with Listwise Supervision. Jun Wang;
Kernelized Locality-Sensitive Hashing for Scalable Image Search. Brian Kulis;
<Written by 10-14>
Date 2015-10-13
Plan to do
- 1. Read Papers for matrix factorization.
- 2. Read Papers for binary vectors.