“ASR-events-BICS16”版本间的差异

来自cslt Wiki
跳转至: 导航搜索
 
第5行: 第5行:
 
===Introduction===
 
===Introduction===
  
Large-scale deep neural models, e.g., deep neural networks (DNN) and recurrent neural networks (RNN), have demonstrated significant success in solving various challenging tasks of speech and language processing (SLP), including, amongst others, speech recognition, speech synthesis, document classification, question answering. This growing impact corroborates the neurobiological evidence concerning the presence of layer-wise deep processing in the human brain.  
+
Large-scale deep neural models, e.g., deep neural networks (DNN) and recurrent neural networks (RNN), have demonstrated significant success in solving various challenging tasks of speech and language processing (SLP), including, amongst others, speech recognition, speech synthesis, document classification, question answering. This growing impact corroborates the neurobiological evidence concerning the presence of layer-wise deep processing in the human brain. On the other hand, sparse coding representation has also gained similar success in SLP, particularly in signal processing, demonstrating sparsity as another important neurobiological characteristic.
 +
 +
Traditionally, deep learning and sparse coding have been studied by different research communities. This special session on BICS 2016 (http://bii.ia.ac.cn/bics-2016/index.html) aims to offer a timely opportunity to researchers in the two areas to share their complementary results and methods, and help mutually promote development of new theories and methodologies for hybrid deep and sparsity based models, particularly in the field of speech and language processing.
  
On the other hand, sparse coding representation has also gained similar success in SLP, particularly in signal processing, demonstrating sparsity as another important neurobiological characteristic, that may be responsible for efficient functioning of the human neural system. One question of particular interest to both neuroscience and sparsity researchers concerns the interrelationship of two key aspects: depth and sparsity, whether they may be functioning independently, or perhaps twisted together?
 
 
Traditionally, deep learning and sparse coding have been studied by different research communities. This special session on BICS 2016 (http://bii.ia.ac.cn/bics-2016/index.html) aims to offer a timely opportunity to researchers in the two areas to share their complementary results and methods, and help mutually promote development of new theories and methodologies for hybrid deep and sparsity based models, particularly in the field of SLP.
 
  
 
===Scope===
 
===Scope===
  
The focus of this special session is to address recent advances in hybrid deep and sparsity based neural models, with a particular focus on SLP. It will provide a forum for scientists and researchers working in deep and sparse computing to learn from each other, and mutually develop possible new methodologies for next-generation deep-sparse, and sparse-deep models and applications. Target research topics of interest include, but are not limited to, the following:
 
  
 
* Theories and methods for deep sparse or sparse deep models  
 
* Theories and methods for deep sparse or sparse deep models  
第27行: 第25行:
 
* Acceptance notification: August 10, 2016
 
* Acceptance notification: August 10, 2016
 
* Camera-ready due: September 10, 2016
 
* Camera-ready due: September 10, 2016
* Special session dates on BICS 2017: TBD (November 28-30, 2016)
+
 
  
 
===Submission and publication===
 
===Submission and publication===
第33行: 第31行:
 
* The special session uses the same submission system as BICS 2016 (http://bii.ia.ac.cn/bics-2016/index.html).  
 
* The special session uses the same submission system as BICS 2016 (http://bii.ia.ac.cn/bics-2016/index.html).  
 
* The accepted papers will be published in the Springer LNAI series.  
 
* The accepted papers will be published in the Springer LNAI series.  
* Selected papers will be published in a special issue of Cognitive Computation Journal(http://link.springer.com/journal/12559), after reasonable extension.
+
* Selected papers will be published in a special issue of Cognitive Computation Journal(http://link.springer.com/journal/12559).
  
 
===Organizers===
 
===Organizers===
'''Dong Wang(+), Qiang Zhou(+) and Amir Hussain(*)'''
 
 
:*(+)Center for Speech and Language Technology, Research Institute of Information Technology,Tsinghua University, China
 
  
::''Email: wangdong99@mails.tsinghua.edu.cn; zq-lxd@mail.tsinghua.edu.cn''
+
Dong Wang, Qiang Zhou (CSLT, Tsinghua University, China)
 +
Email: wangdong99@mails.tsinghua.edu.cn; zq-lxd@mail.tsinghua.edu.cn
  
:*(*)Cognitive Big Data Informatics Research Lab, Computing Science & Maths, School of Natural Science, University of Stirling, Scotland, UK
+
Amir Hussain (Cognitive Big Data Informatics Research Lab, University of Stirling, UK)
::''Email: ahu@cs.stir.ac.uk''
+
Email: ahu@cs.stir.ac.uk

2016年6月16日 (四) 11:25的最后版本

Bicstoken.png

Special session on BICS 2016: Deep and/or Sparse Neural Models for Speech and Language Processing

Introduction

Large-scale deep neural models, e.g., deep neural networks (DNN) and recurrent neural networks (RNN), have demonstrated significant success in solving various challenging tasks of speech and language processing (SLP), including, amongst others, speech recognition, speech synthesis, document classification, question answering. This growing impact corroborates the neurobiological evidence concerning the presence of layer-wise deep processing in the human brain. On the other hand, sparse coding representation has also gained similar success in SLP, particularly in signal processing, demonstrating sparsity as another important neurobiological characteristic.

Traditionally, deep learning and sparse coding have been studied by different research communities. This special session on BICS 2016 (http://bii.ia.ac.cn/bics-2016/index.html) aims to offer a timely opportunity to researchers in the two areas to share their complementary results and methods, and help mutually promote development of new theories and methodologies for hybrid deep and sparsity based models, particularly in the field of speech and language processing.


Scope

  •  Theories and methods for deep sparse or sparse deep models
  •  Theories and methods for hybrid deep neural models in SLP
  •  Theories and methods for hybrid sparse models in SLP
  •  Comparative study of deep/sparse neural and Bayesian based models
  •  Applications of deep and/or sparse models in SLP


Important dates

  • Paper submission: July 20, 2016
  • Acceptance notification: August 10, 2016
  • Camera-ready due: September 10, 2016


Submission and publication

Organizers

Dong Wang, Qiang Zhou (CSLT, Tsinghua University, China) Email: wangdong99@mails.tsinghua.edu.cn; zq-lxd@mail.tsinghua.edu.cn

Amir Hussain (Cognitive Big Data Informatics Research Lab, University of Stirling, UK) Email: ahu@cs.stir.ac.uk