2nd CfP (Deadline Extended): ICML-2014 Workshop on Knowledge-Powered Deep Learning for Text Mining

Event Notification Type: 
Call for Papers
Abbreviated Title: 
KPDLTM
Thursday, 26 June 2014
Country: 
China
City: 
Beijing
Contact: 
Bin Gao
Jiang Bian
Richard Socher
Scott Wen-tau Yih
Submission Deadline: 
Friday, 28 March 2014

ICML-2014 Workshop on Knowledge-Powered Deep Learning for Text Mining

Call for papers

In recent years, deep learning has been applied to various text mining and NLP tasks, where the common practice is to learn word embedding. Because words rarely yield meaningful relationships in the original space when viewed as individual tokens, word embedding aims at representing words with semantic correlation into closer positions in the latent space. Such representations of text are typically derived by applying existing neural network frameworks to text corpora and have successfully demonstrated their effectiveness in solving various text-related tasks.

However, as human languages are governed by both syntactic regularities as defined in morphology and grammars and semantic notions supported by common sense knowledge, learning models solely from large text corpora without recognizing the inherent structure in languages may not be the most efficient strategy. Given the existence and availability of rich knowledge stored in different forms, such as databases of word facts like Freebase and Yago, linguistic resources like WordNet and FrameNet, or even implicit usage data from click-through logs from search engines and social media, we believe deep learning frameworks can benefit substantially from leveraging these knowledge resources and thus further advance the state-of-the-art of various text mining tasks.

In this workshop, our goal is to bring together researchers and practitioners in this area, and review and share the latest research results, as well as discussing future directions. We solicit papers on all aspects of knowledge-powered deep learning for text mining, including, but not limited to:


  • Supervised and supervised text representation learning powered by knowledge
  • Metric learning and kernel learning for text mining
  • Dimensionality expansion and sparse modeling for text mining
  • Hierarchical and hybrid models for text mining
  • Knowledge base completion and machine reasoning with deep learning
  • Recurrent and recursive neural network models for text mining
  • Evaluations on the effectiveness of learned text representation
  • Optimization for text representation learning
  • Implementation issues, parallelization, software platforms, and tools

Submission Information

Authors should submit a paper of up to 4 pages in electronic, PDF format, using ICML template (http://icml.cc/2014/icml2014stylefiles.zip). Reviewing will not be double-blind. Submissions must be made through https://cmt.research.microsoft.com/kpdltm2014.

Important Dates


  • March 28th, 2014: Submissions deadline (Extended)
  • April 18th, 2014: Notification of acceptance
  • May 2nd, 2014: Camera-ready deadline
  • June 25th, 2014: Workshop

Organizers


  • Bin Gao (Microsoft Research)
  • Jiang Bian (Microsoft Research)
  • Richard Socher (Stanford University)
  • Scott Wen-tau Yih (Microsoft Research)