Controllable Generative Modeling in Language and Vision Workshop (NeurIPS 2021)

Event Notification Type: 
Call for Papers
Abbreviated Title: 
CtrlGen2021
Location: 
Co-located with NeurIPS 2021
Monday, 13 December 2021
State: 
Virtual
Country: 
Virtual
City: 
Virtual
Contact: 
Steven Feng
Anusha Balakrishnan
Drew Hudson
Tatsunori Hashimoto
Dongyeop Kang
Varun Gangal
Joel Tetreault
Submission Deadline: 
Monday, 27 September 2021

Paper Submission Deadline: September 27, 2021
Demo Submission Deadline: October 29, 2021
Submission Link: https://cmt3.research.microsoft.com/CtrlGen2021/Submission/Index
===================================
Excited by generation, control, and disentanglement, in either language or vision? Check out our controllable generation workshop CtrlGen taking place virtually at NeurIPS 2021 on December 13th!

We feature an exciting lineup of speakers, a live QA and panel session, interactive activities, and networking opportunities.

We also invite you to submit both papers and demonstrations (deadlines are September 27 and October 29 respectively). We invite submissions on the following topics of interest and more (see further down for Demo instructions):

Methodology and Algorithms:

  • New methods and algorithms for controllability.
  • Improvements of language and vision model architectures for controllability.
  • Novel loss functions, decoding methods, and prompt design methods for controllability.
  • Improved training and finetuning methods for controllability.
  • Investigation of post-editing modeling paradigms including Plug and Play (PPLMs) for controllability.
  • Unsupervised models such as VAEs and self-supervised approaches for controllability.

Applications and Ethics:

  • Applications of controllability including creative AI, machine co-creativity, entertainment, data augmentation (for text and vision), ethics (e.g. bias and toxicity reduction), enhanced training for self-driving vehicles, and improving conversational agents..
  • Some examples of creative AI and entertainment applications include controllable poetry, lyrics, music, image, and video generation..
  • Ethical issues and challenges related to controllable generation including the risks and dangers of deepfake and fake news, and methods to mitigate and combat them..
  • The introduction and exploration of new applications for controllability..
  • Exploration of particular controllability tasks including:.
  • Aspect-based summarization.
  • Semantic text exchange.
  • Syntactically-controlled paraphrase generation.
  • Controlling the sentiment and politeness in generated text.
  • Persona-based text generation.
  • Style-sensitive generation or style transfer (for text and vision).
  • Constrained generation tasks, including concept (or keywords) to text or image generation.
  • Image synthesis and scene representation in both 2D and 3D.
  • Cross-modal tasks such as controllable image or video captioning and generation from text.
  • We also welcome the introduction and exploration of new controllability tasks..

Evaluation and Benchmarks:

  • Establishment of standard and unified metrics and benchmark tasks to efficiently and effectively compare different methods for controllable generation..
  • The introduction and exploration of new evaluation methodology for controllability..

Cross-Domain and Other Areas:

  • Work in relevant domains to controllability such as interpretability, disentanglement, robustness, and representation learning..
  • Papers from other areas that tie in with controllability such as neuroscience and cognitive science..

Position and Survey:

  • Position and survey papers, including those looking at the problems and lacunae in current controllability formulations, neglected areas in controllability, and the unclear and non-standardized definition of controllability..

Submission Instructions
Papers will be submitted using our CMT submission portal and go through double-blind review. Please make sure to select the “Papers” category when submitting. Submissions should be a single .pdf file that is fully anonymized, with up to 8 pages of content and unlimited references and appendices, following the NeurIPS style template. Supplementary material in the form of code and small data files can be submitted separately as a single .zip file. An option to upload supplementary material shows up in the author console after the paper is submitted.

Accepted papers will be presented as posters and hosted on our workshop website. Note that the workshop is non-archival. While original submissions are preferred, we also welcome works currently under review, but discourage papers already accepted and published elsewhere, including at the NeurIPS main conference. We especially encourage submissions from those with diverse backgrounds, such as minority or underrepresented groups and junior researchers.

Submission Link: https://cmt3.research.microsoft.com/CtrlGen2021/Submission/Index

Important Dates
Paper Submission Deadline: September 27, 2021
Paper Acceptance Notification: October 22, 2021
Paper Camera-Ready Deadline: November 1, 2021

Note that the above deadlines are all 11:59pm AOE.

Call for Demonstrations
We also invite submissions for demonstrations of controllable generation systems for both text and vision. Submission deadline: October 29, 2021.

We encourage demonstrations of all forms. This includes those for research and academic-related endeavors, but also demos of products, interesting and creative projects, and so forth. The main criteria for demonstrations will be how creative, well-presented, and attention-grabbing they are. Examples include the following:

  • Creative AI such as controllable poetry, music, image, and video generation models.
  • Style transfer for both text and vision.
  • Interactive chatbots and assistants that involve controllability.
  • Controllable language generation systems, e.g. using GPT-2 or GPT-3.
  • Controllable multimodal systems such as image and video captioning or generation from text.
  • Controllable image and video/graphics enhancement systems.
  • Systems for controlling scenes/environments and applications for self-driving vehicles.
  • Controllability in the form of deepfake and fake news, specifically methods to combat them.
  • And much, much more…

Demonstration Submission Instructions
Please record a brief (e.g. 3-5 minute) video showcasing and explaining your demo. Demonstrations will be submitted using our CMT submission portal in a single .zip file (containing the recording). Please make sure to select the “Demos” category when submitting. Accepted demonstrations will be presented during our workshop and hosted on our workshop website.

Submission Link: https://cmt3.research.microsoft.com/CtrlGen2021/Submission/Index

Demonstration Important Dates
Demo Submission Deadline: October 29, 2021
Demo Acceptance Notification: November 19, 2021

Note that the above deadlines are all 11:59pm AOE.