*SEM Shared Task 2012 Resolving the Scope and Focus of Negation

Event Notification Type: 
Call for Participation
Thursday, 7 June 2012 to Friday, 8 June 2012
Country: 
Canada
City: 
Montreal
Contact: 
Roser Morante
Eduardo Blanco
Submission Deadline: 
Thursday, 15 March 2012

First Call for Participation

================================
*SEM Shared Task 2012
Resolving the Scope and Focus of Negation
http://www.clips.ua.ac.be/sem2012-st-neg/
================================

Resolving the Scope and Focus of Negation is the shared task of the *SEM 2012 Conference (http://ixa2.si.ehu.es/starsem/), which will take place in Montreal, Canada, June 7-8, 2012.

-----------------------
Scope and focus of negation
-----------------------

Negation is a pervasive and intricate linguistic phenomenon present in all languages (Horn 1989). Despite this fact, computational semanticists mostly ignore it; current proposals to represent the meaning of text either dismiss negation or only treat it in a superficial manner. This shared task tackles two key steps in order to obtain the meaning of negated statements: scope and focus detection. Regardless of the semantic representation one favors (predicate calculus, logic forms, binary semantic relations, etc.), these tasks are the basic building blocks to process the meaning of negated statements.

Scope of negation is the part of the meaning that is negated and focus the part of the scope that is most prominently negated (Huddleston and Pullum 2002). In the example (1), scope is enclosed in square brackets and focus is marked between angle brackets:

(1) [John had] never [said before].

Scope marks all negated concepts. In (1) the statement is strictly true if an event saying did not take place, John was not the one who said, as much is not the quantity of said or before the time. Focus indicates the intended negated concepts and allows to reveal implicit positive meaning. The implicit positive meaning of (1) is that John had said less before.

This shared tasks aims at detecting the scope and focus of negation.

-----
Tasks
-----

Two tasks and a pilot task are proposed:

Task 1: scope detection
-----------------------

For each negation, the negation cue and scope are marked, as well as the negated event, if any. Cues and scopes may be discontinuous. Example (1) shows an annotated sentence, where the scope is enclosed between square brackets, the cue highlighted in bold letters and the negated event is marked between asteriscs.

(2) [I do]n't [*know* what made me look up], but there was a face looking in at me through the lower pane.

Task 2: focus detection
-----------------------

Example (2) shows an annotated sentence indicating focus with an underline and semantic roles as provided by PropBank in curly brackets. Detecting focus is useful to detect positive implicit meaning (in (2), a decision is expected in June).

(3) {A decision A1} is{n't M-NEG} expected {until June M-TMP}

Pilot: detection of both scope and focus
-------------------------------------

The pilot task aims at detecting both scope and focus. The test set will be the same test set as for Task 1.

-----
Tracks
-----

All tasks will have a closed and open track. In the closed track, participants can only use the data made available by the organization; in the open track they can make use of any external resource or tool. For the closed track, the organzation will provide the corpora processed with several levels of linguistic information.

--------
Datasets
--------

Two datasets are provided, one for Task 1, and another for Task 2.

[CD-SCO] for Task 1. This dataset includes two stories by Conan Doyle, The Hound of the Baskervilles, The Adventures of Wisteria Lodge for training and development. All occurrences of negation are annotated (1,056 out of 3,899 sentences), accounting for negation expressed by nouns, pronouns, verbs, adverbs, determiners, conjunctions and prepositions. For each negation cue, the negation cue and scope are marked, as well as the negated event, if any. Cues and scopes may be discontinuous. The annotation guidelines are published in Morante et al. (2011) Annotation of Negation Cues and their Scope. Guidelines v1.0, CLiPS Technical Report Series and are available on-line. For testing, another story by Conan Doyle will be provided, [CD-SCO-TEST].

An example sentence is shown in example (2) above.

Regarding copyright, the stories by Conan Doyle are in the public domain, so CD-SCO and CD-SCO-TEST will be freely available (original text and annotations).

[PB-FOC] for Task 2. In this dataset, focus of negation is annotated over the 3,993 sentences in the WSJ section of the Penn TreeBank marked with MNEG in PropBank. It accounts for verbal, analytical and clausal relation; the role most likely to correspond to the focus was selected as focus. Unlike [CD-SCO], all sentences in [PB-FOC] contain a negation. 80% of [PB-FOC] will be released as training/development set and the rest for test.

An example sentence is shown in example (3) above. More information about the dataset can be found in E. Blanco and D. Moldovan (2011) Semantic Representation of Negation Using Focus Detection, in Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies (ACL-HLT 2011), Portland, OR, USA.

Regarding copyright, PB-FOC is built on top of the Penn TreeBank. The organization will release stand-alone annotations and participants will have to obtain the raw text from the LDC.

--------------
Important dates
--------------

January 16, 2012 - Registration open
February 5, 2012 - Train/development data available
March 5, 2012 - Test data available
March 15, 2012 - System outputs due
March 26, 2012 - Description paper due
April 23, 2012 - Notification of paper acceptance
May 4, 2012 - Camera ready deadline

----------
Data format
----------

Following previous shared tasks, all annotations will be provided in the CoNLL-2005 Shared Task format. Very briefly, each line corresponds to a token, each annotation (chunks, named entities, etc.) is provided in a column; empty lines indicate end of sentence.

---------
Evaluation
---------

Evaluation will be performed as follows:

Task 1: Scope detection

F-measure for predicting negation cues (perfect match).
F-measure for predicting both negation cues and scope. Evaluation will be carried out at the scope level.
F-measure for predicting negated events (perfect match).
Full evaluation: F-measure for negation cues, scope and negated events (perfect match).
Sentence evaluation: F-measure per sentence. A sentence is predicted correct if all negation cues, scopes and negated events are predicted exactly correct.

Task 2: Focus detection

F-measure for predicting focus of negation (perfect match).

Pilot: Scope and focus detection

Full evaluation and sentence evaluation (same as for Task1).
F-measure for detecting focus of negation (perfect match).
Joint Evaluation: F-measure per sentence, a sentence is predicted correct if all negation cues, scopes, negated events and foci are predicted exactly correct.

The evaluation scripts will be provided with the training datasets.

-----------
Registration
-----------

Participants should register by sending an e-mail to the organisers. The information required is: name and surname, affiliation, e-mail address, team members. The subject of the message should be: [*sem-st] Registration.

-----------
Submissions
-----------

Participants will be allowed to engage in any combination of tasks and submit a maximum of two runs per track. Submissions should be made by sending an e-mail to the organizers with a zip or tar.gz file. The compressed file should be named according to the following convention: -semst-submission-.. The files should be organised in a directory structure as in the image below.

-----------------------
References and related work
-----------------------

E. Blanco and D. Moldovan. 2011. Semantic Representation of Negation Using Focus Detection. In Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies (ACL-HLT 2011), Portland, OR, USA.
I. Councill, R. McDonald, and L. Velikovich. 2010. What’s great and what’s not: learning to classify the scope of negation for improved sentiment analysis. In Proceedings of the Workshop on Negation and Speculation in Natural Language Processing, pages 51–59, Uppsala, Sweden. University of Antwerp.
L. R. Horn. 1989. A natural history of negation. Chicago University Press, Chicago.
R. D. Huddleston and G. K. Pullum. 2002. The Cambridge Grammar of the English Language. CUP, Cambridge.
R. Morante and W. Daelemans. 2009. A metalearning approach to processing the scope of negation. In Proceedings of CoNLL 2009, pages 28–36, Boulder, Colorado.
R. Morante, S. Schrauwen, and W. Daelemans. 2011. Annotation of negation cues and their scope Guidelines v1.0. CLiPS Technical Report 3, CLiPS, Antwerp, Belgium, April.
Mats Rooth. 1985. Association with Focus. Ph.D. thesis, Univeristy of Massachusetts, Amherst.
Mats Rooth. 1992. A Theory of Focus Interpretation. Natural Language Semantics, 1:75-116.

-------------------
Organisation - contact
-------------------

Roser Morante, CLiPS-Computational Linguistics, University of Antwerp, Belgium.
roser.morante [at] ua.ac.be

Eduardo Blanco, Lymba Corporation, USA
eduardo [at] lymba.com