|
{ |
|
"paper_id": "S12-1039", |
|
"header": { |
|
"generated_with": "S2ORC 1.0.0", |
|
"date_generated": "2023-01-19T15:24:23.201989Z" |
|
}, |
|
"title": "UConcordia: CLaC Negation Focus Detection at *Sem 2012", |
|
"authors": [ |
|
{ |
|
"first": "Sabine", |
|
"middle": [], |
|
"last": "Rosenberg", |
|
"suffix": "", |
|
"affiliation": { |
|
"laboratory": "CLaC Lab", |
|
"institution": "Concordia University", |
|
"location": { |
|
"addrLine": "1455 de Maisonneuve Blvd West", |
|
"postCode": "H3W 2B3", |
|
"settlement": "Montr\u00e9al", |
|
"region": "QC", |
|
"country": "Canada" |
|
} |
|
}, |
|
"email": "" |
|
}, |
|
{ |
|
"first": "Sabine", |
|
"middle": [], |
|
"last": "Bergler", |
|
"suffix": "", |
|
"affiliation": { |
|
"laboratory": "CLaC Lab", |
|
"institution": "Concordia University", |
|
"location": { |
|
"addrLine": "1455 de Maisonneuve Blvd West", |
|
"postCode": "H3W 2B3", |
|
"settlement": "Montr\u00e9al", |
|
"region": "QC", |
|
"country": "Canada" |
|
} |
|
}, |
|
"email": "[email protected]" |
|
} |
|
], |
|
"year": "", |
|
"venue": null, |
|
"identifiers": {}, |
|
"abstract": "Simply detecting negation cues is not sufficient to determine the semantics of negation, scope and focus must be taken into account. While scope detection has recently seen repeated attention, the linguistic notion of focus is only now being introduced into computational work. The *Sem2012 Shared Task is pioneering this effort by introducing a suitable dataset and annotation guidelines. CLaC's NegFocus system is a solid baseline approach to the task.", |
|
"pdf_parse": { |
|
"paper_id": "S12-1039", |
|
"_pdf_hash": "", |
|
"abstract": [ |
|
{ |
|
"text": "Simply detecting negation cues is not sufficient to determine the semantics of negation, scope and focus must be taken into account. While scope detection has recently seen repeated attention, the linguistic notion of focus is only now being introduced into computational work. The *Sem2012 Shared Task is pioneering this effort by introducing a suitable dataset and annotation guidelines. CLaC's NegFocus system is a solid baseline approach to the task.", |
|
"cite_spans": [], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Abstract", |
|
"sec_num": null |
|
} |
|
], |
|
"body_text": [ |
|
{ |
|
"text": "Negation has attracted the attention of the NLP community and we have seen an increased advance in sophistication of processing tools. In order to assess factual information as asserted or not, it is important to distinguish the difference between", |
|
"cite_spans": [], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Introduction", |
|
"sec_num": "1" |
|
}, |
|
{ |
|
"text": "(1) (a) Newt Gingrich Not Conceding Race After Losing Florida Primary (b) Newt Gingrich Conceding Race After Losing Florida Primary This distinction is important and cannot be properly inferred from the surrounding context, not conceding a race after losing is in fact contrary to expectation in the original headline (1a), and the constructed (1b) is more likely in isolation. Negation has been addressed as a task in itself, rather than as a component of other tasks in recent shared tasks and workshops. Detection of negation cues and negation scope at CoNLL (Farkas et al., 2010) , BioNLP (Kim et al., 2011) and the Negation and Speculation in NLP Workshop (Morante and Sporleder, 2010) laid the foundation for the *Sem 2012 Shared Task. While the scope detection has been extended to fictional text in this task, an important progression from the newspaper and biomedical genres, the newly defined Focus Detection for Negation introduces the important question: what is the intended opposition in (1a)? The negation trigger is not, the scope of the negation is the entire verb phrase, but which aspect of the verb phrase is underscored as being at variance with reality, that is, which of the following possible (for the sake of linguistic argument only) continuations is the more likely one:", |
|
"cite_spans": [ |
|
{ |
|
"start": 103, |
|
"end": 131, |
|
"text": "After Losing Florida Primary", |
|
"ref_id": null |
|
}, |
|
{ |
|
"start": 562, |
|
"end": 583, |
|
"text": "(Farkas et al., 2010)", |
|
"ref_id": "BIBREF4" |
|
}, |
|
{ |
|
"start": 593, |
|
"end": 611, |
|
"text": "(Kim et al., 2011)", |
|
"ref_id": "BIBREF9" |
|
}, |
|
{ |
|
"start": 661, |
|
"end": 690, |
|
"text": "(Morante and Sporleder, 2010)", |
|
"ref_id": "BIBREF11" |
|
} |
|
], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Introduction", |
|
"sec_num": "1" |
|
}, |
|
{ |
|
"text": "(2) i . . . , Santorum does.", |
|
"cite_spans": [], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Introduction", |
|
"sec_num": "1" |
|
}, |
|
{ |
|
"text": "(\u00acN ewt Gingrich) ii . . . , Doubling Efforts (\u00acconcede) iii . . . , Demanding Recount (\u00acrace) iv . . . , Texas redistricting at fault (\u00acF lorida)", |
|
"cite_spans": [], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Introduction", |
|
"sec_num": "1" |
|
}, |
|
{ |
|
"text": "This notion of focus of negation is thus a pragmatic one, chosen by the author and encoded with various means. Usually, context is necessary to determine focus. Often, different possible interpretations of focus do not change the factual meaning of the overall text, but rather its coherence. In (1 a) the imagined possible contexts (2 ii) and (2 iii) closely correspond to a simple negation of (1 b), (2 i) and (2 iv) do not feel properly represented by simply negating (1 b). This level of interpretation is contentious among people and it is the hallmark of wellwritten, well-edited text to avoid unnecessary guesswork while at the same time avoiding unnecessary clarifying repetition. The potential for ambiguity is demonstrated by Example (3) from (Partee, 1993) , where it is questionable whether the speaker in fact has possession of the book in question.", |
|
"cite_spans": [ |
|
{ |
|
"start": 753, |
|
"end": 767, |
|
"text": "(Partee, 1993)", |
|
"ref_id": "BIBREF14" |
|
} |
|
], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Introduction", |
|
"sec_num": "1" |
|
}, |
|
{ |
|
"text": "(3) I didn't get that book from Mary Here, if the focus is from Mary, it would be likely that the speaker has possion of the book, but received it some other way. If the focus is that book, the speaker does not have possession of it.", |
|
"cite_spans": [], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Introduction", |
|
"sec_num": "1" |
|
}, |
|
{ |
|
"text": "It is important to note hat this notion of focus is not syntactically determined as shown in (3) (even though we use syntactic heuristics here to approximate it) but pragmatically and it correlates with pronunciation stress, as discussed in linguistics by (Han and Romero, 2001 ). More recently, focus negation has been identified as a special use (Poletto, 2008) . The difference of scope and focus of negation are elaborated by (Partee, 1993) , and have been used for computational use by (Blanco and Moldovan, 2011) .", |
|
"cite_spans": [ |
|
{ |
|
"start": 256, |
|
"end": 277, |
|
"text": "(Han and Romero, 2001", |
|
"ref_id": "BIBREF5" |
|
}, |
|
{ |
|
"start": 348, |
|
"end": 363, |
|
"text": "(Poletto, 2008)", |
|
"ref_id": "BIBREF15" |
|
}, |
|
{ |
|
"start": 430, |
|
"end": 444, |
|
"text": "(Partee, 1993)", |
|
"ref_id": "BIBREF14" |
|
}, |
|
{ |
|
"start": 491, |
|
"end": 518, |
|
"text": "(Blanco and Moldovan, 2011)", |
|
"ref_id": "BIBREF0" |
|
} |
|
], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Introduction", |
|
"sec_num": "1" |
|
}, |
|
{ |
|
"text": "The *Sem 2012 Task 2 on Focus Detection builds on recent negation scope detection capabilities and introduces a gold standard to identify the focus item. Focus of negation is annotated over 3,993 sentences in the WSJ section of the Penn TreeBank marked with MNEG in PropBank. It accounts for verbal, analytical and clausal relation to a negation trigger; the role most likely to correspond to the focus was selected as focus. All sentences of the training data contain a negation. A sample annotation from the gold standard is given in (4), where PropBank semantic roles are labelled A1, M-NEG, and M-TMP and focus is underlined (until June).", |
|
"cite_spans": [], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Introduction", |
|
"sec_num": "1" |
|
}, |
|
{ |
|
"text": "(4) A decision A1 is n t M \u2212N EG expected until June M \u2212T M P", |
|
"cite_spans": [], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Introduction", |
|
"sec_num": "1" |
|
}, |
|
{ |
|
"text": "A recent study in combining regular pattern extraction with parse information for enhanced indexing of radiology reports showed effective detection of negated noun phrases for that corpus (Huang and Lowe, 2007) . NegFinder (Mutalik et al., 2001 ) detects negated concepts in dictated medical documents with a simple set of corpus specific context-free rules, and they observe that in their corpus \"One of the words no, denies/denied, not, or without was present in 92.5 percent of all negations.\" Interestingly, several of their rules concern coordination (and, or) or prepositional phrase attachment patterns (of, for). NegEx (Chapman et al., 2001 ) is publicly available and maintained and updated with community-enhanced trigger lists (http://code.google.com/p/negex/ wiki/NegExTerms). NegEx \"locates trigger terms indicating a clinical condition is negated or possible and determines which text falls within the scope of the trigger terms.\" NegEx uses a simple regular expression algorithm with a small number of negation phrases and focuses on a wide variety of triggers but limits them to domain relevant ones. Consequently, the trigger terms and conditions are heavily stacked with biomedical domain specific terms.", |
|
"cite_spans": [ |
|
{ |
|
"start": 188, |
|
"end": 210, |
|
"text": "(Huang and Lowe, 2007)", |
|
"ref_id": "BIBREF6" |
|
}, |
|
{ |
|
"start": 223, |
|
"end": 244, |
|
"text": "(Mutalik et al., 2001", |
|
"ref_id": "BIBREF13" |
|
}, |
|
{ |
|
"start": 627, |
|
"end": 648, |
|
"text": "(Chapman et al., 2001", |
|
"ref_id": "BIBREF1" |
|
} |
|
], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Previous Work", |
|
"sec_num": "2" |
|
}, |
|
{ |
|
"text": "Outside the biomedical text community, sentiment and opinion analysis research features negation detection (Wilson, 2008) . Current gold standard annotations for explicit negation as well as related phenomena include TIMEBANK (Pustejovsky et al., 2003) , MPQA (Wiebe et al., 2005) , and Bio-Scope (Vincze et al., 2008) . (Wiegand et al., 2010 ) presents a flat feature combination approach of features of different granularity and analytic sophistication, since in opinion mining the boundary between negation and negative expressions is fluid.", |
|
"cite_spans": [ |
|
{ |
|
"start": 97, |
|
"end": 121, |
|
"text": "detection (Wilson, 2008)", |
|
"ref_id": null |
|
}, |
|
{ |
|
"start": 217, |
|
"end": 252, |
|
"text": "TIMEBANK (Pustejovsky et al., 2003)", |
|
"ref_id": null |
|
}, |
|
{ |
|
"start": 255, |
|
"end": 280, |
|
"text": "MPQA (Wiebe et al., 2005)", |
|
"ref_id": null |
|
}, |
|
{ |
|
"start": 297, |
|
"end": 318, |
|
"text": "(Vincze et al., 2008)", |
|
"ref_id": null |
|
}, |
|
{ |
|
"start": 321, |
|
"end": 342, |
|
"text": "(Wiegand et al., 2010", |
|
"ref_id": "BIBREF16" |
|
} |
|
], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Previous Work", |
|
"sec_num": "2" |
|
}, |
|
{ |
|
"text": "CLaC Labs' general, lightweight negation module is intended to be embedded in any processing pipeline. The heuristics-based system is composed of three modules for the GATE (Cunningham et al., 2011) environment: the first component detects and annotates explicit negation cues present in the corpus, the second component detects and annotates the syntactic scope of the detected instances of verbal negation, and the third component implements focus heuristics for negation. The first two steps were developed independently, drawing on data from MPQA (Wiebe et al., 2005) and TIME-BANK (Pustejovsky et al., 2003) with validation on Bio-Scope (Vincze et al., 2008) . The third step has been added based on data for the *Sem 2012 challenge and is intended to validate both, the first two \"preprocessing\" steps and the simple heuristic approximation of focus.", |
|
"cite_spans": [ |
|
{ |
|
"start": 173, |
|
"end": 198, |
|
"text": "(Cunningham et al., 2011)", |
|
"ref_id": null |
|
}, |
|
{ |
|
"start": 546, |
|
"end": 571, |
|
"text": "MPQA (Wiebe et al., 2005)", |
|
"ref_id": null |
|
}, |
|
{ |
|
"start": 576, |
|
"end": 612, |
|
"text": "TIME-BANK (Pustejovsky et al., 2003)", |
|
"ref_id": null |
|
}, |
|
{ |
|
"start": 642, |
|
"end": 663, |
|
"text": "(Vincze et al., 2008)", |
|
"ref_id": null |
|
} |
|
], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "CLaC's NegFocus", |
|
"sec_num": "3" |
|
}, |
|
{ |
|
"text": "Parser-based, our focus detection pipeline requires as input entire sentences. Therefore, the first step requires the extraction of each sentence utilizing the supplied token numbers and save them in the correct format. The system then performs standard preprocessing: sentence splitting, tokenization, parsing using the Stanford Parser (Klein and Manning, 2003; de Marneffe and Manning, 2006) and morphological preprocessing. Note that NegFocus does not use any PropBank annotations nor other provided training annotations, resulting in an independent, parserbased stand-alone module.", |
|
"cite_spans": [ |
|
{ |
|
"start": 337, |
|
"end": 362, |
|
"text": "(Klein and Manning, 2003;", |
|
"ref_id": "BIBREF10" |
|
}, |
|
{ |
|
"start": 363, |
|
"end": 393, |
|
"text": "de Marneffe and Manning, 2006)", |
|
"ref_id": "BIBREF3" |
|
} |
|
], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Data Preprocessing", |
|
"sec_num": "3.1" |
|
}, |
|
{ |
|
"text": "The Focus Detection task only considers the explicit negation cues not, nor, never. The first step in Neg-Focus is thus to identify these triggers in the sentences using an explicit negation trigger word list.", |
|
"cite_spans": [], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Detection of Negation Triggers", |
|
"sec_num": "3.2" |
|
}, |
|
{ |
|
"text": "The Focus Detection task only considers negation of verbs. Thus, NegFocus extracts the syntactic complement of the verb to form the negated verb phrase from the dependency graphs (de Marneffe and Manning, 2006). We annotate this as the syntactic scope of the negation. Note that while we use dependency graphs, our syntactic scope is based on the parse tree and differs from the notion of scope encoded in Bio-Scope (Vincze et al., 2008) and the related format used for the *Sem 2012 Negation Scope Annotation task, which represent in our opinion the pragmatic notion of scope for the logical negation operation. Syntactic scope detection is thus considered to be a basic stepping stone towards the pragmatic scope and since the Focus Detection task does not provide scope annotations, we use syntactic scope here to validate this principle.", |
|
"cite_spans": [ |
|
{ |
|
"start": 416, |
|
"end": 437, |
|
"text": "(Vincze et al., 2008)", |
|
"ref_id": null |
|
} |
|
], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Syntactic Scope Detection", |
|
"sec_num": "3.3" |
|
}, |
|
{ |
|
"text": "Our heuristics are inspired by (Kilicoglu and Bergler, 2011) . In the majority of cases the dependency relation which identifies the syntactic scope is the neg relation. Traditionally, parse trees identify scope as lower or to the right of the trigger term, and our scope module assumes these grammatical constraints, yet includes the verb itself for the purposes of the shared task. Example 5, from the training dataset \"The Hound of the Baskervilles\" by Co-nan Doyle for the *Sem 2012 Negation Scope Annotation task, demonstrates our syntactic scope of the negation (underlined), in contrast with the gold standard scope annotation (in brackets). The gold annotation guidelines follow the proposal of Morante et al. (Morante et al., ", |
|
"cite_spans": [ |
|
{ |
|
"start": 31, |
|
"end": 60, |
|
"text": "(Kilicoglu and Bergler, 2011)", |
|
"ref_id": "BIBREF8" |
|
}, |
|
{ |
|
"start": 718, |
|
"end": 734, |
|
"text": "(Morante et al.,", |
|
"ref_id": null |
|
} |
|
], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Syntactic Scope Detection", |
|
"sec_num": "3.3" |
|
}, |
|
{ |
|
"text": "The third and final step for NegFocus is to annotate focus in sentences containing verbal negations. Using the verbal negation scope annotations of the previous step, four focus heuristics are invoked:", |
|
"cite_spans": [], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Focus Heuristics", |
|
"sec_num": "3.4" |
|
}, |
|
{ |
|
"text": "The Baseline heuristic for this component is defined according to notions discussed in (Huddleston and Pullum, 2002) , where the last constituent in the verb phrase of a clause is commonly the default location to place the heaviest stress, which we here equate with the focus. Example (6) depicts an instance where both NegFocus results (underlined) and the gold focus annotation (in brackets) match exactly. The baseline heuristic achieves 47.4% recall and 49.4% precision on the training set and 47% recall and 49.7% precision on the test set. As pointed out in Section 3.3, focus is not always determined by scope (Partee, 1993) . The training data gave rise to three additional heuristics.", |
|
"cite_spans": [ |
|
{ |
|
"start": 87, |
|
"end": 116, |
|
"text": "(Huddleston and Pullum, 2002)", |
|
"ref_id": "BIBREF7" |
|
}, |
|
{ |
|
"start": 617, |
|
"end": 631, |
|
"text": "(Partee, 1993)", |
|
"ref_id": "BIBREF14" |
|
} |
|
], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Baseline", |
|
"sec_num": "3.4.1" |
|
}, |
|
{ |
|
"text": "When an adverb is directly preceding and connected through an advmod dependency relation to the negated verb, the adverb constituent is determined as the focus of the negation.", |
|
"cite_spans": [], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Adverb", |
|
"sec_num": "3.4.2" |
|
}, |
|
{ |
|
"text": "(7) Although it may not be [legally] obligated to sell the company if the buyout group can't revive its bid, it may have to explore alternatives if the buyers come back with a bid much lower than the group 's original $ 300-a-share proposal.", |
|
"cite_spans": [], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Adverb", |
|
"sec_num": "3.4.2" |
|
}, |
|
{ |
|
"text": "Passives are frequent in newspaper articles and passive constructions front what would otherwise be the verb complement. Thus the fronted material should be eligible for focus assignment. Passives are flagged through the nsubjpass dependency, and for cases where the negated verb participates in an nsubjpass relation and has no other complement, the nsubjpass is determined as the focus. 8[Billings] were n't disclosed.", |
|
"cite_spans": [], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Noun Subject Passive", |
|
"sec_num": "3.4.3" |
|
}, |
|
{ |
|
"text": "The challenge data has cases where the negation cue itself is its own focus. These cases seem to be pragmatically determined. Error cases were reduced when determining the negation cue to be its own focus in two cases. The first case occurs when the negated verb has an empty complement (and is not a passive construction), as in Example 9. The second case occurs when the negated verb embeds a verb that we identify as an implicit negation. We have a list of implicit negation triggers largely compiled from MPQA (Wiebe et al., 2005) . Implicit negations are verbs that lexically encode a predicate and a negation, such as reject or fail. ", |
|
"cite_spans": [ |
|
{ |
|
"start": 509, |
|
"end": 534, |
|
"text": "MPQA (Wiebe et al., 2005)", |
|
"ref_id": null |
|
} |
|
], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Negation Cue", |
|
"sec_num": "3.4.4" |
|
}, |
|
{ |
|
"text": "Ordering the heuristics impacts on recall. We place the most specific heuristics before the more general ones to avoid starvation effects. For example, the adverb heuristic followed by the noun subject passive heuristic achieved better results at the beginning, since they are more specific then the negation cue heuristic. Table 1 shows the performance of the heuristics of NegFocus on the test set and on the development set. We observe that the heuristics are stable across the two sets with a 60% accuracy on the test set. The worst performer is the baseline, which is very coarse for such a semantically sophisticated task: assuming that the last element of the negated verb phrase is the focus is truly a baseline. Our heuristics, albeit simplistic, are based on linguistically sound observations. The heuristic nature allows additional heuristics that are more tailored to a corpus or a task to be added without incurring unmanageable complexity, in fact each heuristic can be tested on the development set and can report on the test set to monitor its performance. The heuristics will also provide excellent features for statistical systems.", |
|
"cite_spans": [], |
|
"ref_spans": [ |
|
{ |
|
"start": 324, |
|
"end": 331, |
|
"text": "Table 1", |
|
"ref_id": null |
|
} |
|
], |
|
"eq_spans": [], |
|
"section": "Results", |
|
"sec_num": "4" |
|
}, |
|
{ |
|
"text": "We distinguish 11 classes of errors on the test set.", |
|
"cite_spans": [], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Error Analysis", |
|
"sec_num": "5" |
|
}, |
|
{ |
|
"text": "The classes of errors depicted in Table ( 3) indicates that the classes of errors and their frequencies are consistent across the different data sets. The third error class in Table ( 3) is of particular inter- Similarly, the seventh error class in Table ( 3) contains focus annotations that are not contained in NegFocus negation scopes. Example (12) shows an error where the sentence begins with a prepositional phrase that is annotated as the gold focus.", |
|
"cite_spans": [], |
|
"ref_spans": [ |
|
{ |
|
"start": 34, |
|
"end": 41, |
|
"text": "Table (", |
|
"ref_id": null |
|
}, |
|
{ |
|
"start": 176, |
|
"end": 183, |
|
"text": "Table (", |
|
"ref_id": null |
|
}, |
|
{ |
|
"start": 249, |
|
"end": 256, |
|
"text": "Table (", |
|
"ref_id": null |
|
} |
|
], |
|
"eq_spans": [], |
|
"section": "Error Analysis", |
|
"sec_num": "5" |
|
}, |
|
{ |
|
"text": "(12) [On some days], the Nucor plant does n't produce anything.", |
|
"cite_spans": [], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Error Analysis", |
|
"sec_num": "5" |
|
}, |
|
{ |
|
"text": "We disagree with the gold annotations on this and similar cases: the prepositional phrase on some days is not negated, it provides a temporal specification for the negated statement the Nucor plant produces something and in our opinion, the negation negates something, contrasting it with (13) [On some days], the Nucor plant does n't produce a lot.", |
|
"cite_spans": [], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Error Analysis", |
|
"sec_num": "5" |
|
}, |
|
{ |
|
"text": "which allows for some production, which indicates to us that without context information, low focus is warranted here.", |
|
"cite_spans": [], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Error Analysis", |
|
"sec_num": "5" |
|
}, |
|
{ |
|
"text": "NegFocus incorporates a focus heuristic for determining the passive noun subject constituent as the focus of the negation, however only in cases where the negated verb has an empty complement. The fourth error class contains errors in focus determination where this heuristic fails and where the passive subject is the gold focus despite the complement of the negated verb not being empty, requiring further analysis: NegFocus determines an adverb directly preceding the verb trigger as the focus of the negation, but, as described in the fifth error class, the gold focus annotations in a few cases determine adverbs to be the focus of the negation even when they don't directly precede the verb, but are linked by the advmod relation, as in Example (15). When we experimented with relaxing the adjacency constraint, re- The majority of NegFocus errors occur in the second error class. Table (4) further analyzes the second error class, where the gold annotation puts the negation trigger in the focus but NegFocus finds another focus (usually in the verb complement).", |
|
"cite_spans": [], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Error Analysis", |
|
"sec_num": "5" |
|
}, |
|
{ |
|
"text": "The gold standard annotations place the focus of the negation of verb v on the negation trigger if it cannot be inferred that an action v occurred (Blanco and Moldovan, 2011) . NegFocus will only make this assumption when the verb complement constituent is empty, otherwise the baseline focus heuristic will be triggered, as depicted in Example (16). Furthermore, the CLaC system will choose to trigger the subject passive focus heuristic in the case where the verb complement constituent is empty, and the passive noun subject is present. In contrast, the gold standard annotations do not necessarily follow this heuristic as seen in Example (17).", |
|
"cite_spans": [ |
|
{ |
|
"start": 147, |
|
"end": 174, |
|
"text": "(Blanco and Moldovan, 2011)", |
|
"ref_id": "BIBREF0" |
|
} |
|
], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Error Analysis", |
|
"sec_num": "5" |
|
}, |
|
{ |
|
"text": "(17) That is n't 51 %, and the claim is [n't] documented .", |
|
"cite_spans": [ |
|
{ |
|
"start": 40, |
|
"end": 45, |
|
"text": "[n't]", |
|
"ref_id": null |
|
} |
|
], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Error Analysis", |
|
"sec_num": "5" |
|
}, |
|
{ |
|
"text": "Lastly, the gold focus annotations include focus spans which are discontinuous. NegFocus will only detect one continuous focus span within one instance of a verbal negation. The eleventh error class includes those cases where NegFocus matches one of the gold focus spans but not the other as seen in Example (18). These error cases show that more analysis of the data, but also of the very notion of focus, is necessary.", |
|
"cite_spans": [], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Error Analysis", |
|
"sec_num": "5" |
|
}, |
|
{ |
|
"text": "We conclude that this experiment confirmed the hypothesis that negation trigger detection, syntactic scope determination, and focus determination are usefully modelled as a pipeline of three simple modules that apply after standard text preprocessing and dependency parsing. Approximating focus from a principled, linguistic point of view proved to be a quick and robust exercise. Performance on development and test sets is nearly identical and in a range around 58% f-measure. While the annotation standards as well as our heuristics warrant revisiting, we believe that the value of the focus annotation will prove its value beyond negation. The challenge data provide a valuable resource in themselves, but we believe that their true value will be shown by using the derived notion of focus in downstream applications. For initial experiments, the simple NegFocus pipeline is a stable prototype.", |
|
"cite_spans": [], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Conclusion", |
|
"sec_num": "6" |
|
}, |
|
{ |
|
"text": "http://www.clips.ua.ac.be/sites/default/files/ctrs-n3.pdf", |
|
"cite_spans": [], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "", |
|
"sec_num": null |
|
}, |
|
{ |
|
"text": "(11) In New York, [a spokesman for American Brands] would n't comment.", |
|
"cite_spans": [], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "", |
|
"sec_num": null |
|
} |
|
], |
|
"back_matter": [], |
|
"bib_entries": { |
|
"BIBREF0": { |
|
"ref_id": "b0", |
|
"title": "Semantic representation of negation using focus detection", |
|
"authors": [ |
|
{ |
|
"first": "E", |
|
"middle": [], |
|
"last": "Blanco", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "D", |
|
"middle": [], |
|
"last": "Moldovan", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2011, |
|
"venue": "Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies (ACL-HLT 2011", |
|
"volume": "", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "E. Blanco and D. Moldovan. 2011. Semantic represen- tation of negation using focus detection. In Proceed- ings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Tech- nologies (ACL-HLT 2011), Portland, OR.", |
|
"links": null |
|
}, |
|
"BIBREF1": { |
|
"ref_id": "b1", |
|
"title": "A simple algorithm for identifying negated findings and diseases in discharge summaries", |
|
"authors": [ |
|
{ |
|
"first": "W", |
|
"middle": [], |
|
"last": "Chapman", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "W", |
|
"middle": [], |
|
"last": "Bridewell", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "P", |
|
"middle": [], |
|
"last": "Hanbury", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "G", |
|
"middle": [ |
|
"F" |
|
], |
|
"last": "Cooper", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "B", |
|
"middle": [], |
|
"last": "Buchanan", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2001, |
|
"venue": "Journal of Biomedical Informatics", |
|
"volume": "34", |
|
"issue": "5", |
|
"pages": "301--310", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "W. Chapman, W. Bridewell, P. Hanbury, G.F. Cooper, and B. Buchanan. 2001. A simple algorithm for identi- fying negated findings and diseases in discharge sum- maries. Journal of Biomedical Informatics, 34(5):301- 310.", |
|
"links": null |
|
}, |
|
"BIBREF3": { |
|
"ref_id": "b3", |
|
"title": "Generating typed dependency parses from phrase structure parses", |
|
"authors": [ |
|
{ |
|
"first": "M", |
|
"middle": [], |
|
"last": "De Marneffe", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "C", |
|
"middle": [ |
|
"D" |
|
], |
|
"last": "Manning", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2006, |
|
"venue": "LREC", |
|
"volume": "", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "M. de Marneffe and C.D. Manning. 2006. Generating typed dependency parses from phrase structure parses. In LREC.", |
|
"links": null |
|
}, |
|
"BIBREF4": { |
|
"ref_id": "b4", |
|
"title": "The conll-2010 shared task: Learning to detect hedges and their scope in natural language text", |
|
"authors": [ |
|
{ |
|
"first": "R", |
|
"middle": [], |
|
"last": "Farkas", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "V", |
|
"middle": [], |
|
"last": "Vincze", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "G", |
|
"middle": [], |
|
"last": "M\u00f3ra", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "J", |
|
"middle": [], |
|
"last": "Csirik", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "G", |
|
"middle": [], |
|
"last": "Szarvas", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2010, |
|
"venue": "Proceedings of the Fourteenth Conference on Computational Natural Language Learning", |
|
"volume": "", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "R. Farkas, V. Vincze, G.M\u00f3ra, J. Csirik, and G.Szarvas. 2010. The conll-2010 shared task: Learning to detect hedges and their scope in natural language text. In Proceedings of the Fourteenth Conference on Compu- tational Natural Language Learning.", |
|
"links": null |
|
}, |
|
"BIBREF5": { |
|
"ref_id": "b5", |
|
"title": "Negation, focus and alternative questions", |
|
"authors": [ |
|
{ |
|
"first": "C-H", |
|
"middle": [], |
|
"last": "Han", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "M", |
|
"middle": [], |
|
"last": "Romero", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2001, |
|
"venue": "Proceedings of the West Coast Conference in Formal Linguistics XX", |
|
"volume": "", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "C-H. Han and M. Romero. 2001. Negation, focus and alternative questions. In K. Megerdoomian and L.A. Bar-el, editors, Proceedings of the West Coast Confer- ence in Formal Linguistics XX, Somerville, MA. Cas- cadilla Press.", |
|
"links": null |
|
}, |
|
"BIBREF6": { |
|
"ref_id": "b6", |
|
"title": "A novel hybrid approach to automated negation detection in clinical radiology reports", |
|
"authors": [ |
|
{ |
|
"first": "Y", |
|
"middle": [], |
|
"last": "Huang", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "H", |
|
"middle": [ |
|
"J" |
|
], |
|
"last": "Lowe", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2007, |
|
"venue": "Journal of the American Medical Informatics Association : JAMIA", |
|
"volume": "14", |
|
"issue": "3", |
|
"pages": "304--311", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "Y. Huang and H.J. Lowe. 2007. A novel hybrid approach to automated negation detection in clinical radiology reports. Journal of the American Medical Informatics Association : JAMIA, 14(3):304-311.", |
|
"links": null |
|
}, |
|
"BIBREF7": { |
|
"ref_id": "b7", |
|
"title": "The Cambridge grammar of the English language", |
|
"authors": [ |
|
{ |
|
"first": "R", |
|
"middle": [ |
|
"D" |
|
], |
|
"last": "Huddleston", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "G", |
|
"middle": [ |
|
"K" |
|
], |
|
"last": "Pullum", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2002, |
|
"venue": "", |
|
"volume": "", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "R.D. Huddleston and G.K. Pullum. 2002. The Cam- bridge grammar of the English language. Cambridge University Press, Cambridge, UK; New York.", |
|
"links": null |
|
}, |
|
"BIBREF8": { |
|
"ref_id": "b8", |
|
"title": "Effective bio-event extraction using trigger words and syntactic dependencies", |
|
"authors": [ |
|
{ |
|
"first": "H", |
|
"middle": [], |
|
"last": "Kilicoglu", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "S", |
|
"middle": [], |
|
"last": "Bergler", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2011, |
|
"venue": "Computational Intelligence", |
|
"volume": "27", |
|
"issue": "4", |
|
"pages": "583--609", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "H. Kilicoglu and S. Bergler. 2011. Effective bio-event extraction using trigger words and syntactic dependen- cies. Computational Intelligence, 27(4):583-609.", |
|
"links": null |
|
}, |
|
"BIBREF9": { |
|
"ref_id": "b9", |
|
"title": "Overview of genia event task in bionlp shared task", |
|
"authors": [ |
|
{ |
|
"first": "J.-D", |
|
"middle": [], |
|
"last": "Kim", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "Y", |
|
"middle": [], |
|
"last": "Wang", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "T", |
|
"middle": [], |
|
"last": "Takagi", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "A", |
|
"middle": [], |
|
"last": "Yonezawa", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2011, |
|
"venue": "Proceedings of BioNLP Shared Task 2011 Workshop at ACL-HLT", |
|
"volume": "", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "J.-D. Kim, Y. Wang, T. Takagi, and A. Yonezawa. 2011. Overview of genia event task in bionlp shared task 2011. In Proceedings of BioNLP Shared Task 2011 Workshop at ACL-HLT.", |
|
"links": null |
|
}, |
|
"BIBREF10": { |
|
"ref_id": "b10", |
|
"title": "Accurate unlexicalized parsing", |
|
"authors": [ |
|
{ |
|
"first": "D", |
|
"middle": [], |
|
"last": "Klein", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "C", |
|
"middle": [ |
|
"D" |
|
], |
|
"last": "Manning", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2003, |
|
"venue": "Proceedings of the 41st Annual Meeting of the Association for Computational Linguistics", |
|
"volume": "", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "D. Klein and C.D. Manning. 2003. Accurate unlexical- ized parsing. In Proceedings of the 41st Annual Meet- ing of the Association for Computational Linguistics.", |
|
"links": null |
|
}, |
|
"BIBREF11": { |
|
"ref_id": "b11", |
|
"title": "NeSp-NLP '10: Proceedings of the Workshop on Negation and Speculation in Natural Language Processing", |
|
"authors": [ |
|
{ |
|
"first": "R", |
|
"middle": [], |
|
"last": "Morante", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "C", |
|
"middle": [], |
|
"last": "Sporleder", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2010, |
|
"venue": "", |
|
"volume": "", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "R. Morante and C. Sporleder, editors. 2010. NeSp-NLP '10: Proceedings of the Workshop on Negation and Speculation in Natural Language Processing, Strouds- burg, PA, USA. Association for Computational Lin- guistics.", |
|
"links": null |
|
}, |
|
"BIBREF12": { |
|
"ref_id": "b12", |
|
"title": "Annotation of negation cues and their scope. guidelines v1.0", |
|
"authors": [ |
|
{ |
|
"first": "R", |
|
"middle": [], |
|
"last": "Morante", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "S", |
|
"middle": [], |
|
"last": "Schrauwen", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "W", |
|
"middle": [], |
|
"last": "Daelemans", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2011, |
|
"venue": "", |
|
"volume": "", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "R. Morante, S. Schrauwen, and W. Daelemans. 2011. Annotation of negation cues and their scope. guide- lines v1.0. Technical report, CLiPS, University of Antwerp.", |
|
"links": null |
|
}, |
|
"BIBREF13": { |
|
"ref_id": "b13", |
|
"title": "Use of general-purpose negation detection to augment concept indexing of medical documents: a quantitative study using the umls", |
|
"authors": [ |
|
{ |
|
"first": "P", |
|
"middle": [ |
|
"G" |
|
], |
|
"last": "Mutalik", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "A", |
|
"middle": [], |
|
"last": "Deshpande", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "P", |
|
"middle": [ |
|
"M" |
|
], |
|
"last": "Nadkarni", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2001, |
|
"venue": "Journal of the American Medical Informatics Association : JAMIA", |
|
"volume": "8", |
|
"issue": "6", |
|
"pages": "598--609", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "P. G. Mutalik, A. Deshpande, and P. M. Nadkarni. 2001. Use of general-purpose negation detection to augment concept indexing of medical documents: a quantitative study using the umls. Journal of the American Medi- cal Informatics Association : JAMIA, 8(6):598-609.", |
|
"links": null |
|
}, |
|
"BIBREF14": { |
|
"ref_id": "b14", |
|
"title": "On the \"scope of negation\" and polarity sensitivity", |
|
"authors": [ |
|
{ |
|
"first": "B", |
|
"middle": [], |
|
"last": "Partee", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 1993, |
|
"venue": "", |
|
"volume": "", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "B. Partee. 1993. On the \"scope of negation\" and po- larity sensitivity. In E. Hajicova, editor, Functional Approaches to Language Description.", |
|
"links": null |
|
}, |
|
"BIBREF15": { |
|
"ref_id": "b15", |
|
"title": "Annotating expressions of opinions and emotions in language. Language Resources and Evaluation", |
|
"authors": [ |
|
{ |
|
"first": "C", |
|
"middle": [], |
|
"last": "Poletto", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "; J", |
|
"middle": [], |
|
"last": "Wiebe", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "T", |
|
"middle": [], |
|
"last": "Wilson", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "C", |
|
"middle": [], |
|
"last": "Cardie", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2005, |
|
"venue": "", |
|
"volume": "18", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "C. Poletto. 2008. The syntax of focus negation. Univer- sity of Venice Working Papers in Linguistics, 18. J. Wiebe, T. Wilson, and C. Cardie. 2005. Annotating ex- pressions of opinions and emotions in language. Lan- guage Resources and Evaluation, 39(2-3).", |
|
"links": null |
|
}, |
|
"BIBREF16": { |
|
"ref_id": "b16", |
|
"title": "A survey on the role of negation in sentiment analysis", |
|
"authors": [ |
|
{ |
|
"first": "M", |
|
"middle": [], |
|
"last": "Wiegand", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "B", |
|
"middle": [], |
|
"last": "Roth", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "D", |
|
"middle": [], |
|
"last": "Klakow", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "A", |
|
"middle": [], |
|
"last": "Balahur", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "A", |
|
"middle": [], |
|
"last": "Montoyo", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2010, |
|
"venue": "Proceedings of the Workshop on Negation and Speculation in Natural Language Processing", |
|
"volume": "", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "M. Wiegand, B. Roth, D. Klakow, A. Balahur, and A. Montoyo. 2010. A survey on the role of negation in sentiment analysis. In Proceedings of the Workshop on Negation and Speculation in Natural Language Pro- cessing (NeSp-NLP 2010).", |
|
"links": null |
|
}, |
|
"BIBREF17": { |
|
"ref_id": "b17", |
|
"title": "Fine-Grained Subjectivity Analysis", |
|
"authors": [ |
|
{ |
|
"first": "", |
|
"middle": [], |
|
"last": "Th", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "", |
|
"middle": [], |
|
"last": "Wilson", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2008, |
|
"venue": "", |
|
"volume": "", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "Th. Wilson. 2008. Fine-Grained Subjectivity Analysis. Ph.D. thesis, University of Pittsburgh. Intelligent Sys- tems Program.", |
|
"links": null |
|
} |
|
}, |
|
"ref_entries": { |
|
"FIGREF0": { |
|
"uris": null, |
|
"type_str": "figure", |
|
"num": null, |
|
"text": "NBC broadcast throughout the entire night and did not go off the air [until noon yesterday] ." |
|
}, |
|
"FIGREF1": { |
|
"uris": null, |
|
"type_str": "figure", |
|
"num": null, |
|
"text": "Both said the new plan would [n't] work." |
|
}, |
|
"FIGREF2": { |
|
"uris": null, |
|
"type_str": "figure", |
|
"num": null, |
|
"text": "Black activist Walter Sisulu said the African National Congress would [n't] reject violence as a way to pressure the South African government into concessions that might lead to negotiations over apartheid . . ." |
|
}, |
|
"FIGREF3": { |
|
"uris": null, |
|
"type_str": "figure", |
|
"num": null, |
|
"text": "Figure 2: System Results" |
|
}, |
|
"FIGREF4": { |
|
"uris": null, |
|
"type_str": "figure", |
|
"num": null, |
|
"text": "System Errors est to us, as it highlights the different interpretations of verbal negation scope. NegFocus will not include the noun subject in the syntactic negation scope, and therefore the noun subject constituent is never a focus candidate as required in Example (11)." |
|
}, |
|
"FIGREF5": { |
|
"uris": null, |
|
"type_str": "figure", |
|
"num": null, |
|
"text": "To simplify the calculations , [commissions on the option and underlying stock] are n't included in the table." |
|
}, |
|
"FIGREF6": { |
|
"uris": null, |
|
"type_str": "figure", |
|
"num": null, |
|
"text": "AMR declined to comment , and Mr.Trump did [n't] respond to requests for interviews." |
|
}, |
|
"FIGREF7": { |
|
"uris": null, |
|
"type_str": "figure", |
|
"num": null, |
|
"text": "[The payments] aren't expected [to have an impact on coming operating results], Linear added ." |
|
}, |
|
"TABREF0": { |
|
"type_str": "table", |
|
"num": null, |
|
"text": "2011) 1 . (5) [We did] not [drive up to the door] but got down near the gate of the avenue.", |
|
"content": "<table/>", |
|
"html": null |
|
}, |
|
"TABREF1": { |
|
"type_str": "table", |
|
"num": null, |
|
"text": "heuristiccorr. incorr. acc.", |
|
"content": "<table><tr><td>Test Set</td><td/><td/><td/></tr><tr><td>baseline</td><td>336</td><td colspan=\"2\">238 .59</td></tr><tr><td>adverb</td><td>26</td><td colspan=\"2\">4 .87</td></tr><tr><td>nsubjpass</td><td>10</td><td colspan=\"2\">8 .56</td></tr><tr><td>neg. cue</td><td>33</td><td colspan=\"2\">20 .62</td></tr><tr><td colspan=\"2\">Development Set</td><td/><td/></tr><tr><td>baseline</td><td>257</td><td>174</td><td>.6</td></tr><tr><td>adverb</td><td>15</td><td colspan=\"2\">6 .71</td></tr><tr><td>nsubjpass</td><td>10</td><td colspan=\"2\">6 .63</td></tr><tr><td>neg. cue</td><td>21</td><td colspan=\"2\">19 .53</td></tr><tr><td colspan=\"4\">Figure 1: Performance of NegFocus heuristics</td></tr><tr><td colspan=\"4\">The overall performance of the system is almost</td></tr><tr><td colspan=\"4\">balanced between precision and recall with an f-</td></tr><tr><td colspan=\"2\">measure of .58.</td><td/><td/></tr><tr><td/><td>Test Set</td><td/><td/></tr><tr><td colspan=\"3\">Precision 60.00 [405/675]</td><td/></tr><tr><td>Recall</td><td colspan=\"2\">56.88 [405/712]</td><td/></tr><tr><td>F-score</td><td>58.40</td><td/><td/></tr><tr><td colspan=\"2\">Development Set</td><td/><td/></tr><tr><td colspan=\"3\">Precision 59.65 [303/508]</td><td/></tr><tr><td>Recall</td><td colspan=\"2\">57.06 [303/531]</td><td/></tr><tr><td>F-score</td><td>58.33</td><td/><td/></tr></table>", |
|
"html": null |
|
} |
|
} |
|
} |
|
} |