Natural Language Engineering



Papers

Detecting errors in English article usage by non-native speakers


NA-RAE HAN a1, MARTIN CHODOROW a2 and CLAUDIA LEACOCK a3
a1 University of Pennsylvania, 619 Williams Hall, 36th & Spruce Street, Philadelphia, PA 19104, USA e-mail: nrh@ling.upenn.edu and Educational Testing Service Rosedale Rd. MS 13E, Princeton, NJ 08541, USA
a2 Hunter College of the City University of New York 695 Park Avenue, New York, NY 10021, USA e-mail: mchodoro@hunter.cuny.edu
a3 Pearson Knowledge Technologies, 4940 Pearl East Circle, Boulder, CO 80301, USA e-mail: cleacock@pearsonkt.com

Article author query
han n   [Google Scholar] 
chodorow m   [Google Scholar] 
leacock c   [Google Scholar] 
 

Abstract

One of the most difficult challenges faced by non-native speakers of English is mastering the system of English articles. We trained a maximum entropy classifier to select among a/an, the, or zero article for noun phrases (NPs), based on a set of features extracted from the local context of each. When the classifier was trained on 6 million NPs, its performance on published text was about 83% correct. We then used the classifier to detect article errors in the TOEFL essays of native speakers of Chinese, Japanese, and Russian. These writers made such errors in about one out of every eight NPs, or almost once in every three sentences. The classifier's agreement with human annotators was 85% (kappa = 0.48) when it selected among a/an, the, or zero article. Agreement was 89% (kappa = 0.56) when it made a binary (yes/no) decision about whether the NP should have an article. Even with these levels of overall agreement, precision and recall in error detection were only 0.52 and 0.80, respectively. However, when the classifier was allowed to skip cases where its confidence was low, precision rose to 0.90, with 0.40 recall. Additional improvements in performance may require features that reflect general knowledge to handle phenomena such as indirect prior reference. In August 2005, the classifier was deployed as a component of Educational Testing Service's Criterion$^{SM}$ Online Writing Evaluation Service.

(Published Online May 22 2006)
(Received February 1 2006)