Repository logo
 
Loading...
Thumbnail Image
Publication

Document retrieval for question answering : a quantitative evaluation of text preprocessing

Use this identifier to reference this record.
Name:Description:Size:Format: 
2007 CIKM p125-carvalho.pdf387.55 KBAdobe PDF Download

Advisor(s)

Abstract(s)

Question Answering (QA) has been an area of interest for researchers, in part motivated by the international QA evaluation forums, namely the Text REtrieval Conference (TREC), and more recently, the Cross Language Evaluation Forum (CLEF) through QA@CLEF, that since 2004 includes the Portuguese language. In these forums, a collection of written documents is provided, as well as a set of questions, which are to be answered by the participating systems. Each system is evaluated by its capacity to answer the questions, as a whole, and there are relatively few results published that focus on the performance of its different components and their influence on the overall system performance. That is the case of the Information Retrieval (IR) component, which is broadly used in QA systems. Our work concentrates on the different options of preprocessing Portuguese text before feeding it to the IR component, evaluating their impact on the IR performance in the specific context of QA, so that we can make a sustained choice of which options to choose. From this work we conclude the clear advantage of the basic preprocessing techniques: case folding and removal of punctuation marks. For the other techniques considered, stop word removal enhanced the performance of the IR system but that was not the case as far as Stemming and Lemmatization are concerned.

Description

Keywords

Information retrieval Question answering

Citation

Research Projects

Organizational Units

Journal Issue