ASQA: Academia sinica question answering system for NTCIR-5 CLQA
2005
Abstract
We propose a hybrid architecture for the NTCIR-5 CLQA CC (Cross Language Question Answering from Chinese to Chinese) Task. Our system, the Academia Sinica Question-Answering System (ASQA), outputs exact answers to six types of factoid question: personal names, location names, organization names, artifacts, times, and numbers. The architecture of ASQA comprises four main components: Question Processing, Passage Retrieval, Answer Extraction, and Answer Ranking. ASQA successfully combines machine learning and knowledge-based approaches to answer Chinese factoid questions, achieving 37.5% and 44.5% Top1 accuracy for correct, and correct+unsupported answers, respectively.
References (14)
- References
- M.-Y. Day, C.-W. Lee, S.-H. Wu, C.-S. Ong, W.-L. Hsu. An Integrated Knowledge-based and Machine Learning Approach for Chinese Question Classification. In Proceedings of IEEE International Conference on Natural Language Processing and Knowledge Engineering (NLPKE), 2005.
- Z. Dong and Q. Dong, HowNet, http://www.keenage.com/, 2000.
- W.-L. Hsu, Y.-S. Chen, S.-H. Event Identification Based on the Information Map -INFOMAP. In Proceedings of IEEE International Conference on Natural Language Processing and Knowledge Engineering (NLPKE), 2001.
- D. Moldovan, S. Harabagiu, M. Paşca, R. Mihalcea, R. Goodrum, R. Gîrju and V. Rus. Lasso: A Tool for Surfing the Answer Net. In Proceedings of The Eighth Text REtrieval Conference, 1999.
- T.-H. Tsai, S.-H. Wu, C.-H. Lee, C.-W. Shih, W.-L. Hsu. Mencius: A Chinese Named Entity Recognizer Using Maximum Entropy-based Hybrid Model. In Computational Linguistics & Chinese Language Processing 9, 65-82. 2004.
- H.-H. Tseng, K.-J. Chen, Design of Chinese Morphological Analyzer. In Proceedings of SIGHAN, 2002.
- V. N. Vapnik, The Nature of Statistical Learning Theory. Springer, 1995.
- CKIP AutoTag, Academia Sinica. http://ckipsvr.iis.sinica.edu.tw/
- Lucene, http://lucene.apache.org/
- Text REtrieval Conference (TREC), http://trec.nist.gov/
- Cross Language Evaluation Forum (CLEF), http://www.clef-campaign.org/
- NTCIR Workshop, http://research.nii.ac.jp/ntcir/
- TREC 2004 Question Answering Results, http://trec.nist.gov/pubs/trec13/appendices/qa.res ults.html