Proceedings of Human Language Technologies: The 11th Annual Conference of the North American Chapter of the Association for Computational Linguistics (HLT-NAACL 2010), Los Angeles, CA, USA, pp. 885-893, 2010.
Abstract. We present a new method that compresses sentences by removing words. In a first stage, it generates candidate compressions by removing branches from the source sentence’s dependency tree using a Maximum Entropy classifier. In a second stage, it chooses the best among the candidate compressions using a Support Vector Machine Regression model. Experimental results show that our method achieves state-of-the-art performance without requiring any manually written rules.