一、沙龙主题:统计语言模型的方法和应用
二、沙龙时间:2011年9月4日(周日)下午2-5点
三、沙龙地点:清华科技园
四、沙龙主持人:胡日勒博士

本次沙龙拟尝试读书会的形式。报名的同时,请告知您打算在沙龙上介绍的与本次活动主题相关的论文。该论文可以是您自己的,也可以是科研中接触到的高水平论文,或者是附件中的论文。如果选择附件的论文,请尽量选择其他人未选择的论文。关于已选择的论文情况,我们会每日在www.52nlp.cn更新。

报名截止日期:9月1日(周四)
报名请回复邮箱cmt.salon@gmail.com
中文翻译技术沙龙的豆瓣小组是http://www.douban.com/group/304684/。
中文翻译技术沙龙的QQ群:80266425(满);172478666(NLP);172478453(CAT)

附件:
1) S. F. Chen and J. Goodman, “An empirical study of smoothing techniques for language modeling”, Technical Report TR-10-98, Computer Science Group, Harvard University, Aug. 1998.
2)A. Stolcke, “Entropy-based pruning of backoff language models”, in Proceedings DARPA Broadcast News Transcription and Understanding Workshop, pp. 270–274, Lansdowne, VA, Feb. 1998. Morgan Kaufmann.
3)P. F. Brown, V. J. Della Pietra, P. V. deSouza, J. C. Lai, and R. L. Mercer, “Class-based n-gram models of natural language”, Computational Linguistics, vol. 18, pp. 467–479, 1992.
4)Kuhn R, De Mori R. A cache-based natural language model for speech reproduction. IEEE PAMI,1990 12(6), page:570-583
5)Kuhn R, De Mori R. Corrections to A cache-based natural language model for speech reproduction. IEEE PAMI,1992 14, page:691-692
6)Goodman J. A bit of progress in language modeling. In Computer Speech and Language, 2001, 15(4), pages: 403-434
7)R. Rosenfeld, “Two decades of statistical language modeling: Where do we go from here?”, Proceedings of the IEEE, vol. 88, 2000.
8)R. Rosenfeld, A Maximum Entropy Approach to adaptive statistical language modeling, Computer Speech and Language 1996, 10, pages: 187-228
9)Bengio Y, Ducharme R, Vincent P, Jauvin C. A Neural Probabilistic Language Model, In Journal of Machine Learing Research, 2003(3), pages: 1137-1155.
10)Xu P, Chelba C, Jelinek F. A Study on Richer Syntactic Dependencies for Structured Language Modeling, ACL2002.
11)Chelba C, Jelinek F. Structured Language Modeling, Computer Speech and Language 2000, 14(4),pages: 283-332.
12)Gao J F, Lin C Y. Introduction to the Special Issue on Statistical Language Modeling. ACM transactions on Asian language information processing, 2004, Vol 3, No.2 pages: 87-93
13)Berger A L, Della Pietra S A, Della Pietra V J. A Maximum Entropy Approach to Natural Language Processing. Computational Linguistics. 1996, 22(1), pages : 1-36
14)Clarkson P R, Robinson A J. Language model adaptation using mixtures and Exponentially. Decaying cache. ICASSP, 1997, vol 2 pages: 799-802
15)Kneser R, Steinbiss V. On the Dynamic Adaptation for Stochastic Language Models. ICASSP 1993.
16)Kneser R, Ney H. Improved Backing-off for ngram language modeling. ICASSP 1995, Vol 1,pages:181-184
17)Khudanpur S, Wu J. A Maximum Entropy Language Model Integrating N-Gram and Topic Dependency of Conversational Speech Recognition, In ICASSP 1999, Pages: 553-556
18)Gildea D, Hofmann T. Topic-Based Language Model Using EM. In Eurospeech 1999, pages 2167-2170.
19)Brants Thorsten, Popat Ashok. 2007. Large Language Models in Machine
Translation. In Proceedings of the 2007 Joint Conference on Empirical Methods in
Natural Language Processing and Computational Natural Language Learning:
858-867.
20)Deyi Xiong, Min Zhang, & Haizhou Li: Enhancing language models in statistical machine translation with backward n-grams and mutual information triggers. ACL-HLT 2011: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics, Portland, Oregon, June 19-24, 2011; pp.1288-1297. [PDF, 620KB]

21)(已选)Adam Pauls & Dan Klein: Faster and smaller n-gram language models. ACL-HLT 2011: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics, Portland, Oregon, June 19-24, 2011; pp.258-267. [PDF, 246KB]

22)(已选)Robert C.Moore & William Lewis: Intelligent selection of language model training data. ACL 2010: the 48th Annual Meeting of the Association for Computational Linguistics, Uppsala, Sweden, July 11-16, 2010: Proceedings of the Conference Short Papers; pp.220-224. [PDF, 176KB] бесплатный займ без процентов

作者 cmtsalon

发表回复

您的电子邮箱地址不会被公开。 必填项已用*标注