Ming-Wei Chang
Cited by
Cited by
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
J Devlin, MW Chang, K Lee, K Toutanova, 2018
Natural questions: a benchmark for question answering research
T Kwiatkowski, J Palomaki, O Redfield, M Collins, A Parikh, C Alberti, ...
Transactions of the Association for Computational Linguistics 7, 453-466, 2019
REALM: Retrieval-Augmented Language Model Pre-Training
K Guu, K Lee, Z Tung, P Pasupat, MW Chang
Load forecasting using support vector machines: A study on EUNITE competition 2001
BJ Chen, MW Chang, CJ Lin
Power Systems, IEEE Transactions on 19 (4), 1821-1830, 2004
Gemini: a family of highly capable multimodal models
G Team, R Anil, S Borgeaud, Y Wu, JB Alayrac, J Yu, R Soricut, ...
arXiv preprint arXiv:2312.11805, 2023
Latent retrieval for weakly supervised open domain question answering
K Lee, MW Chang, K Toutanova
arXiv preprint arXiv:1906.00300, 2019
BoolQ: Exploring the surprising difficulty of natural yes/no questions
C Clark, K Lee, MW Chang, T Kwiatkowski, M Collins, K Toutanova
arXiv preprint arXiv:1905.10044, 2019
Well-read students learn better: On the importance of pre-training compact models
I Turc, MW Chang, K Lee, K Toutanova
arXiv preprint arXiv:1908.08962, 2019
Semantic Parsing via Staged Query Graph Generation: Question Answering with Knowledge Base
W Yih, MW Chang, X He, J Gao
ACL, 2015
A knowledge-grounded neural conversation model
M Ghazvininejad, C Brockett, MW Chang, B Dolan, J Gao, W Yih, ...
Proceedings of the AAAI conference on artificial intelligence 32 (1), 2018
The value of semantic parse labeling for knowledge base question answering
W Yih, M Richardson, C Meek, MW Chang, J Suh
Proceedings of the 54th Annual Meeting of the Association for Computationalá…, 2016
Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv
J Devlin, MW Chang, K Lee, K Toutanova
arXiv preprint arXiv:1810.04805, 2019
Importance of Semantic Representation: Dataless Classification.
MW Chang, LA Ratinov, D Roth, V Srikumar
Aaai 2, 830-835, 2008
Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)
J Devlin, MW Chang, K Lee, K Toutanova
Association for Computational Linguistics, 4171-4186, 2019
Question answering using enhanced lexical semantic models
SW Yih, MW Chang, C Meek, A Pastusiak
Proceedings of the 51st Annual Meeting of the Association for Computationalá…, 2013
Driving semantic parsing from the world’s response
J Clarke, D Goldwasser, MW Chang, D Roth
Proceedings of the fourteenth conference on computational natural languageá…, 2010
Zero-shot entity linking by reading entity descriptions
L Logeswaran, MW Chang, K Lee, K Toutanova, J Devlin, H Lee
arXiv preprint arXiv:1906.07348, 2019
Large dual encoders are generalizable retrievers
J Ni, C Qu, J Lu, Z Dai, GH ┴brego, J Ma, VY Zhao, Y Luan, KB Hall, ...
arXiv preprint arXiv:2112.07899, 2021
Guiding semi-supervision with constraint-driven learning
MW Chang, L Ratinov, D Roth
Proceedings of the 45th annual meeting of the association of computationalá…, 2007
To link or not to link? a study on end-to-end tweet entity linking
S Guo, MW Chang, E Kiciman
Proceedings of the 2013 conference of the North American chapter of theá…, 2013
The system can't perform the operation now. Try again later.
Articles 1–20