Follow
Furu Wei
Furu Wei
Partner Research Manager, Microsoft Research
Verified email at microsoft.com - Homepage
Title
Cited by
Cited by
Year
Beit: Bert pre-training of image transformers
H Bao, L Dong, S Piao, F Wei
arXiv preprint arXiv:2106.08254, 2021
28022021
Oscar: Object-Semantics Aligned Pre-training for Vision-Language Tasks
X Li, X Yin, C Li, P Zhang, X Hu, L Zhang, L Wang, H Hu, L Dong, F Wei, ...
Computer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, August 23 …, 2020
20632020
Vl-bert: Pre-training of generic visual-linguistic representations
W Su, X Zhu, Y Cao, B Li, L Lu, F Wei, J Dai
arXiv preprint arXiv:1908.08530, 2019
18602019
Swin transformer v2: Scaling up capacity and resolution
Z Liu, H Hu, Y Lin, Z Yao, Z Xie, Y Wei, J Ning, Y Cao, Z Zhang, L Dong, ...
Proceedings of the IEEE/CVF conference on computer vision and pattern …, 2022
18082022
Unified language model pre-training for natural language understanding and generation
L Dong, N Yang, W Wang, F Wei, X Liu, Y Wang, J Gao, M Zhou, HW Hon
33rd Conference on Neural Information Processing Systems (NeurIPS 2019), 2019
17912019
Learning sentiment-specific word embedding for twitter sentiment classification
D Tang, F Wei, N Yang, M Zhou, T Liu, B Qin
Proceedings of the 52nd Annual Meeting of the Association for Computational …, 2014
16212014
Wavlm: Large-scale self-supervised pre-training for full stack speech processing
S Chen, C Wang, Z Chen, Y Wu, S Liu, Z Chen, J Li, N Kanda, T Yoshioka, ...
IEEE Journal of Selected Topics in Signal Processing 16 (6), 1505-1518, 2022
15762022
Adaptive Recursive Neural Network for Target-dependent Twitter Sentiment Classification
L Dong, F Wei, C Tan, D TangΤ, M Zhou, K Xu
ACL, 2014
12292014
Minilm: Deep self-attention distillation for task-agnostic compression of pre-trained transformers
W Wang, F Wei, L Dong, H Bao, N Yang, M Zhou
Advances in Neural Information Processing Systems 33, 5776-5788, 2020
11182020
Gated self-matching networks for reading comprehension and question answering
W Wang, N Yang, F Wei, B Chang, M Zhou
Proceedings of the 55th Annual Meeting of the Association for Computational …, 2017
8502017
HIBERT: Document Level Pre-training of Hierarchical Bidirectional Transformers for Document Summarization
X Zhang, F Wei, M Zhou
ACL, 2019
817*2019
Layoutlm: Pre-training of text and layout for document image understanding
Y Xu, M Li, L Cui, S Huang, F Wei, M Zhou
Proceedings of the 26th ACM SIGKDD international conference on knowledge …, 2020
7702020
Image as a foreign language: Beit pretraining for all vision and vision-language tasks
W Wang, H Bao, L Dong, J Bjorck, Z Peng, Q Liu, K Aggarwal, ...
arXiv preprint arXiv:2208.10442, 2022
701*2022
Recognizing named entities in tweets
X Liu, S Zhang, F Wei, M Zhou
Proceedings of the 49th annual meeting of the association for computational …, 2011
6422011
Topic sentiment analysis in twitter: a graph-based hashtag sentiment classification approach
X Wang, F Wei, X Liu, M Zhou, M Zhang
Proceedings of the 20th ACM international conference on Information and …, 2011
6342011
Question answering over freebase with multi-column convolutional neural networks
L Dong, F Wei, M Zhou, K Xu
Proceedings of the 53rd Annual Meeting of the Association for Computational …, 2015
5872015
Neural codec language models are zero-shot text to speech synthesizers
C Wang, S Chen, Y Wu, Z Zhang, L Zhou, S Liu, Z Chen, Y Liu, H Wang, ...
arXiv preprint arXiv:2301.02111, 2023
5132023
Layoutlmv2: Multi-modal pre-training for visually-rich document understanding
Y Xu, Y Xu, T Lv, L Cui, F Wei, G Wang, Y Lu, D Florencio, C Zhang, ...
arXiv preprint arXiv:2012.14740, 2020
5012020
Superagent: A customer service chatbot for e-commerce websites
L Cui, S Huang, F Wei, C Tan, C Duan, M Zhou
Proceedings of ACL 2017, system demonstrations, 97-102, 2017
4862017
Kosmos-2: Grounding multimodal large language models to the world
Z Peng, W Wang, L Dong, Y Hao, S Huang, S Ma, F Wei
arXiv preprint arXiv:2306.14824, 2023
4852023
The system can't perform the operation now. Try again later.
Articles 1–20