Follow
Joe Davison
Title
Cited by
Cited by
Year
Transformers: State-of-the-Art Natural Language Processing
Proceedings of the 2020 Conference in Empirical Methods in Natural Language …, 2020
16175*2020
Datasets: A community library for natural language processing
Q Lhoest, AV Del Moral, Y Jernite, A Thakur, P Von Platen, S Patil, ...
arXiv preprint arXiv:2109.02846, 2021
544*2021
Commonsense knowledge mining from pretrained models
J Davison, J Feldman, AM Rush
Proceedings of the 2019 conference on empirical methods in natural language …, 2019
3812019
DEvol: Automated deep neural network design via genetic programming
J Davison
www.github.com/joeddav/devol, 2017
24*2017
Flexible and scalable deep learning with MMLSpark
M Hamilton, S Raghunathan, A Annavajhala, D Kirsanov, E Leon, ...
International Conference on Predictive Applications and APIs, 11-22, 2018
172018
MS2Mol: A transformer model for illuminating dark chemical space from mass spectra
T Butler, A Frandsen, R Lightheart, B Bargh, J Taylor, TJ Bollerman, ...
142023
Multi-scale sinusoidal embeddings enable learning on high resolution mass spectrometry data
G Voronov, R Lightheart, J Davison, CA Krettler, D Healey, T Butler
arXiv preprint arXiv:2207.02980, 2022
122022
Cross-population Variational Autoencoders
J Davison, KA Severson, S Ghosh
Proceedings of NeurIPS 2019 Workshop on Bayesian Deep Learning, 2019
22019
The system can't perform the operation now. Try again later.
Articles 1–8