Neil Band

I'm a second-year PhD student in the Computer Science Department at Stanford University. My advisors are Tatsunori Hashimoto and Tengyu Ma. My work is supported by the NSF GRFP and Quad Fellowship.

I was previously a Masters by Research student in the Oxford Applied and Theoretical Machine Learning Group (OATML) at the University of Oxford, where I was advised by Yarin Gal and supported by a Rhodes Scholarship. Before that, I did my A.B. in Computer Science and Economics at Harvard College. I worked with Stratos Idreos in the Harvard Data Systems Lab on systems machine learning, and Manolis Kellis in the MIT Computational Biology Group on machine learning in single-cell transcriptomics.

Google Scholar  /  Twitter  /  Github

profile photo
Research

My long-term goal is to build frontier AI systems that safely and legibly advance scientific knowledge and our collective well-being. My current focus is on harnessing the internal knowledge of large models to make them more reliable and interpretable.

Conference Papers
Benchmarking Bayesian Deep Learning on Diabetic Retinopathy Detection Tasks
Neil Band*, Tim G. J. Rudner*, Qixuan Feng, Angelos Filos, Zachary Nado,
Michael W. Dusenberry, Ghassen Jerfel, Dustin Tran, Yarin Gal.
NeurIPS Datasets and Benchmarks Track, 2021
NeurIPS Workshop on Distribution Shifts, 2021 (Spotlight, 4.6% of accepted papers)
Symposium on Machine Learning for Health (ML4H) Extended Abstract Track, 2021
NeurIPS Workshop on Bayesian Deep Learning, 2021

paper / poster / code / bibtex

An easy-to-use, expert-guided, open-source suite of diabetic retinopathy detection benchmarking tasks for Bayesian deep learning.

Shifts: A Dataset of Real Distributional Shift Across Multiple Large-Scale Tasks
Andrey Malinin, Neil Band, German Chesnokov, Yarin Gal, Mark J.F. Gales, Alexey Noskov, Andrey Ploskonosov, Liudmila Prokhorenkova, Ivan Provilkov, Vatsal Raina, Vyas Raina, Mariya Shmatova, Panos Tigas, Boris Yangel.
NeurIPS Datasets and Benchmarks Track, 2021

arXiv / NeurIPS 2021 competition / code / bibtex / OATML blog / Yandex Research blog

Three industry-scale tasks for evaluating robustness and uncertainty quality under distribution shift, including the largest vehicle motion prediction dataset to date.

Self-Attention Between Datapoints: Going Beyond Individual Input-Output Pairs in Deep Learning
Jannik Kossen*, Neil Band*, Clare Lyle, Aidan N. Gomez, Tom Rainforth, Yarin Gal.
NeurIPS, 2021

arXiv / poster / code / community video

A novel deep learning architecture that takes the entire dataset as input and learns to reason about relationships between datapoints using self-attention.

Conference Workshop Papers
Plex: Towards Reliability using Pretrained Large Model Extensions
Dustin Tran, Jeremiah Liu, Michael W. Dusenberry, Du Phan, Mark Collier, Jie Ren, Kehang Han, Zi Wang, Zelda Mariet, Huiyi Hu, Neil Band, Tim G. J. Rudner, Karan Singhal, Zachary Nado, Joost van Amersfoort, Andreas Kirsch, Rodolphe Jenatton, Nithum Thain, Honglin Yuan, Kelly Buchanan, Kevin Murphy, D. Sculley, Yarin Gal, Zoubin Ghahramani, Jasper Snoek, Balaji Lakshminarayanan.
ICML Pre-training Workshop, 2022 (Contributed Talk, 5.9% of accepted papers)
Bay Area Machine Learning Symposium, 2022 (Spotlight Talk)
ICML Principles of Distribution Shift Workshop, 2022

paper / code / talk / bibtex / Google AI blog

A set of tasks assessing the reliability of models on both vision and language, and Plex, a large pretrained model which improves SoTA across tasks.

Uncertainty Baselines: Benchmarks for Uncertainty & Robustness in Deep Learning
Zachary Nado, Neil Band*, Mark Collier, Josip Djolonga, Michael W. Dusenberry, Sebastian Farquhar, Angelos Filos, Marton Havasi, Rodolphe Jenatton, Ghassen Jerfel, Jeremiah Liu, Zelda Mariet, Jeremy Nixon, Shreyas Padhy, Jie Ren, Tim G. J. Rudner, Yeming Wen, Florian Wenzel, Kevin Murphy, D. Sculley, Balaji Lakshminarayanan, Jasper Snoek, Yarin Gal, Dustin Tran. (*Not placed alphabetically.)
NeurIPS Workshop on Bayesian Deep Learning, 2021

arXiv / code / Google AI blog

A high-quality repository of many datasets, baselines, and techniques relevant to the Bayesian deep learning literature.

Conference Short Papers
MemFlow: Memory-Aware Distributed Deep Learning
Neil Band.
SIGMOD (Two-Page Extended Abstract), 2020 (Second Place, SIGMOD SRC)

paper / poster / bibtex

An optimization framework for distributed deep learning that jointly considers memory usage and computation time when searching for a parallelization strategy.


Based on Jon Barron's website (source code here).