CV/Resume
This resume is also available as a pdf.
Michael Hanna
(+1) 872-356-8659 | m.w.hanna@uva.nl | hannamw.github.io
Education
University of Amsterdam, Institute for Logic, Language, and Computation (September 2022 - September 2026 (expected))
- PhD, Computational Linguistics
Charles University (September 2022); GPA: 1 (=A: excellent, with honors)
- MS, Computer Science (computational linguistics track)
University of Trento (July 2022); GPA: 110/110 (with honors)
- MS, Cognitive Science (language and multimodal interaction track)
University of Chicago (June 2020); GPA: 3.96
- BS with Honors, Computer Science (specialization: machine learning); GPA: 3.95
- BA with Honors, Linguistics; GPA: 3.96
Publications
- Michael Hanna, Ollie Liu, and Alexandre Variengien. 2023. How does GPT-2 compute greater-than?: Interpreting mathematical abilities in a pre-trained language model. To appear at the Thirty-seventh Conference on Neural Information Processing Systems.
- Michael Hanna, Roberto Zamparelli, and David Mareček. 2023. The Functional Relevance of Probed Information: A Case Study. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, pages 835–848, Dubrovnik, Croatia. Association for Computational Linguistics.
- Michael Hanna, Federico Pedeni, Alessandro Suglia, Alberto Testoni, and Raffaella Bernardi. 2022. ACT-Thor: A Controlled Benchmark for Embodied Action Understanding in Simulated Environments. In Proceedings of the 29th International Conference on Computational Linguistics, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
- Michael Hanna and Ondrej Bojar. 2021. A Fine-Grained Investigation of BERTScore. In Proceedings of the Sixth Conference on Machine Translation. Punta Cana, Dominican Republic (Online). Association for Computational Linguistics
- Michael Hanna and David Marecek. 2021. Investigating BERT’s Knowledge of Hypernymy through Prompting. In Proceedings of the Fourth BlackboxNLP Workshop on Analyzing and Interpreting Neural Networks for NLP. Punta Cana, Dominican Republic. Association for Computational Linguistics
Experience
Research Resident, Redwood Research
January 2023
- Using mechanistic interpretability techniques to understand GPT-2’s behavior, as part of the REMIX program
December 2021 – Ongoing, Prague
- Researching the effect of reference translation quality on the reliability of machine translation metrics.
March 2021 – August 2021, Prague
- Used prompting to probe BERT for knowledge of hypernyms of common words.
- Conducted empirical experiments comparing BERT’s hypernym discovery performance to existing systems’.
Research Assistant, University of Chicago Dept. of Linguistics
January 2020 - June 2020, Chicago
- Used unsupervised clustering to test if ELMo embeddings of polysemous words were embedded in distinct clusters in the embedding space; this could allow for unsupervised learning of word senses.
- Used zero-shot probing tasks to explore the relationship between BERT’s (masked) language modeling abilities / pre-training and its high performance on down-stream tasks.
Summer 2019, Boston
- Wrote monitors in Python to track and plot trends in data, and send alerts when anomalies were detected. Also wrote Dockerfiles for easy deployment to Kubernetes
- Used pandas, matplotlib, and scikit-learn to analyze and quantify the accuracy of geospatial data from various sources, and ultimately decide which data provider to use.
Board Member, Board Manager (2019), Splash! Chicago
Sept. 2016 – Jun. 2020, Chicago
- Led Splash! Chicago, a volunteer student group that organizes large (100-student) educational events where
high school students can learn from university students. Taught classes for Splash! Chicago in linguistics.
Grader, University of Chicago Dept. of Computer Science
Fall 2018 - July 2020, Chicago
- Graded the following courses: Intro to CS, Computer Systems, Computer Architecture, Time Series Analysis and Stochastic Processes
Feb. 2018 - June 2018, Chicago
- Fixed and refactored projects in Scratch that teach elementary and middle school students about math and computer science
- Developed new projects in Scratch to meet specific STEM learning goals
Languages
Programming / Markup
Human
Selected Coursework
- Natural Language Processing, Computational Linguistics, Grounded Language Processing
- Machine Learning, ML Theory
- Deep Learning, DL Systems, Deep Reinforcement Learning
- Psycholinguistics, Cognitive Science, Semantics, Syntax
Honors
Standardized Exams
- GRE (2019): 168 V / 169 Q / 6.0 AW