Some of the projects and explorations I’m most proud of.

Research

Currently working in the Isola lab with Minyoung Huh, Tongzhou Wang, Brian Cheung on in context learning and inner optimization in transformers.

The Quantization Model of Neural Scaling. Eric J. Michaud, Ziming Liu, Uzay Girit, Max Tegmark. NeurIPS, 2023. Fun work on modeling emergence at scale as a product of structure in the task distribution of language - personally worked on finding a method to cluster inputs based on the mechanisms they elicit in the model - thereby finding semantically meaningful clusters in natural language data.

In the past, I did some exploratory work on safe neurosymbolic RL algorithms inspired by Dreamcoder, and have another ongoing project related to interpreting adversarial examples in vision.

Software