I am a postdoctoral scholar at University of Washington, working with Rachel Lin and Jamie Morgenstern. I obtained my PhD from the Courant Institute of Mathematical Sciences in New York University under the supervision of Joan Bruna and Oded Regev.
Contact info: mjsong32 ät cs dot washington dot edu
I am interested in theoretical computer science and machine learning. My recent focus is on establishing algorithmic fairness using tools from theoretical computer science. I am also interested in the computational complexity of statistical inference.
(α-β) denotes alphabetical ordering, * denotes equal contribution.
Cryptographic Hardness of Score Estimation. Min Jae Song. Advances in Neural Information Processing System (NeurIPS), 2024. [arxiv]
Learning Single-Index Models with Shallow Neural Networks. Alberto Bietti, Joan Bruna, Clayton Sanford, Min Jae Song (α-β). Advances in Neural Information Processing Systems (NeurIPS), 2022. [arxiv]
Lattice-Based Methods Surpass Sum-of-Squares in Clustering. Ilias Zadik, Min Jae Song, Alexander S. Wein, Joan Bruna. Proceedings of the Conference on Learning Theory (COLT), 2022. [arxiv, video]
On the Cryptographic Hardness of Learning Single Periodic Neurons. Min Jae Song*, Ilias Zadik*, Joan Bruna. Advances in Neural Information Processing Systems (NeurIPS), 2021. [arxiv, video]
Continuous LWE. Joan Bruna, Oded Regev, Min Jae Song, Yi Tang (α-β). ACM Symposium on Theory of Computing (STOC), 2021. [arxiv, video]
Self-Supervised Motion Retargeting with Safety Guarantee. Sungjoon Choi, Min Jae Song, Hyemin Ahn, Joohyung Kim. IEEE International Conference on Robotics and Automation (ICRA), 2021. [arxiv, video]
Evaluating Representations by the Complexity of Learning Low-Loss Predictors. William F. Whitney, Min Jae Song, David Brandfonbrener, Jaan Altosaar, Kyunghyun Cho. ICLR Neural Compression Workshop, 2021. [arxiv, code, blog]
Hardness of Approximate Nearest Neighbor Search under L-Infinity. Young Kun Ko, Min Jae Song (α-β). arXiv preprint, 2020. [arxiv]