default search action
Yin Tat Lee
Person information
Refine list
refinements active!
zoomed in on ?? of ?? records
view refined list in
export refined list as
showing all ?? records
Journal Articles
- 2024
- [j12]Manru Zong, Yin Tat Lee, Man-Chung Yue:
Short-step methods are not strongly polynomial-time. Math. Program. 207(1): 733-746 (2024) - 2022
- [j11]Yin Tat Lee, Santosh S. Vempala:
Geodesic Walks in Polytopes. SIAM J. Comput. 51(2): 17-400 (2022) - 2021
- [j10]Michael B. Cohen, Yin Tat Lee, Zhao Song:
Solving Linear Programs in the Current Matrix Multiplication Time. J. ACM 68(1): 3:1-3:39 (2021) - [j9]Sébastien Bubeck, Ronen Eldan, Yin Tat Lee:
Kernel-based Methods for Bandit Convex Optimization. J. ACM 68(4): 25:1-25:35 (2021) - [j8]Yin Tat Lee, Man-Chung Yue:
Universal Barrier Is n-Self-Concordant. Math. Oper. Res. 46(3): 1129-1148 (2021) - [j7]Sébastien Bubeck, Michael B. Cohen, James R. Lee, Yin Tat Lee:
Metrical Task Systems on Trees via Mirror Descent and Unfair Gluing. SIAM J. Comput. 50(3): 909-923 (2021) - 2020
- [j6]Yin Tat Lee, Marcin Pilipczuk, David P. Woodruff:
Introduction to the Special Issue on SODA'18. ACM Trans. Algorithms 16(1): 1:1-1:2 (2020) - 2019
- [j5]Kevin Scaman, Francis R. Bach, Sébastien Bubeck, Yin Tat Lee, Laurent Massoulié:
Optimal Convergence Rates for Convex Distributed Optimization in Networks. J. Mach. Learn. Res. 20: 159:1-159:31 (2019) - 2018
- [j4]Yin Tat Lee, He Sun:
Constructing Linear-Sized Spectral Sparsification in Almost-Linear Time. SIAM J. Comput. 47(6): 2315-2336 (2018) - 2017
- [j3]Michael Kapralov, Yin Tat Lee, Cameron Musco, Christopher Musco, Aaron Sidford:
Single Pass Spectral Sparsification in Dynamic Streams. SIAM J. Comput. 46(1): 456-477 (2017) - [j2]Tsz Chiu Kwok, Lap Chi Lau, Yin Tat Lee:
Improved Cheeger's Inequality and Analysis of Local Graph Partitioning using Vertex Expansion and Expansion Profile. SIAM J. Comput. 46(3): 890-910 (2017) - 2016
- [j1]Yin Tat Lee, Ka Chun Lam, Lok Ming Lui:
Landmark-Matching Transformation with Large Deformation Via n-dimensional Quasi-conformal Maps. J. Sci. Comput. 67(3): 926-954 (2016)
Conference and Workshop Papers
- 2024
- [c85]Chulin Xie, Zinan Lin, Arturs Backurs, Sivakanth Gopi, Da Yu, Huseyin A. Inan, Harsha Nori, Haotian Jiang, Huishuai Zhang, Yin Tat Lee, Bo Li, Sergey Yekhanin:
Differentially Private Synthetic Data via Foundation Model APIs 2: Text. ICML 2024 - [c84]Haotian Jiang, Yin Tat Lee, Zhao Song, Lichen Zhang:
Convex Minimization with Integer Minima in Õ(n4) Time. SODA 2024: 3659-3684 - [c83]Mehrdad Ghadiri, Yin Tat Lee, Swati Padmanabhan, William Swartworth, David P. Woodruff, Guanghao Ye:
Improving the Bit Complexity of Communication for Distributed Convex Optimization. STOC 2024: 1130-1140 - 2023
- [c82]Sivakanth Gopi, Yin Tat Lee, Daogao Liu, Ruoqi Shen, Kevin Tian:
Algorithmic Aspects of the Log-Laplace Transform and a Non-Euclidean Proximal Sampler. COLT 2023: 2399-2439 - [c81]Yunbum Kook, Yin Tat Lee, Ruoqi Shen, Santosh S. Vempala:
Condition-number-independent Convergence Rate of Riemannian Hamiltonian Monte Carlo with Numerical Integrators. COLT 2023: 4504-4569 - [c80]Reid Pryzant, Dan Iter, Jerry Li, Yin Tat Lee, Chenguang Zhu, Michael Zeng:
Automatic Prompt Optimization with "Gradient Descent" and Beam Search. EMNLP 2023: 7957-7968 - [c79]Yair Carmon, Arun Jambulapati, Yujia Jin, Yin Tat Lee, Daogao Liu, Aaron Sidford, Kevin Tian:
ReSQueing Parallel and Private Stochastic Convex Optimization. FOCS 2023: 2031-2058 - [c78]Jiyan He, Xuechen Li, Da Yu, Huishuai Zhang, Janardhan Kulkarni, Yin Tat Lee, Arturs Backurs, Nenghai Yu, Jiang Bian:
Exploring the Limits of Differentially Private Deep Learning with Group-wise Clipping. ICLR 2023 - [c77]Kwangjun Ahn, Sébastien Bubeck, Sinho Chewi, Yin Tat Lee, Felipe Suarez, Yi Zhang:
Learning threshold neurons via edge of stability. NeurIPS 2023 - [c76]Sivakanth Gopi, Yin Tat Lee, Daogao Liu, Ruoqi Shen, Kevin Tian:
Private Convex Optimization in General Norms. SODA 2023: 5068-5089 - [c75]Sophie Huiberts, Yin Tat Lee, Xinzhi Zhang:
Upper and Lower Bounds on the Smoothed Complexity of the Simplex Method. STOC 2023: 1904-1917 - 2022
- [c74]Sivakanth Gopi, Yin Tat Lee, Daogao Liu:
Private Convex Optimization via Exponential Mechanism. COLT 2022: 1948-1989 - [c73]Yin Tat Lee, Santosh S. Vempala:
The Manifold Joys of Sampling (Invited Talk). ICALP 2022: 4:1-4:20 - [c72]Da Yu, Saurabh Naik, Arturs Backurs, Sivakanth Gopi, Huseyin A. Inan, Gautam Kamath, Janardhan Kulkarni, Yin Tat Lee, Andre Manoel, Lukas Wutschitz, Sergey Yekhanin, Huishuai Zhang:
Differentially Private Fine-tuning of Language Models. ICLR 2022 - [c71]Damek Davis, Dmitriy Drusvyatskiy, Yin Tat Lee, Swati Padmanabhan, Guanghao Ye:
A gradient sampling method with complexity guarantees for Lipschitz functions in high and low dimensions. NeurIPS 2022 - [c70]Sally Dong, Haotian Jiang, Yin Tat Lee, Swati Padmanabhan, Guanghao Ye:
Decomposable Non-Smooth Convex Optimization with Nearly-Linear Gradient Oracle Complexity. NeurIPS 2022 - [c69]Yunbum Kook, Yin Tat Lee, Ruoqi Shen, Santosh S. Vempala:
Sampling with Riemannian Hamiltonian Monte Carlo in a Constrained Space. NeurIPS 2022 - [c68]Xuechen Li, Daogao Liu, Tatsunori B. Hashimoto, Huseyin A. Inan, Janardhan Kulkarni, Yin Tat Lee, Abhradeep Guha Thakurta:
When Does Differentially Private Learning Not Suffer in High Dimensions? NeurIPS 2022 - [c67]Sally Dong, Yu Gao, Gramoz Goranci, Yin Tat Lee, Richard Peng, Sushant Sachdeva, Guanghao Ye:
Nested Dissection Meets IPMs: Planar Min-Cost Flow in Nearly-Linear Time. SODA 2022: 124-153 - [c66]Maryam Fazel, Yin Tat Lee, Swati Padmanabhan, Aaron Sidford:
Computing Lewis Weights to High Precision. SODA 2022: 2723-2742 - [c65]Jan van den Brand, Yu Gao, Arun Jambulapati, Yin Tat Lee, Yang P. Liu, Richard Peng, Aaron Sidford:
Faster maxflow via improved dynamic spectral vertex sparsifiers. STOC 2022: 543-556 - 2021
- [c64]Yin Tat Lee, Ruoqi Shen, Kevin Tian:
Structured Logconcave Sampling with a Restricted Gaussian Oracle. COLT 2021: 2993-3050 - [c63]Janardhan Kulkarni, Yin Tat Lee, Daogao Liu:
Private Non-smooth ERM and SCO in Subquadratic Steps. NeurIPS 2021: 4053-4064 - [c62]Sivakanth Gopi, Yin Tat Lee, Lukas Wutschitz:
Numerical Composition of Differential Privacy. NeurIPS 2021: 11631-11642 - [c61]Yin Tat Lee, Ruoqi Shen, Kevin Tian:
Lower Bounds on Metropolized Sampling Methods for Well-Conditioned Distributions. NeurIPS 2021: 18812-18824 - [c60]Zhiqi Bu, Sivakanth Gopi, Janardhan Kulkarni, Yin Tat Lee, Judy Hanwen Shen, Uthaipon Tantipongpipat:
Fast and Memory Efficient Differentially Private-SGD via JL Projections. NeurIPS 2021: 19680-19691 - [c59]Jan van den Brand, Yin Tat Lee, Yang P. Liu, Thatchaphol Saranurak, Aaron Sidford, Zhao Song, Di Wang:
Minimum cost flows, MDPs, and ℓ1-regression in nearly linear time for dense instances. STOC 2021: 859-869 - [c58]He Jia, Aditi Laddha, Yin Tat Lee, Santosh S. Vempala:
Reducing isotropy and volume to KLS: an o*(n3ψ2) volume algorithm. STOC 2021: 961-974 - [c57]Sally Dong, Yin Tat Lee, Guanghao Ye:
A nearly-linear time algorithm for linear programs with small treewidth: a multiscale representation of robust central path. STOC 2021: 1784-1797 - 2020
- [c56]Naman Agarwal, Sham M. Kakade, Rahul Kidambi, Yin Tat Lee, Praneeth Netrapalli, Aaron Sidford:
Leverage Score Sampling for Faster Accelerated Regression and ERM. ALT 2020: 22-47 - [c55]Yin Tat Lee, Ruoqi Shen, Kevin Tian:
Logsmooth Gradient Concentration and Tighter Runtimes for Metropolized Hamiltonian Monte Carlo. COLT 2020: 2565-2597 - [c54]Yin Tat Lee, Swati Padmanabhan:
An $\widetilde\mathcalO(m/\varepsilon^3.5)$-Cost Algorithm for Semidefinite Programs with Diagonal Constraints. COLT 2020: 3069-3119 - [c53]Haotian Jiang, Tarun Kathuria, Yin Tat Lee, Swati Padmanabhan, Zhao Song:
A Faster Interior Point Method for Semidefinite Programming. FOCS 2020: 910-918 - [c52]Jan van den Brand, Yin Tat Lee, Danupon Nanongkai, Richard Peng, Thatchaphol Saranurak, Aaron Sidford, Zhao Song, Di Wang:
Bipartite Matching in Nearly-linear Time on Moderately Dense Graphs. FOCS 2020: 919-930 - [c51]Yin Tat Lee:
Convex Optimization and Dynamic Data Structure (Invited Talk). FSTTCS 2020: 3:1-3:1 - [c50]Sébastien Bubeck, Ronen Eldan, Yin Tat Lee, Dan Mikulincer:
Network size and size of the weights in memorization with two-layers neural networks. NeurIPS 2020 - [c49]Yair Carmon, Arun Jambulapati, Qijia Jiang, Yujia Jin, Yin Tat Lee, Aaron Sidford, Kevin Tian:
Acceleration with a Ball Optimization Oracle. NeurIPS 2020 - [c48]Marek Eliás, Michael Kapralov, Janardhan Kulkarni, Yin Tat Lee:
Differentially Private Release of Synthetic Graphs. SODA 2020: 560-578 - [c47]Sébastien Bubeck, Bo'az Klartag, Yin Tat Lee, Yuanzhi Li, Mark Sellke:
Chasing Nested Convex Bodies Nearly Optimally. SODA 2020: 1496-1508 - [c46]Sally Dong, Yin Tat Lee, Kent Quanrud:
Computing Circle Packing Representations of Planar Graphs. SODA 2020: 2860-2875 - [c45]Jan van den Brand, Yin Tat Lee, Aaron Sidford, Zhao Song:
Solving tall dense linear programs in nearly linear time. STOC 2020: 775-788 - [c44]Arun Jambulapati, Yin Tat Lee, Jerry Li, Swati Padmanabhan, Kevin Tian:
Positive semidefinite programming: mixed, parallel, and width-independent. STOC 2020: 789-802 - [c43]Haotian Jiang, Yin Tat Lee, Zhao Song, Sam Chiu-wai Wong:
An improved cutting plane method for convex optimization, convex-concave games, and its applications. STOC 2020: 944-953 - [c42]Aditi Laddha, Yin Tat Lee, Santosh S. Vempala:
Strong self-concordance and sampling. STOC 2020: 1212-1222 - 2019
- [c41]Sébastien Bubeck, Qijia Jiang, Yin Tat Lee, Yuanzhi Li, Aaron Sidford:
Near-optimal method for highly smooth convex optimization. COLT 2019: 492-507 - [c40]Michael B. Cohen, Ben Cousins, Yin Tat Lee, Xin Yang:
A near-optimal algorithm for approximating the John Ellipsoid. COLT 2019: 849-873 - [c39]Alexander V. Gasnikov, Pavel E. Dvurechensky, Eduard Gorbunov, Evgeniya A. Vorontsova, Daniil Selikhanovych, César A. Uribe, Bo Jiang, Haoyue Wang, Shuzhong Zhang, Sébastien Bubeck, Qijia Jiang, Yin Tat Lee, Yuanzhi Li, Aaron Sidford:
Near Optimal Methods for Minimizing Convex Functions with Lipschitz $p$-th Derivatives. COLT 2019: 1392-1393 - [c38]Yin Tat Lee, Zhao Song, Qiuyi Zhang:
Solving Empirical Risk Minimization in the Current Matrix Multiplication Time. COLT 2019: 2140-2157 - [c37]Deeparnab Chakrabarty, Yin Tat Lee, Aaron Sidford, Sahil Singla, Sam Chiu-wai Wong:
Faster Matroid Intersection. FOCS 2019: 1146-1168 - [c36]Sébastien Bubeck, Yin Tat Lee, Eric Price, Ilya P. Razenshteyn:
Adversarial examples from computational constraints. ICML 2019: 831-840 - [c35]Ruoqi Shen, Yin Tat Lee:
The Randomized Midpoint Method for Log-Concave Sampling. NeurIPS 2019: 2098-2109 - [c34]Sébastien Bubeck, Qijia Jiang, Yin Tat Lee, Yuanzhi Li, Aaron Sidford:
Complexity of Highly Parallel Non-Smooth Convex Optimization. NeurIPS 2019: 13900-13909 - [c33]Sébastien Bubeck, Michael B. Cohen, James R. Lee, Yin Tat Lee:
Metrical task systems on trees via mirror descent and unfair gluing. SODA 2019: 89-97 - [c32]C. J. Argue, Sébastien Bubeck, Michael B. Cohen, Anupam Gupta, Yin Tat Lee:
A Nearly-Linear Bound for Chasing Nested Convex Bodies. SODA 2019: 117-122 - [c31]Sébastien Bubeck, Yin Tat Lee, Yuanzhi Li, Mark Sellke:
Competitively chasing convex bodies. STOC 2019: 861-868 - [c30]Michael B. Cohen, Yin Tat Lee, Zhao Song:
Solving linear programs in the current matrix multiplication time. STOC 2019: 938-942 - 2018
- [c29]Yin Tat Lee, Aaron Sidford, Santosh S. Vempala:
Efficient Convex Optimization with Membership Oracles. COLT 2018: 1292-1294 - [c28]Kevin Scaman, Francis R. Bach, Sébastien Bubeck, Laurent Massoulié, Yin Tat Lee:
Optimal Algorithms for Non-Smooth Distributed Optimization in Networks. NeurIPS 2018: 2745-2754 - [c27]Sébastien Bubeck, Michael B. Cohen, Yin Tat Lee, James R. Lee, Aleksander Madry:
k-server via multiscale entropic regularization. STOC 2018: 3-16 - [c26]Tsz Chiu Kwok, Lap Chi Lau, Yin Tat Lee, Akshay Ramachandran:
The Paulsen problem, continuous operator scaling, and smoothed analysis. STOC 2018: 182-189 - [c25]Ankit Garg, Yin Tat Lee, Zhao Song, Nikhil Srivastava:
A matrix expander Chernoff bound. STOC 2018: 1102-1114 - [c24]Yin Tat Lee, Santosh S. Vempala:
Convergence rate of riemannian Hamiltonian Monte Carlo and faster polytope volume computation. STOC 2018: 1115-1121 - [c23]Yin Tat Lee, Santosh S. Vempala:
Stochastic localization + Stieltjes barrier = tight bound for log-Sobolev. STOC 2018: 1122-1129 - [c22]Sébastien Bubeck, Michael B. Cohen, Yin Tat Lee, Yuanzhi Li:
An homotopy method for lp regression provably beyond self-concordance and in input-sparsity time. STOC 2018: 1130-1137 - 2017
- [c21]Yin Tat Lee, Santosh Srinivas Vempala:
Eldan's Stochastic Localization and the KLS Hyperplane Conjecture: An Improved Lower Bound for Expansion. FOCS 2017: 998-1007 - [c20]Kevin Scaman, Francis R. Bach, Sébastien Bubeck, Yin Tat Lee, Laurent Massoulié:
Optimal Algorithms for Smooth and Strongly Convex Distributed Optimization in Networks. ICML 2017: 3027-3036 - [c19]Sébastien Bubeck, Yin Tat Lee, Ronen Eldan:
Kernel-based methods for bandit convex optimization. STOC 2017: 72-85 - [c18]Yin Tat Lee, He Sun:
An SDP-based algorithm for linear-sized spectral sparsification. STOC 2017: 678-687 - [c17]Yin Tat Lee, Santosh S. Vempala:
Geodesic walks in polytopes. STOC 2017: 927-940 - [c16]Deeparnab Chakrabarty, Yin Tat Lee, Aaron Sidford, Sam Chiu-wai Wong:
Subquadratic submodular function minimization. STOC 2017: 1220-1231 - 2016
- [c15]Sébastien Bubeck, Yin Tat Lee:
Black-box Optimization with a Politician. ICML 2016: 1624-1631 - [c14]Zeyuan Allen Zhu, Yin Tat Lee, Lorenzo Orecchia:
Using Optimization to Obtain a Width-Independent, Parallel, Simpler, and Faster Positive SDP Solver. SODA 2016: 1824-1831 - [c13]Tsz Chiu Kwok, Lap Chi Lau, Yin Tat Lee:
Improved Cheeger's Inequality and Analysis of Local Graph Partitioning using Vertex Expansion and Expansion Profile. SODA 2016: 1848-1861 - [c12]Michael B. Cohen, Yin Tat Lee, Gary L. Miller, Jakub Pachocki, Aaron Sidford:
Geometric median in nearly linear time. STOC 2016: 9-21 - [c11]Rasmus Kyng, Yin Tat Lee, Richard Peng, Sushant Sachdeva, Daniel A. Spielman:
Sparsified Cholesky and multigrid solvers for connection laplacians. STOC 2016: 842-850 - 2015
- [c10]Yin Tat Lee, Aaron Sidford:
Efficient Inverse Maintenance and Faster Algorithms for Linear Programming. FOCS 2015: 230-249 - [c9]Yin Tat Lee, He Sun:
Constructing Linear-Sized Spectral Sparsification in Almost-Linear Time. FOCS 2015: 250-269 - [c8]Yin Tat Lee, Aaron Sidford, Sam Chiu-wai Wong:
A Faster Cutting Plane Method and its Implications for Combinatorial and Convex Optimization. FOCS 2015: 1049-1065 - [c7]Michael B. Cohen, Yin Tat Lee, Cameron Musco, Christopher Musco, Richard Peng, Aaron Sidford:
Uniform Sampling for Matrix Approximation. ITCS 2015: 181-190 - 2014
- [c6]Yin Tat Lee, Aaron Sidford:
Path Finding Methods for Linear Programming: Solving Linear Programs in Õ(vrank) Iterations and Faster Algorithms for Maximum Flow. FOCS 2014: 424-433 - [c5]Michael Kapralov, Yin Tat Lee, Cameron Musco, Christopher Musco, Aaron Sidford:
Single Pass Spectral Sparsification in Dynamic Streams. FOCS 2014: 561-570 - [c4]Jonathan A. Kelner, Yin Tat Lee, Lorenzo Orecchia, Aaron Sidford:
An Almost-Linear-Time Algorithm for Approximate Max Flow in Undirected Graphs, and its Multicommodity Generalizations. SODA 2014: 217-226 - 2013
- [c3]Yin Tat Lee, Aaron Sidford:
Efficient Accelerated Coordinate Descent Methods and Faster Algorithms for Solving Linear Systems. FOCS 2013: 147-156 - [c2]Tsz Chiu Kwok, Lap Chi Lau, Yin Tat Lee, Shayan Oveis Gharan, Luca Trevisan:
Improved Cheeger's inequality: analysis of spectral partitioning algorithms through higher order spectral gap. STOC 2013: 11-20 - [c1]Yin Tat Lee, Satish Rao, Nikhil Srivastava:
A new approach to computing maximum flows using electrical flows. STOC 2013: 755-764
Informal and Other Publications
- 2024
- [i93]Chulin Xie, Zinan Lin, Arturs Backurs, Sivakanth Gopi, Da Yu, Huseyin A. Inan, Harsha Nori, Haotian Jiang, Huishuai Zhang, Yin Tat Lee, Bo Li, Sergey Yekhanin:
Differentially Private Synthetic Data via Foundation Model APIs 2: Text. CoRR abs/2403.01749 (2024) - [i92]Mehrdad Ghadiri, Yin Tat Lee, Swati Padmanabhan, William Swartworth, David P. Woodruff, Guanghao Ye:
Improving the Bit Complexity of Communication for Distributed Convex Optimization. CoRR abs/2403.19146 (2024) - [i91]Marah I Abdin, Sam Ade Jacobs, Ammar Ahmad Awan, Jyoti Aneja, Ahmed Awadallah, Hany Awadalla, Nguyen Bach, Amit Bahree, Arash Bakhtiari, Harkirat S. Behl, Alon Benhaim, Misha Bilenko, Johan Bjorck, Sébastien Bubeck, Martin Cai, Caio César Teodoro Mendes, Weizhu Chen, Vishrav Chaudhary, Parul Chopra, Allie Del Giorno, Gustavo de Rosa, Matthew Dixon, Ronen Eldan, Dan Iter, Amit Garg, Abhishek Goswami, Suriya Gunasekar, Emman Haider, Junheng Hao, Russell J. Hewett, Jamie Huynh, Mojan Javaheripi, Xin Jin, Piero Kauffmann, Nikos Karampatziakis, Dongwoo Kim, Mahoud Khademi, Lev Kurilenko, James R. Lee, Yin Tat Lee, Yuanzhi Li, Chen Liang, Weishung Liu, Eric Lin, Zeqi Lin, Piyush Madan, Arindam Mitra, Hardik Modi, Anh Nguyen, Brandon Norick, Barun Patra, Daniel Perez-Becker, Thomas Portet, Reid Pryzant, Heyang Qin, Marko Radmilac, Corby Rosset, Sambudha Roy, Olatunji Ruwase, Olli Saarikivi, Amin Saied, Adil Salim, Michael Santacroce, Shital Shah, Ning Shang, Hiteshi Sharma, Xia Song, Masahiro Tanaka, Xin Wang, Rachel Ward, Guanhua Wang, Philipp Witte, Michael Wyatt, Can Xu, Jiahang Xu, Sonali Yadav, Fan Yang, Ziyi Yang, Donghan Yu, Chengruidong Zhang, Cyril Zhang, Jianwen Zhang, Li Lyna Zhang, Yi Zhang, Yue Zhang, Yunan Zhang, Xiren Zhou:
Phi-3 Technical Report: A Highly Capable Language Model Locally on Your Phone. CoRR abs/2404.14219 (2024) - 2023
- [i90]Yair Carmon, Arun Jambulapati, Yujia Jin, Yin Tat Lee, Daogao Liu, Aaron Sidford, Kevin Tian:
ReSQueing Parallel and Private Stochastic Convex Optimization. CoRR abs/2301.00457 (2023) - [i89]Sivakanth Gopi, Yin Tat Lee, Daogao Liu, Ruoqi Shen, Kevin Tian:
Algorithmic Aspects of the Log-Laplace Transform and a Non-Euclidean Proximal Sampler. CoRR abs/2302.06085 (2023) - [i88]Yangsibo Huang, Daogao Liu, Zexuan Zhong, Weijia Shi, Yin Tat Lee:
kNN-Adapter: Efficient Domain Adaptation for Black-Box Language Models. CoRR abs/2302.10879 (2023) - [i87]Sébastien Bubeck, Varun Chandrasekaran, Ronen Eldan, Johannes Gehrke, Eric Horvitz, Ece Kamar, Peter Lee, Yin Tat Lee, Yuanzhi Li, Scott M. Lundberg, Harsha Nori, Hamid Palangi, Marco Túlio Ribeiro, Yi Zhang:
Sparks of Artificial General Intelligence: Early experiments with GPT-4. CoRR abs/2303.12712 (2023) - [i86]Haotian Jiang, Yin Tat Lee, Zhao Song, Lichen Zhang:
Convex Minimization with Integer Minima in Õ(n4) Time. CoRR abs/2304.03426 (2023) - [i85]Reid Pryzant, Dan Iter, Jerry Li, Yin Tat Lee, Chenguang Zhu, Michael Zeng:
Automatic Prompt Optimization with "Gradient Descent" and Beam Search. CoRR abs/2305.03495 (2023) - [i84]Yiran Wu, Feiran Jia, Shaokun Zhang, Hangyu Li, Erkang Zhu, Yue Wang, Yin Tat Lee, Richard Peng, Qingyun Wu, Chi Wang:
An Empirical Study on Challenging Math Problem Solving with GPT-4. CoRR abs/2306.01337 (2023) - [i83]Suriya Gunasekar, Yi Zhang, Jyoti Aneja, Caio César Teodoro Mendes, Allie Del Giorno, Sivakanth Gopi, Mojan Javaheripi, Piero Kauffmann, Gustavo de Rosa, Olli Saarikivi, Adil Salim, Shital Shah, Harkirat Singh Behl, Xin Wang, Sébastien Bubeck, Ronen Eldan, Adam Tauman Kalai, Yin Tat Lee, Yuanzhi Li:
Textbooks Are All You Need. CoRR abs/2306.11644 (2023) - [i82]Yuanzhi Li, Sébastien Bubeck, Ronen Eldan, Allie Del Giorno, Suriya Gunasekar, Yin Tat Lee:
Textbooks Are All You Need II: phi-1.5 technical report. CoRR abs/2309.05463 (2023) - [i81]Ruoqi Shen, Sébastien Bubeck, Ronen Eldan, Yin Tat Lee, Yuanzhi Li, Yi Zhang:
Positional Description Matters for Transformers Arithmetic. CoRR abs/2311.14737 (2023) - [i80]Harsha Nori, Yin Tat Lee, Sheng Zhang, Dean Carignan, Richard Edgar, Nicolò Fusi, Nicholas King, Jonathan Larson, Yuanzhi Li, Weishung Liu, Renqian Luo, Scott Mayer McKinney, Robert Osazuwa Ness, Hoifung Poon, Tao Qin, Naoto Usuyama, Chris White, Eric Horvitz:
Can Generalist Foundation Models Outcompete Special-Purpose Tuning? Case Study in Medicine. CoRR abs/2311.16452 (2023) - 2022
- [i79]Yunbum Kook, Yin Tat Lee, Ruoqi Shen, Santosh S. Vempala:
Sampling with Riemannian Hamiltonian Monte Carlo in a Constrained Space. CoRR abs/2202.01908 (2022) - [i78]Sivakanth Gopi, Yin Tat Lee, Daogao Liu:
Private Convex Optimization via Exponential Mechanism. CoRR abs/2203.00263 (2022) - [i77]Sally Dong, Yu Gao, Gramoz Goranci, Yin Tat Lee, Richard Peng, Sushant Sachdeva, Guanghao Ye:
Nested Dissection Meets IPMs: Planar Min-Cost Flow in Nearly-Linear Time. CoRR abs/2205.01562 (2022) - [i76]Xuechen Li, Daogao Liu, Tatsunori Hashimoto, Huseyin A. Inan, Janardhan Kulkarni, Yin Tat Lee, Abhradeep Guha Thakurta:
When Does Differentially Private Learning Not Suffer in High Dimensions? CoRR abs/2207.00160 (2022) - [i75]Sivakanth Gopi, Yin Tat Lee, Daogao Liu, Ruoqi Shen, Kevin Tian:
Private Convex Optimization in General Norms. CoRR abs/2207.08347 (2022) - [i74]Sally Dong, Haotian Jiang, Yin Tat Lee, Swati Padmanabhan, Guanghao Ye:
Decomposable Non-Smooth Convex Optimization with Nearly-Linear Gradient Oracle Complexity. CoRR abs/2208.03811 (2022) - [i73]Arun Jambulapati, Yin Tat Lee, Santosh S. Vempala:
A Slightly Improved Bound for the KLS Constant. CoRR abs/2208.11644 (2022) - [i72]