


Остановите войну!
for scientists:


default search action
Yin Tat Lee
Person information

Refine list

refinements active!
zoomed in on ?? of ?? records
view refined list in
export refined list as
showing all ?? records
2020 – today
- 2023
- [c75]Sivakanth Gopi, Yin Tat Lee, Daogao Liu, Ruoqi Shen, Kevin Tian:
Private Convex Optimization in General Norms. SODA 2023: 5068-5089 - [c74]Sophie Huiberts, Yin Tat Lee, Xinzhi Zhang:
Upper and Lower Bounds on the Smoothed Complexity of the Simplex Method. STOC 2023: 1904-1917 - [i85]Yair Carmon, Arun Jambulapati, Yujia Jin, Yin Tat Lee, Daogao Liu, Aaron Sidford, Kevin Tian:
ReSQueing Parallel and Private Stochastic Convex Optimization. CoRR abs/2301.00457 (2023) - [i84]Sivakanth Gopi, Yin Tat Lee, Daogao Liu, Ruoqi Shen, Kevin Tian:
Algorithmic Aspects of the Log-Laplace Transform and a Non-Euclidean Proximal Sampler. CoRR abs/2302.06085 (2023) - [i83]Yangsibo Huang, Daogao Liu, Zexuan Zhong, Weijia Shi, Yin Tat Lee:
kNN-Adapter: Efficient Domain Adaptation for Black-Box Language Models. CoRR abs/2302.10879 (2023) - [i82]Sébastien Bubeck, Varun Chandrasekaran, Ronen Eldan, Johannes Gehrke, Eric Horvitz, Ece Kamar, Peter Lee, Yin Tat Lee, Yuanzhi Li, Scott M. Lundberg, Harsha Nori, Hamid Palangi, Marco Túlio Ribeiro, Yi Zhang:
Sparks of Artificial General Intelligence: Early experiments with GPT-4. CoRR abs/2303.12712 (2023) - [i81]Haotian Jiang, Yin Tat Lee, Zhao Song, Lichen Zhang:
Convex Minimization with Integer Minima in Õ(n4) Time. CoRR abs/2304.03426 (2023) - [i80]Reid Pryzant, Dan Iter, Jerry Li, Yin Tat Lee, Chenguang Zhu, Michael Zeng:
Automatic Prompt Optimization with "Gradient Descent" and Beam Search. CoRR abs/2305.03495 (2023) - 2022
- [j11]Yin Tat Lee, Santosh S. Vempala:
Geodesic Walks in Polytopes. SIAM J. Comput. 51(2): 17-400 (2022) - [c73]Sivakanth Gopi, Yin Tat Lee, Daogao Liu:
Private Convex Optimization via Exponential Mechanism. COLT 2022: 1948-1989 - [c72]Yin Tat Lee, Santosh S. Vempala:
The Manifold Joys of Sampling (Invited Talk). ICALP 2022: 4:1-4:20 - [c71]Da Yu, Saurabh Naik, Arturs Backurs, Sivakanth Gopi, Huseyin A. Inan, Gautam Kamath, Janardhan Kulkarni, Yin Tat Lee, Andre Manoel, Lukas Wutschitz, Sergey Yekhanin, Huishuai Zhang:
Differentially Private Fine-tuning of Language Models. ICLR 2022 - [c70]Damek Davis, Dmitriy Drusvyatskiy, Yin Tat Lee, Swati Padmanabhan, Guanghao Ye:
A gradient sampling method with complexity guarantees for Lipschitz functions in high and low dimensions. NeurIPS 2022 - [c69]Sally Dong, Haotian Jiang, Yin Tat Lee, Swati Padmanabhan, Guanghao Ye:
Decomposable Non-Smooth Convex Optimization with Nearly-Linear Gradient Oracle Complexity. NeurIPS 2022 - [c68]Yunbum Kook, Yin Tat Lee, Ruoqi Shen, Santosh S. Vempala:
Sampling with Riemannian Hamiltonian Monte Carlo in a Constrained Space. NeurIPS 2022 - [c67]Sally Dong, Yu Gao, Gramoz Goranci, Yin Tat Lee, Richard Peng, Sushant Sachdeva, Guanghao Ye:
Nested Dissection Meets IPMs: Planar Min-Cost Flow in Nearly-Linear Time. SODA 2022: 124-153 - [c66]Maryam Fazel, Yin Tat Lee, Swati Padmanabhan, Aaron Sidford:
Computing Lewis Weights to High Precision. SODA 2022: 2723-2742 - [c65]Jan van den Brand, Yu Gao, Arun Jambulapati, Yin Tat Lee, Yang P. Liu, Richard Peng, Aaron Sidford:
Faster maxflow via improved dynamic spectral vertex sparsifiers. STOC 2022: 543-556 - [i79]Yunbum Kook, Yin Tat Lee, Ruoqi Shen, Santosh S. Vempala:
Sampling with Riemannian Hamiltonian Monte Carlo in a Constrained Space. CoRR abs/2202.01908 (2022) - [i78]Sivakanth Gopi, Yin Tat Lee, Daogao Liu:
Private Convex Optimization via Exponential Mechanism. CoRR abs/2203.00263 (2022) - [i77]Sally Dong, Yu Gao, Gramoz Goranci, Yin Tat Lee, Richard Peng, Sushant Sachdeva, Guanghao Ye:
Nested Dissection Meets IPMs: Planar Min-Cost Flow in Nearly-Linear Time. CoRR abs/2205.01562 (2022) - [i76]Xuechen Li, Daogao Liu, Tatsunori Hashimoto, Huseyin A. Inan, Janardhan Kulkarni, Yin Tat Lee, Abhradeep Guha Thakurta:
When Does Differentially Private Learning Not Suffer in High Dimensions? CoRR abs/2207.00160 (2022) - [i75]Sivakanth Gopi, Yin Tat Lee, Daogao Liu, Ruoqi Shen, Kevin Tian:
Private Convex Optimization in General Norms. CoRR abs/2207.08347 (2022) - [i74]Sally Dong, Haotian Jiang, Yin Tat Lee, Swati Padmanabhan, Guanghao Ye:
Decomposable Non-Smooth Convex Optimization with Nearly-Linear Gradient Oracle Complexity. CoRR abs/2208.03811 (2022) - [i73]Arun Jambulapati, Yin Tat Lee, Santosh S. Vempala:
A Slightly Improved Bound for the KLS Constant. CoRR abs/2208.11644 (2022) - [i72]Yunbum Kook, Yin Tat Lee, Ruoqi Shen, Santosh S. Vempala:
Condition-number-independent Convergence Rate of Riemannian Hamiltonian Monte Carlo with Numerical Integrators. CoRR abs/2210.07219 (2022) - [i71]Sophie Huiberts, Yin Tat Lee, Xinzhi Zhang:
Upper and Lower Bounds on the Smoothed Complexity of the Simplex Method. CoRR abs/2211.11860 (2022) - [i70]Jiyan He, Xuechen Li, Da Yu, Huishuai Zhang, Janardhan Kulkarni, Yin Tat Lee, Arturs Backurs, Nenghai Yu, Jiang Bian:
Exploring the Limits of Differentially Private Deep Learning with Group-wise Clipping. CoRR abs/2212.01539 (2022) - [i69]Kwangjun Ahn, Sébastien Bubeck, Sinho Chewi, Yin Tat Lee, Felipe Suarez, Yi Zhang:
Learning threshold neurons via the "edge of stability". CoRR abs/2212.07469 (2022) - 2021
- [j10]Michael B. Cohen, Yin Tat Lee, Zhao Song:
Solving Linear Programs in the Current Matrix Multiplication Time. J. ACM 68(1): 3:1-3:39 (2021) - [j9]Sébastien Bubeck, Ronen Eldan, Yin Tat Lee:
Kernel-based Methods for Bandit Convex Optimization. J. ACM 68(4): 25:1-25:35 (2021) - [j8]Yin Tat Lee, Man-Chung Yue
:
Universal Barrier Is n-Self-Concordant. Math. Oper. Res. 46(3): 1129-1148 (2021) - [j7]Sébastien Bubeck, Michael B. Cohen, James R. Lee, Yin Tat Lee:
Metrical Task Systems on Trees via Mirror Descent and Unfair Gluing. SIAM J. Comput. 50(3): 909-923 (2021) - [c64]Yin Tat Lee, Ruoqi Shen, Kevin Tian:
Structured Logconcave Sampling with a Restricted Gaussian Oracle. COLT 2021: 2993-3050 - [c63]Janardhan Kulkarni, Yin Tat Lee, Daogao Liu:
Private Non-smooth ERM and SCO in Subquadratic Steps. NeurIPS 2021: 4053-4064 - [c62]Sivakanth Gopi, Yin Tat Lee, Lukas Wutschitz:
Numerical Composition of Differential Privacy. NeurIPS 2021: 11631-11642 - [c61]Yin Tat Lee, Ruoqi Shen, Kevin Tian:
Lower Bounds on Metropolized Sampling Methods for Well-Conditioned Distributions. NeurIPS 2021: 18812-18824 - [c60]Zhiqi Bu, Sivakanth Gopi, Janardhan Kulkarni, Yin Tat Lee, Judy Hanwen Shen, Uthaipon Tantipongpipat:
Fast and Memory Efficient Differentially Private-SGD via JL Projections. NeurIPS 2021: 19680-19691 - [c59]Jan van den Brand
, Yin Tat Lee, Yang P. Liu, Thatchaphol Saranurak
, Aaron Sidford, Zhao Song, Di Wang:
Minimum cost flows, MDPs, and ℓ1-regression in nearly linear time for dense instances. STOC 2021: 859-869 - [c58]He Jia, Aditi Laddha, Yin Tat Lee, Santosh S. Vempala:
Reducing isotropy and volume to KLS: an o*(n3ψ2) volume algorithm. STOC 2021: 961-974 - [c57]Sally Dong, Yin Tat Lee, Guanghao Ye:
A nearly-linear time algorithm for linear programs with small treewidth: a multiscale representation of robust central path. STOC 2021: 1784-1797 - [i68]Jan van den Brand, Yin Tat Lee, Yang P. Liu, Thatchaphol Saranurak, Aaron Sidford, Zhao Song, Di Wang:
Minimum Cost Flows, MDPs, and 𝓁1-Regression in Nearly Linear Time for Dense Instances. CoRR abs/2101.05719 (2021) - [i67]Zhiqi Bu, Sivakanth Gopi, Janardhan Kulkarni, Yin Tat Lee, Judy Hanwen Shen, Uthaipon Tantipongpipat:
Fast and Memory Efficient Differentially Private-SGD via JL Projections. CoRR abs/2102.03013 (2021) - [i66]Janardhan Kulkarni, Yin Tat Lee, Daogao Liu:
Private Non-smooth Empirical Risk Minimization and Stochastic Convex Optimization in Subquadratic Steps. CoRR abs/2103.15352 (2021) - [i65]Sivakanth Gopi, Yin Tat Lee, Lukas Wutschitz:
Numerical Composition of Differential Privacy. CoRR abs/2106.02848 (2021) - [i64]Yin Tat Lee, Ruoqi Shen, Kevin Tian:
Lower Bounds on Metropolized Sampling Methods for Well-Conditioned Distributions. CoRR abs/2106.05480 (2021) - [i63]Yin Tat Lee, Santosh S. Vempala:
Tutorial on the Robust Interior Point Method. CoRR abs/2108.04734 (2021) - [i62]Da Yu, Saurabh Naik, Arturs Backurs, Sivakanth Gopi, Huseyin A. Inan, Gautam Kamath, Janardhan Kulkarni, Yin Tat Lee, Andre Manoel, Lukas Wutschitz, Sergey Yekhanin, Huishuai Zhang:
Differentially Private Fine-tuning of Language Models. CoRR abs/2110.06500 (2021) - [i61]Maryam Fazel, Yin Tat Lee, Swati Padmanabhan, Aaron Sidford:
Computing Lewis Weights to High Precision. CoRR abs/2110.15563 (2021) - [i60]Jan van den Brand, Yu Gao, Arun Jambulapati, Yin Tat Lee, Yang P. Liu, Richard Peng, Aaron Sidford:
Faster Maxflow via Improved Dynamic Spectral Vertex Sparsifiers. CoRR abs/2112.00722 (2021) - 2020
- [j6]Yin Tat Lee, Marcin Pilipczuk, David P. Woodruff:
Introduction to the Special Issue on SODA'18. ACM Trans. Algorithms 16(1): 1:1-1:2 (2020) - [c56]Naman Agarwal, Sham M. Kakade, Rahul Kidambi, Yin Tat Lee, Praneeth Netrapalli, Aaron Sidford:
Leverage Score Sampling for Faster Accelerated Regression and ERM. ALT 2020: 22-47 - [c55]Yin Tat Lee, Ruoqi Shen, Kevin Tian:
Logsmooth Gradient Concentration and Tighter Runtimes for Metropolized Hamiltonian Monte Carlo. COLT 2020: 2565-2597 - [c54]Yin Tat Lee, Swati Padmanabhan:
An $\widetilde\mathcalO(m/\varepsilon^3.5)$-Cost Algorithm for Semidefinite Programs with Diagonal Constraints. COLT 2020: 3069-3119 - [c53]Haotian Jiang, Tarun Kathuria, Yin Tat Lee, Swati Padmanabhan, Zhao Song:
A Faster Interior Point Method for Semidefinite Programming. FOCS 2020: 910-918 - [c52]Jan van den Brand
, Yin Tat Lee, Danupon Nanongkai, Richard Peng, Thatchaphol Saranurak
, Aaron Sidford, Zhao Song, Di Wang:
Bipartite Matching in Nearly-linear Time on Moderately Dense Graphs. FOCS 2020: 919-930 - [c51]Yin Tat Lee:
Convex Optimization and Dynamic Data Structure (Invited Talk). FSTTCS 2020: 3:1-3:1 - [c50]Sébastien Bubeck, Ronen Eldan, Yin Tat Lee, Dan Mikulincer:
Network size and size of the weights in memorization with two-layers neural networks. NeurIPS 2020 - [c49]Yair Carmon, Arun Jambulapati, Qijia Jiang, Yujia Jin, Yin Tat Lee, Aaron Sidford, Kevin Tian:
Acceleration with a Ball Optimization Oracle. NeurIPS 2020 - [c48]Marek Eliás, Michael Kapralov, Janardhan Kulkarni, Yin Tat Lee:
Differentially Private Release of Synthetic Graphs. SODA 2020: 560-578 - [c47]Sébastien Bubeck, Bo'az Klartag, Yin Tat Lee, Yuanzhi Li, Mark Sellke:
Chasing Nested Convex Bodies Nearly Optimally. SODA 2020: 1496-1508 - [c46]Sally Dong, Yin Tat Lee, Kent Quanrud:
Computing Circle Packing Representations of Planar Graphs. SODA 2020: 2860-2875 - [c45]Jan van den Brand
, Yin Tat Lee, Aaron Sidford, Zhao Song:
Solving tall dense linear programs in nearly linear time. STOC 2020: 775-788 - [c44]Arun Jambulapati, Yin Tat Lee, Jerry Li, Swati Padmanabhan, Kevin Tian:
Positive semidefinite programming: mixed, parallel, and width-independent. STOC 2020: 789-802 - [c43]Haotian Jiang, Yin Tat Lee, Zhao Song, Sam Chiu-wai Wong:
An improved cutting plane method for convex optimization, convex-concave games, and its applications. STOC 2020: 944-953 - [c42]Aditi Laddha, Yin Tat Lee, Santosh S. Vempala:
Strong self-concordance and sampling. STOC 2020: 1212-1222 - [i59]Jan van den Brand, Yin Tat Lee, Aaron Sidford, Zhao Song:
Solving Tall Dense Linear Programs in Nearly Linear Time. CoRR abs/2002.02304 (2020) - [i58]Yin Tat Lee, Ruoqi Shen, Kevin Tian:
Logsmooth Gradient Concentration and Tighter Runtimes for Metropolized Hamiltonian Monte Carlo. CoRR abs/2002.04121 (2020) - [i57]Arun Jambulapati, Yin Tat Lee, Jerry Li, Swati Padmanabhan, Kevin Tian:
Positive Semidefinite Programming: Mixed, Parallel, and Width-Independent. CoRR abs/2002.04830 (2020) - [i56]Yair Carmon, Arun Jambulapati, Qijia Jiang, Yujia Jin, Yin Tat Lee, Aaron Sidford, Kevin Tian:
Acceleration with a Ball Optimization Oracle. CoRR abs/2003.08078 (2020) - [i55]Haotian Jiang, Yin Tat Lee, Zhao Song, Sam Chiu-wai Wong:
An Improved Cutting Plane Method for Convex Optimization, Convex-Concave Games and its Applications. CoRR abs/2004.04250 (2020) - [i54]Sébastien Bubeck, Ronen Eldan, Yin Tat Lee, Dan Mikulincer:
Network size and weights size for memorization with two-layers neural networks. CoRR abs/2006.02855 (2020) - [i53]Ruoqi Shen, Kevin Tian, Yin Tat Lee:
Composite Logconcave Sampling with a Restricted Gaussian Oracle. CoRR abs/2006.05976 (2020) - [i52]He Jia, Aditi Laddha, Yin Tat Lee, Santosh S. Vempala:
Reducing Isotropy and Volume to KLS: An O(n3ψ2) Volume Algorithm. CoRR abs/2008.02146 (2020) - [i51]Jan van den Brand, Yin Tat Lee, Danupon Nanongkai, Richard Peng, Thatchaphol Saranurak, Aaron Sidford, Zhao Song, Di Wang:
Bipartite Matching in Nearly-linear Time on Moderately Dense Graphs. CoRR abs/2009.01802 (2020) - [i50]Haotian Jiang, Tarun Kathuria, Yin Tat Lee, Swati Padmanabhan, Zhao Song:
A Faster Interior Point Method for Semidefinite Programming. CoRR abs/2009.10217 (2020) - [i49]Yin Tat Lee, Ruoqi Shen, Kevin Tian:
Structured Logconcave Sampling with a Restricted Gaussian Oracle. CoRR abs/2010.03106 (2020) - [i48]Sally Dong, Yin Tat Lee, Guanghao Ye:
A Nearly-Linear Time Algorithm for Linear Programs with Small Treewidth: A Multiscale Representation of Robust Central Path. CoRR abs/2011.05365 (2020)
2010 – 2019
- 2019
- [j5]Kevin Scaman, Francis R. Bach, Sébastien Bubeck, Yin Tat Lee, Laurent Massoulié:
Optimal Convergence Rates for Convex Distributed Optimization in Networks. J. Mach. Learn. Res. 20: 159:1-159:31 (2019) - [c41]Sébastien Bubeck, Qijia Jiang, Yin Tat Lee, Yuanzhi Li, Aaron Sidford:
Near-optimal method for highly smooth convex optimization. COLT 2019: 492-507 - [c40]Michael B. Cohen, Ben Cousins, Yin Tat Lee, Xin Yang:
A near-optimal algorithm for approximating the John Ellipsoid. COLT 2019: 849-873 - [c39]Alexander V. Gasnikov, Pavel E. Dvurechensky, Eduard Gorbunov, Evgeniya A. Vorontsova, Daniil Selikhanovych, César A. Uribe, Bo Jiang, Haoyue Wang, Shuzhong Zhang, Sébastien Bubeck, Qijia Jiang, Yin Tat Lee, Yuanzhi Li, Aaron Sidford:
Near Optimal Methods for Minimizing Convex Functions with Lipschitz $p$-th Derivatives. COLT 2019: 1392-1393 - [c38]Yin Tat Lee, Zhao Song, Qiuyi Zhang:
Solving Empirical Risk Minimization in the Current Matrix Multiplication Time. COLT 2019: 2140-2157 - [c37]Deeparnab Chakrabarty, Yin Tat Lee, Aaron Sidford, Sahil Singla, Sam Chiu-wai Wong:
Faster Matroid Intersection. FOCS 2019: 1146-1168 - [c36]Sébastien Bubeck, Yin Tat Lee, Eric Price, Ilya P. Razenshteyn:
Adversarial examples from computational constraints. ICML 2019: 831-840 - [c35]Ruoqi Shen, Yin Tat Lee:
The Randomized Midpoint Method for Log-Concave Sampling. NeurIPS 2019: 2098-2109 - [c34]Sébastien Bubeck, Qijia Jiang, Yin Tat Lee, Yuanzhi Li, Aaron Sidford:
Complexity of Highly Parallel Non-Smooth Convex Optimization. NeurIPS 2019: 13900-13909 - [c33]Sébastien Bubeck, Michael B. Cohen, James R. Lee, Yin Tat Lee:
Metrical task systems on trees via mirror descent and unfair gluing. SODA 2019: 89-97 - [c32]C. J. Argue, Sébastien Bubeck, Michael B. Cohen, Anupam Gupta, Yin Tat Lee:
A Nearly-Linear Bound for Chasing Nested Convex Bodies. SODA 2019: 117-122 - [c31]Sébastien Bubeck, Yin Tat Lee, Yuanzhi Li, Mark Sellke:
Competitively chasing convex bodies. STOC 2019: 861-868 - [c30]Michael B. Cohen, Yin Tat Lee, Zhao Song:
Solving linear programs in the current matrix multiplication time. STOC 2019: 938-942 - [i47]Yin Tat Lee, Swati Padmanabhan:
An Õ(m/ε3.5)-Cost Algorithm for Semidefinite Programs with Diagonal Constraints. CoRR abs/1903.01859 (2019) - [i46]Yin Tat Lee, Zhao Song, Qiuyi Zhang:
Solving Empirical Risk Minimization in the Current Matrix Multiplication Time. CoRR abs/1905.04447 (2019) - [i45]Michael B. Cohen, Ben Cousins, Yin Tat Lee, Xin Yang:
A near-optimal algorithm for approximating the John Ellipsoid. CoRR abs/1905.11580 (2019) - [i44]Sébastien Bubeck, Qijia Jiang, Yin Tat Lee, Yuanzhi Li, Aaron Sidford:
Complexity of Highly Parallel Non-Smooth Convex Optimization. CoRR abs/1906.10655 (2019) - [i43]Ruoqi Shen, Yin Tat Lee:
The Randomized Midpoint Method for Log-Concave Sampling. CoRR abs/1909.05503 (2019) - [i42]Yin Tat Lee, Aaron Sidford:
Solving Linear Programs with Sqrt(rank) Linear System Solves. CoRR abs/1910.08033 (2019) - [i41]Sally Dong, Yin Tat Lee, Kent Quanrud:
Computing Circle Packing Representations of Planar Graphs. CoRR abs/1911.00612 (2019) - [i40]Aditi Laddha, Yin Tat Lee, Santosh S. Vempala:
Strong Self-Concordance and Sampling. CoRR abs/1911.05656 (2019) - [i39]Deeparnab Chakrabarty, Yin Tat Lee, Aaron Sidford, Sahil Singla, Sam Chiu-wai Wong:
Faster Matroid Intersection. CoRR abs/1911.10765 (2019) - 2018
- [j4]Yin Tat Lee, He Sun:
Constructing Linear-Sized Spectral Sparsification in Almost-Linear Time. SIAM J. Comput. 47(6): 2315-2336 (2018) - [c29]Yin Tat Lee, Aaron Sidford, Santosh S. Vempala:
Efficient Convex Optimization with Membership Oracles. COLT 2018: 1292-1294 - [c28]Kevin Scaman, Francis R. Bach, Sébastien Bubeck, Laurent Massoulié, Yin Tat Lee:
Optimal Algorithms for Non-Smooth Distributed Optimization in Networks. NeurIPS 2018: 2745-2754 - [c27]Sébastien Bubeck, Michael B. Cohen, Yin Tat Lee, James R. Lee, Aleksander Madry:
k-server via multiscale entropic regularization. STOC 2018: 3-16 - [c26]Tsz Chiu Kwok, Lap Chi Lau, Yin Tat Lee, Akshay Ramachandran:
The Paulsen problem, continuous operator scaling, and smoothed analysis. STOC 2018: 182-189 - [c25]Ankit Garg, Yin Tat Lee, Zhao Song, Nikhil Srivastava:
A matrix expander Chernoff bound. STOC 2018: 1102-1114 - [c24]Yin Tat Lee, Santosh S. Vempala:
Convergence rate of riemannian Hamiltonian Monte Carlo and faster polytope volume computation. STOC 2018: 1115-1121 - [c23]Yin Tat Lee, Santosh S. Vempala:
Stochastic localization + Stieltjes barrier = tight bound for log-Sobolev. STOC 2018: 1122-1129 - [c22]Sébastien Bubeck, Michael B. Cohen, Yin Tat Lee, Yuanzhi Li:
An homotopy method for lp regression provably beyond self-concordance and in input-sparsity time. STOC 2018: 1130-1137 - [i38]C. J. Argue, Sébastien Bubeck, Michael B. Cohen, Anupam Gupta, Yin Tat Lee:
A Nearly-Linear Bound for Chasing Nested Convex Bodies. CoRR abs/1806.08865 (2018) - [i37]Yin Tat Lee, Santosh S. Vempala:
The Kannan-Lovász-Simonovits Conjecture. CoRR abs/1807.03465 (2018) - [i36]Sébastien Bubeck, Michael B. Cohen, James R. Lee, Yin Tat Lee:
Metrical task systems on trees via mirror descent and unfair gluing. CoRR abs/1807.04404 (2018) - [i35]Michael B. Cohen, Yin Tat Lee, Zhao Song:
Solving Linear Programs in the Current Matrix Multiplication Time. CoRR abs/1810.07896 (2018) - [i34]Sébastien Bubeck, Yin Tat Lee, Yuanzhi Li, Mark Sellke:
Competitively Chasing Convex Bodies. CoRR abs/1811.00887 (2018) - [i33]Sébastien Bubeck, Yin Tat Lee, Yuanzhi Li, Mark Sellke:
Chasing Nested Convex Bodies Nearly Optimally. CoRR abs/1811.00999 (2018) - [i32]Sébastien Bubeck, Yin Tat Lee, Eric Price, Ilya P. Razenshteyn:
Adversarial Examples from Cryptographic Pseudo-Random Generators. CoRR abs/1811.06418 (2018) - [i31]Yin Tat Lee, Zhao Song, Santosh S. Vempala:
Algorithmic Theory of ODEs and Sampling from Well-conditioned Logconcave Densities. CoRR abs/1812.06243 (2018) - 2017
- [j3]Michael Kapralov, Yin Tat Lee, Cameron Musco, Christopher Musco, Aaron Sidford:
Single Pass Spectral Sparsification in Dynamic Streams. SIAM J. Comput. 46(1): 456-477 (2017) - [j2]Tsz Chiu Kwok, Lap Chi Lau, Yin Tat Lee:
Improved Cheeger's Inequality and Analysis of Local Graph Partitioning using Vertex Expansion and Expansion Profile. SIAM J. Comput. 46(3): 890-910 (2017) - [c21]Yin Tat Lee, Santosh Srinivas Vempala:
Eldan's Stochastic Localization and the KLS Hyperplane Conjecture: An Improved Lower Bound for Expansion. FOCS 2017: 998-1007 - [c20]Kevin Scaman, Francis R. Bach, Sébastien Bubeck, Yin Tat Lee, Laurent Massoulié:
Optimal Algorithms for Smooth and Strongly Convex Distributed Optimization in Networks. ICML 2017: 3027-3036 - [c19]Sébastien Bubeck, Yin Tat Lee, Ronen Eldan:
Kernel-based methods for bandit convex optimization. STOC 2017: 72-85 - [c18]Yin Tat Lee, He Sun:
An SDP-based algorithm for linear-sized spectral sparsification. STOC 2017: 678-687 - [c17]Yin Tat Lee, Santosh S. Vempala:
Geodesic walks in polytopes. STOC 2017: 927-940 - [c16]Deeparnab Chakrabarty, Yin Tat Lee, Aaron Sidford, Sam Chiu-wai Wong:
Subquadratic submodular function minimization. STOC 2017: 1220-1231 - [i30]Yin Tat Lee, He Sun:
An SDP-Based Algorithm for Linear-Sized Spectral Sparsification. CoRR abs/1702.08415 (2017) - [i29]