


Остановите войну!
for scientists:


default search action
Manfred K. Warmuth
Person information

- affiliation: University of California, Santa Cruz, USA
Refine list

refinements active!
zoomed in on ?? of ?? records
view refined list in
export refined list as
showing all ?? records
2020 – today
- 2023
- [j67]Ehsan Amid, Rohan Anil, Christopher Fifty, Manfred K. Warmuth:
Layerwise Bregman Representation Learning of Neural Networks with Applications to Knowledge Distillation. Trans. Mach. Learn. Res. 2023 (2023) - 2022
- [j66]Jérémie Chalopin
, Victor Chepoi, Shay Moran, Manfred K. Warmuth:
Unlabeled sample compression schemes and corner peelings for ample and maximum classes. J. Comput. Syst. Sci. 127: 1-28 (2022) - [c128]Ehsan Amid, Rohan Anil, Manfred K. Warmuth:
LocoProp: Enhancing BackProp via Local Loss Optimization. AISTATS 2022: 9626-9642 - [i35]Ehsan Amid, Rohan Anil, Christopher Fifty, Manfred K. Warmuth:
Step-size Adaptation Using Exponentiated Gradient Updates. CoRR abs/2202.00145 (2022) - [i34]Ehsan Amid, Rohan Anil, Wojciech Kotlowski, Manfred K. Warmuth:
Learning from Randomly Initialized Neural Network Features. CoRR abs/2202.06438 (2022) - [i33]Ehsan Amid, Rohan Anil, Christopher Fifty, Manfred K. Warmuth:
Layerwise Bregman Representation Learning with Applications to Knowledge Distillation. CoRR abs/2209.07080 (2022) - [i32]Ehsan Amid, Richard Nock, Manfred K. Warmuth:
Clustering above Exponential Families with Tempered Exponential Measures. CoRR abs/2211.02765 (2022) - 2021
- [c127]Manfred K. Warmuth, Wojciech Kotlowski, Ehsan Amid:
A case where a spindly two-layer linear network decisively outperforms any neural network with a fully connected input layer. ALT 2021: 1214-1236 - [i31]Negin Majidi, Ehsan Amid, Hossein Talebi, Manfred K. Warmuth:
Exponentiated Gradient Reweighting for Robust Training Under Label Noise and Beyond. CoRR abs/2104.01493 (2021) - [i30]Ehsan Amid, Rohan Anil, Manfred K. Warmuth:
LocoProp: Enhancing BackProp via Local Loss Optimization. CoRR abs/2106.06199 (2021) - 2020
- [c126]Ehsan Amid, Manfred K. Warmuth:
An Implicit Form of Krasulina's k-PCA Update without the Orthonormality Constraint. AAAI 2020: 3179-3186 - [c125]Ehsan Amid, Manfred K. Warmuth:
Winnowing with Gradient Descent. COLT 2020: 163-182 - [c124]Hossein Talebi, Ehsan Amid, Peyman Milanfar, Manfred K. Warmuth:
Rank-Smoothed Pairwise Learning In Perceptual Quality Assessment. ICIP 2020: 3413-3417 - [c123]Ehsan Amid, Manfred K. Warmuth:
Reparameterizing Mirror Descent as Gradient Descent. NeurIPS 2020 - [c122]Ehsan Amid, Manfred K. Warmuth:
Divergence-Based Motivation for Online EM and Combining Hidden Variable Models. UAI 2020: 81-90 - [i29]Ehsan Amid, Manfred K. Warmuth:
Interpolating Between Gradient Descent and Exponentiated Gradient Using Reparameterized Gradient Descent. CoRR abs/2002.10487 (2020) - [i28]Manfred K. Warmuth, Wojciech Kotlowski, Ehsan Amid:
A case where a spindly two-layer linear network whips any neural network with a fully connected input layer. CoRR abs/2010.08625 (2020) - [i27]Hossein Talebi, Ehsan Amid, Peyman Milanfar, Manfred K. Warmuth:
Rank-smoothed Pairwise Learning In Perceptual Quality Assessment. CoRR abs/2011.10893 (2020)
2010 – 2019
- 2019
- [j65]Atsuyoshi Nakamura, David P. Helmbold, Manfred K. Warmuth:
Mistake bounds on the noise-free multi-armed bandit game. Inf. Comput. 269 (2019) - [c121]Michal Derezinski, Manfred K. Warmuth, Daniel Hsu:
Correcting the bias in least squares regression with volume-rescaled sampling. AISTATS 2019: 944-953 - [c120]Ehsan Amid, Manfred K. Warmuth, Sriram Srinivasan:
Two-temperature logistic regression based on the Tsallis divergence. AISTATS 2019: 2388-2396 - [c119]Corinna Cortes, Vitaly Kuznetsov, Mehryar Mohri, Holakou Rahmanian, Manfred K. Warmuth:
Online Non-Additive Path Learning under Full and Partial Information. ALT 2019: 274-299 - [c118]Michal Derezinski, Kenneth L. Clarkson, Michael W. Mahoney, Manfred K. Warmuth:
Minimax experimental design: Bridging the gap between statistical and worst-case approaches to least squares regression. COLT 2019: 1050-1069 - [c117]Jérémie Chalopin, Victor Chepoi
, Shay Moran
, Manfred K. Warmuth:
Unlabeled Sample Compression Schemes and Corner Peelings for Ample and Maximum Classes. ICALP 2019: 34:1-34:15 - [c116]Michal Kempka, Wojciech Kotlowski, Manfred K. Warmuth:
Adaptive Scale-Invariant Online Algorithms for Learning Linear Models. ICML 2019: 3321-3330 - [c115]Ehsan Amid, Manfred K. Warmuth, Rohan Anil, Tomer Koren:
Robust Bi-Tempered Logistic Loss Based on Bregman Divergences. NeurIPS 2019: 14987-14996 - [i26]Michal Derezinski, Kenneth L. Clarkson, Michael W. Mahoney, Manfred K. Warmuth:
Minimax experimental design: Bridging the gap between statistical and worst-case approaches to least squares regression. CoRR abs/1902.00995 (2019) - [i25]Ehsan Amid, Manfred K. Warmuth:
Divergence-Based Motivation for Online EM and Combining Hidden Variable Models. CoRR abs/1902.04107 (2019) - [i24]Michal Kempka, Wojciech Kotlowski, Manfred K. Warmuth:
Adaptive scale-invariant online algorithms for learning linear models. CoRR abs/1902.07528 (2019) - [i23]Ehsan Amid, Manfred K. Warmuth, Rohan Anil, Tomer Koren:
Robust Bi-Tempered Logistic Loss Based on Bregman Divergences. CoRR abs/1906.03361 (2019) - [i22]Michal Derezinski, Manfred K. Warmuth, Daniel Hsu:
Unbiased estimators for random design regression. CoRR abs/1907.03411 (2019) - [i21]Ehsan Amid, Manfred K. Warmuth:
An Implicit Form of Krasulina's k-PCA Update without the Orthonormality Constraint. CoRR abs/1909.04803 (2019) - [i20]Ehsan Amid, Manfred K. Warmuth:
TriMap: Large-scale Dimensionality Reduction Using Triplets. CoRR abs/1910.00204 (2019) - 2018
- [j64]Michal Derezinski, Manfred K. Warmuth:
Reverse Iterative Volume Sampling for Linear Regression. J. Mach. Learn. Res. 19: 23:1-23:39 (2018) - [c114]Michal Derezinski, Manfred K. Warmuth:
Subsampling for Ridge Regression via Regularized Volume Sampling. AISTATS 2018: 716-725 - [c113]Michal Derezinski, Manfred K. Warmuth, Daniel J. Hsu:
Leveraged volume sampling for linear regression. NeurIPS 2018: 2510-2519 - [i19]Michal Derezinski, Manfred K. Warmuth, Daniel Hsu:
Tail bounds for volume sampled linear regression. CoRR abs/1802.06749 (2018) - [i18]Ehsan Amid, Manfred K. Warmuth:
A more globally accurate dimensionality reduction method using triplets. CoRR abs/1803.00854 (2018) - [i17]Sanjay Krishna Gouda, Salil Kanetkar, David Harrison, Manfred K. Warmuth:
Speech Recognition: Keyword Spotting Through Image Recognition. CoRR abs/1803.03759 (2018) - [i16]Corinna Cortes, Vitaly Kuznetsov, Mehryar Mohri, Holakou Rahmanian, Manfred K. Warmuth:
Online Non-Additive Path Learning under Full and Partial Information. CoRR abs/1804.06518 (2018) - [i15]Michal Derezinski, Manfred K. Warmuth:
Reverse iterative volume sampling for linear regression. CoRR abs/1806.01969 (2018) - [i14]Michal Derezinski, Manfred K. Warmuth, Daniel Hsu:
Correcting the bias in least squares regression with volume-rescaled sampling. CoRR abs/1810.02453 (2018) - [i13]Jérémie Chalopin, Victor Chepoi
, Shay Moran, Manfred K. Warmuth:
Unlabeled sample compression schemes and corner peelings for ample and maximum classes. CoRR abs/1812.02099 (2018) - 2017
- [c112]Holakou Rahmanian, Manfred K. Warmuth:
Online Dynamic Programming. NIPS 2017: 2827-2837 - [c111]Michal Derezinski, Manfred K. Warmuth:
Unbiased estimates for linear regression via volume sampling. NIPS 2017: 3084-3093 - [i12]Michal Derezinski, Manfred K. Warmuth:
Unbiased estimates for linear regression via volume sampling. CoRR abs/1705.06908 (2017) - [i11]Ehsan Amid, Manfred K. Warmuth:
Two-temperature logistic regression based on the Tsallis divergence. CoRR abs/1705.07210 (2017) - [i10]Holakou Rahmanian, S. V. N. Vishwanathan, Manfred K. Warmuth:
Online Dynamic Programming. CoRR abs/1706.00834 (2017) - [i9]Michal Derezinski, Manfred K. Warmuth:
Subsampling for Ridge Regression via Regularized Volume Sampling. CoRR abs/1710.05110 (2017) - 2016
- [j63]Jiazhong Nie, Wojciech Kotlowski, Manfred K. Warmuth:
Online PCA with Optimal Regret. J. Mach. Learn. Res. 17: 173:1-173:49 (2016) - [j62]Elad Hazan
, Satyen Kale, Manfred K. Warmuth:
Learning rotations with little regret. Mach. Learn. 104(1): 129-148 (2016) - [c110]Shay Moran, Manfred K. Warmuth:
Labeled Compression Schemes for Extremal Classes. ALT 2016: 34-49 - [c109]Atsuyoshi Nakamura
, David P. Helmbold, Manfred K. Warmuth:
Noise Free Multi-armed Bandit Game. LATA 2016: 412-423 - [i8]Ehsan Amid, Nikos Vlassis, Manfred K. Warmuth:
t-Exponential Triplet Embedding. CoRR abs/1611.09957 (2016) - 2015
- [c108]Peter L. Bartlett, Wouter M. Koolen, Alan Malek, Eiji Takimoto, Manfred K. Warmuth:
Minimax Fixed-Design Linear Regression. COLT 2015: 226-239 - [c107]Corinna Cortes, Vitaly Kuznetsov, Mehryar Mohri, Manfred K. Warmuth:
On-Line Learning Algorithms for Path Experts with Non-Additive Losses. COLT 2015: 424-447 - [c106]Wouter M. Koolen, Manfred K. Warmuth, Dmitry Adamskiy:
Open Problem: Online Sabotaged Shortest Path. COLT 2015: 1764-1766 - [i7]Shay Moran, Manfred K. Warmuth:
Labeled compression schemes for extremal classes. CoRR abs/1506.00165 (2015) - [i6]Wojciech Kotlowski, Manfred K. Warmuth:
PCA with Gaussian perturbations. CoRR abs/1506.04855 (2015) - 2014
- [j61]Manfred K. Warmuth, Wouter M. Koolen, David P. Helmbold:
Combining initial segments of lists. Theor. Comput. Sci. 519: 29-45 (2014) - [j60]Manfred K. Warmuth, Wojciech Kotlowski, Shuisheng Zhou:
Kernelization of matrix updates, when and how? Theor. Comput. Sci. 558: 159-178 (2014) - [c105]Manfred K. Warmuth, Wouter M. Koolen:
Open Problem: Shifting Experts on Easy Data. COLT 2014: 1295-1298 - [c104]Michal Derezinski, Manfred K. Warmuth:
The limits of squared Euclidean distance regularization. NIPS 2014: 2807-2815 - [i5]Manfred K. Warmuth, Dima Kuzmin:
A Bayesian Probability Calculus for Density Matrices. CoRR abs/1408.3100 (2014) - 2013
- [c103]Jiazhong Nie, Wojciech Kotlowski, Manfred K. Warmuth:
Online PCA with Optimal Regrets. ALT 2013: 98-112 - [c102]Wouter M. Koolen, Jiazhong Nie, Manfred K. Warmuth:
Learning a set of directions. COLT 2013: 851-866 - [c101]Jiazhong Nie, Manfred K. Warmuth, S. V. N. Vishwanathan, Xinhua Zhang:
Open Problem: Lower bounds for Boosting with Hadamard Matrices. COLT 2013: 1076-1079 - [i4]Katy S. Azoury, Manfred K. Warmuth:
Relative Loss Bounds for On-line Density Estimation with the Exponential Family of Distributions. CoRR abs/1301.6677 (2013) - [i3]Jiazhong Nie, Wojciech Kotlowski, Manfred K. Warmuth:
On-line PCA with Optimal Regrets. CoRR abs/1306.3895 (2013) - 2012
- [j59]Manfred K. Warmuth, Dima Kuzmin:
Online variance minimization. Mach. Learn. 87(1): 1-32 (2012) - [c100]Manfred K. Warmuth, Wojciech Kotlowski, Shuisheng Zhou:
Kernelization of Matrix Updates, When and How? ALT 2012: 350-364 - [c99]Wouter M. Koolen, Dmitry Adamskiy, Manfred K. Warmuth:
Putting Bayes to sleep. NIPS 2012: 135-143 - 2011
- [c98]Manfred K. Warmuth, Wouter M. Koolen, David P. Helmbold:
Combining Initial Segments of Lists. ALT 2011: 219-233 - [c97]Wouter M. Koolen, Wojciech Kotlowski, Manfred K. Warmuth:
Learning Eigenvectors for Free. NIPS 2011: 945-953 - [c96]Wojciech Kotlowski, Manfred K. Warmuth:
Minimax Algorithm for Learning Rotations. COLT 2011: 821-824 - 2010
- [j58]Manfred K. Warmuth, Dima Kuzmin:
Bayesian generalized probability calculus for density matrices. Mach. Learn. 78(1-2): 63-101 (2010) - [c95]Manfred K. Warmuth:
The Blessing and the Curse of the Multiplicative Updates. ALT 2010: 31 - [c94]Wouter M. Koolen, Manfred K. Warmuth, Jyrki Kivinen:
Hedging Structured Concepts. COLT 2010: 93-105 - [c93]Elad Hazan, Satyen Kale, Manfred K. Warmuth:
Learning Rotations with Little Regret. COLT 2010: 144-154 - [c92]Elad Hazan, Satyen Kale, Manfred K. Warmuth:
On-line Variance Minimization in O(n2) per Trial? COLT 2010: 314-315 - [c91]Manfred K. Warmuth:
The Blessing and the Curse of the Multiplicative Updates. Discovery Science 2010: 382 - [c90]Shuisheng Zhou, Manfred K. Warmuth, Yinli Dong, Feng Ye:
New combination coefficients for AdaBoost algorithms. ICNC 2010: 3194-3198 - [c89]Jacob D. Abernethy, Manfred K. Warmuth:
Repeated Games against Budgeted Adversaries. NIPS 2010: 1-9
2000 – 2009
- 2009
- [j57]David P. Helmbold, Manfred K. Warmuth:
Learning Permutations with Exponential Weights. J. Mach. Learn. Res. 10: 1705-1736 (2009) - [c88]Jacob D. Abernethy, Manfred K. Warmuth:
Minimax Games with Bandits. COLT 2009 - [c87]Manfred K. Warmuth, S. V. N. Vishwanathan:
Tutorial summary: Survey of boosting from an optimization perspective. ICML 2009: 15 - 2008
- [c86]Manfred K. Warmuth, Karen A. Glocer, S. V. N. Vishwanathan:
Entropy Regularized LPBoost. ALT 2008: 256-271 - [c85]Jacob D. Abernethy, Manfred K. Warmuth, Joel Yellin:
When Random Play is Optimal Against an Adversary. COLT 2008: 437-446 - [c84]Adam M. Smith, Manfred K. Warmuth:
Learning Rotations. COLT 2008: 517 - 2007
- [j56]Dima Kuzmin, Manfred K. Warmuth:
Unlabeled Compression Schemes for Maximum Classes. J. Mach. Learn. Res. 8: 2047-2081 (2007) - [c83]David P. Helmbold, Manfred K. Warmuth:
Learning Permutations with Exponential Weights. COLT 2007: 469-483 - [c82]Manfred K. Warmuth:
When Is There a Free Matrix Lunch? COLT 2007: 630-632 - [c81]Dima Kuzmin, Manfred K. Warmuth:
Online kernel PCA with entropic matrix updates. ICML 2007: 465-472 - [c80]Manfred K. Warmuth:
Winnowing subspaces. ICML 2007: 999-1006 - [c79]Manfred K. Warmuth, Karen A. Glocer, Gunnar Rätsch:
Boosting Algorithms for Maximizing the Soft Margin. NIPS 2007: 1585-1592 - 2006
- [j55]Jyrki Kivinen
, Manfred K. Warmuth, Babak Hassibi:
The p-norm generalization of the LMS algorithm for adaptive filtering. IEEE Trans. Signal Process. 54(5): 1782-1793 (2006) - [c78]Manfred K. Warmuth, Dima Kuzmin:
Online Variance Minimization. COLT 2006: 514-528 - [c77]Jacob D. Abernethy, John Langford, Manfred K. Warmuth:
Continuous Experts and the Binning Algorithm. COLT 2006: 544-558 - [c76]Manfred K. Warmuth:
Can Entropic Regularization Be Replaced by Squared Euclidean Distance Plus Additional Linear Constraints. COLT 2006: 653-654 - [c75]Manfred K. Warmuth, Jun Liao, Gunnar Rätsch
:
Totally corrective boosting algorithms that maximize the margin. ICML 2006: 1001-1008 - [c74]Manfred K. Warmuth, Dima Kuzmin:
Randomized PCA Algorithms with Regret Bounds that are Logarithmic in the Dimension. NIPS 2006: 1481-1488 - [c73]Manfred K. Warmuth, Dima Kuzmin:
A Bayesian Probability Calculus for Density Matrices. UAI 2006 - 2005
- [j54]Koji Tsuda, Gunnar Rätsch, Manfred K. Warmuth:
Matrix Exponentiated Gradient Updates for On-line Learning and Bregman Projection. J. Mach. Learn. Res. 6: 995-1018 (2005) - [j53]Gunnar Rätsch, Manfred K. Warmuth:
Efficient Margin Maximizing with Boosting. J. Mach. Learn. Res. 6: 2131-2152 (2005) - [c72]Manfred K. Warmuth, S. V. N. Vishwanathan:
Leaving the Span. COLT 2005: 366-381 - [c71]Dima Kuzmin, Manfred K. Warmuth:
Unlabeled Compression Schemes for Maximum Classes, . COLT 2005: 591-605 - [c70]Dima Kuzmin, Manfred K. Warmuth:
Optimum Follow the Leader Algorithm. COLT 2005: 684-686 - [c69]Manfred K. Warmuth:
A Bayes Rule for Density Matrices. NIPS 2005: 1457-1464 - 2004
- [c68]Manfred K. Warmuth:
The Optimal PAC Algorithm. COLT 2004: 641-642 - [c67]Koji Tsuda, Gunnar Rätsch, Manfred K. Warmuth:
Matrix Exponential Gradient Updates for On-line Learning and Bregman Projection. NIPS 2004: 1425-1432 - 2003
- [j52]Manfred K. Warmuth, Jun Liao, Gunnar Rätsch
, Michael Mathieson, Santosh Putta, Christian Lemmen:
Active Learning with Support Vector Machines in the Drug Discovery Process. J. Chem. Inf. Comput. Sci. 43(2): 667-673 (2003) - [j51]Eiji Takimoto, Manfred K. Warmuth:
Path Kernels and Multiplicative Updates. J. Mach. Learn. Res. 4: 773-818 (2003) - [j50]Jürgen Forster, Manfred K. Warmuth:
Relative Loss Bounds for Temporal-Difference Learning. Mach. Learn. 51(1): 23-50 (2003) - [c66]Manfred K. Warmuth:
Compressing to VC Dimension Many Points. COLT 2003: 743-744 - [c65]Ashutosh Garg, Manfred K. Warmuth:
Inline updates for HMMs. INTERSPEECH 2003 - [c64]Rita Singh, Manfred K. Warmuth, Bhiksha Raj, Paul Lamere:
Classification with free energy at raised temperatures. INTERSPEECH 2003 - [c63]Kohei Hatano, Manfred K. Warmuth:
Boosting versus Covering. NIPS 2003: 1109-1116 - [e4]Bernhard Schölkopf, Manfred K. Warmuth:
Computational Learning Theory and Kernel Machines, 16th Annual Conference on Computational Learning Theory and 7th Kernel Workshop, COLT/Kernel 2003, Washington, DC, USA, August 24-27, 2003, Proceedings. Lecture Notes in Computer Science 2777, Springer 2003, ISBN 3-540-40720-0 [contents] - 2002
- [j49]Jürgen Forster, Manfred K. Warmuth:
Relative Expected Instantaneous Loss Bounds. J. Comput. Syst. Sci. 64(1): 76-102 (2002) - [j48]Olivier Bousquet, Manfred K. Warmuth:
Tracking a Small Set of Experts by Mixing Past Posteriors. J. Mach. Learn. Res. 3: 363-396 (2002) - [j47]David P. Helmbold, Sandra Panizza, Manfred K. Warmuth:
Direct and indirect algorithms for on-line learning of disjunctions. Theor. Comput. Sci. 284(1): 109-142 (2002) - [j46]Eiji Takimoto, Manfred K. Warmuth:
Predicting nearly as well as the best pruning of a planar decision graph. Theor. Comput. Sci. 288(2): 217-235 (2002) - [c62]Eiji Takimoto, Manfred K. Warmuth:
Path Kernels and Multiplicative Updates. COLT 2002: 74-89 - [c61]Gunnar Rätsch
, Manfred K. Warmuth:
Maximizing the Margin with Boosting. COLT 2002: 334-350 - [c60]Robert B. Gramacy, Manfred K. Warmuth, Scott A. Brandt, Ismail Ari:
Adaptive Caching by Refetching. NIPS 2002: 1465-1472 - 2001
- [j45]Mark Herbster, Manfred K. Warmuth:
Tracking the Best Linear Predictor. J. Mach. Learn. Res. 1: 281-309 (2001) - [j44]Katy S. Azoury, Manfred K. Warmuth:
Relative Loss Bounds for On-Line Density Estimation with the Exponential Family of Distributions. Mach. Learn. 43(3): 211-246 (2001) - [j43]Jyrki Kivinen, Manfred K. Warmuth:
Relative Loss Bounds for Multidimensional Regression Problems. Mach. Learn. 45(3): 301-329 (2001) - [c59]Olivier Bousquet, Manfred K. Warmuth:
Tracking a Small Set of Experts by Mixing Past Posteriors. COLT/EuroCOLT 2001: 31-47 - [c58]Gunnar Rätsch, Sebastian Mika, Manfred K. Warmuth:
On the Convergence of Leveraging. NIPS 2001: 487-494 - [c57]Manfred K. Warmuth, Gunnar Rätsch, Michael Mathieson, Jun Liao, Christian Lemmen:
Active Learning in the Drug Discovery Process. NIPS 2001: 1449-1456 - 2000
- [c56]Eiji Takimoto, Manfred K. Warmuth:
The Last-Step Minimax Algorithm. ALT 2000: 279-290 - [c55]Jürgen Forster, Manfred K. Warmuth:
Relative Expected Instantaneous Loss Bounds. COLT 2000: 90-99 - [c54]