default search action
Search dblp
Full-text search
- > Home
Please enter a search query
- case-insensitive prefix search: default
e.g., sig matches "SIGIR" as well as "signal" - exact word search: append dollar sign ($) to word
e.g., graph$ matches "graph", but not "graphics" - boolean and: separate words by space
e.g., codd model - boolean or: connect words by pipe symbol (|)
e.g., graph|network
Update May 7, 2017: Please note that we had to disable the phrase search operator (.) and the boolean not operator (-) due to technical problems. For the time being, phrase search queries will yield regular prefix search result, and search terms preceded by a minus will be interpreted as regular (positive) search terms.
Author search results
no matches
Venue search results
no matches
Refine list
refine by author
- no options
- temporarily not available
refine by venue
- no options
- temporarily not available
refine by type
- no options
- temporarily not available
refine by access
- no options
- temporarily not available
refine by year
- no options
- temporarily not available
Publication search results
found 48 matches
- 2024
- Nélida Mirabet-Herranz, Chiara Galdi, Jean-Luc Dugelay:
One Embedding to Predict Them All: Visible and Thermal Universal Face Representations for Soft Biometric Estimation via Vision Transformers. CVPR Workshops 2024: 1500-1509 - Matthew Kowal, Achal Dave, Rares Ambrus, Adrien Gaidon, Konstantinos G. Derpanis, Pavel Tokmakov:
Understanding Video Transformers via Universal Concept Discovery. CVPR 2024: 10946-10956 - Tokio Kajitsuka, Issei Sato:
Are Transformers with One Layer Self-Attention Using Low-Rank Weight Matrices Universal Approximators? ICLR 2024 - Gerald Woo, Chenghao Liu, Akshat Kumar, Caiming Xiong, Silvio Savarese, Doyen Sahoo:
Unified Training of Universal Time Series Forecasting Transformers. ICML 2024 - Matthew Kowal, Achal Dave, Rares Ambrus, Adrien Gaidon, Konstantinos G. Derpanis, Pavel Tokmakov:
Understanding Video Transformers via Universal Concept Discovery. CoRR abs/2401.10831 (2024) - Gerald Woo, Chenghao Liu, Akshat Kumar, Caiming Xiong, Silvio Savarese, Doyen Sahoo:
Unified Training of Universal Time Series Forecasting Transformers. CoRR abs/2402.02592 (2024) - Benedikt Alkin, Andreas Fürst, Simon Schmid, Lukas Gruber, Markus Holzleitner, Johannes Brandstetter:
Universal Physics Transformers. CoRR abs/2402.12365 (2024) - Róbert Csordás, Kazuki Irie, Jürgen Schmidhuber, Christopher Potts, Christopher D. Manning:
MoEUT: Mixture-of-Experts Universal Transformers. CoRR abs/2405.16039 (2024) - Hang Zhou, Yuezhou Ma, Haixu Wu, Haowen Wang, Mingsheng Long:
Unisolver: PDE-Conditional Transformers Are Universal PDE Solvers. CoRR abs/2405.17527 (2024) - Souradip Poddar, Youngmin Oh, Yao Lai, Hanqing Zhu, Bosun Hwang, David Z. Pan:
INSIGHT: Universal Neural Simulator for Analog Circuits Harnessing Autoregressive Transformers. CoRR abs/2407.07346 (2024) - Takashi Furuya, Maarten V. de Hoop, Gabriel Peyré:
Transformers are Universal In-context Learners. CoRR abs/2408.01367 (2024) - Emanuele Zappala, Maryam Bagherian:
Universal Approximation of Operators with Transformers and Neural Integral Operators. CoRR abs/2409.00841 (2024) - Michael E. Sander, Gabriel Peyré:
Towards Understanding the Universality of Transformers for Next-Token Prediction. CoRR abs/2410.03011 (2024) - Joseph Liu, Joshua Geddes, Ziyu Guo, Haomiao Jiang, Mahesh Kumar Nandwana:
SmoothCache: A Universal Inference Acceleration Technique for Diffusion Transformers. CoRR abs/2411.10510 (2024) - Jerry Yao-Chieh Hu, Wei-Po Wang, Ammar Gilani, Chenyang Li, Zhao Song, Han Liu:
Fundamental Limits of Prompt Tuning Transformers: Universality, Capacity and Efficiency. CoRR abs/2411.16525 (2024) - 2023
- Anastasis Kratsios, Valentin Debarnot, Ivan Dokmanic:
Small Transformers Compute Universal Metric Embeddings. J. Mach. Learn. Res. 24: 170:1-170:48 (2023) - Hongwei Yu, Jiansheng Chen, Huimin Ma, Cheng Yu, Xinlong Ding:
Defending Against Universal Patch Attacks by Restricting Token Attention in Vision Transformers. ICASSP 2023: 1-5 - Silas Alberti, Niclas Dern, Laura Thesing, Gitta Kutyniok:
Sumformer: Universal Approximation for Efficient Transformers. TAG-ML 2023: 72-86 - André Brasil Vieira Wyzykowski, Anil Kumar Jain:
A Universal Latent Fingerprint Enhancer Using Transformers. CoRR abs/2306.00231 (2023) - Silas Alberti, Niclas Dern, Laura Thesing, Gitta Kutyniok:
Sumformer: Universal Approximation for Efficient Transformers. CoRR abs/2307.02301 (2023) - Sourya Basu, Moulik Choraria, Lav R. Varshney:
Transformers are Universal Predictors. CoRR abs/2307.07843 (2023) - Tokio Kajitsuka, Issei Sato:
Are Transformers with One Layer Self-Attention Using Low-Rank Weight Matrices Universal Approximators? CoRR abs/2307.14023 (2023) - K. L. Navaneet, Soroush Abbasi Koohpayegani, Essam Sleiman, Hamed Pirsiavash:
SlowFormer: Universal Adversarial Patch for Attack on Compute and Energy Efficiency of Inference Efficient Vision Transformers. CoRR abs/2310.02544 (2023) - 2022
- Yuan Li:
Learning both Expert and Universal Knowledge using Transformers. Duke University, Durham, NC, USA, 2022 - Kevin Lu, Aditya Grover, Pieter Abbeel, Igor Mordatch:
Frozen Pretrained Transformers as Universal Computation Engines. AAAI 2022: 7628-7636 - Agrim Gupta, Linxi Fan, Surya Ganguli, Li Fei-Fei:
MetaMorph: Learning Universal Controllers with Transformers. ICLR 2022 - Anastasis Kratsios, Behnoosh Zamanlooy, Tianlin Liu, Ivan Dokmanic:
Universal Approximation Under Constraints is Possible with Transformers. ICLR 2022 - Yutian Chen, Xingyou Song, Chansoo Lee, Zi Wang, Richard Zhang, David Dohan, Kazuya Kawakami, Greg Kochanski, Arnaud Doucet, Marc'Aurelio Ranzato, Sagi Perel, Nando de Freitas:
Towards Learning Universal Hyperparameter Optimizers with Transformers. NeurIPS 2022 - Rahul Goel, Modar Sulaiman, Kimia Noorbakhsh, Mahdi Sharifi, Rajesh Sharma, Pooyan Jamshidi, Kallol Roy:
Pre-Trained Language Transformers are Universal Image Classifiers. CoRR abs/2201.10182 (2022) - Agrim Gupta, Linxi Fan, Surya Ganguli, Li Fei-Fei:
MetaMorph: Learning Universal Controllers with Transformers. CoRR abs/2203.11931 (2022)
skipping 18 more matches
loading more results
failed to load more results, please try again later
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.
Unpaywalled article links
Add open access links from to the list of external document links (if available).
Privacy notice: By enabling the option above, your browser will contact the API of unpaywall.org to load hyperlinks to open access articles. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Unpaywall privacy policy.
Archived links via Wayback Machine
For web page which are no longer available, try to retrieve content from the of the Internet Archive (if available).
Privacy notice: By enabling the option above, your browser will contact the API of archive.org to check for archived content of web pages that are no longer available. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Internet Archive privacy policy.
Reference lists
Add a list of references from , , and to record detail pages.
load references from crossref.org and opencitations.net
Privacy notice: By enabling the option above, your browser will contact the APIs of crossref.org, opencitations.net, and semanticscholar.org to load article reference information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Crossref privacy policy and the OpenCitations privacy policy, as well as the AI2 Privacy Policy covering Semantic Scholar.
Citation data
Add a list of citing articles from and to record detail pages.
load citations from opencitations.net
Privacy notice: By enabling the option above, your browser will contact the API of opencitations.net and semanticscholar.org to load citation information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the OpenCitations privacy policy as well as the AI2 Privacy Policy covering Semantic Scholar.
OpenAlex data
Load additional information about publications from .
Privacy notice: By enabling the option above, your browser will contact the API of openalex.org to load additional information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the information given by OpenAlex.
retrieved on 2025-01-16 21:55 CET from data curated by the dblp team
all metadata released as open data under CC0 1.0 license
see also: Terms of Use | Privacy Policy | Imprint