


Остановите войну!
for scientists:
Search dblp
Full-text search
- > Home
Please enter a search query
- case-insensitive prefix search: default
e.g., sig matches "SIGIR" as well as "signal" - exact word search: append dollar sign ($) to word
e.g., graph$ matches "graph", but not "graphics" - boolean and: separate words by space
e.g., codd model - boolean or: connect words by pipe symbol (|)
e.g., graph|network
Update May 7, 2017: Please note that we had to disable the phrase search operator (.) and the boolean not operator (-) due to technical problems. For the time being, phrase search queries will yield regular prefix search result, and search terms preceded by a minus will be interpreted as regular (positive) search terms.
Author search results
no matches
Venue search results
no matches
Refine list
refine by author
- no options
- temporarily not available
refine by venue
- no options
- temporarily not available
refine by type
- no options
- temporarily not available
refine by access
- no options
- temporarily not available
refine by year
- no options
- temporarily not available
Publication search results
found 30 matches
- 2021
- Xiaopeng Zhang, Haoyu Yang, Evangeline F. Y. Young:
Attentional Transfer is All You Need: Technology-aware Layout Pattern Generation. DAC 2021: 169-174 - Mozhdeh Gheini, Xiang Ren, Jonathan May:
Cross-Attention is All You Need: Adapting Pretrained Transformers for Machine Translation. EMNLP (1) 2021: 1754-1765 - Cem Subakan, Mirco Ravanelli, Samuele Cornell, Mirko Bronzi, Jianyuan Zhong:
Attention Is All You Need In Speech Separation. ICASSP 2021: 21-25 - Pau Torras, Mohamed Ali Souibgui, Jialuo Chen, Alicia Fornés:
A Transcription Is All You Need: Learning to Align Through Attention. ICDAR Workshops (1) 2021: 141-146 - Gedas Bertasius, Heng Wang, Lorenzo Torresani:
Is Space-Time Attention All You Need for Video Understanding? ICML 2021: 813-824 - Yihe Dong, Jean-Baptiste Cordonnier, Andreas Loukas:
Attention is not all you need: pure attention loses rank doubly exponentially with depth. ICML 2021: 2793-2803 - Juraj Juraska, Marilyn A. Walker:
Attention Is Indeed All You Need: Semantically Attention-Guided Decoding for Data-to-Text NLG. INLG 2021: 416-431 - Hsiang-Chun Chang, Hung-Jen Chen, Yu-Chia Shen, Hong-Han Shuai, Wen-Huang Cheng:
Re-Attention Is All You Need: Memory-Efficient Scene Text Detection via Re-Attention on Uncertain Regions. IROS 2021: 452-459 - Wang Yin, Peng Lu
, Zhaoran Zhao, Xujun Peng:
Yes, "Attention Is All You Need", for Exemplar based Colorization. ACM Multimedia 2021: 2243-2251 - Yuan Cheng, Yanbo Xue:
Looking at CTR Prediction Again: Is Attention All You Need? SIGIR 2021: 1279-1287 - Gedas Bertasius, Heng Wang, Lorenzo Torresani:
Is Space-Time Attention All You Need for Video Understanding? CoRR abs/2102.05095 (2021) - Yihe Dong, Jean-Baptiste Cordonnier, Andreas Loukas:
Attention is Not All You Need: Pure Attention Loses Rank Doubly Exponentially with Depth. CoRR abs/2103.03404 (2021) - Hongqiu Wu, Hai Zhao, Min Zhang:
Not All Attention Is All You Need. CoRR abs/2104.04692 (2021) - Yuan Cheng, Yanbo Xue:
Looking at CTR Prediction Again: Is Attention All You Need? CoRR abs/2105.05563 (2021) - Lina Achaji, Julien Moreau, Thibault Fouqueray, François Aioun, François Charpillet:
Is attention to bounding boxes all you need for pedestrian action prediction? CoRR abs/2107.08031 (2021) - Juraj Juraska, Marilyn A. Walker:
Attention Is Indeed All You Need: Semantically Attention-Guided Decoding for Data-to-Text NLG. CoRR abs/2109.07043 (2021) - 2020
- Myungsub Choi, Heewon Kim, Bohyung Han, Ning Xu, Kyoung Mu Lee:
Channel Attention Is All You Need for Video Frame Interpolation. AAAI 2020: 10663-10671 - Sufeng Duan, Hai Zhao:
Attention Is All You Need for Chinese Word Segmentation. EMNLP (1) 2020: 3862-3872 - Tsung-Han Wu, Chun-Cheng Hsieh, Yen-Hao Chen, Po-Han Chi, Hung-yi Lee:
Hand-crafted Attention is All You Need? A Study of Attention on Self-supervised Audio Transformer. CoRR abs/2006.05174 (2020) - Sanghyun Yoo, Young-Seok Kim, Kang Hyun Lee, Kuhwan Jeong, Junhwi Choi, Hoshik Lee, Young Sang Choi:
Graph-Aware Transformer: Is Attention All Graphs Need? CoRR abs/2006.05213 (2020) - Nikolay Bogoychev:
Not all parameters are born equal: Attention is mostly what you need. CoRR abs/2010.11859 (2020) - Cem Subakan, Mirco Ravanelli, Samuele Cornell, Mirko Bronzi, Jianyuan Zhong:
Attention is All You Need in Speech Separation. CoRR abs/2010.13154 (2020) - Liqiang Lin, Pengdi Huang, Chi-Wing Fu, Kai Xu, Hao Zhang, Hui Huang:
One Point is All You Need: Directional Attention Point for Feature Learning. CoRR abs/2012.06257 (2020) - 2019
- Tassilo Klein, Moin Nabi:
Attention Is (not) All You Need for Commonsense Reasoning. ACL (1) 2019: 4831-4836 - Tassilo Klein, Moin Nabi:
Attention Is (not) All You Need for Commonsense Reasoning. CoRR abs/1905.13497 (2019) - Manjot Bilkhu, Siyang Wang, Tushar Dobhal:
Attention is all you need for Videos: Self-attention based Video Summarization using Universal Transformers. CoRR abs/1906.02792 (2019) - Sufeng Duan, Hai Zhao:
Attention Is All You Need for Chinese Word Segmentation. CoRR abs/1910.14537 (2019) - Thomas Dowdell, Hongyu Zhang:
Is Attention All What You Need? - An Empirical Investigation on Convolution-Based Active Memory and Self-Attention. CoRR abs/1912.11959 (2019) - 2017
- Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin:
Attention is All you Need. NIPS 2017: 5998-6008 - Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser
, Illia Polosukhin:
Attention Is All You Need. CoRR abs/1706.03762 (2017)
loading more results
failed to load more results, please try again later

manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.
Unpaywalled article links
Add open access links from to the list of external document links (if available).
Privacy notice: By enabling the option above, your browser will contact the API of unpaywall.org to load hyperlinks to open access articles. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Unpaywall privacy policy.
Archived links via Wayback Machine
For web page which are no longer available, try to retrieve content from the of the Internet Archive (if available).
load content from web.archive.org
Privacy notice: By enabling the option above, your browser will contact the API of web.archive.org to check for archived content of web pages that are no longer available. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Internet Archive privacy policy.
Reference lists
Add a list of references from ,
, and
to record detail pages.
load references from crossref.org and opencitations.net
Privacy notice: By enabling the option above, your browser will contact the APIs of crossref.org, opencitations.net, and semanticscholar.org to load article reference information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Crossref privacy policy and the OpenCitations privacy policy, as well as the AI2 Privacy Policy covering Semantic Scholar.
Citation data
Add a list of citing articles from and
to record detail pages.
load citations from opencitations.net
Privacy notice: By enabling the option above, your browser will contact the API of opencitations.net and semanticscholar.org to load citation information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the OpenCitations privacy policy as well as the AI2 Privacy Policy covering Semantic Scholar.
OpenAlex data
Load additional information about publications from .
Privacy notice: By enabling the option above, your browser will contact the API of openalex.org to load additional information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the information given by OpenAlex.
Tweets on dblp homepage
Show tweets from on the dblp homepage.
Privacy notice: By enabling the option above, your browser will contact twitter.com and twimg.com to load tweets curated by our Twitter account. At the same time, Twitter will persistently store several cookies with your web browser. While we did signal Twitter to not track our users by setting the "dnt" flag, we do not have any control over how Twitter uses your data. So please proceed with care and consider checking the Twitter privacy policy.
retrieved on 2022-05-25 06:07 CEST from data curated by the dblp team
all metadata released as open data under CC0 1.0 license
see also: Terms of Use | Privacy Policy | Imprint