default search action
Neural Networks, Volume 131
Volume 131, November 2020
- Xiao-Zhen Liu, Kaining Wu, Weihai Zhang:
Intermittent boundary stabilization of stochastic reaction-diffusion Cohen-Grossberg neural networks. 1-13
- Bruno Légora Souza da Silva, Fernando Kentaro Inaba, Evandro Ottoni Teatini Salles, Patrick Marques Ciarelli:
Fast Deep Stacked Networks based on Extreme Learning Machine applied to regression problems. 14-28
- Eduardo Paluzo-Hidalgo, Rocío González-Díaz, Miguel Angel Gutiérrez-Naranjo:
Two-hidden-layer feed-forward networks are universal approximators: A constructive approach. 29-36
- Damien Drix, Verena V. Hafner, Michael Schmuker:
Sparse coding with a somato-dendritic rule. 37-49
- Weihao Xia, Yujiu Yang, Jing-Hao Xue:
Unsupervised multi-domain multimodal image-to-image translation with explicit domain-constrained disentanglement. 50-63
- Assaf Cohen, Aviad Cohen, Nir Nissim:
ASSAF: Advanced and Slim StegAnalysis Detection Framework for JPEG images based on deep convolutional denoising autoencoder and Siamese networks. 64-77
- Xuechen Li, Nan Wang, Jungang Lou, Jianquan Lu:
Global μ-synchronization of impulsive pantograph neural networks. 78-92
- Zhao Kang, Xiao Lu, Jian Liang, Kun Bai, Zenglin Xu:
Relation-Guided Representation Learning. 93-102 - Callie Federer, Haoyan Xu, Alona Fyshe, Joel Zylberberg:
Improved object recognition using neural networks trained to mimic the brain's statistical properties. 103-114 - Chengdai Huang, Heng Liu, Xiangyun Shi, Xiaoping Chen, Min Xiao, Zhengxin Wang, Jinde Cao:
Bifurcations in a fractional-order neural network with multiple leakage delays. 115-126 - Vaibhav B. Sinha, Sneha Kudugunta, Adepu Ravi Sankar, Surya Teja Chavali, Vineeth N. Balasubramanian:
DANTE: Deep alternations for training neural networks. 127-143 - Shan Xue, Biao Luo, Derong Liu:
Integral reinforcement learning based event-triggered control with input saturation. 144-153
- Zhiying Fang, Han Feng, Shuo Huang, Ding-Xuan Zhou:
Theory of deep convolutional neural networks II: Spherical analysis. 154-162 - Chongyang Chen, Song Zhu, Min Wang, Chunyu Yang, Zhigang Zeng:
Finite-time stabilization and energy consumption estimation for delayed neural networks with bounded activation function. 163-171
- Han Xiao:
Hungarian layer: A novel interpretable neural layer for paraphrase identification. 172-184 - Tomaso Fontanini, Eleonora Iotti, Luca Donati, Andrea Prati:
MetalGAN: Multi-domain label-less image synthesis using cGANs and meta-learning. 185-200 - Liming Yang, Yakun Wen, Min Zhang, Xue Wang:
Twin minimax probability machine for pattern classification. 201-214 - Dingheng Wang, Guangshe Zhao, Guoqi Li, Lei Deng, Yang Wu:
Compressing 3DCNNs based on tensor train decomposition. 215-230
- Peng Yi, ShiNung Ching:
Synthesis of recurrent neural dynamics for monotone inclusion with application to Bayesian inference. 231-241 - Nijing Yang, Yongbin Yu, Shouming Zhong, Xiangxiang Wang, Kaibo Shi, Jingye Cai:
Exponential synchronization of stochastic delayed memristive neural networks via a novel hybrid control. 242-250 - Chunwei Tian, Lunke Fei, Wenxian Zheng, Yong Xu, Wangmeng Zuo, Chia-Wen Lin:
Deep learning on image denoising: An overview. 251-275
- Hongwei Jiang, Bin Zou, Chen Xu, Jie Xu, Yuan Yan Tang:
SVM-Boosting based on Markov resampling: Theory and algorithm. 276-290
- Hang Su, Yingbai Hu, Hamid Reza Karimi, Alois C. Knoll, Giancarlo Ferrigno, Elena De Momi:
Improved recurrent neural network-based manipulator control with remote center of motion constraints: Experimental results. 291-299 - Xiaoyang Liu, Zhigang Zeng, Donald C. Wunsch II:
Memristor-based LSTM network with in situ training and its applications. 300-311
- Zhicheng He, Jie Liu, Kai Dang, Fuzhen Zhuang, Yalou Huang:
Leveraging maximum entropy and correlation on latent factors for learning representations. 312-323
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.