


default search action
Neural Networks, Volume 1
Volume 1, Number 1, 1988
- Stephen Grossberg:
Editorial. 1 - Teuvo Kohonen:
An introduction to neural computing. 3-16 - Stephen Grossberg:
Nonlinear neural networks: Principles, mechanisms, and architectures. 17-61 - Shun-ichi Amari, Kenjiro Maginu:
Statistical neurodynamics of associative memory. 63-73 - R. Paul Gorman, Terrence J. Sejnowski:
Analysis of hidden units in a layered network trained to classify sonar targets. 75-89 - Carver Mead, Misha Mahowald:
A silicon model of early visual processing. 91-97
Volume 1, Number 2, 1988
- Allen I. Selverston:
A consideration of invertebrate central pattern generators as computational data bases. 109-117 - Kunihiko Fukushima:
Neocognitron: A hierarchical neural network capable of visual pattern recognition. 119-130 - Robert Hecht-Nielsen:
Applications of counterpropagation networks. 131-139 - Christoph von der Malsburg:
Pattern recognition by labeled graph matching. 141-148 - Demetri Psaltis, Cheol Hoon Park, John Hong:
Higher order associative memories and their optical implementations. 149-163
Volume 1, Number 3, 1988
- Alice J. O'Toole
, Richard B. Millward, James A. Anderson:
A physical system approach to recognition memory for spatially transformed faces. 179-199 - Murali M. Menon, Karl G. Heinemann:
Classification of patterns using a self-organizing neural network. 201-215 - Michael A. Cohen:
Sustained oscillations in a symmetric cooperative-competitive neural network: Disproof of a conjecture about content addressable memory. 217-221 - Charles M. Newman:
Memory capacity in neural network models: Rigorous lower bounds. 223-238 - János Komlós, Ramamohan Paturi:
Convergence results in an associative memory model. 239-250 - Hiroyuki Miyamoto, Mitsuo Kawato, Tohru Setoyama, Ryoji Suzuki:
Feedback-error-learning neural network for trajectory control of a robotic manipulator. 251-265
Volume 1, Number 4, 1988
- Walter J. Freeman, Yong Yao, Brian C. Burke:
Central pattern generating and recognizing in olfactory bulb: A correlation learning rule. 277-288 - Bernard Angéniol, Gaël de La Croix Vaubois, Jean-Yves Le Texier:
Self-organizing feature maps and the travelling salesman problem. 289-293 - Robert A. Jacobs
:
Increased rates of convergence through learning rate adaptation. 295-307 - Pierre Peretto:
On the dynamics of memorization processes. 309-322 - Harvey J. Greenberg:
Equilibria of the brain-state-in-a-box (BSB) neural model. 323-324 - Robert J. Jannarone, Kai F. Yu, Yoshiyasu Takefuji
:
Conjunctoids: Statistical learning modules for binary events. 325-337 - Paul J. Werbos
:
Generalization of backpropagation with application to a recurrent gas market model. 339-356
Volume 1, Number Supplement-1, 1988
- Subutai Ahmad, Gerald Tesauro:
A study of scaling and generalization in neural networks. 3-6 - György Barna, Ron Chrisley
, Teuvo Kohonen:
Statistical pattern recognition with neural networks. 7-8 - Farokh B. Bastani, S. Sitharama Iyengar
, Sandeep Gulati:
An analysis of competing neural network knowledge representation strategies. 9 - Diane J. Blackwood, Wesley R. Elsberry, Samuel J. Leven:
Competing network models and problem-solving. 10-17 - Maureen Caudill, Charles W. Butler:
Comparison of internal processing of different network paradigms. 18 - David H. Collins:
Two-dimensional pattern sequence prediction using high-order neural networks. 19-20 - William B. Feild Jr., Jainendra K. Navlakha:
On Hopfield neural networks. 21-22 - Stephen I. Gallant:
A neural network model for sequential tasks. 23 - Patrick Gallinari, Sylvie Thiria, Françoise Fogelman-Soulié:
Comparing neural networks and data analysis. 24-25 - David B. Hertz, E. Lee, J. D. Lynch:
Certainty worlds, risk judgments, and neural net design. 26-29 - George G. Lendaris:
On comparing neural net training paradigms via graded pattern recognition tasks. 30-34 - L. Masih, T. John Stonham:
Digital neural networks - A distributed architecture with self-evolving capabilities. 35-42 - S. J. Rak, P. J. Kolodzy:
Invariant object recognition with the adaptive resonance (ART) network. 43 - P. A. Ramamoorthy, S. Ho:
A neural network approach for implementing expert systems. 44 - Paul Rhodes:
Pattern storage and recall performance in a Hebb network generalizing the cerebral cortex. 45-46 - Alan Rojer, Eric L. Schwartz:
A new pattern classifier motivated by brain maps. 47-50 - J. F. Shepanski:
Multilayer preceptron training using optimal estimation. 51 - Katsunori Shimohara, Yukio Tokunaga, Tadasu Uchiyama, Yoshimasa Kimura:
A neural network system with an automatic generation mechanism for distorted patterns. 52 - Bruno Sirletti, Michel Verleysen, Andre M. Vandemeulebroecke, Paul G. A. Jespers:
An algorithm for pattern recognition with VLSI neural networks. 53 - Guo-Zheng Sun, Hsing-Hen Chen, Yee-Chun Lee:
Learning decision trees using parallel sequential induction network. 54-55 - Santosh S. Venkatesh, Girish Pancha, Demetri Psaltis:
Composite algorithms for shaping attraction basins. 56-59 - Michael Werman:
The capacity of k-gridgraphs as associative memory. 60-66 - Robert B. Allen
, Mark E. Riecken:
Interacting and communicating connectionist agents. 67-69 - Agnessa Babloyantz, Alain Destexhe, Jacques-A. Sepulchre:
Selforganization and information processing of neural networks. 70-74 - Aviv Bergman:
Variation and selection: An evolutionary model of learning in neural networks. 75-77 - Thomas M. Breuel:
Problem-intrinsic bounds on sample complexity. 78 - A. E. Busch, L. E. H. Trainor:
Neural network models with higher order neural interactions. 79-80 - Gianpiero Cattaneo, Nicolò Cesa-Bianchi:
Microcanonical annealing on neural networks. 81-82 - David A. Cohen, C. Mannion, John Shawe-Taylor
:
Transformational theory of feedforward neural networks. 83-84 - Thomas J. Collins:
Cerebral cortical parallel processing using a metric tensor. 85-87 - Todd R. Davies
:
Some notes on the probabilistic semantics of logistic function parameters in neural networks. 88 - Bart De Moor, Lieven Vandenberghe, Joos Vandewalle:
Computing all invariant states of a neural network. 89-90 - Faiq Ali Fazal:
An application-based study of the stochastic parallel computations in harmony theory. 91 - Marcelo Fogaça, Alan Kramer, Barbara Moore:
Scalability issues in neural networks. 92 - Shohei Fujita:
Self-organization in distributed operating system. 93 - B. Furman, J. Liang, Harold Szu:
Constraint optimization neural network for adaptive early vision. 94 - C. Lee Giles
, R. D. Griffin, T. Maxwell:
Computational advantages of higher order neural networks. 95-99 - David Heath, Carl Diegert:
On learning through competition. 100 - Karen A. Huyser, Mark A. Horowitz:
Generalization in digital functions. 101 - J. Zachary Jacobson, Norman J. Pullman, William C. Treurniet:
Transitions between network states may cause both equipotentiality and localization of function in cerebral cortex. 102 - Moshe Kam, Ari Naim, Kevin Atteson:
The symmetric adaptive resonance theoretic model (SMART). 103 - Behzad Kamgar-Parsi, Behrooz Kamgar-Parsi:
The importance of being synchronous in neural networks. 104 - Behzad Kamgar-Parsi, Behrooz Kamgar-Parsi, J. Anthony Gualtieri, Judy E. Devaney:
Simultaneous fitting of several curves to point sets using neural networks. 105-107 - F. H. Ling, N. Xu, J. H. Xu:
Non-empty box counting algorithm for calculating fractal dimensions and its applications in EEG analysis. 108-112 - Anaikuppam R. Marudarajan, Harold Szu:
Neural computing approach to number theory problems. 113-115 - Barbara Moore:
ART I and pattern clustering algorithms. 116-119 - R. K. Pearson:
Ultrametrics, pseudo-ultrametrics, fuzzy sets, and pattern recognition. 120 - Jean-Claude Pérez, Jean-Michel Bertille:
"FRACTAL CHAOS" a new neural network holographic model. 121-123 - C. R. Renfrew, Jon R. Malone, A. Todd:
A classificatory network based on a study of the neocortex. 124 - Rod Rinkus
:
Learning as natural selection in a sensori-motor being. 125-126 - Terence D. Sanger:
Optimal unsupervised learning. 127 - Samir Sayech, Mancel F. Tenorio:
A learning rule for neural networks with an application to an optimization problem. 128-129 - William A. Sethares:
A convergence theorem for the modified delta rule. 130-136 - Paul E. Stolorz, G. W. Hoffmann:
Learning with recurrent networks and constant synaptic strengths. 137-138 - J. P. Sutton, L. E. H. Trainor:
Nested neural circuits, multi-level memory and learning by selection. 139-140 - Hideo Tanaka, Satoshi Matsuda, Hiromi Ogi, Yoshio Izui, Hisao Taoka
, Toshiaki Sakaguchi:
Redundant coding for fault tolerant computing on Hopfield network. 141-142 - Gérard Y. Vichniac, Marianne Lepp, Martha Steenstrup:
A neural network for the optimization of communications network design. 144-147 - Rodney Winter, Bernard Widrow:
MADALINE Rule II: A training algorithm for neural networks. 148 - Xin Xu, W. T. Tsai, N. K. Huang:
Information capacity of McCulloch Pitts' model. 149 - Xin Xu, W. T. Tsai, N. K. Huang:
A generalized neural network model. 150 - Xin Xu, S. Chen, W. T. Tsai, N. K. Huang:
A case study of solving optimization problems using neural networks. 151 - Yong Yao:
Dynamic tunneling algorithm and simulatted annealing circuit for global optimization. 152 - Alan L. Yuille
, Daniel M. Kammen:
Spontaneous symmetry-breaking energy functions, orientation selective cortical cells, and hypercolumnar cell assemblies. 153-156 - James A. Anderson, Arthur B. Markman, Susan R. Viscuso, Edward J. Wisniewski:
Programming neural networks. 157-160 - John A. Barnden:
Commonsense reasoning in Conposit, a quasi-connectionist register-array model. 161-162 - Tony Bell:
Attractor transitions: A basis for sequential processing in neural networks? 163-164 - Maureen Caudill:
Benchmarking the performance of backpropagation and couterpropagation networks. 165 - Tzi-Dar Chiueh, Rodney M. Goodman:
Learning algorithms for neural networks with ternary weights. 166-167 - Yann LeCun:
Using curvature information to improve back-propagation. 168-171 - Victor Eliashberg:
The E-machines: Associative neural networks as nonclassical symbolic processors. 172-173 - Jeffrey E. Fookson, John Antrobus:
Executive control in a PDP system: Automatization of task performance and mindwandering. 174 - Walter J. Freeman:
Neurodynamics of pattern recognition in biological neural networks. 175 - Qian Gao, Zili Liu:
Spatio-temporal associative memory and a high-order correlation neuronal network. 176-181 - Ronald L. Greene:
Learning generalization in simple linear discriminant networks. 182 - Hiroaki Hara:
Two storage mechanisms in random network system: Probabilistic model of memory. 183-186 - Nicolaos B. Karayiannis, Anastasios N. Venetsanopoulos:
The correlational associative memory realizes Hebbian learning. 187 - K. Kato, K. Nakane:
An associative memory model based on a quantification method. 188-189 - Jason M. Kinser, H. John Caulfield, C. Hester:
Error-correcting neural networks. 190-198 - Christopher J. Matheus
:
Exemplar versus prototype network models for concept representation. 199 - Gee-gwo Mei, Wentai Liu:
Design graph search problems with learning: A neural network approach. 200-201 - John E. Moody, Christian Darken:
Speedy alternatives to back propagation. 202 - B. Moore, Tomaso A. Poggio:
Representation properties of multilayer feedforward networks. 203 - Mary M. Moya, R. Joseph Fogler, Larry D. Hostetler:
Back propagation for perspective-invariant pattern recognition in sar imagery. 204 - Catherine Myers, Igor Aleksander:
Learning algorithms for probabilistic neural nets. 205-207 - M. Pavel, Mark A. Gluck, Van Henkle:
Comparing generalization by humans and adaptive networks. 208-210 - Tomaso A. Poggio:
Learning, regularization and splines. 211-212 - Gareth D. Richards:
Investigation of a layered network as an associative memory. 213-214 - Yehuda Salu:
Associative retrieval and restoration of data in neural networks. 215-216 - Gregor Schöner, J. A. Scott Kelso:
A dynamic pattern theory of learning and recall. 217 - Zoltan Schreter:
Sequential processing by overlap and fatigue of memories. 218 - Noel E. Sharkey:
A PDP system for paraphrasing routine knowledge vignettes. 219-221 - Dejan J. Sobajic, Dennis T. Lee, Yoh-Han Pao:
Increased effectiveness of learning by local neural feedback. 222 - Sara A. Solla:
Learning contiguity with layered neural networks. 223 - Kunio Suzuki, Chiaki Aoyama:
Neural network simulation and pandemonium. 224-226 - DeLiang Wang, Irwin King
:
Three neural models which process temporal information. 227 - Shengrui Wang:
Training multi-layered neural networks with T.R. Based algorithm. 228-230 - Alexis Wieland, Russell Leighton:
Shaping schedules as a method for accelerated learning. 231-237 - Herbert Axelrad, Christophe P. Bernard, Bertrand Giraud:
Global changes in entropy and in spatial organisation of activity in a network of formal neurons with inhibitory interactions. 238-246 - Max S. Cynader:
Some design features of visual cortex. 247 - Robert Fanelli, Charles Schnabolk, Theodore Raphan:
Neural network modelling of velocity estimation during off-vertical axis rotation (ovar). 248-249 - Masahiko Fujita:
A proposed system of inhibition in the deeper layers of the superior colliculus. 250-251 - Fotios Giannakopoulos, Hanspeter A. Mallot:
A nonlinear layered model of cortical dynamics. 252-253 - Richard Granger, Jose A. Ambros-Ingerson, Gary Lynch:
Formal analysis of aggregate function of layer II cerebral cortex. 254 - Guenter W. Gross, M. H. Hightower, Jacek M. Kowalski:
Multielectrode burst pattern feature extraction from mammalian networks in culture. 255 - Daniel K. Hartline:
Models for simulation of real neural networks. 256-258 - Toshio Inui, Sei Miyake, K. Kani:
Receptive field density of y cells estimated by a model of human retina. 259