default search action
NIPS 1989: Denver, CO, USA
- David S. Touretzky:
Advances in Neural Information Processing Systems 2, [NIPS Conference, Denver, Colorado, USA, November 27-30, 1989]. Morgan Kaufmann 1990, ISBN 1-55860-100-7
Part 1: Neuroscience
- James A. Simmons:
Acoustic-Imaging Computations by Echolocating Bats: Unification of Diversely-Represented Stimulus Features into Whole Images. 2-9 - Clay Spence, John C. Pearson:
The Computation of Sound Source Evaluation in the Barn Owl. 10-17 - Ronald M. Harris-Warrick:
Mechanisms for Neuromodulation of Biological Neural Networks. 18-27 - Shawn R. Lockery, Yan Fang, Terrence J. Sejnowski:
Neural Network Analysis of Distributed Representations of Dynamical Sensory-Motor Transormations in the Leech. 28-35 - William Bialek, Fred Rieke, Robert R. de Ruyter van Steveninck, David Warland:
Reading a Neural Code. 36-43 - Randall D. Beer, Hillel J. Chiel:
Neural Implementation of Motivated Behavior: Feeding in an Artificial Insect. 44-51 - Kamil A. Grajski, Michael Merzenich:
Neural Network Simulation of Somatosensory Representational Plasticity. 52-59 - Mark E. Nelson, James M. Bower:
Computational Efficiency: A Common Organizing Principle for Parallel Computer Maps and Brain Maps? 60-67 - Bill Baird:
Associative Memory in a Simple Model of Oscillating Cortex. 68-75 - Daniel M. Kammen, Christof Koch, Philip J. Holmes:
Collective Oscillations in the Visual Cortex. 76-83 - Matthew A. Wilson, James M. Bower:
Computer Simulation of Oscillatory Behavior in Cerebral Cortical Networks. 84-91 - Jack D. Cowan, A. E. Friedman:
Development and Regeneration of Eye-Brain Maps: A Computational Model. 92-99 - David Servan-Schreiber, Harry Printz, Jonathan D. Cohen:
The Effect of Catecholamines on Performance: From Unit to System Behavior. 100-108 - Michael C. Crair, William Bialek:
Non-Boltzmann Dynamics in Networks of Spiking Neurons. 109-116 - Maurice Lee, James M. Bower:
A Computer Modeling Approach to Understanding the Inferior Olive and Its Relationships to the Cerebellar Cortex in Rats. 117-124 - William R. Softky, Daniel M. Kammen:
Can Simple Cells Learn Curves? A Hebbian Model in a Structured Environment. 125-132 - Alex Chernajvsky, John E. Moody:
Note on Development of Modularity in Simple Cortical Models. 133-140 - G. T. Kenyon, Eberhard E. Fetz, R. D. Puff:
Effects of Firing Synchrony on Signal Propagation in Layered Networks. 141-148 - Paul C. Rhodes:
A Systematic Study of the Input/Output Properties of a 2 Compartment Model Neuron With Active Membranes. 149-159 - Dun-Sung Tang:
Analytic Solutions to the Formation of Feature-Analysing Cells of a Three-Layer Feedforward Visual Information Processing Neural Net. 160-165
Part 2: Speech and Signal Processing
- Yuchun Lee, Richard Lippmann:
Practical Characteristics of Neural Network and Conventional Pattern Classifiers on Artificial and Speech Problems. 168-177 - Kevin J. Lang, Geoffrey E. Hinton:
Dimensionality Reduction and Prior Knowledge in E-Set Recognition. 178-185 - Hervé Bourlard, Nelson Morgan:
A Continuous Speech Recognition System Embedding MLP into HMM. 186-193 - William Y. Huang, Richard Lippmann:
HMM Speech Recognition with Neural Net Discrimination. 194-202 - John B. Hampshire II, Alex Waibel:
Connectionist Architectures for Multi-Speaker Phoneme Recognition. 203-210 - John S. Bridle:
Training Stochastic Model Recognition Algorithms as Networks can Lead to Maximum Mutual Information Estimation of Parameters. 211-217 - Yoshua Bengio, Renato de Mori, Régis Cardin:
Speaker Independent Speech Recognition with Neural Networks and Speech Knowledge. 218-225 - Jim Mann:
The Effects of Circuit Integration on a Feature Map Vector Quantizer. 226-231 - Terrence J. Sejnowski, Ben P. Yuhas, Moise H. Goldstein Jr., Robert E. Jenkins:
Combining Visual and Acoustic Speech Signals with a Neural Network Improves Intelligibility. 232-239 - Susan Ciarrocca Lee:
Using a Translation-Invariant Neural Network to Diagnose Heart Arrhythmia. 240-247 - Donald B. Malkoff:
A Neural Network for Real-Time Signal Processing. 248-255
Part 3: Vision
- Michael Seibert, Allen M. Waxman:
Learning Aspect Graph Representations from View Sequences. 258-265 - Richard S. Zemel, Michael Mozer, Geoffrey E. Hinton:
TRAFFIC: Recognizing Objects Using Hierarchical Reference Frame Transformations. 266-273 - Daphna Weinshall, Shimon Edelman, Heinrich H. Bülthoff:
A Self-Organizing Multiple-View Representations of 3D Objects. 274-281 - Pentti Kanerva:
Contour-Map Encoding of Shape for Early Vision. 282-289 - Paul A. Viola:
Neurally Inspired Plasticity in Oculomotor Processes. 290-297 - Toshiaki Okamoto, Mitsuo Kawato, Toshio Inui, Sei Miyake:
Model Based Image Compression and Adaptive Data Representaion by Interacting Filter Banks. 298-305
Part 4: Optimization and Control
- Jim Donnett, Tim Smithers:
Neuronal Group Selection Theory: A Grounding in Robotics. 308-315 - Christopher G. Atkeson:
Using Local Models to Control Movement. 316-323 - Michael I. Jordan, Robert A. Jacobs:
Learning to Control an Unstable System with Forward Modeling. 324-331 - Michael Hormel:
A Self-organizing Associative Memory System for Control Applications. 332-339 - Michael J. Carter, Franklin J. Rudolph, Adam J. Nucci:
Operational Fault Tolerance of CMAC Networks. 340-347 - Oluseyi Farotimi, Amir Dembo, Thomas Kailath:
Neural Network Weight Matrix Synthesis Using Optimal Control Techniques. 348-354 - Gintaras V. Reklaitis, Athanasios G. Tsirukis, Manoel Fernando Tenorio:
Generalized Hopfield Networks and Nonlinear Optimization. 355-362
Part 5: Other Applications
- Ajay N. Jain, Alex Waibel:
Incremental Parsing by Modular Recurrent Connectionist Networks. 364-371 - David S. Touretzky, Deirdre W. Wheeler:
A Computational Basis for Phonology. 372-379 - C. Lee Giles, Guo-Zheng Sun, Hsing-Hen Chen, Yee-Chun Lee, Dong Chen:
Higher Order Recurrent Networks and Grammatical Inference. 380-387 - Kurt R. Smith, Michael I. Miller:
Bayesian Inference of Regular Grammar and Markov Source Models. 388-395 - Yann LeCun, Bernhard E. Boser, John S. Denker, Donnie Henderson, Richard E. Howard, Wayne E. Hubbard, Lawrence D. Jackel:
Handwritten Digit Recognition with a Back-Propagation Network. 396-404 - Gale Martin, James A. Pittman:
Recognizing Hand-Printed Letters and Digits. 405-414 - Yoshihiro Mori, Kazuki Joe:
A Large-Scale Neural Network Which Recognizes Handwritten Kanji Characters. 415-422 - Yoshua Bengio, Samy Bengio, Yannick Pouliot, Patrick Agin:
A Neural Network to Detect Homologies in Proteins. 423-430 - David S. Touretzky, Gillette Elvgreen III:
Rule Representations in a Connectionist Chunker. 431-438 - Michael Mozer, Jonathan Bachrach:
Discovering the Structure of a Reactive Environment by Exploration. 439-446 - Steven A. Harp, Tariq Samad, Aloke Guha:
Designing Application-Specific Neural Networks Using the Genetic Algorithm. 447-454 - David Rogers:
Predicting Weather Using a Genetic Memory: A Combination of Kanerva's Sparse Distributed Memory with Holland's Genetic Algorithms. 455-464 - Jakub Wejchert, Gerald Tesauro:
Neural Network Visualization. 465-472
Part 6: New Learning Algorithms
- Bartlett W. Mel, Christof Koch:
Sigma-Pi Learning: On Radial Basis Functions and Cortical Associative Learning. 474-481 - Avijit Saha, James D. Keeler:
Algorithms for Better Representation and Faster Learning in Radial Basis Function Networks. 482-489 - Tony Bell:
Learning in Higher-Order "Artificial Dendritic Trees". 490-497 - Jacob Barhen, Nikzad Benny Toomarian, Sandeep Gulati:
Adjoint Operator Algorithms for Faster Learning in Dynamical Neural Networks. 498-508 - Conrad C. Galland, Geoffrey E. Hinton:
Discovering High Order Features with Mean Field Modules. 509-515 - Tal Grossman:
The CHIR Algorithm for Feed Forward Networks with Binary Weights. 516-523 - Scott E. Fahlman, Christian Lebiere:
The Cascade-Correlation Learning Architecture. 524-532 - Stephen Jose Hanson:
Meiosis Networks. 533-541 - John Kassebaum, Manoel Fernando Tenorio, Christoph Schaefers:
The Cocktail Party Problem: Speech/Data Signal Separation Comparison between Backpropagation and SONN. 542-549 - David H. Ackley, Michael L. Littman:
Generalization and Scaling in Reinforcement Learning. 550-557 - Richard Rohwer:
The "Moving Targets" Training Algorithm. 558-565 - Les E. Atlas, David A. Cohn, Richard E. Ladner:
Training Connectionist Networks with Queries and Selective Sampling. 566-573 - Steven J. Nowlan:
Maximum Likelihood Competitive Learning. 574-582 - Michail Zak, Nikzad Benny Toomarian:
Unsupervised Learning in Neurodynamics Using the Phase Velocity Field Approach. 583-589 - Amir F. Atiya, Yaser S. Abu-Mostafa:
A Method for the Associative Storage of Analog Vectors. 590-595
Part 7: Empirical Analyses
- Yann LeCun, John S. Denker, Sara A. Solla:
Optimal Brain Damage. 598-605 - Subutai Ahmad, Gerald Tesauro, Yu He:
Asymptotic Convergence of Backpropagation: Numerical Experiments. 606-613 - Sheri L. Gish, W. E. Blanz:
Comparing the Performance of Connectionist and Statistical Classifiers on an Image Segmentation Problem. 614-621 - Les E. Atlas, Ronald A. Cole, Jerome T. Connor, Mohamed A. El-Sharkawi, Robert J. Marks II, Yeshwant K. Muthusamy, Etienne Barnard:
Performance Comparisons Between Backpropagation Networks and Classification Trees on Three Real-World Applications. 622-629 - Nelson Morgan, Hervé Bourlard:
Generalization and Parameter Estimation in Feedforward Netws: Some Experiments. 630-637 - David Zipser:
Subgrouping Reduces Complexity and Speeds Up Learning in Recurrent Networks. 638-641 - Yves Chauvin:
Dynamic Behavior of Constained Back-Propagation Networks. 642-649 - William P. Lincoln, Josef Skrzypek:
Synergy of Clustering Multiple Back Propagation Networks. 650-657
Part 8: Theoretical Analyses
- Davi Geiger, Federico Girosi:
Coupled Markov Random Fields and Mean Field Theory. 660-667 - Amir Dembo, Kai-Yeung Siu, Thomas Kailath:
Complexity of Finite Precision Neural Network Classifier. 668-675 - Eric B. Baum:
The Perceptron Algorithm Is Fast for Non-Malicious Distributions. 676-685 - Andrew G. Barto, Richard S. Sutton, Christopher J. C. H. Watkins:
Sequential Decision Probelms and Neural Networks. 686-693 - David J. C. MacKay, Kenneth D. Miller:
Analysis of Linsker's Simulations of Hebbian Rules. 694-701 - Zoran Obradovic, Ian Parberry:
Analog Neural Networks of Limited Precision I: Computing with Multilinear Threshold Functions. 702-709 - Fernando J. Pineda:
Time Dependent Adaptive Neural Networks. 710-718 - Nathan Intrator:
A Neural Network for Feature Extraction. 719-726 - Pierre Baldi, Yosef Rinott, Charles Stein:
On the Distribution of the Number of Local Minima of a Random Function on a Graph. 727-732 - Anders Krogh, C. I. Thorbergsson, John A. Hertz:
A Cost Function for Internal Representations. 733-740
Part 9: Hardware Implementation
- Stephen P. DeWeerth, Carver Mead:
An Analog VLSI Model of Adaptation in the Vestibulo-Ocular Reflex. 742-749 - Christof Koch, Wyeth Bair, John G. Harris, Timothy K. Horiuchi, Andrew Hsu, Jin Luo:
Real-Time Computer Vision and Robotics Using Analog VLSI Circuits. 750-757 - Srinagesh Satyanarayana, Yannis P. Tsividis, Hans Peter Graf:
A Reconfigurable Analog VLSI Neural Network Chip. 758-768 - Alexander Moopenn, Tuan Duong, A. P. Thakoor:
Digital-Analog Hybrid Synapse Chips for Electronic Neural Networks. 769-776 - John C. Platt:
Analog Circuits for Constrained Optimization. 777-784 - Michael Brownlow, Lionel Tarassenko, Alan F. Murray, Alister Hamilton, Il Song Han, H. Martin Reekie:
Pulse-Firing Neural Chips for Hundreds of Neurons. 785-792 - Tzi-Dar Chiueh, Rodney M. Goodman:
VLSI Implementation of a High-Capacity Neural Network Associative Memory. 793-800 - Xiru Zhang, Michael McKenna, Jill P. Mesirov, David L. Waltz:
An Efficient Implementation of the Back-propagation Algorithm on the Connection Machine CM-2. 801-809 - Fernando J. Nuñez, José A. B. Fortes:
Performance of Connectionist Learning Algorithms on 2-D SIMD Processor Arrays. 810-817 - Ira Smotroff:
Dataflow Architectures: Flexible Platforms for Neural Network Simulation. 818-825
Part 10: History of Neural Networks
- Jack D. Cowan:
Neural Networks: The Early Days. 828-842
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.