Stop the war!
Остановите войну!
for scientists:
default search action
Information Fusion, Volume 5
Volume 5, Number 1, March 2004
- Belur V. Dasarathy:
On reaching a new milestone - measuring our progress. 1-3 - Guo-Hong Wang, Shi-Yi Mao, You He, Zhi-Yu Che:
Optimal decision fusion when priori probabilities and risk functions are fuzzy. 5-14 - Branko Ristic, Neil J. Gordon, Amanda Bessell:
On target classification using kinematic data. 15-21 - Stefano Coraluppi, Craig Carthel:
Recursive track fusion for multi-sensor surveillance. 23-33 - Bienvenu Fassinut-Mombot, Jean-Bernard Choquel:
A new probabilistic and entropy fusion approach for management of information sources. 35-47 - Yong Xun, Mieczyslaw M. Kokar, Kenneth Baclawski:
Control based sensor management for a multiple radar monitoring scenario. 49-63 - Piotr Gutkowski:
Algorithm for retrieval and verification of personal identity using bimodal biometrics. 65-71
Volume 5, Number 2, June 2004
- Belur V. Dasarathy:
Robust speech processing. 75 - Parham Aarabi, Belur V. Dasarathy:
Robust speech processing using multi-sensor multi-source information fusion - an overview of the state of the art. 77-80 - Samy Bengio:
Multimodal speech processing using asynchronous Hidden Markov Models. 81-89 - Georg F. Meyer, Jeffrey B. Mulligan, Sophie M. Wuerger:
Continuous audio-visual digit recognition using N-best decision fusion. 91-101 - Parham Aarabi, Bob Mungamuru:
The fusion of visual lip movements and mixed speech signals for robust speech separation. 103-117 - Omid S. Jahromi, Bruce A. Francis, Raymond H. Kwong:
Relative information of multi-rate sensors. 119-129 - QingHua Wang, Teodor Ivanov, Parham Aarabi:
Acoustic robot navigation using distributed microphone arrays. 131-140 - Ka-Yee Leung, Man-Hung Siu:
Integration of acoustic and articulatory information with application to speech recognition. 141-151
Volume 5, Number 3, September 2004
- Belur V. Dasarathy:
Does length matter? 155-156 - Xiaoxun Zhu, Yingqin Yuan, Chris Rorres, Moshe Kam:
Distributed M-ary hypothesis testing with binary local decisions. 157-167 - Frédéric Dambreville, Jean-Pierre Le Cadre:
Spatio-temporal multi-mode information management for moving target detection. 169-178 - Patrick Vannoorenberghe:
On aggregating belief decision trees. 179-188 - Mieczyslaw M. Kokar, Jerzy A. Tomasik, Jerzy Weyman:
Formalizing classes of information fusion systems. 189-202 - Anne-Sophie Capelle, Olivier Colot, Christine Fernandez-Maloigne:
Evidential segmentation scheme of multi-echo MR images for the detection of brain tumors using neighborhood information. 203-216 - Abdelmalik Taleb-Ahmed, André Bigand, V. Lethuc, P. M. Allioux:
Visual acuity of vision tested by fuzzy logic: An application in ophthalmology as a step towards a telemedicine project. 217-230
Volume 5, Number 4, December 2004
- Belur V. Dasarathy:
A panoramic sampling of avant-garde applications of information fusion. 233-238 - P. Viswanath, M. Narasimha Murty, Shalabh Bhatnagar:
Fusion of multiple approximate nearest neighbor classifiers for fast and efficient classification. 239-250 - Alexandre Jouan, Yannick Allard:
Land use mapping with evidential fusion of features extracted from polarimetric synthetic aperture radar and hyperspectral imagery. 251-267 - Xiaohui Yuan, Jian Zhang, Bill P. Buckles:
Evolution strategies based image registration via feature matching. 269-282 - Xiaoming Peng, Mingyue Ding, Chengping Zhou, Qian Ma:
A practical two-step image registration method for two-dimensional images. 283-298 - Vasileios Megalooikonomou, Yaacov Yesha:
Space efficient quantization for distributed estimation by a multi-sensor fusion system. 299-308 - Johan Schubert:
Clustering belief functions based on attracting and conflicting metalevel evidence using Potts spin mean field theory. 309-318
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.