Neural Computation

Công bố khoa học tiêu biểu

* Dữ liệu chỉ mang tính chất tham khảo

Sắp xếp:  
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation - Tập 12 Số 10 - Trang 2385-2404 - 2000
G. Baudat, F. Anouar
We present a new method that we call generalized discriminant analysis (GDA) to deal with nonlinear discriminant analysis using kernel function operator. The underlying theory is close to the support vector machines (SVM) insofar as the GDA method provides a mapping of the input vectors into high-dimensional feature space. In the transformed space, linear properties make it easy to extend...... hiện toàn bộ
Using Multilayer Perceptron Computation to Discover Ideal Insect Olfactory Receptor Combinations in the Mosquito and Fruit Fly for an Efficient Electronic Nose
Neural Computation - Tập 27 Số 1 - Trang 171-201 - 2015
Luqman R. Bachtiar, Charles P. Unsworth, Richard D. Newcomb
The model organism, Drosophila melanogaster, and the mosquito Anopheles gambiae use 60 and 79 odorant receptors, respectively, to sense their olfactory world. However, a commercial “electronic nose” in the form of an insect olfactory biosensor demands very low numbers of receptors at its front end of detection due to the difficulties of receptor/sensor integration and functionalization. I...... hiện toàn bộ
Finite State Automata and Simple Recurrent Networks
Neural Computation - Tập 1 Số 3 - Trang 372-381 - 1989
Axel Cleeremans, David Servan‐Schreiber, James L. McClelland
We explore a network architecture introduced by Elman (1988) for predicting successive elements of a sequence. The network uses the pattern of activation over a set of hidden units from time-step t−1, together with element t, to predict element t + 1. When the network is trained with strings from a particular finite-state grammar, it can learn to be a perfect finite-state recognizer for t...... hiện toàn bộ
Canonical Correlation Analysis: An Overview with Application to Learning Methods
Neural Computation - Tập 16 Số 12 - Trang 2639-2664 - 2004
David R. Hardoon, Sándor Szedmák, John Shawe‐Taylor
We present a general method using kernel canonical correlation analysis to learn a semantic representation to web images and their associated text. The semantic space provides a common representation and enables a comparison between the text and images. In the experiments, we look at two approaches of retrieving images based on only their content from a text query. We compare orthogonaliz...... hiện toàn bộ
Nonlinear Component Analysis as a Kernel Eigenvalue Problem
Neural Computation - Tập 10 Số 5 - Trang 1299-1319 - 1998
Bernhard Schölkopf, Alexander J. Smola, Klaus‐Robert Müller
A new method for performing a nonlinear form of principal component analysis is proposed. By the use of integral operator kernel functions, one can efficiently compute principal components in high-dimensional feature spaces, related to input space by some nonlinear map—for instance, the space of all possible five-pixel products in 16 × 16 images. We give the derivation of the method and p...... hiện toàn bộ
Neural Networks and the Bias/Variance Dilemma
Neural Computation - Tập 4 Số 1 - Trang 1-58 - 1992
Stuart Geman, Elie Bienenstock, René Doursat
Feedforward neural networks trained by error backpropagation are examples of nonparametric regression estimators. We present a tutorial on nonparametric inference and its relation to neural networks, and we use the statistical viewpoint to highlight strengths and weaknesses of neural models. We illustrate the main points with some recognition experiments involving artificial data as well ...... hiện toàn bộ
Regularization Theory and Neural Networks Architectures
Neural Computation - Tập 7 Số 2 - Trang 219-269 - 1995
Federico Girosi, Michael Jones, Tomaso Poggio
We had previously shown that regularization principles lead to approximation schemes that are equivalent to networks with one layer of hidden units, called regularization networks. In particular, standard smoothness functionals lead to a subclass of regularization networks, the well known radial basis functions approximation schemes. This paper shows that regularization networks encompass...... hiện toàn bộ
A Parallel Mixture of SVMs for Very Large Scale Problems
Neural Computation - Tập 14 Số 5 - Trang 1105-1114 - 2002
Ronan Collobert, Samy Bengio, Yoshua Bengio
Support vector machines (SVMs) are the state-of-the-art models for many classification problems, but they suffer from the complexity of their training algorithm, which is at least quadratic with respect to the number of examples. Hence, it is hopeless to try to solve real-life problems having more than a few hundred thousand examples with SVMs. This article proposes a new mixture of SVMs ...... hiện toàn bộ
Training v-Support Vector Classifiers: Theory and Algorithms
Neural Computation - Tập 13 Số 9 - Trang 2119-2147 - 2001
Chih-Chung Chang, Chih‐Jen Lin
The ν-support vector machine (ν-SVM) for classification proposed by Schölkopf, Smola, Williamson, and Bartlett (2000) has the advantage of using a parameter ν on controlling the number of support vectors. In this article, we investigate the relation between ν-SVM and C-SVM in detail. We show that in general they are two different problems with the same optimal solution set. Hence, we may ...... hiện toàn bộ
The Transition to Perfect Generalization in Perceptrons
Neural Computation - Tập 3 Số 3 - Trang 386-401 - 1991
Eric B. Baum, Yuh‐Dauh Lyuu
Several recent papers (Gardner and Derrida 1989; Györgyi 1990; Sompolinsky et al. 1990) have found, using methods of statistical physics, that a transition to perfect generalization occurs in training a simple perceptron whose weights can only take values ±1. We give a rigorous proof of such a phenomena. That is, we show, for α = 2.0821, that if at least αn examples are drawn from the uni...... hiện toàn bộ
Tổng số: 80   
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 8