Based on this fact,we propose a Soft Kernel Perceptron(SKP) in terms of L2 norm,in which the regular perceptron is directly employed to solve the linearly separable problem determined by L2 norm soft margin algorithms.

The linear and non-linear separability of Boolean functions are difficult problems, in which only linearly separable problem for dimension n ≤7 had ever been discussed.

In this model, a self adaptive feature space expanding layer is added in front of the feedforward neural network to enhance the description of the original pattern, thus the nonlinear separable problem can be transformed into linear one or less nonlinear one, and our new model converges faster than the traditional feedforward neural network.

But for the continuous perceptrons, despite the excellent application, we have not found satisfactorily proved results for the linearly separable problem. Researchers have also attempted to obtain the convergence of the online BP algorithm for nonlinear multilayer perceptrons.

kernel method has been of wide concern in the field of machine learning recently. It allows the efficient computation of linear classification in high-dimensional feature space, instead of non-linearly separable problem in low-dimensional input space.

The initial values thus obtained are utilized in the formulation of an eigenfunction solution to a non-separable problem in which the derivatives of the solution function are of interest, so that retention of analytic control is desirable.

If a satellite orbit is described by means of osculating Jacobi α's and β's of a separable problem, the paper shows that a perturbing forceF makes them vary according to

Global convergence is proved for a partitioned BFGS algorithm, when applied on a partially separable problem with a convex decomposition.

This integer problem is formulated by a simple piecewise-linear underestimation of the separable problem.

The original problem is reduced to an equivalent separable problem by solving a multiple-cost-row linear program with 2n cost rows.

Artificial neural network model is a kind of nonlinear dynamical net system composed of a large scaleof extensively interconnected simple computing elements.For the parallel distributed processing,associa-tive memory,self-organization,self-learning,and strong mapping abilities,it has shown broad applicationprospects in many fields.In this paper,from the view points of pattern recognition,the artificial neuralnetwork technique and its applications to rotating machinery fault diagnosis are discussed,the networktopollogy(i.e.the...

Artificial neural network model is a kind of nonlinear dynamical net system composed of a large scaleof extensively interconnected simple computing elements.For the parallel distributed processing,associa-tive memory,self-organization,self-learning,and strong mapping abilities,it has shown broad applicationprospects in many fields.In this paper,from the view points of pattern recognition,the artificial neuralnetwork technique and its applications to rotating machinery fault diagnosis are discussed,the networktopollogy(i.e.the number of hidden layers and the number of hidden units)and its decision ability to formthe demanded classification regions are also studied.Based on the standard frequency spectrum waveformfeatures which are represented in power ratios of nine different frquency intervals,five types of typicalfaults in rotating machinery are analyzed and diagnosed with the famous perceptron networks by Back-Propagation algorithm.In addition,this kind of adaptive neural network method is compared with the tra-ditional pattern recognition approach.The resrarch results show that artificial neural network techniquehas special pattern classification properties for high dimension and nonlinear pattern recognition problemsbecause of its extensive interconnected nonlinear network architecture and its strong parameter self-learn-ing ability.The clssification ability of the network is a function of the number of hidden layers and hiddennodes.The addition of more hidden nodes improved the learning speed somewhat but the ability of the net-work to recall and generalize suffers.For a linearly separable problem,a single—layer perceptrou networkcan be adopted.For a linear unseparable problem,multi-layer perceptron networks can be adopted.Inpractical application situations,a two-layer perceptron network(one hidden layer)with suitable hiddexnodes can form enough complex decision regions.Then,the number of hidden nodes is determined accord-ing to the problem complexity.As a new pattern recognition approach,artificial neural network techniqueis capable of solving the complex state recognition problems in fault diagnosis.

The linear and non-linear separability of Boolean functions are difficult problems, in which only linearly separable problem for dimension n ≤7 had ever been discussed. This paper, on the bases of classification complexity of n-dimensional Boo1ean functions, presents a concept of to1erantly lineat c1assification of n-dimensional Boolean functions , and discusses some counting properties of n-dimensional hypercubes with the counting results presented' A1l of these are refered as the theoretical preparation...

The linear and non-linear separability of Boolean functions are difficult problems, in which only linearly separable problem for dimension n ≤7 had ever been discussed. This paper, on the bases of classification complexity of n-dimensional Boo1ean functions, presents a concept of to1erantly lineat c1assification of n-dimensional Boolean functions , and discusses some counting properties of n-dimensional hypercubes with the counting results presented' A1l of these are refered as the theoretical preparation for the further discussion of Boolean functions of tolerantly linear separability to be 2.

The network decomposition and combination algorithm is proposed, with which a nonlinearly separable problem can be decomposed into several linearly separable sub problems, which can be easily realized by sub nets. Then the sub nets are combined to form a network, which can be efficiently trained to solve the nonlinearly separable problem. This algorithm's convergence is proved. Some example studies were carried out. It was shown that this algorithm, which could be used to obtain hidden...

The network decomposition and combination algorithm is proposed, with which a nonlinearly separable problem can be decomposed into several linearly separable sub problems, which can be easily realized by sub nets. Then the sub nets are combined to form a network, which can be efficiently trained to solve the nonlinearly separable problem. This algorithm's convergence is proved. Some example studies were carried out. It was shown that this algorithm, which could be used to obtain hidden objects and determine number of hidden units in the hidden layer, is a very efficient and fast algorithm for training neural networks.