Department of Electrical & Computer Engineering 

Synopsis of Research

Our research is in the field of the discriminant analysis or (statistical) discrimination. It includes problems associated with the statistical separation between distinct classes and with the allocation of entit ies to classes, finite in number, say, c. The existence of the classes is known a priori and an entity of interest is assumed to belong to one (and only one) of the classes. There are feature data on entities of known origin available from the underlying classes. The problem for allocation is to make an outright assignment of the entity (with unknown class membership) to one of the possible classes on the basis of its associated features. In this situation, statistical decision theory can be invoke d for the construction of suitable allocation rules. The problem for the class separation is to draw inferences about the relationship between class membership and feature variables of the entity without allocation of the entity to one of the possi ble classes. In this situation the specific aim is to understand and to provide insight into the predictive structure of the feature variables.
In our papers we are concerned with the problem of class separation. Most of our works (except some recent conference papers [23,24,25]) are limited to linear class separation. In this case, the problem as a whole i s as follows: Suppose we have c samples (sets of observations) drawn from n-dimensional distributions. These can be represented geometrically as c sample clusters in Euclidean n-space. We want to project these c sample clusters onto a Euclidean s-space (s <n), so that the c projected samples are maximally separated, i.e. a class separability measure (or discriminant criterion) has maximal value for this s-space. In practical situations, if we could find the basis vectors of this s-space, c alled discriminant vectors, we would have a way of discriminating between samples from c distributions by linear combinations of the n-components of the feature vector. This process of reduction of the dimensionality of the observations is referred to in the pattern recognition literature as linear feature extraction, and is viewed as an important step in the pattern recognition. This is because too many feature variables (relative to the sample size) can harm the performance of the sample a llocation rule. On the other hand, the reduction of the dimensionality of the observations facilitates visualization and understanding of the multivariate feature observations. In particular one- and two-dimensional projections are widely used in interact ive (man-machine) pattern recognition systems. Such projections are helpful in exploring relationship between the classes.
We discuss problems where the class-conditional densities are unknown and we compute the divectors using a collection of the feature data of known origin called in the pattern recognition literature design or tra ining data. This problem was originally considered by R.Fisher. In recent times there have been many advances made in discriminant analysis. 

  Our research concerns:
· Novel discriminant criteria
· Interactive system for exploratory data analysis
· Method for estimating the significance of the control parameters of projection procedures
· Multiclass discriminant projections
· New methods for successive optimization of the discriminant criteria
· Comparative study of neural networks for multivariate data projection
· Discriminant analysis via neural network reduction of the class separation

Back to main home page