In our papers we are concerned with the problem of class
separation. Most of our works (except some recent conference papers [23,24,25])
are limited to linear class separation. In this case, the problem
as a whole i s as follows: Suppose we have c samples (sets of observations)
drawn from n-dimensional distributions. These can be represented geometrically
as c sample clusters in Euclidean n-space. We want to project these c sample
clusters onto a Euclidean s-space (s <n), so that the c projected samples
are maximally separated, i.e. a class separability measure (or discriminant
criterion) has maximal value for this s-space. In practical situations,
if we could find the basis vectors of this s-space, c alled discriminant
vectors, we would have a way of discriminating between samples from
c distributions by linear combinations of the n-components of the feature
vector. This process of reduction of the dimensionality of the observations
is referred to in the pattern recognition literature as linear feature
extraction, and is viewed as an important step in the pattern recognition.
This is because too many feature variables (relative to the sample size)
can harm the performance of the sample a llocation rule. On the other hand,
the reduction of the dimensionality of the observations facilitates visualization
and understanding of the multivariate feature observations. In particular
one- and two-dimensional projections are widely used in interact ive (man-machine)
pattern recognition systems. Such projections are helpful in exploring
relationship between the classes. |