Learning filters in Gaussian process classification problems
|Title||Learning filters in Gaussian process classification problems|
|Publication Type||Conference Paper|
|Year of Publication||2014|
|Authors||Ruiz, P.., J.. Mateos, R.. Molina, and A.K.. Katsaggelos|
|Conference Name||Image Processing (ICIP), 2014 IEEE International Conference on|
|Keywords||analysis representation, Bayes methods, Bayesian inference, Bayesian modeling, Brain modeling, channel bank filters, classification tasks, classification-filtering approach, Conferences, filter estimation, Gaussian distribution, Gaussian Process classification, Gaussian process classification problems, Gaussian processes, global optimality, iterative methods, iterative procedure, Joints, Kernel, learning filters, neighbor labeling, optimal filter bank, posterior distributions, sequence labeling, signal classification, Support vector machines|
Many real classification tasks are oriented to sequence (neighbor) labeling, that is, assigning a label to every sample of a signal while taking into account the sequentiality (or neighborhood) of the samples. This is normally approached by first filtering the data and then performing classification. In consequence, both processes are optimized separately, with no guarantee of global optimality. In this work we utilize Bayesian modeling and inference to jointly learn a classifier and estimate an optimal filterbank. Variational Bayesian inference is used to approximate the posterior distributions of all unknowns, resulting in an iterative procedure to estimate the classifier parameters and the filterbank coefficients. In the experimental section we show, using synthetic and real data, that the proposed method compares favorably with other classification/filtering approaches, without the need of parameter tuning.