Variational Bayes Learning from Relevant Tasks Only

Reference:

Jaakko Peltonen, Yusuf Yaslan, and Samuel Kaski. Variational Bayes learning from relevant tasks only. In Learning from Multiple Sources Workshop, 13 December 2008, Whistler Canada, 2008. Proceedings at http://web.mac.com/davidrh/LMSworkshop08/Schedule.html.

Abstract:

We extend our recent work on relevant subtask learning, a new variant of multi-task learning where the goal is to learn a good classifier for a task-of-interest with too few training samples, by exploiting "supplementary data" from several other tasks. It is crucial to model the uncertainty about which of the supplementary data samples are relevant for the task-of-interest, that is, which samples are classified in the same way as in the task-of-interest. We have shown that the problem can be solved by careful mixture modeling: all tasks are modeled as mixtures of relevant and irrelevant samples, and the model for irrelevant samples is flexible enough so that the relevant model only needs to explain the relevant data. Previously we used simple maximum likelihood learning; now we extend the method to variational Bayes inference more suitable for high-dimensional data. We compare the method experimentally to a recent multi-task learning method and two naive methods.

Suggested BibTeX entry:

@inproceedings{Peltonen08,
    author = {Jaakko Peltonen and Yusuf Yaslan and Samuel Kaski},
    booktitle = {Learning from Multiple Sources Workshop, 13 December 2008, Whistler Canada},
    note = {Proceedings at \url{http://web.mac.com/davidrh/LMSworkshop08/Schedule.html}},
    title = {Variational {B}ayes Learning from Relevant Tasks Only},
    year = {2008},
}

See www.cis.hut.fi ...