Ask a Librarian

Threre are lots of ways to contact a librarian. Choose what works best for you.

HOURS TODAY

10:00 am - 3:00 pm

Reference Desk

CONTACT US BY PHONE

(802) 656-2022

Voice

(802) 503-1703

Text

MAKE AN APPOINTMENT OR EMAIL A QUESTION

Schedule an Appointment

Meet with a librarian or subject specialist for in-depth help.

Email a Librarian

Submit a question for reply by e-mail.

WANT TO TALK TO SOMEONE RIGHT AWAY?

Library Hours for Wednesday, April 24th

All of the hours for today can be found below. We look forward to seeing you in the library.
HOURS TODAY
8:00 am - 12:00 am
MAIN LIBRARY

SEE ALL LIBRARY HOURS
WITHIN HOWE LIBRARY

MapsM-Th by appointment, email govdocs@uvm.edu

Media Services8:00 am - 7:00 pm

Reference Desk10:00 am - 3:00 pm

OTHER DEPARTMENTS

Special Collections10:00 am - 6:00 pm

Dana Health Sciences Library7:30 am - 11:00 pm

 

CATQuest

Search the UVM Libraries' collections

UVM Theses and Dissertations

Browse by Department
Format:
Print
Author:
Lu, Zhenyu
Dept./Program:
Computer Science
Year:
2011
Degree:
PhD
Abstract:
Classification techniques build predictive models with data described by a set of features (attributes) and associated labels (a discrete set of possible classes). One popular approach to classification is ensemble methods, which instead of relying on one single classification model such as Decision Trees (DT), combine a set of models for prediction. Ensemble methods have been successfully applied for many classification tasks, as well as other tasks such as relevance ranking and recommendation systems. An open question in ensemble methods is how to choose one model type (homogeneous ensemble), or a set of model types (heterogeneous ensemble) to construct ensembles.
This dissertation addresses four fundamental questions for heterogeneous ensembles : 1) whether we need heterogeneous ensembles: we demonstrate that heterogeneous ensembles could outperform homogeneous ensembles of any involving classification model alone; and 2) how to construct appropriate heterogeneous ensembles: we introduce an algorithm called Adaptive Heterogeneous Ensembles (AHE) to automatically discover appropriate combinations of classification model types; and 3) why heterogeneous ensembles work: through empirical analysis we demonstrate that heterogeneous ensembles outperform homogeneous ensembles because different classification model types complement each other; and 4) when heterogeneous ensembles work: we discover that the advantage of heterogeneous ensembles over other methods is increased when the target data have more class labels. In our work, the efficacy of AHE is experimentally validated in the context of active learning. Extensive experiments on 18 DCI data sets show that AHE outperforms its homogeneous variants, as well as bagging, boosting and the random subspace method (RSM) with random sampling.