Ensemble Machine Learning [electronic resource] : Methods and Applications / edited by Cha Zhang, Yunqian Ma.Material type: TextLanguage: English Publisher: Boston, MA : Springer US, 2012Description: VIII, 332 p. online resourceContent type: text Media type: computer Carrier type: online resourceISBN: 9781441993267Subject(s): Engineering | Computer science | Data mining | Engineering | Computational Intelligence | Data Mining and Knowledge Discovery | Computer Science, generalAdditional physical formats: Printed edition:: No titleDDC classification: 006.3 LOC classification: Q342Online resources: Click here to access online
Introduction of Ensemble Learning -- Boosting Algorithms: Theory, Methods and Applications -- On Boosting Nonparametric Learners -- Super Learning -- Random Forest -- Ensemble Learning by Negative Correlation Learning -- Ensemble Nystrom Method -- Object Detection -- Ensemble Learning for Activity Recognition -- Ensemble Learning in Medical Applications -- Random Forest for Bioinformatics.
It is common wisdom that gathering a variety of views and inputs improves the process of decision making, and, indeed, underpins a democratic society. Dubbed “ensemble learning” by researchers in computational intelligence and machine learning, it is known to improve a decision system’s robustness and accuracy. Now, fresh developments are allowing researchers to unleash the power of ensemble learning in an increasing range of real-world applications. Ensemble learning algorithms such as “boosting” and “random forest” facilitate solutions to key computational issues such as face detection and are now being applied in areas as diverse as object trackingand bioinformatics. Responding to a shortage of literature dedicated to the topic, this volume offers comprehensive coverage of state-of-the-art ensemble learning techniques, including various contributions from researchers in leading industrial research labs. At once a solid theoretical study and a practical guide, the volume is a windfall for researchers and practitioners alike.