出版时间:2008-8 出版社:浙江大学出版社有限责任公司 作者:黄开竹//杨海钦//金国庆//吕荣骢 页数:169
Tag标签:无
内容概要
Machine learning - modeling data locally and globally presents a novel and unified theory that tries to seamlessly integrate different algorithms。 specifically, the book distinguishes the inner nature of machine learning algorithms as either “local learning”or “global learning。”this theory not only connects previous machine learning methods, or serves as roadmap in various models, but more importantly it also motivates a theory that can learn from data both locally and globally。 this would help the researchers gain a deeper insight and comprehensive understanding of the techniques in this field。 the book reviews current topics,new theories and applications。 kaizhu huang was a researcher at the fujitsu research and development center and is currently a research fellow in the chinese university of hong kong。 haiqin yang leads the image processing group at hisilicon technologies。 irwin king and michael r。 lyu are professors at the computer science and engineering department of the chinese university of hong kong。
书籍目录
1 introduction1.1 learning and global modeling1.2 learning and local modeling1.3 hybrid learning1.4 major contributions1.5 scope1.6 book 0rganizationreferences2 global learning vs.local learning2.1 problem definition2.2 global learning2.2.1 generative learning2.2.2 non—parametric learning2.2.3 the minimum error minimax probability machine2.3 local learning2.4 hybrid learning2.5 maxi—min margin machinereferences3 a general global learning modeh mempm3.1 marshall and 0lkin theory. 3.2 minimum error minimax probability decision hyperplane3.2.1 problem definition3.2.2 interpretation3.2.3 special case for biased classifications3.2.4 solving the mempm optimization problem3.2.5 when the worst—case bayes optimal hyperplane becomes the true one3.2.6 geometrical interdretation3.3 robust version3.4 kernelization3.4.1 kernelization theory for bmpm3.4.2 notations in kernelization theorem of bmpm3.4.3 kernelization results3.5 experiments3.5.1 model illustration on a synthetic dataset3.5.2 evaluations on benchmark datasets3.5.3 evaluations of bmpm on heart.disease dataset3.6 how tight is the bound3.7 on the concavity of mempm3.8 limitations and future work3.9 summaryreferencese4 learning locally and globally:maxi-min margin machine4.1 maxi—min margin machine4.1.1 separable case4.1.2 connections with other models4.1.3 nonseparable case4.1.4 further connection with minimum error minimax probability machine4.2 bound on the error rate4.3 reduction4.4 kernelization4.4.1 foundation of kernelization for m44.4.2 kernelization result4.5 experiments4.5.1 evaluations on three synthetic toy datasets4.5.2 evaluations on benchmark datasets4.6 discussions and future work4.7 summaryreferences5 extensionⅰ:bmpm for imbalanced learning5.1 introduction to imbalanced learning5.2 biased minimax probability machine5.3 learning from imbalanced data by using bmpm5.3.1 four criteria to evaluate learning from imbalanced data5.3.2 bmpm for maximizing the sum of the accuracies5.3.3 bmpm for roc analysis6 extensionⅱ :a regression model from m47 extensionⅲ:variational margin settings within local data8 conclusion and future workreferencesindex
图书封面
图书标签Tags
无
评论、评分、阅读与下载