Positive Example Learning for Content-Based Recommendations: A Cost-Sensitive Learning-Based Approach

Publication Year:
2009
Usage 362
Abstract Views 243
Downloads 119
Repository URL:
https://aisel.aisnet.org/icis2009/188
Author(s):
Lee, Yen-Hsien; Hu, Paul Jen-Hwa; Cheng, Tsang-Hsiang; Hsieh, Ya-Fang
Tags:
Content-based Recommendation Systems; Cost-sensitive Learning; Single-class Learning; Positive Example-based Learning; Committee Machine
article description
Existing supervised learning techniques can support product recommendations but are ineffective in scenarios characterized by single-class learning; i.e., training samples consisted of some positive examples and a much greater number of unlabeled examples. To address the limitations inherent in existing single-class learning techniques, we develop COst-sensitive Learning-based Positive Example Learning (COLPEL), which constructs an automated classifier from a singleclass training sample. Our method employs cost-proportionate rejection sampling to derive, from unlabeled examples, a subset likely to feature negative examples, according to the respective misclassification costs. COLPEL follows a committee machine strategy, thereby constructing a set of automated classifiers used together to reduce probable biases common to a single classifier. We use customers’ book ratings from the Amazon.com Web site to evaluate COLPEL, with PNB and PEBL as benchmarks. Our results show that COLPEL outperforms both PNB and PEBL, as measured by its accuracy, positive F1 score, and negative F1 score.