TY - JOUR
T1 - Using POMDPs for learning cost sensitive decision trees
AU - Maliah, Shlomi
AU - Shani, Guy
N1 - Funding Information:
This paper was partially supported by the ISF fund, under grant number 1210/18 , and by the Helmsley Charitable Trust through the Agricultural, Biological and Cognitive Robotics Center at the Ben Gurion University, Israel.
Publisher Copyright:
© 2020 Elsevier B.V.
PY - 2021/3/1
Y1 - 2021/3/1
N2 - In classification, an algorithm learns to classify a given instance based on a set of observed attribute values. In many real world cases testing the value of an attribute incurs a cost. Furthermore, there can also be a cost associated with the misclassification of an instance. Cost sensitive classification attempts to minimize the expected cost of classification, by deciding after each observed attribute value, which attribute to measure next. In this paper we suggest Partially Observable Markov Decision Processes (POMDPs) as a modeling tool for cost sensitive classification. POMDPs are typically solved through a policy over belief states. We show how a relatively small set of potentially important belief states can be identified, and define an MDP over these belief states. To identify these potentially important belief states, we construct standard decision trees over all attribute subsets, and the leaves of these trees become the state space of our tree-based MDP. At each phase we decide on the next attribute to measure, balancing the cost of the measurement and the classification accuracy. We compare our approach to a set of previous approaches, showing our approach to work better for a range of misclassification costs.
AB - In classification, an algorithm learns to classify a given instance based on a set of observed attribute values. In many real world cases testing the value of an attribute incurs a cost. Furthermore, there can also be a cost associated with the misclassification of an instance. Cost sensitive classification attempts to minimize the expected cost of classification, by deciding after each observed attribute value, which attribute to measure next. In this paper we suggest Partially Observable Markov Decision Processes (POMDPs) as a modeling tool for cost sensitive classification. POMDPs are typically solved through a policy over belief states. We show how a relatively small set of potentially important belief states can be identified, and define an MDP over these belief states. To identify these potentially important belief states, we construct standard decision trees over all attribute subsets, and the leaves of these trees become the state space of our tree-based MDP. At each phase we decide on the next attribute to measure, balancing the cost of the measurement and the classification accuracy. We compare our approach to a set of previous approaches, showing our approach to work better for a range of misclassification costs.
KW - Cost sensitive classification
KW - Decision trees
KW - MDP
KW - POMDP
UR - http://www.scopus.com/inward/record.url?scp=85097069792&partnerID=8YFLogxK
U2 - 10.1016/j.artint.2020.103400
DO - 10.1016/j.artint.2020.103400
M3 - Article
AN - SCOPUS:85097069792
VL - 292
JO - Artificial Intelligence
JF - Artificial Intelligence
SN - 0004-3702
M1 - 103400
ER -