- A Consider A Node In A Decision Tree With N Positive And N Negative Training Examples If The Node Is Split To Its K C 1 (418.77 KiB) Viewed 52 times
a Consider a node in a decision tree with n+ positive and n_ negative training examples. If the node is split to its k c
-
- Site Admin
- Posts: 899603
- Joined: Mon Aug 02, 2021 8:13 am
a Consider a node in a decision tree with n+ positive and n_ negative training examples. If the node is split to its k c
a Consider a node in a decision tree with n+ positive and n_ negative training examples. If the node is split to its k children, The average weighted Gini index of the children is given by, Gini (children) = x, where nk is the number of training examples associated with the child node tk and n= Σ = n+ + n.. Apply the formula to calculate the average weighted Gini for each of the candidate test conditions A, and B shown below. Based on their Gini values, which attribute, A or B, should be chosen to split the parent node? Σ". = Gini (TR) n n A B +: 5 +: 15 -:5 +:10 -:15 +: 20 -:10 +: 10 -:20 : 10