- Consider A Node In A Decision Tree With N Positive And N Negative Training Examples If The Node Is Split To Its K Chi 1 (43.27 KiB) Viewed 35 times
Consider a node in a decision tree with n+ positive and n. negative training examples. If the node is split to its k chi
-
- Site Admin
- Posts: 899603
- Joined: Mon Aug 02, 2021 8:13 am
Consider a node in a decision tree with n+ positive and n. negative training examples. If the node is split to its k chi
Consider a node in a decision tree with n+ positive and n. negative training examples. If the node is split to its k children. The average weighted Gini index of the children is given by, Gini (children) = "- XGini (2) t where nk is the number of training examples associated with the child node tk and n= 2,1 = n+ + n... Apply the formula to calculate the average weighted Gini for each of the candidate test conditions A, and B shown below. Based on their Gini values, which attribute, A or B, should be chosen to split the parent node? A 00 B +: 15 -: 5 +: 5 -: 10 +: 10 - : 15 +: 20 -: 10 +: 10 -:20