Page 1 of 1

Consider a node in a decision tree with n+ positive and n. negative training examples. If the node is split to its k chi

Posted: Fri May 20, 2022 3:14 pm
by answerhappygod
Consider A Node In A Decision Tree With N Positive And N Negative Training Examples If The Node Is Split To Its K Chi 1
Consider A Node In A Decision Tree With N Positive And N Negative Training Examples If The Node Is Split To Its K Chi 1 (43.27 KiB) Viewed 36 times
Consider a node in a decision tree with n+ positive and n. negative training examples. If the node is split to its k children. The average weighted Gini index of the children is given by, Gini (children) = "- XGini (2) t where nk is the number of training examples associated with the child node tk and n= 2,1 = n+ + n... Apply the formula to calculate the average weighted Gini for each of the candidate test conditions A, and B shown below. Based on their Gini values, which attribute, A or B, should be chosen to split the parent node? A 00 B +: 15 -: 5 +: 5 -: 10 +: 10 - : 15 +: 20 -: 10 +: 10 -:20