Consider a node in a decision tree with n+ positive and n. negative training examples. If the node is split to its k chi

Business, Finance, Economics, Accounting, Operations Management, Computer Science, Electrical Engineering, Mechanical Engineering, Civil Engineering, Chemical Engineering, Algebra, Precalculus, Statistics and Probabilty, Advanced Math, Physics, Chemistry, Biology, Nursing, Psychology, Certifications, Tests, Prep, and more.
Post Reply
answerhappygod
Site Admin
Posts: 899603
Joined: Mon Aug 02, 2021 8:13 am

Consider a node in a decision tree with n+ positive and n. negative training examples. If the node is split to its k chi

Post by answerhappygod »

Consider A Node In A Decision Tree With N Positive And N Negative Training Examples If The Node Is Split To Its K Chi 1
Consider A Node In A Decision Tree With N Positive And N Negative Training Examples If The Node Is Split To Its K Chi 1 (43.27 KiB) Viewed 35 times
Consider a node in a decision tree with n+ positive and n. negative training examples. If the node is split to its k children. The average weighted Gini index of the children is given by, Gini (children) = "- XGini (2) t where nk is the number of training examples associated with the child node tk and n= 2,1 = n+ + n... Apply the formula to calculate the average weighted Gini for each of the candidate test conditions A, and B shown below. Based on their Gini values, which attribute, A or B, should be chosen to split the parent node? A 00 B +: 15 -: 5 +: 5 -: 10 +: 10 - : 15 +: 20 -: 10 +: 10 -:20
Join a community of subject matter experts. Register for FREE to view solutions, replies, and use search function. Request answer by replying!
Post Reply