The gain for a split on an attribute of a training dataset for a
decision tree modeling is always positive irrespective of the type
of impurity measure.
True
False
The validation error rate is the only reliable measure for
selecting models even if the validation set is too small.
True
False
For a nearest neighbor classifier if the parameter k is too
small, then it may be susceptible to overfitting due to noise.
True
False
The conditional probability of variable Y given variable X,
P(Y|X) is equal to the joint probability of X and Y, P(X, Y)
divided by the probability of Y, P(Y).
True
False
Support Vector Machine is a lazy learner.
True
False
Consider a transaction dataset that contains five items,
{A,B,C,D,E}. Suppose the support of itemset {A, B} is the
same as the support of itemset {A, B, C}, then the
confidence of the rule {A, B} ⇒ {C} is 80%.
True
False
Consider a transaction dataset that contains five items,
{A,B,C,D,E}. Suppose the support of itemset {A} is the
same as the support of itemset {A, C}, then all
transactions that contain the item A may not contain the item
C.
True
False
In clustering, objects are grouped together based on the
principle of maximizing the interclass distance and minimizing the
intraclass distance.
True
False
Density based clustering, e.g., DBSCAN is less susceptible to
noise.
True
False
DBSCAN works very well even if the dataset has clusters with
varying densities.
True
False
The gain for a split on an attribute of a training dataset for a decision tree modeling is always positive irrespective
-
- Site Admin
- Posts: 899603
- Joined: Mon Aug 02, 2021 8:13 am