On This Page

This set of Machine Learning (ML) Multiple Choice Questions & Answers (MCQs) focuses on Machine Learning Set 27

Q1 | Which of the following is true about Residuals ?
  • A) Lower is better
  • B) Higher is better
  • C) A or B depend on the situation
  • D) None of these
Q2 | Which of the following statement is true about outliers in Linear regression?
  • A) Linear regression is sensitive to outliers
  • B) Linear regression is not sensitive to outliers
  • C) Can’t say
  • D) None of these
Q3 | Suppose you plotted a scatter plot between the residuals and predicted values in linear regression and you found that there is a relationship between them. Which of the following conclusion do you make about this situation?
  • A) Since the there is a relationship means our model is not good
  • B) Since the there is a relationship means our model is good
  • C) Can’t say
  • D) None of these
Q4 | Naive Bayes classifiers are a collection ------------------of algorithms 
  • Classification
  • Clustering
  • Regression
  • All
Q5 | Naive Bayes classifiers is _______________ Learning
  • Supervised
  • Unsupervised
  • Both
  • None
Q6 | Features being classified is independent of each other in Naïve Bayes Classifier
  • False
  • true
Q7 | Features being classified is __________ of each other in Naïve Bayes Classifier
  • Independent
  • Dependent
  • Partial Dependent
  • None
Q8 | Conditional probability is a measure of the probability of an event given that another event has already occurred.
  • True
  • false
Q9 | Bayes’ theorem describes the probability of an event, based on prior knowledge of conditions that might be related to the event.
  • True
  • false
Q10 | Bernoulli Naïve Bayes Classifier is ___________distribution
  • Continuous
  • Discrete
  • Binary
Q11 | Multinomial Naïve Bayes Classifier is ___________distribution
  • Continuous
  • Discrete
  • Binary
Q12 | Gaussian Naïve Bayes Classifier is ___________distribution
  • Continuous
  • Discrete
  • Binary
Q13 | Binarize parameter in BernoulliNB scikit sets threshold for binarizing of sample features.
  • True
  • false
Q14 | Gaussian distribution when plotted, gives a bell shaped curve which is symmetric about the _______ of the feature values.
  • Mean
  • Variance
  • Discrete
  • Random
Q15 | SVMs directly give us the posterior probabilities P(y = 1jx) and P(y = 􀀀1jx)
  • True
  • false
Q16 | Any linear combination of the components of a multivariate Gaussian is a univariate Gaussian.
  • True
  • false
Q17 | Solving a non linear separation problem with a hard margin Kernelized SVM (Gaussian RBF Kernel) might lead to overfitting
  • True
  • false
Q18 | SVM is a ------------------ algorithm 
  • Classification
  • Clustering
  • Regression
  • All
Q19 | SVM is a ------------------ learning
  • Supervised
  • Unsupervised
  • Both
  • None
Q20 | The linear SVM classifier works by drawing a straight line between two classes
  • True
  • false
Q21 | What is Model Selection in Machine Learning?
  • The process of selecting models among different mathematical models, which are used to describe the same data set
  • when a statistical model describes random error or noise instead of underlying relationship
  • Find interesting directions in data and find novel observations/ database cleaning
  • All above
Q22 | Which are two techniques of Machine Learning ?
  • Genetic Programming andInductive Learning
  • Speech recognition and Regression
  • Both A & B
  • None of the Mentioned
Q23 | Even if there are no actual supervisors ________ learning is also based on feedback provided by the environment
  • Supervised
  • Reinforcement
  • Unsupervised
  • None of the above
Q24 | When it is necessary to allow the model to develop a generalization ability and avoid a common problem called______.
  • Overfitting
  • Overlearning
  • Classification
  • Regression
Q25 | Techniques involve the usage of both labeled and unlabeled data is called___.
  • Supervised
  • Semi-supervised
  • Unsupervised
  • None of the above