Вы находитесь на странице: 1из 4

ONE MARK QUESTION

1. What is meant by a bipartite graph


A bipartite graph, also called a bigraph, is a set of graph vertices decomposed
into two disjoint sets such that no two graph vertices within the same set are
adjacent. A bipartite graph is a special case of a k-partite graph with 

2. Which programming language is used to implement Mathematica


Wolfarm
3. Is breadth first considered as a form of informed search?
No
Informed search  other name Heuristic search
The four types of informed search method are best-first search, Greedy best-first search, A*
search and memory bounded heuristic search.

4. What is meant by batch learning?


 Stochastic gradient descent is a learning algorithm that has a number of
hyperparameters. Two hyperparameters that often confuse beginners are the
batch size and number of epochs. They are both integer values and seem to do
the same thing.
 In this post, you will discover the difference between batches and epochs in
stochastic gradient descent.
After reading this post, you will know:
 Stochastic gradient descent is an iterative learning algorithm that uses a training
dataset to update a model.
 The batch size is a hyperparameter of gradient descent that controls the number
of training samples to work through before the model’s internal parameters are
updated.
 The number of epochs is a hyperparameter of gradient descent that controls the
number of complete passes through the training dataset.

5. What is a category structure called if category have several generalisations?


6. Which data driven search approach has the smallest storage demands

7.

What is the name of the most widely used TDIDT System?


A TDIDT technique for multi-label classification.
TDIDT is short for "top down induction of decision trees
8. What is the graph theory term for a Bayes network graph?

9. What is a system called that implements rule-based systems in terms of genetic


algorithms
Classifier System
10. What is the name of the kind of multicoloured diagram that defines proximity
Polyhedrons around each training instance(IBL)
A Voronoi Diagram is a partitioning of the decision surface into convex polyhedral
surroundings of the training instances
11. What kind of Clustering technique is Chameleon?
 A Hierarchical Clustering Algorithm Using Dynamic Modeling

12. Is theta subsumption an operation related to top down or bottom up ILP search?

13. Which is the first well known AI system of the EBL type?

14. What is meant by an exploitation scheme in reinforcement learning

15. Which CBR phase follows after RETRIEVE

16. Which two well known AI researchers analyzed and criticized the Perceptron

17. Name two typical properties of an Auto encoder ANN

18. What does the abbreviation LSTM stands for


Long short-term memory (LSTM) is an artificial recurrent neural network, (RNN)
architecture used in the field of deep learning
19. What is meant by a bifurcation in an associative memory?

20. Which kind of graph is characteristic for a Restricted Boltzman Machine?

21. What is meant by Padding in a CNN?


1. Perform inner and outer products of two vectors
2. Distinguish between simple examples of the classical modes of deduction
3. Judge the correctness of very small logic programs for classical operations
4. Discriminate between illustrations of different situations such as linear separability,
overfitting etc.
5. Decide on the full set of attributes relevant for a specifi category within a category
structure 
6. Application of specific search strategies on simple examples
7. Small examples using ID3 algorithm
8. Calculations of probabilities for causes given evidences using Bayes Rule
9. Simple calculations of the effect of cross over and mutuation functions
10. Calculations of distances for metrices such as Euclid, Manhattan,cosine or chess
board matrices
11. Calculations of proximity matrices
12. Simple exercise on theta subsumption
13. Perform a unification based generalisations of an example for simple EBL cases
14. Evaluation of value functions for Monte Carlo solutions to reinforcement learning
problems
15. Simple similarity measures between a new case and a stored case
16. Weight updating for a Perceptron
17. Perform a single phase update in ANN: feedforward, backpropagation of error or
weight updates
18. Illustrate how an upfolding of a vanilla RNN looks like
19. Perform a simple update (learning step) of a Hebbian Network
20. Perform a simple update (learning step) of a Hopfield Network
21. Perform a simple convolution operation based on very limited visual and receptive
fields

Вам также может понравиться