Вы находитесь на странице: 1из 16

DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING ARTIFICIAL INTELLIGENCE UNIT I INTRODUCTION AND PROBLEM SOLVING I Possible 2 marks: 1.

What is AI? The study of how to make computers do things at which, at the moment, people are better. 2.What 1. 2. 3. 4. are the categories of AI? Systems that act like humans. Systems that think like a humans. Systems that think rationally. Systems that act rationally.

3.What is meant by turing test? It was designs to give a satisfactory operational definition of intelligence. Turing defined the intelligent behavior as the ability to achieve human-level performance in all cognitive tasks, sufficient to fool an interrogat or. 4.What are the capabilities that a computer should process? The capabilities are: 1. Natural language processing. 2. Knowledge representation. 3. Automated reasoning. 4. Machine learning. 5. Define agent with example. An agent is anything that can be viewed as perceiving its enviro nment through sensors and acting upon that environment through actuators. Ex: Human Agents, Robotics agents & Software agents. 6. Define rational agent. A rational agent is one that does the right thing. A system is r ational if it does the right thing, given what it knows. 7. State the needs of a computer to pass the turing test. i) Computer Vision: To perceive Objects. ii) Robotics: To manipulate objects and move about.

8. What is called as an omniscience agent? It is one which knows the actual outcome of its actions & can ac t accordingly. 9. Define agent program. The agent is a concrete implementation, running on the agent arc hitecture. They take the current percept as input from the sensors and return to the actuators. 10. Define agent function. It is an abstract mathematical description. That maps any given percept sequence to an action. 11. State the properties of task environment. 1. Fully observable Vs Partially observable. 2. Deterministic Vs Stochastic.

3. 4. 5. 6.

Episodic Vs Sequential Static Vs Dynamic Discrete Vs Continuous Single agent Vs Multi agent

12. What are the basic kinds of agent program? i) Simple reflex agents. ii) Mode-based reflex agents. iii) Goal based agents and iv) Utility-based agents. 13. Differentiate episodic vs sequential. In an episodic task environment, the agents experience is divide d into atomic episodes. Each episode consists of the agent perceiving and then p erforming a single action. The current decision does not affect whether the next part is de fective. In sequential environments, the current decision could affect al l future decisions. Chess and taxi driving are sequential. 14. Define problem solving agent. Problem solving agents decide what to do by finding sequences of actions that lead to desirable states. 15. What is backtracking search? A variant of depth-first search called backtracking search uses still less memory. Only one successor is generated at a time rather than all suc cessors. Each partially expanded node remembers which successor to generate next . 16. What do you mean by depth limited search? The problem of unbounded trees can be alleviated by supplying de pth-first with a predetermined depth limit l. That is, nodes at depth l are trea ted as if they have no successors. This approach is called depth-limited search. 17. What are the problems arises when knowledge of the states or actions is inco mplete? 1. Sensor less problems 2. Contingency problems 3. Exploration problems 18. What are the steps to evaluate an algorithms performance? 1. Completeness 2. Optimality 3. Time Complexity 4. Space Complexity 19. Give examples for real world problems. i) The route finding ii) Touring iii) Traveling sales person iv) Robot navigation 20. What are the four components in problem? i) Initial state ii) Actions iii) Goal state iv) Path cost

21. What is called as a uniformed search? This term has no information about the number of steps or path c ost current to goal state. They can distinguish a goal state from a non-goal sta te. Also known as blind search.

22. What is called informed search? It is one that uses problem-specific knowledge beyond the defini tion of the problem itself and can find solutions more efficiently than an uninf ormed strategy. 23. Give the complexity of a breath-first search. The time complexity is O(b d), where, d is the depth and b is nu mber at each level. 24. What is iterative deepening search? It is an abstract mathematical description. That maps any given percept sequence to an action. 25. What is breadth first search? The root node is expanded first then all the nodes generated by the root node are expanded next and their successors and so on. Possible 12 mark questions: 1. Explain in detail the history of Artificial Intelligence. 2. What is meant by PEAS? List out few agents types and describes their PEAS? 3. Explain in detail about properties of task environment. Give their characteri stics. 4. Explain in detail about the four kinds of agent program. 5. Explain in detail the advantage and disadvantage of depth-first search. 6. Explain in detail iterative deepening depth-first search with an algorithm f or it. 7. Describe in brief the depth-first search and breadth-first search algorithms and also mention their advantages.

UNIT II PROBLEM SOLVING II POSSIBLE 2 MARKS: 1. Define greedy best-first search. Greedy best-first search expands the node that is closest to the goal, on the grounds that this is likely to lead to a solution quickly. Thus, i t evaluates nodes by using the heuristic function f(n) = h(n).

2.

Define A* search. A* search evaluates nodes by combining g(n), the cost to reach t he node, and h(n), the cost to get from the node to the goal. f (n)=g(n)+h(n) 3. Define Consistency. A heuristic h(n) is consistent if, for every node n and every su ccessor n of n generated by any action a, the estimated cost of reaching the goa l n is no greater than the step cost of getting to n plus the estimated cost of reaching the goal from n: h(n) <= c(n, a, n) | h(n) 4. What do you mean by Recursive best-first search? Recursive best-first search is a simple recursive algorithm that attempts to minimize the operation of standard best-first search, but using onl y linear space. 5. What are the reasons that hill climbing often gets stuck? Local maxima: A local maximum is a peak that is higher than each of it s neighboring states, but lower than the global maximum. Ridges: Ridges results in a sequence of local maxima that is ver y difficult for greedy algorithms to navigate. Plateaux: A Plateaux is an area of the state space landscape where the evaluatin function is flat. 6. Define Hill Climbing Search. The hill climbing search algorithm is simply a loop that continu ally moves in the direction of increasing value that is uphill. It terminates wh en it reaches a peak where no neighbor has a higher value. 7. 8. Mention the types of hill-climbing search. Stochastic hill climbing First-choice hill climbing Random-restart hill climbing

Why a hill climbing search is called a greedy local search? Hill climbing is sometimes called greedy local search because it grabs a good neighbor state without thinking ahead about where to go next.

9.

Define genetic algorithm. A genetic algorithm is a variant of stochastic beam in which suc cessor states are generated by combining two parent states, rather than by modif ying a single state. 10. Define Linear Programming problem. Linear programming problem is in which the constraints must be l inear inequalities forming a convex region and the objective function is also li near. This problem can be solved in time polynomial in the number of variables. 11. Define online search problems. An online search problem can be solved only by an agent executin g actions, rather than by a purely computational process. Assume that the agent knows the following: ACTIONS(s), which returns a list of actions allowed in states.

The steps-cost function c(s, a, s) note that this cannot be used until the agent knows that s is the outcome. GOAL-TEST(s) 12. Define constraint satisfaction problem. Constraint Satisfaction problem is defined by a set of variables , X1, X2,..Xn and a set of constraints, c1,c2,..,cm. Each variable xi has a nonemp ty domain Di of possible values. Each constraints Ci involves some subset of the variables and specifies the allowable combinations of values for that subset. 13. Define linear constraints. Linear constraints are the constraints in which each variable ap pears only in linear form. 14. What are the types of Constraints? Unary Constraints: Unary constraints are one which restricts the value of a single variable. Binary Constraints: Binary constraints are one with only binary constraints. It can be represented as a constraint graph. 15. Define Triangle Inequality. A heuristic h(n) is consistent if, for every node n and every su ccessor n pf n generated by any action a, the estimated cost of reaching the goal form n is no greater than the step cost of getting to n plus the estimated cost of reaching the goal form n: This is a form of the general triangle equality. 16. Define game. A game can be defined by the initial state, the legal actions in each state, a terminal test and a utility function that applies to terminal sta tes. 17. What is alpha-beta pruning? The problem with minimax search is that the number of games stat es it has to examine is exponential in the number of moves. We cant eliminate the exponent, but we can effectively cut it in half. The trick is that is possible to compute the correct minimax decision without looking at every node in the gam e tree. This technique is called alpha-beta pruning. 18. Define Offline search. Offline search algorithms compute a complete solution before set ting in the real world and then execute the solution without recourse to their p ercepts. 19. Define the term backtracking search. Backtracking search is used for a depth first search that choose s values for one variable at a time and backtracks when a variable has no legal values left to assign. 20. When a problem is called commutative? A problem is commutative if the order of application of any give n set of actions has no effect on the outcome.

21. What do you mean by minimum remaining values? Choosing the variable with the fewest legal values is called the m inimum remaining values heuristic. Otherwise called as most constraint variable or fail first 22.Define informed search strategy. Informed search strategy is one that uses problem specific knowledge beyond the definition of the problem itself that can find solution s more efficiently that an uniformed strategy. 23.What do you mean by Best First Search approach? Best first search is an instance of the general TREE SEARCH algo rithm in which a node is selected for expansion based on an evaluation function, f (n). 24. Define heuristic function. Best first search typically use a heuristic function h (n) that estimates the cost of the solution from n. h(n) = estimated cost of the cheapest path from node n to a goal node. Possible 12 mark questions: 1. Explain the following with an example. a) Greedy best first search. b) Recursive best first search. 2. Trace the operation of A* search applied to the problem of getting to Buch arest from Lugoj using the straight-line distance heuristic. 3. Invent a heuristic function for the 8-puzzle that sometimes overestimate s, and show how it can lead to a suboptimal solution on a particular problem. 4. Relate the time complexity of LRTA* to its space complexity. 5. Describe a hill climbing approach to solve TSPs. 6. Describe a genetic algorithm approach to the traveling sales person prob lem. 7. Explain backtracking search for CSPs with an example. 8. Explain Minimax algorithm with an example. 9. Explain alpha-beta pruning in detail. UNIT III KNOWLEDGE REPRESENTATION POSSIBLE 2 MARKS: 1. What are the standard quantifiers of First Order Logic? The First Order Logic contains two standard quantifiers. They are: i) Universal Quantifiers ii) Existential Quantifiers 2. Define Universal Quantifier with an example. To represent All elephants are mammal Raj is an elephant is represented by E lephant (Raj) and Raj is a mammal. The first order logic is given by X Elephant (x) => Mammal (x)

Refers toFor all. P is any logical expression, which is eq uivalent to the conjunction (i.e. the ^) of all sentences obtained by substituti ng the name of an object for the variable x where if appears in p. The above sen tence is equivalent to

Elephant (Raj) => Mammal (Raj) Elephant (John) => Mammal (John) Thus it is true if and only if, all the above sentences are true that is if p is true for all objects x in the universe. Hence, is c alled universal quantifier. 3. Define Existential Quantifier with an example. Universal quantification makes statements about every object. Similarly, We can make statement about some object in the universe without naming it, by u sing an existential quantifier. To say, for example, that king john has a crown on his head, we write x Crown(x) ^ OnHead (x, John) x is pronounced There exists an x such that.. or For some x The sentence says that P is true for at least one object x. Henc e, is called existential quantifier.

4. Define Nested Quantifier with an example. The Nested Quantifier is to express the more complex sentences using mul tiple quantifiers. For example, Brothers are siblings can be written as x y Brother (x, y) => Sibling (x, y) Consecutive quantifiers of the same type can be written as one quantifie r with several variables. For example, to say that siblinghood is a symmetric re lationship, we can write x, y Sibling (x, y) Sibling (y, x)

5. Explain the connections between and . The two quantifiers can be connected with each other through negation. I t can be explained through negation. It can be explained with the following exam ple. Eg: x Likes(x, IceCream) is equivalent to x Likes (x, IceCream) This means Everyone likes ice cream is equivalent to there is no one who do es not like ice cream. 6. What is the use of equality symbol? The equality symbol is used to make the statements more effective that t wo terms refer to the same object. Eg: Father (John) = Henry 7. Define Higher Order Logic. The Higher Order Logic allows quantifying over relations and functions a s well as over objects. are Eg: The two objects are equal if and only if, all the properties to them equivalent. x, y (x=y) ( p p(x) p(y))

8. Define First Order Logic. First Order Logic, a representation language that is far more powerful t han propositional logic. First Order Logic commits to the existence of objects a nd relations. Eg: One plus two equals three Objects - one, two & three Relations - equals Functions - plus 9. What is called declarative approach? The representation language makes it easy to express the knowledge in th e form of sentences. This simplifies the construction problem enormously. This i s called as declarative approach. 10. State the aspects of a knowledge representation language. A knowledge representation language is defined in two aspects: i) Syntax: The syntax of a language describes the possible confi guration that can constitute sentences. ii) Semantics: It determines the facts in the world to which the sentenc es refer. 11. What is called entailment? The generations of new sentences that are necessarily true given the old sentences are true. This relation between sentences is called entailment. 12. What is meant by tuple? A tuple is a collection of objects arranged in a fixed order and is writ ten with angle brackets surrounding the objects. {< Richard the Lionheart, King John>, <King John, Richard the Lion hea rt>} 13. What is Propositional Logic? Propositional Logic is a declarative language because its semantics is b ased on a truth relation between sentences and possible worlds. It also has suff icient expressive power to deal with partial information, using disjunction and negation. 14. What is compositionality in propositional logic? Propositional Logic has a third property that is desirable in r epresentation languages, namely compositionality. In a compositionality language , the meaning of sentences is a function of the meaning of its parts. For exampl e, S1 ^ S2 is related to the meanings of S1 and S2. 15. Define Symbols. The basic syntactic elements of first order logic are the symbol s that stand for objects, relations and functions. The symbols are in three kind s. Constant symbols which stand for objects, Predicate symbols which stand for r elations and Function symbol which stand for functions. 16. Define ground term, Inference. The term without variables is called ground term.

The task of deriving the new sentence from the old is called Inf erence. 17. Define Datalog. The set of first order definite clauses with no function symbols is called datalog. Eg: The country Nono, an enemy of America Enemy(Nono, America) The absence of function symbols makes inference much easier. 18. What is Pattern Matching? The inner loop of the algorithm involves finding all possible unifiers suc h that the premise of a rule unifies with a suitable set of facts in the knowled ge base. This is called Pattern Matching. 19. What is Data complexity? The complexity of inference as a function of the number of ground facts in the database is called data complexity. 20. Define Prolog. Prolog programs are sets of definite clauses written in a notation somew hat different from standard first-order logic.

21. What are the principal sources of Parallelism? The first called OR-Parallelism comes from the possibility of a goal uni fying with many different clauses in the knowledge base. Each gives rise to an i ndependent branch in the search space that can lead to a potential solution and branches can be solved in parallel. The second called AND-Parallelism comes from the possibility of solving each conjunct in the body of an implication in parallel. 22. Define conjunctive normal form. First Order resolution requires that sentences be form that is, a conjunction of clauses, where each clause terals. Literals can contain variables, which are assumed ied. For ex, the sentence X American(x) ^ Weapon(y) Sells(x,y,z) al(x) Becomes, in CNF, American(x) lls(x,y,z) Hostile(z) Criminal(x) in conjunctive normal is a disjunction of li to universally quantif ^ Hostile(z) => Crimin

Weapon(y)

Se

23. Define Skolemization. Skolemization is the process of removing existential quantifiers by elim ination. 24. What is the other way to deal with equality? Another way to deal with an additional inference rule is Demodulation Para modulation 25. Define the ontology of situation calculus. Situations, which denote the states resulting from executing actions

. This approach is called Situation Calculus. Situations are logical terms consisting of the initial situation and all situati ons that are generated by applying an action to a situation. Fluent are functions and predicates that vary from one situation to the next, su ch as the location of the agent. Atemporal or eternal predicates and functions are also allowed. Possible 12 mark questions: 1. Explain the various steps associated with the knowledge engineering proc ess? Discuss them by applying the steps to any real world application of your ch oice. 2. What are the various ontologies involved in situation calculus? 3. How did you solve the following problems in Situation Calculus? a) Representation frame problems b) Inferential frame problems 4.Illustrate the use of first-order-logic to represent the knowledge 5. Explain the forward chaining and backward chaining algorithm with an example. 6.Explain the steps involved in representing knowledge using first order logic . 7. What do you understand by symbols, interpretations and qualifiers? 8. How are the facts represented using prepositional logic? Give an example. 9. Describe Non-Monotonic logic with an example. UNIT IV LEARNING 1. What is learning? Learning takes many forms, depending on the nature of the perfor mance element, the component to be improved, and the available feedback. 2. a) b) c) 3. a) What are the types of Machine learning? Supervised Unsupervised Reinforcement

Define the following: Classification: Learning a discrete-valued function is called classification. b) Regression: Learning a continuous function is called regression. 4. What is inductive learning? The task is to learn a function from examples of its inputs and outputs is called inductive learning. 5. When will a learning problem is said to be realizable or unrealizable? A hypothesis space consisting of polynomials of finite degree re present sinusoidal functions accurately, so a leaner using that hypothesis space will not be able to learn from sinusoidal data. A Learning process is realizable if the hypothesis space contain s the true function, otherwise it is unrealizable. 6. What is decision tree? A decision tree takes as input an object or situation described by a set of attributes and returns a decision, the predicted output value for the input. The input can be discrete or continuous.

7.

Define goal predicate.

To define the goal predicate should have the following list of a ttributes. a) Alternate b) Bar c) Fri/Sat d) Hungry e) Patrons f) Price g) Raining h) Reservation i) Type j) Wait Estimate 8. Define the kinds of functions. The kinds of functions are a real problem. Parity function: Returns 1 if and only if an even number of inputs are 1, then an exponentially large decision tree will be needed. Majority function: Returns 1 if more than half of its inputs are 1.

9.

Define training set. The positive examples are the ones in which the goal WillWait i s true (X1, X3,.); the negative examples are the ones in which it is false (X2, X5 ,..). The complete set of examples is called the training set. 10. How do you assess the performance of the learning algorithm? A learning algorithm is good if it produces hypotheses that do a good job of predication the classifications of unseen examples. We do this on a set of examples known as the test set. It is more convenient to adopt the follo wing methodology: a) Collect a large set of examples. b) Divide it into two disjoint sets: the training set and the test set. c) Apply the learning algorithm to the training set, generating a hypothesi s h. d) Measures the percentage of examples in the test set that are correctly c lassified by h. e) Repeat steps 1 to 4 for different sizes of training sets and different r andomly selected training sets of each size. 11. Define Overfitting. Whenever there is a large set of possible hypotheses, one has to be careful not to use the resulting freedom to find meaningless regularly in the data. This problem is called overfitting. 12. What is ensemble learning? The idea of ensemble learning methods is to select a whole colle ction, or ensemble, of hypotheses from the hypothesis space and combine their pr edictions. 13. Define Weak learning algorithm. If the input learning algorithm L is a week learning algorithm w hich means that L always returns a hypothesis with weighted error on the trainin g set that is slightly better than random guessing.

14.

Define computational learning theory. The approach taken in this section is based on computational lea rning theory, a field at the intersection of AI, statistics, and theoretical com puter science. 15. What do you mean by PAC-learning algorithm? Any learning algorithm that returns hypothesis that are probably approximately correct is called a PAC-learning algorithm.

16.

What is an error? The error of a hypothesis h with respect to the true function f given a distribution D over the examples as the probability that h is different from f on an example. Error (h) = p (h(x) = f(x)|x drawn from D) 17. Define Sample Complexity. The number of required examples, as a function of E and, is call ed the sample complexity of the hypothesis space. 18. Define neural networks. A neuron is a cell in the brain whose principal function is the collection, processing and dissemination of electrical signals. 19. Define units in neural networks. Neural networks are composed of nodes or units connected by dire cted links. A link from unit j to unit i serve to propagate the activation aj fr om j to i. 20. a. b. 21. Mention the types of neural structures. Feed-forward networks Cyclic or recurrent network.

Define epoch. Each cycle through the examples is called an epoch. Epochs are r epeated until some stopping criterion is reached typically that the weight chang es have become very small. 22. What do you mean by Bayesian learning? Bayesian learning methods formulate learning as a form of probab ilistic inference, using the observation to update a prior distribution over hyp otheses. This approach provides a good way to implement Ockhams razor, but quickl y becomes intractable for complex hypothesis spaces. 23. What is reinforcement? The problem is this without some feedback about what is good and what is bad, the agent will have no grounds for deciding which move to make. Th e agent needs to when it loses. This kind of feedback is called a reward, or rei nforcement. 24. Define passive learning. The agents policy is fixed and the task is to learn the utilities of states, this could also involve learning a model of the environment.

25. Define the following: a. Utility-based agent: Learns a utility function on states and uses it to select actions that maximize the expected outcome utility. b. Q-learning agent: Learns an action-value function or Q-function, giving the expected utility of taking a given action in a given state. c. Reflex agent: Learns a policy that maps directly from states to actions.

Possible 12mark questionS: 1. Explain with proper example how EM algorithm can be used for learning wi th hidden variables. 2. Describe how decision trees could be used for inductive learning. Explai n its effectiveness with a suitable example. 3. Explain the explanation-based learning. 4. Discuss on learning with hidden variables. 5. i) What do you understand by soft computing? 6. ii)Differentiate conventional and formal learning techniques / Theory an d learning via forms of reward and punishment. 7. Discuss partial order planning with unbound variables. 8. With reference to planning discuss progression and regression. 9. What are the languages suited for planning?

UNIT V APPLICATIONS Possible two marks: 1. What is communication? Communication is the intentional exchange of information brought about by the production and perception of signs drawn from a shared system of c onventional signs. Most animals use signs to represent important messages. 2. Define language. Language enables us to communicate most of what we know about th e world. 3. Why would an agent bother to perform a speech act when it could be doing a regular action? A group of agents exploring together gains an advantage by being able to do the following. Query Inform Request Acknowledge Promise 4. Differentiate formal language Vs natural language. Formal language: A formal language is defined as a set of strings. Each string is a conca tenation of terminal symbols called words. For example, a language in the first order logic, the terminal symbols include ^ and P, and a typical string is P ^ Q. The String is not a member of the language . Formal languages always have grammar. Natural language: Formal language is in contrast to natural Languages, such as Chinese, E nglish, that have no strict definition but are used by a community of speakers. Natural languages have no grammar. 5. Define Grammar. A grammar is a finite set of rules that specifies a language. Fo

rmal languages always have grammar. Natural languages have no grammar. 6. What are the component steps of communication? Intention Generation Synthesis Perception Analysis Disambiguation Incorporation 7. Define Lexicon. The list of allowable words called lexicon. The words are groupe d into the categories or parts of speech familiar to dictionary users. Nouns, pr onouns and names to denote things, verbs to denote events, adjective to modify n ouns and adverbs to modify verbs. 8. What are called open classes and closed classes? Nouns, Verbs, Adjectives and Adverbs are called open classes. Pronoun, Article, Preposition and Conjunction are called closed classes. 9. Define grammar overgenerates, undergenerates. The grammar overgenerates is that generates sentences that are n ot grammatical. Ex: I smell pit fold wumpus nothing east. The grammar undergenerates is that generates sentence with gramm ar. Ex: I think the wumpus is smelly 10. Define parsing (or) Syntactic parsing. Parsing is the process of finding a parse tree for a given input string. That is, a call to the parsing function PARSE, such as PARSE(the wumpus is dead, 0, S) Should return a parse tree with root S whose leaves are the the wumpus is dead and whose internal nodes are nonterminal symbols from the grammar 0.

D fin Semantic Interpretation. The extraction of the meaning of utterance is called Semantics. Semantic interpretation is the process of associating a First Order Logic expres sion with a phrase. 12. What are the properties of Intermediate form? The Intermediate form is to mediate between syntax and semantics . It has two key properties. First, it is structurally similar to the syntax of the sentence and thus can tha t it can be easily constructed through compositional means. Second, it contains enough information that it can be translated into a regular first order logical sentence. 13. Define metaphor. A Metaphor is a figure of speech in which a phrase with one lite ral meaning is used to suggest a different meaning by way of an analogy. 14. What are the models of knowledge? World model Mental model Language model Acoustic model

11.

15.

Define discourse. A discourse is any string of language usually that is more than one sentence long. 16. Define Reference resolution. Reference resolution is the interpretation of a pronoun or a def inite noun phrase that refers to an object in the world. 17. Mention the list of coherence relations. Enable or cause Explanation Ground-figure Evaluation Exemplification Generalization Violated Expectation

18.What is grammar induction? Grammar induction is the task of learning a grammar from data. 19.What is information retrieval? Information retrieval is the task of finding documents that are relevant to a users need for information. The best known example of information r etrieval systems are search engines on the World Wide Web. An information retrieval can be characterized by: 1. A document collection 2. A query posed in a query language 3. A result set 4. A representation of the result set. 20.What is information extraction? Information extraction is the process of creating database entri es by skimming a text and looking for occurrences of a particular class of objec t or event and for relationships among those objects and events. 21.What is context-sensitive grammar? Context-sensitive grammars are restricted only in that the right -hand side must contain at least as many symbols as the left-hand side. The name context sensitive comes from the fact that a rule such as A S B A * b says that a n S can be rewritten as an X in the context of a preceding A and following. 22.Define Language Modeling. Language modeling approach is one which estimates a language mod el for each document and then, for each query, computes the probability of the q uery, given the documents language model. 23.What is a regular expression? A regular expression defines a regular grammar in a single text string. These are used in UNIX commands such as grep, in programming languages s uch as Perl, and in word processors such Microsoft word. 24.What is cascaded finite-state transducer? Cascaded finite-state transducer consist of a series of finite-s tate automata, where automation receives text as input, transducers the text int o a different format, and passes it along to the next automation. Possible 12 mark questions:

1. Explain the Machine Translation System with a neat sketch. Analyze its l earning probablities. 2. Perform Bottom Up and Top Down Parsing for the input the wumpus is dead. 3. i) Describe the process involved in communication using the example sent ence the wumpus is dead ii) Write short notes on semantic representation. 4. Explain briefly about the following: i) Information retrieval (ii) Information extraction. 5. Construct semantic net representation for the following: i) Pomepeian (Marcus), Blacksmith (Marcus) ii) Mary gave the green flowered vase to her favorite cousin. 7. Construct partitioned semantic net representations for the following: i) Every batter hit a ball.ii) All the batters like the pitcher. 8. Illustrate the learning from examples by induction with suitable example s.