Вы находитесь на странице: 1из 4

Hints and Answers to Selected Problems of the Exercises

Chapter 1.
1.1
1.2

Hint : See the first and second paragraph of section 1.2 (Fuzzy Systems).
Hint : See Table 1.2 (Soft Computing Techniques) and the subsequent text.

Chapter 2.
2.1.

Introduction

Fuzzy Sets

Hint : Apply Venn diagram.

2.3 & 2.4.


Hint : Use a graph paper. For various values of x, see the value of (x), and then map
this value to the new value using the transformation function.
2.7

A = { 0, 1, 2, 3 }, B = { 2, 3, 5 }
R = { ( 0, 2 ), ( 0, 3 ), ( 0, 5 ), ( 1, 2 ), ( 2, 3 ), ( 2, 5 ), ( 3, 2 ) }
S = { ( 0, 2 ), ( 0, 3 ), ( 0, 5 ), ( 1, 3 ), ( 2, 5 ), ( 3, 5 ) }
T = { ( 2, 0 ), ( 3, 0 ), ( 3, 2 ), ( 3, 4 ), ( 5, 0 ), ( 5, 2 ) }

2.9

See Example 2.28 (-cut).

2.10

F = 0.6/a + 0.2/b + 0.3/c + 0.9/d


F0.2 = { a, b, c, d } and 0.2F0..2 = 0.2/a + 0.2/b +0.2/c + 0.2/d
F0.3 = { a, c, d } and 0.3F0.3 = 0.3/a + 0.3/c + 0.3/d
F0.6 = { a, d } and 0.6F0.6 = 0.6/a + 0.6/d
F0.9 = { d } and 0.9F0.9 = 0.9/d

2.11

See Example 2.32 (Fuzzy Cardinality).

2.12 & 2.13

See Examples 2.32 (Fuzzy Cardinality) and 2.33 (Fuzzy Extension principle).

Chapter 3.

Fuzzy Logic

3.1

Hint : Realize AND and OR using , and .

3.2

Hint : Apply truth table method.

3.3 & 3.4

See Example 3.3 (Validity of an argument).

3.5

Hint : Recall that a collection of statements is said to be consistent if they can all be true
simultaneously. Construct the truth table and check.

3.10

See Examples 3.19 (Zadehs interpretation of fuzzy rule)

3.11 & 3.12

See Examples 3.21 (Fuzzy reasoning with the help of Generalized Modus Ponens)

Chapter 4.

Fuzzy Inference Systems

4.1

See Sections 4.7.1 (Fuzzy air conditioner controller) and 4.7.2 (Fuzzy cruise controller).

Chapter 5.

Rough Sets

5.1

Proof follows from the definitions of equivalence relations and indiscernibility.

5.2

Let x B (U-X). Then [x]B U X [x]B X [x]B X = x B (X) x UB (X). Therefore, B (U-X) U- B (X). Similarly prove B (U-X) U- B (X).

5.5

The reduced table is


#
1
2
3
4
5
6
7
8
9
10

Customer
Name
Mili
Bill
Rita
Pam
Maya
Bob
Tony
Gaga
Sam
Abu

Gender
(GD)
F
M
F
F
F
M
M
F
M
M

Amount
(A)
High
Low
High
High
Medium
Medium
Low
High
Low
Low

Payment
Mode (P)
CC
Cash
CC
CC
Cash
CC
Cash
CC
Cash
Cash

Considering the indiscernible set of objects { 1, 3, 4, 8 } we derive the rule : IF Gender = F


AND (Age = High) THEN (Payment Mode = Credit Card). Obtain the other rules similarly.

5.7

See Example 5.17 (Data Clustering).

Chapter 6.
6.2

Artificial Neural Networks : Basic Concepts

3 input, 1 output net with w1 = w2 = w3 = 1, and activation function


1, if y_in 2
y_out = f ( y_in ) =
0, otherwise

6.3

Hint : Draw two straight lines separating the sets A and B.

6.4

Hint : What is the equation of a plane that separates the two classes?

6.6

See Example 6.6 (Realizing the logical AND function through Hebb learning).

6.7

See Example 6.7 (Learning the logical AND function by a perceptron).

6.9

See Example 6.9 (Competitive learning through winner-takes-all strategy).

Chapter 7.

Elementary Pattern Classifiers

7.1

Hint : Take the training data in bipolar form and then apply Hebb learning rule.

7.3

See Example 7.4 (ADALINE training for the AND-NOT function).

7.5

See Example 7.5 (MADALINE training for the XOR function).

Chapter 8.
8.1

Pattern Associators

An n-input auto-associative net can store at most n - 1 patterns.


So,
1.
Check whether number of patterns <= n-1
2.
Check orthogonality of input patterns
3.
Obtain weight matrix of individual pattern [See Sec. 8.1.1]
4.
Obtain weight matrix for storage of multiple patterns
See Example 8.5 (Storage of multiple bipolar patterns in auto-associative neural nets) and
8.6 (Recognizing patterns by auto-associative nets storing multiple patterns).

8.2

See Example 8.9 (Recognition of noisy input by hetero-associative net) for single pair
pattern association. Multiple associations are stored by adding up the individual weight
matrices.

8.3

Hint: Use bipolar representation of input string. See Example 8.10 (Computing the weight
matrix of a Hopfield net and testing its performance).

8.4

Find weight matrix for individual pattern association. Add weight matrices of individual
patterns to determine overall weight matrix. See Example 8.13 (Storage of multiple
associations on a BAM).

Chapter 9.

Competitive Neural Nets

9.1

See Example 9.1 (Clustering by MAXNET).

9.2

See Example 9.2 (Learning by Kohonens self-organizing map)..

9.3

See Examples 9.3 (Learning by LVQ net) and 9.4 (Clustering application of LVQ net).

9.4

See Examples 9.5 (Learning by ART1 net) and 9.6 (ART1 net operation).

Chapter 10. Backpropagation


10.2. Hint : You may consider a network with 8 nodes at the input layer corresponding to 8 letters
of the word computer. A single output is sufficient for the Yes/No decision. Design a
procedure to map a given word to some numerical expression suitable to feed the proposed
net. Train the net with the given training set. Test the trained net with the given test words.
Do the outcome tallies with your intuitive solution?
Chapter 11. Elementary Search Techniques
11.1

Hint: Consider the entire X-Y plane as a 2-dimensional array of cells, each cell having unit
length and breadth. Then take g ( n ) = length of the actual path from the starting cell to the
current cell n, and h ( n ) = the Manhattan distance between n and the destination. See
Example 11.6 (An A* algorithm to solve a maze problem).

11.5

Hint: Include additional rules, e.g., <adverb> very <adverb>. However, this will enable
you to put as many verys as you want. However, at most one very is to be allowed, then

include production rules such as <adverb> very <adverb> etc.


11.6

See Problem 11.7 (Solving the Satisfiability problem using AND-OR graph).

11.7

Hint : The production rules will be expanded as AND arcs. When there is a choice of
production rules, i.e. there are several rules with the same string on the left hand side, use
OR arcs.

11.10 See Problem 11.10 (Applying constraint satisfaction to solve cryptarithmetic puzzle).
11.11 See Problem 11.9 (Applying constraint satisfaction to solve crossword puzzle).
11.12 See the text on the Map colouring problem in Subsection 11.4.8.
11.13 See Problem 11.2 (Monkey and Banana problem)
Chapter 12. Advanced Search Strategies
12.1

Hint: The chromosomes will be a binary string of length n k, where n is the number of
nodes and k = ceiling ( log 2 n ). Given such a chromosome, it is divided into n parts each
consisting of k bits. The first k-bit substring encodes 1, the 2nd k-bit string encodes node 2
and so on. As the function f is to be minimized, we may take 1/f as the fitness function. The
initial population is generated randomly.

12.2

Hint : Encode the solutions as in Ex. 12.1. Use f directly as the energy function. Try with
Tmax = 100, Tmin = 0.01, = 0.8. Tune the parameters if necessary.

Вам также может понравиться