Вы находитесь на странице: 1из 2

12/16/2018 Neural Net sample xnor

In [10]: from numpy import loadtxt


lines = loadtxt("datapraktikumci.txt", comments="#", delimiter=" ", unpack=False)

In [12]: lines

Out[12]: array([[ 1., 1., 1.],


[ 1., -1., -1.],
[-1., 1., -1.],
[-1., -1., -1.]])

In [13]: X=lines[:,0:2]
y=lines[:,2]

In [14]: X

Out[14]: array([[ 1., 1.],


[ 1., -1.],
[-1., 1.],
[-1., -1.]])

In [15]: y

Out[15]: array([ 1., -1., -1., -1.])

In [25]: from sklearn.neural_network import MLPClassifier

In [28]: mlp = MLPClassifier(activation='identity', hidden_layer_sizes=(2), learning_rate_

In [29]: mlp.fit(X,y)

Out[29]: MLPClassifier(activation='identity', alpha=0.0001, batch_size='auto',


beta_1=0.9, beta_2=0.999, early_stopping=False, epsilon=1e-08,
hidden_layer_sizes=2, learning_rate='constant',
learning_rate_init=1, max_iter=200, momentum=0.9,
n_iter_no_change=10, nesterovs_momentum=True, power_t=0.5,
random_state=None, shuffle=True, solver='adam', tol=0.0001,
validation_fraction=0.1, verbose=False, warm_start=False)

In [19]: predictions = mlp.predict(X)

In [20]: from sklearn.metrics import classification_report,confusion_matrix


print(confusion_matrix(y,predictions))

[[3 0]
[0 1]]

http://localhost:8889/notebooks/Neural%20Net%20sample%20xnor.ipynb 1/2
12/16/2018 Neural Net sample xnor

In [22]: from sklearn.metrics import accuracy_score


accuracy_score(y, predictions)

Out[22]: 1.0

In [24]: print(classification_report(y,predictions))

precision recall f1-score support

-1.0 1.00 1.00 1.00 3


1.0 1.00 1.00 1.00 1

micro avg 1.00 1.00 1.00 4


macro avg 1.00 1.00 1.00 4
weighted avg 1.00 1.00 1.00 4

In [30]: mlp.coefs_[0]

Out[30]: array([[ 0.78201101, -3.20909117],


[ 1.20182077, -6.22694725]])

In [ ]:

http://localhost:8889/notebooks/Neural%20Net%20sample%20xnor.ipynb 2/2

Вам также может понравиться