Академический Документы
Профессиональный Документы
Культура Документы
25.09.2019
Kristijan Šarić
EXACT BYTE d.o.o.
About me
● Java (desktop, web, mobile) applications
● Python (ML, web applications, scripts)
● JavaScript (Angular, React, …)
● Scala (web), purely functional programming
● Haskell (IOHK - explorer, wallet, cardano-shell, consulting for foreign companies)
● My own products (using ML)
○ https://www.emprovio.com/
○ https://contetino.com/
○ https://alenn.ai
○ Croatian sign language (in progress)
AI
● (Most) General form/term for anything relating to machines “thinking for
themselves”
● Artificial Intelligence is the broader concept of machines being able to carry
out tasks in a way that we would consider “intelligent” [1]
Machine learning
● “Rather than teaching computers everything they need to know about the
world and how to carry out tasks, it might be possible to teach them to learn
for themselves” [1]
● Probabilistic reasoning - what is the connection between inputs and outputs?
● Interesting field that emerged is Probabilistic programming which emphasizes
“reasoning under uncertainty”
Machine learning
Reinforcement
Supervised Unsupervised
AI vs Machine learning
● Deep Blue, the AI that defeated the world’s chess champion in 1997, used a
method called tree search algorithms to evaluate millions of moves at every
turn
● Prolog (Zlatko presented), first order logic - "there exists x such that x is
Socrates and x is a man"
● General intelligence, which is very broad and uses a sort of “transfer learning”
over different domains is more general than ML - imagine a neural network
that learns how to understand the language by understanding pictures
Neural networks
[ 0.75 [ 0.63
, 0.35 , -3.56
, 1.7 Magic ,0
, -4.5 , -10.3
] ]
Neural networks
[ 0.63
[ [ 0.75 ]
, -3.56
, [ 0.35, 1.7 ]
Magic ,0
, [ -4.5 ]
, -10.3
]
]
Neural networks
[ 0.63
[ [ [ 0.75 ] ]
, -3.56
, [ [ 0.35, -1.5 ], [ 1.7 ] ]
Magic ,0
, [ [ -4.5 ], [ -6.4 ]
, -10.3
]
]
Neural networks
[ 0.63
[ [ [ 0.75 ] ]
, -3.56
, [ [ 0.35, -1.5 ], [ 1.7 ] ]
Function ,0
, [ [ -4.5 ], [ -6.4 ]
, -10.3
]
]
plt.plot([0,1,2,3], [0,1,2,3])
Non - linear function
xs = np.linspace(-10, 10, 100).tolist()
Magic
[ 0.63
[ [ [ 0.75 ] ]
, -3.56
, [ [ 0.35, -1.5 ], [ 1.7 ] ]
Magic ,0
, [ [ -4.5 ], [ -6.4 ]
, -10.3
]
]
Neural networks (some) details
Input Output
(Neuron) (Neuron)
Neural networks (some) details
is_human_frozen(X)
is_human_frozen(X)
import numpy as np
# 1÷(1+(2.71828^-100))
def sigmoid(X):
return 1/(1+np.exp(-X))
-100 Input Output 0
np.round(sigmoid(-100), 2) == 0
np.round(sigmoid(100), 2) == 1
Neural networks (some) details
import numpy as np
# 1÷(1+(2.71828^100))
def sigmoid(X):
return 1/(1+np.exp(-X))
100 Input Output 1
np.round(sigmoid(-100), 2) == 0
np.round(sigmoid(100), 2) == 1
Neural networks (some) details
Output
100 Input ACTIVATION Output 1
FUNCTION
Neural networks (some) details
is_frozen
= i1 * w1
w1
Output
100 Input (i1) ACTIVATION Output 1
FUNCTION
w2
is_burned
= i1 * w2
Neural networks (some) details
Bias
Hidden 1
= i1 * w1
w1
Output
100 Input (i1) ACTIVATION Output 1
FUNCTION
w2
Hidden 2
= i1 * w2
Hidden
Input Output
Hidden
Hidden
Neural networks (some) details
Hidden
Input Output
Hidden
Hidden
Neural networks (some) details
Hidden
Input Output
Hidden
Hidden
Neural networks (some) details
Hidden Hidden
Input Output
Hidden Hidden
Hidden Hidden
Neural networks (some) details
Hidden Hidden
Deep neural
Input Output
network, Deep
Hidden Hidden
learning!
Hidden Hidden
Neural networks (some) details
Hidden Hidden
Deep neural
Input Output
network, Deep
Hidden Hidden
learning!
Hidden Hidden
Features, why neural networks
Input Output
Features, why neural networks
Input
Output (Is
(Image of
it a cat?)
animals)
Features, why neural networks
Input
Algorithm Output (Is
(Image of EXPERT
(Classification) it a cat?)
animals)
Features, why neural networks
Input Algorithm
(Feature Output (Is
(Image of extraction + it a cat?)
animals) classification)
model = Sequential()
Keras, initial example
# Add an input layer and a hidden layer with 2 neurons
model.add(Dense(1, input_shape=(2,)))
model.summary()
Keras, test
# Add an input layer and a hidden layer with _ neurons
model.add(Dense(_, input_shape=(_,), activation='tanh'))
model.compile(optimizer='adam',loss='mse')
model.fit(Xs,Ys, epochs=5)
print("Final loss:",model.evaluate(Xs,Ys))
Keras, training
X Y
-100 -1
-50 -1
-30 -1
0 1
30 1
50 1
60 -1
100 -1
150 -1
Keras, training
# Compile your model
model.compile(optimizer='adam',loss='mse')
model.fit(Xs,Ys, epochs=5)
print("Final loss:",model.evaluate(Xs,Ys))
Keras, simple classification example
● MNIST
● https://en.wikipedia.org/wiki/MNIST_database
Keras, training
from __future__ import print_function
import keras
num_classes = 10
epochs = 12