Академический Документы
Профессиональный Документы
Культура Документы
Sign Up
Sign In
HOME AI ML DL ANALYTICS STATISTICS BIG DATA DATAVIZ HADOOP PODCASTS WEBINARS FORUMS JOBS MEMBERSHIP GROUPS SEARCH
CONTACT
They say that the best ideas sometimes come to you while you are in the shower, and this idea of how to explain two important Neural Network concepts –
Backpropagation and Stochastic Gradient Descent – actually did come to me as I was trying to set the perfect water temperature for my morning shower.
As I was struggling to adjust the two shower handles – one handle that controlled scolding hot and the other handle that controlled flash freezing – it occurred to me that I
was a simple Neural Network (in spite of the “Snow Miser/Heat Miser” song running through my head). I was using Backpropagation to feed back the error between my
desired water temperature versus the actual water temperature to my two handled Neural Network, and Stochastic Gradient Descent to determine how much to adjust the
hyperparameters of those two handles.
While I don’t expect that the normal human will ever write their own Neural Network program, the more that you can understand how these advanced technologies work,
the better prepared you will be to determine where and how to leverage AI, Machine Learning and Deep Learning technologies to uncover new sources of economic value
with respect to customer, product and operational insights
Backpropagation is a mathematically based tool for improving the accuracy of predictions of neural networks by gradually adjusting the weights until the expected
model results match the actual model results. Backpropagation solves the problem of finding the best weights to deliver the best expected results.
Stochastic Gradient Descent is a mathematically based optimization algorithm (think second derivative in calculus) used to minimize some cost function by
iteratively moving in the direction of steepest descent as defined by the negative of the gradient (slope). Gradient descent guides the updates being made to the
weights of our neural network model by pushing the errors from the model’s results back into the weights.
Figure 2: Bathroom Faucet Neural Network and Tweaking Hot and Cold Faucet Weights and Biases
My hand in Figure 3 determines or measures error – the size and direction of the error – between the expected versus actual model results: blazing hot (Heat Miser),
mildly hot, slightly hot, slightly cold, mildly cold, chillingly cold (Snow Miser).
Figure 3: Measuring / Determining the Error Between Actual versus Optimal Results
I continue to measure the size and direction of the temperature error that gets backpropagated back to the faucet (model) using Stochastic Gradient Descent to determine
how much to tweak the model parameters (faucet settings) until I get the perfect outcome/result (see Figure 4).
Figure 4: Backpropagating the Error Back to the Neural Network Model in Order to Tweak the Model Weights
For those interested, here is my video explaining the water faucet neural network (and yes, I know that I need to clean the grout).
Tags: #AI, #BigData, #DOBD, #DataAnalytics, #DataMonetization, #DataScience, #DeepLearning, #DesignThinking, #DigitalTransformation, #DigitalTwins, More…
Like
0 members like this
Tweet
Share Facebook
Like 0
Comment
LMAO!!!
We had to write a program to solve this program in freshman programming, circa 1979 at UC Santa Cruz.
Of course, we didn't know Runge-Kutta from a hole in the ground and just stole the formula.
RSS
Welcome to
Data Science Central
Sign Up
or Sign In
Or sign in with:
RESOURCES
VIDEOS
DSC Webinar Series: Some Great Ways to Visualize Survey Data (and Virtually any Type of Data)
Building Accessible
Added Dashboards
by Tim Matteson in0Tableau
0 Comments Likes - Oct 15 ✕
When you create a dashboard, you want to ensure that everyone can see and understand the data. In this latest Data Science Central webinar you will learn how to create and publish
Register today
dashboards that are accessible to a wide variety of users.
https://www.datasciencecentral.com/profiles/blogs/using-a-bathroom-faucet-to-teach-neural-network-basic-concepts 5/6
9/30/2019 Using a Bathroom Faucet to Teach Neural Network Basic Concepts - Data Science Central
Master Data Management at Scale - Oct 29
REGISTER TODAY ›
×
DSC Webinar Series: Democratizing Analytics and DS for Continuous Intelligence
Added by Tim Matteson 0 Comments 1 Like
DSC Webinar Series: Mathematical Optimization + ML: Featuring Forrester Survey Insights
Added by Tim Matteson 0 Comments 1 Like
Add Videos
View All