Wow… I never knew why the activation function was necessary, but only knew it needed to be there. Now I know! Thanks so much for making this very easy to understand!

I mean, 3B1B's videos on neural networks are great, and they have some great explanations, but in this video, you explained the "network" part of a neural network in just the right way… in five minutes!!

One of the best, if not the best YouTube videos that visually represents what is happening to the neural network and how the activation functions splits the data. Thank you.

excuse my french but holy FUCK this is well done. The way you skip unessential bits so that the viewer can grasp the whole before going in to details is great. I personally prefer this mode of education, even when I'm learning seriously, it becomes much faster to learn the details after understanding the goal

This particular visualization of the mathematical parts I've been studying for so long caused more than one ah-ha moments for me. I'm so grateful for your work. 3BlueOneBrown had a great primer a while back—you two guys should totally collaborate on one. Thank you, liked, subscribed, and getting notified 😉

I have watched so many NN teaching videos but this is by far the most helpful and easy to understand video on Neural Network. Finally I understand how the weights and the Bias work. Thank you so much.

This is great. I've seen a lot on neural networks from other people (like 3Blue1Brown) but this gave me some serious insight into the sigmoid function! Thanks 🙂

This visual example is actualy the best. I've watched a learning course, where he explained it, but i doesn't understand how we make more komplex equations for the determination at this point. The Animations helps me alot!

been self studying ML and NN for over 4 months and only now i have a clearer understanding of why biases and activation functions (except ReLU) are used ! this video is gold

I cant thank u enough for this brother finally i can go into neural networks..i am python coder..never seen forword to neural network bcz i didn't understand it from core and i thought learning it without knowing core is just useless but finally i can give it a try and go in nachine learning and neural network..field…thank u so much

The necessity of the hidden layer remained unexplained, yet for those interested any expansion between input and output relates to the degrees of freedom between elements. Too much depth distorts the ability for the machine to learn since ineffective back-propagation effectively leads to forgetting.

No continuation of these series? I really wanted to finish the whole picture in my head and you were the only one doing it right for me 🙁 Can I get private paid tutorship?

Thank you for including these evolving decision boundary animations, it's something I haven't seen in explanations elsewhere. I'm still trying to understand how to adjust the complexity of the neural network to the problem, e.g. how many layers to use, but now I can see HOW more layers can increase the flexibility of decision boundaries.

Yeah, this is by far the best visual example of some of the basic math and it's effects in NNs. Great job, I will be watching the rest of the series and have subbed!

I'm trying to understand the math here. Before you added in the hidden layer the math for each output was essentially: r=X1*W1 + X2*W2 + b1 and p=X1*W3 + X2*W4 + b2 , then you added the hidden layer math (the a's, I don't know how large of a comment I can make on youtube so I'll refer to them as a's) which were the previous r/p outputs, my main question is the part with getting the decision boundary to curve, the f(x) = 1/1+e^-x sigmoid function. How does this play into the previous established math? I'm not the best with understanding the first time but can anyone (unlikely as this is a year old video) help me understand where he put this function? If I'm understanding the video correctly, the a's and the outputs r/p are now that function but how would this math work out? f(a1*w7 + a2*w8 + a3*w9 + b4) = 1/1e^-x ? I understand functions but this specific function I'd love a simple breakdown or explanation on. Thanks!

I didn't find this introduction particularly easy or useful for a beginner. However, I did like it, because it showed a new "viewing angle" on the same topic, thus broadening my understanding.

never saw a better explanation of this, even understanding it before it was super cool to see it adjust by hand sliders and how activation function changes it

Being a Deep learning researcher and having taken courses both in college and online MOOCs I find this as the best visual explanation of neural networks ever. Huge fan of your channel, been watching it for unity for a while now, and will now start watching your channel for everything. Keep up the great work Sebastian.

Is there a way to measure how "complicated" an output graph we can get? Initially using only linear combinations, we could only split it in half. I'd imagine that with more neurons, we could capture more and more complicated distributions. Is there anyone studying the "topology" so to speak of the output distribution, as a function of the neural network's shape?

Hello, Sebastian, thank you for the nice example and interesting videos. I want to ask you what Python's programming language library do you use to make such interesting animated graph visualizations ? Although the plots look similar to https://www.desmos.com/

Seeing this in a visual aspect, and seeing the activation function cause non-linearity really helped everything click in my head. From 5 minutes I've gained a large amount of understanding I hadn't gotten elsewhere before. So thank you for that!

Excellent video. I have learned more in a few minutes here than I have learned from hours of watching other videos on the topic. The visuals really crystallized the principles in my mind. I could never figure out what the biases were for and I did not appreciate the full effect of the sigmoid function.

I’ve watched about a dozen neural network videos. I have taken Calculus in college and the math does not intimidate me. However you have shown how the weights and bias and activation function actively change the output function and it finally clicked in my brain what was going on. THANK YOU.

This is a stunningly clear explanation of how neural networks work, in just under 6 minutes! Amazing. You have exceptional clarity of thought, and it shows in how you explain things. Thank you.

An amazingly simple explanation for a not so simple topic.

This video was the one that gave me that 'click' moment, incredible man!

what graphing software did u use there? and how to visualize it like that? really need this for my research

Incredible work. Never seen such a clear explanation of this topic. Thank you.

Tell me are you a GOD?? This is enlightening!

We need #4!

Wow… I never knew why the activation function was necessary, but only knew it needed to be there. Now I know! Thanks so much for making this very easy to understand!

I mean, 3B1B's videos on neural networks are great, and they have some great explanations, but in this video, you explained the "network" part of a neural network in just the right way… in five minutes!!

So

that'swhy neural nets use a sigmoid function! Very clear explanation, thank you.This is probably the best explanation I’ve ever seen of a neural network’s inner workings, and in only a few minutes.

One of the best, if not the best YouTube videos that visually represents what is happening to the neural network and how the activation functions splits the data. Thank you.

This was great Sebastian! Thank you!

Seeing everyone comment on how well you explained this I realized I am not the only one thinking you did an amazing job!

Best video ever! 10/10

You explained in 5 minutes what my college teacher tried to explain in 4 weeks

You're a freaking mad genius. This is the best neural network video I've seen.

I wish I had seen this year's ago 😭

excuse my french but holy FUCK this is well done. The way you skip unessential bits so that the viewer can grasp the whole before going in to details is great. I personally prefer this mode of education, even when I'm learning seriously, it becomes much faster to learn the details after understanding the goal

This is amazing, great video!!

This particular visualization of the mathematical parts I've been studying for so long caused more than one ah-ha moments for me. I'm so grateful for your work. 3BlueOneBrown had a great primer a while back—you two guys should totally collaborate on one. Thank you, liked, subscribed, and getting notified 😉

Awesome work!

What software did you use for visualization?

I have watched so many NN teaching videos but this is by far the most helpful and easy to understand video on Neural Network. Finally I understand how the weights and the Bias work.

Thank you so much.

I love this video so much. The way you explained it so simply and using pure linear equations, it's beautiful. Thank you so much for this video.

This is great. I've seen a lot on neural networks from other people (like 3Blue1Brown) but this gave me some serious insight into the sigmoid function! Thanks 🙂

This is the BEST high-level overview of neural networks for a n00b. Why isn't this more well known?!

best explanation i’ve ever seen

wtf

Dude, make more videos plz. I've seen many videos about neural networks and this one IS THE BEST. Great job.

This visual example is actualy the best. I've watched a learning course, where he explained it, but i doesn't understand how we make more komplex equations for the determination at this point. The Animations helps me alot!

i never subscribed to anyone this faster!

been self studying ML and NN for over 4 months and only now i have a clearer understanding of why biases and activation functions (except ReLU) are used !

this video is gold

This is the best video i found regarding the neural networks, i'm not even exaggerating.

I would have not imagine I could find a better explanation than the one from 3b1b, but here we are!

So basically is math.

This was very clear and easy to follow! Great work!

I cant thank u enough for this brother finally i can go into neural networks..i am python coder..never seen forword to neural network bcz i didn't understand it from core and i thought learning it without knowing core is just useless but finally i can give it a try and go in nachine learning and neural network..field…thank u so much

Wonderful explanation for neural network

i don't think u will read my comment

i didn't understand the w stuff

what is that and how to get its value

Wow what a great explanation! I've never seen anyone explain it like this before.

Нихуя не понял, но очень интересно)

good start but you lost me the instant you said "in python"

Yes yes yes, that's the good shit

Whoa, this makes it so much easier to understand! Thanks!

After more than a month of searching, clearest video I've seen

BRILLIANT!!!!!!!!!!

I've watched hundreds of videos and couldn't catch the thing, until now. Amazing job!

Awesome.

I kept seeing graphs like these in AI videos and didn't know what they were until watching this video. Very nice. 🙂

The necessity of the hidden layer remained unexplained,

yet for those interested any expansion between input and output relates to the degrees of freedom between elements.

Too much depth distorts the ability for the machine to learn

since ineffective back-propagation effectively leads to forgetting.

Fantastic. Nice balance of theory and practicality.

4:40 mind = blown

Best explanation ever! Keep doing the great job!

You are persistently AWESOME!

Yeah, but can you do 'Long Division'??

Great video !

Oh Yes!…You inject my brain correctly within this OnPoint 5+ minutes visual explanations, Bravo! Keep it up! 🤘👏

Best video i ever seen !

Wow never thought of it like that.

I love this movie!

This is so far beyond me but god I love these videos

this was actually such a good video, thank you!

Ahahaahhaha thanks a lot dude

Finish the series!!!!

Okay bro, you can go and teach Lecun and his friends

This is by far the best visual explanation I have ever seen explaining NN.

Omg thanks graet video

The best explanation I have ever found….

the best intuitive explanation for neurel net

thanks a lot

Still looking for #4 ;-;

This is the best explanation ive seen of neural networks

No continuation of these series? I really wanted to finish the whole picture in my head and you were the only one doing it right for me 🙁 Can I get private paid tutorship?

I finally begin to understand neural networks, thanks to you. Thank you <3

Thank you for including these evolving decision boundary animations, it's something I haven't seen in explanations elsewhere. I'm still trying to understand how to adjust the complexity of the neural network to the problem, e.g. how many layers to use, but now I can see HOW more layers can increase the flexibility of decision boundaries.

Wow, I wish I watched this video months ago. The best Explanation!.

Why is this video not viral yet

Wow thank you that other video of this guy just talking shit I could not understand but this I can ❤️

Yeah, this is by far the best visual example of some of the basic math and it's effects in NNs. Great job, I will be watching the rest of the series and have subbed!

This is the best illustration of a perceptron model I saw ever!

This is really the best explaination i've ever seen

I'm trying to understand the math here. Before you added in the hidden layer the math for each output was essentially: r=X1*W1 + X2*W2 + b1 and p=X1*W3 + X2*W4 + b2 , then you added the hidden layer math (the a's, I don't know how large of a comment I can make on youtube so I'll refer to them as a's) which were the previous r/p outputs, my main question is the part with getting the decision boundary to curve, the f(x) = 1/1+e^-x sigmoid function. How does this play into the previous established math? I'm not the best with understanding the first time but can anyone (unlikely as this is a year old video) help me understand where he put this function? If I'm understanding the video correctly, the a's and the outputs r/p are now that function but how would this math work out? f(a1*w7 + a2*w8 + a3*w9 + b4) = 1/1e^-x ? I understand functions but this specific function I'd love a simple breakdown or explanation on. Thanks!

I didn't find this introduction particularly easy or useful for a beginner. However, I did like it, because it showed a new "viewing angle" on the same topic, thus broadening my understanding.

never saw a better explanation of this, even understanding it before it was super cool to see it adjust by hand sliders and how activation function changes it

Being a Deep learning researcher and having taken courses both in college and online MOOCs I find this as the best visual explanation of neural networks ever. Huge fan of your channel, been watching it for unity for a while now, and will now start watching your channel for everything. Keep up the great work Sebastian.

Thank you very much for your GREAT work!

Is there a link to the code to rune the visualizer with the weights and biases we can play with?

I love the way you explain these things. It really sinks in 🙂

Is there a way to measure how "complicated" an output graph we can get? Initially using only linear combinations, we could only split it in half. I'd imagine that with more neurons, we could capture more and more complicated distributions. Is there anyone studying the "topology" so to speak of the output distribution, as a function of the neural network's shape?

Hello, Sebastian,

thank you for the nice example and interesting videos. I want to ask you what Python's programming language library do you use to make such interesting animated graph visualizations ?

Although the plots look similar to https://www.desmos.com/

Seeing this in a visual aspect, and seeing the activation function cause non-linearity really helped everything click in my head. From 5 minutes I've gained a large amount of understanding I hadn't gotten elsewhere before. So thank you for that!

5:16 is the math

what program do you use for videos

Excellent video. I have learned more in a few minutes here than I have learned from hours of watching other videos on the topic. The visuals really crystallized the principles in my mind. I could never figure out what the biases were for and I did not appreciate the full effect of the sigmoid function.

Honestly the best explanation I have seen so far. Well done.

I’ve watched about a dozen neural network videos. I have taken Calculus in college and the math does not intimidate me. However you have shown how the weights and bias and activation function actively change the output function and it finally clicked in my brain what was going on. THANK YOU.

It's amazing how well done are the animations you use in your videos to explain how "things" work. There must be a lot of work behind them! Thank you!

wow, thanks!

This is a stunningly clear explanation of how neural networks work, in just under 6 minutes! Amazing. You have exceptional clarity of thought, and it shows in how you explain things. Thank you.

I love your videos and this one especially was AWESOME! Thanks for making videos.

unity version?

I thought you name was Sebastion LEAGUE before? Mandela effect in play or some shit???

This is so understandable! Thank you!!!