Neural Networks from Scratch - P.5 Hidden Layer Activation Functions

  Переглядів 285,641

sentdex

sentdex

4 роки тому

Neural Networks from Scratch book, access the draft now: nnfs.io
NNFSiX Github: github.com/Sentdex/NNfSiX
Playlist for this series: • Neural Networks from S...
Spiral data function: gist.github.com/Sentdex/454cb...
Python 3 basics: pythonprogramming.net/introdu...
Intermediate Python (w/ OOP): pythonprogramming.net/introdu...
Mug link for fellow mug aficionados: amzn.to/3bvkZ6B
Channel membership: / @sentdex
Discord: / discord
Support the content: pythonprogramming.net/support...
Twitter: / sentdex
Instagram: / sentdex
Facebook: / pythonprogramming.net
Twitch: / sentdex
#nnfs #python #neuralnetworks

КОМЕНТАРІ: 973
@tuhinmukherjee8141
@tuhinmukherjee8141 4 роки тому
Bro the effort he puts in to make us understand this stuff is highly admirable. Thanks for doing this man. Will be waiting for pt. 6
@mayurpanpaliya
@mayurpanpaliya 3 роки тому
When pt 6 will be released ?
@trashtop1810
@trashtop1810 3 роки тому
@@mayurpanpaliya He is waiting for you to buy the book haha
@robenromero4947
@robenromero4947 3 роки тому
I am excited for part 6
@subhammishra5445
@subhammishra5445 3 роки тому
@@bossragegamer4081 5 months now :(
@Mohamm-ed
@Mohamm-ed 3 роки тому
6 months
@ConorFenlon
@ConorFenlon 3 роки тому
Dude, you're a legend. Bought the ebook Pre-Order yesterday, absolutely CANNOT WAIT for full release. My favourite thing about your videos, is your enthusiasm. For example, at 8:38, "What's so cool about ReLU is it's ALMOST linear, it's sooooo close to being linear, but yet that little itty-bitty bit of that rectified clipping at 0, is exactly what makes it powerful; as powerful as a sigmoid activation function, super fast, but this is what makes it work, and it's so cool! So WHY does it work??" Dude, I've never been so PUMPED to learn from someone with such enthusiasm in my LIFE. You take all the time you need to do this man, do it your way, and take your time, and you'll change the world. Thank you so much. Much love from Ireland. edit: spellings
@josephmejia9520
@josephmejia9520 3 роки тому
Seriously! PUMPED encompasses all my feels as I follow along.
@nishantsvnit
@nishantsvnit 4 роки тому
18:17 Seeing the neurons fire when activated and die when deactivated really helped to see what really goes under the hood of a neural network. Thanks for this really helpful animation and the whole nnfs initiative as a whole.
@Orchishman
@Orchishman 4 роки тому
can you please explain how the activation point is getting changed by changing the bias? doesn't that flout the activation function which says y=x only when x>0?
@tuhinmukherjee8141
@tuhinmukherjee8141 3 роки тому
@@Orchishman the bias here is essentially setting the activation point because y= max(0,max(0,-x+0.5)+0.48) will give you 0.48 for x being greater than or equal to 0.5 which serves as the lower bound for the function
@nishantsvnit
@nishantsvnit 3 роки тому
@@Orchishman I created a graph so that you can play with the parameters and see for yourself how this is actually happening. I considered the first two rows and the last row of neurons only to make it simple (so 6 neurons in total in the hidden layer). I have numbered the neurons such that first neuron of first row has subscript 11, second neuron of first row has subscript 12 and so on (so, for example, the second neuron of the 8th row has subscript 82). Now to simulate the movement that is happening in the video, adjust the slider for the variable w_22 and see what happens with the plot. You will see where the area of effect of the second neuron comes into play. You can also adjust other sliders for weights and biases to see see their influence on the output. Here is the link: www.desmos.com/calculator/gruatlyner When you open the plot, the values of the weights and biases are the same as seen in the video till 17:01. I hope this helps.
@syedalamdar100
@syedalamdar100 4 роки тому
This is the first time I understand how to build a neural network. I love the work. My impatient side is wishing that all the videos be made available for this series but this will keep me hooked and awaiting your next post. Amazing job!
@jonathanmorgan4480
@jonathanmorgan4480 Рік тому
This is by far the best explanation of how neural nets work that I have ever found. This should be it's own standalone teaching. The sine wave example with visuals - perfect! Thanks so much.
@patrickjdarrow
@patrickjdarrow 4 роки тому
Dude. Having been here for the last 5ish years, it's awesome to see how far your production level has come. Always good content, now shinier. Would love to see a video on how the videos themselves are made.
@crohno
@crohno 4 роки тому
I just wanted to thank you for all this stuff, I am in the process of getting a PhD in neuroscience and artificial neural networks seems like a great tool to help with research. You make it really clear, and unlike other tutorials that tend to just show how to use certain libraries you really get down to how they actually works. As soon as the book is out I am getting a physical copy!!
@JackSimpsonJBS
@JackSimpsonJBS 3 роки тому
This series (and the book) are incredible! Such an amazing teacher - I can't wait for part 6 :)
@vedangpingle1914
@vedangpingle1914 4 роки тому
40 mins ! Oh boy this is gonna be good
@satwikram2479
@satwikram2479 4 роки тому
Yes😍
@shauryapatel8372
@shauryapatel8372 4 роки тому
it IS good
@mariyanzarev6423
@mariyanzarev6423 3 роки тому
Hey sentdex, since the other parts are still in the works I’d like to give some feedback. Thanks for doing all this, the graphics help a ton to see how everything works. The only suggestion is to explain why the different concepts even exist, with some real life examples. This looks like it would be great for someone experienced that has used activation functions and everything else you discuss, and now they would like to closely see how it works. For a noob like me, it is not clear why they even exist, and it feels a bit like we are just listing different concepts without a clear picture of why, and what we are trying to achieve with this network. For example when you were showing how well the ReLu fits the data its not clear if that is actually desirable since it seems to overfit the data.
@mdimransarkar1103
@mdimransarkar1103 2 роки тому
It is all result of years of experiment, scientists just try to look for patterns.
@karthikkashyap4557
@karthikkashyap4557 11 місяців тому
Hi, Here's a link of my video where I've explained how relu helps in fitting lines to the data. ukposts.info/have/v-deo/caRliHpspJmItH0.html.
@rkidy
@rkidy 3 місяці тому
For anyone that’s sees this in the future and agrees, this series generally balances practicality with understanding. I would heavily recommend also giving 3blue 1browns series on neural networks a look as that focuses far more on understanding and doesn’t really go into code that much.
@mizupof
@mizupof 3 роки тому
This video just blew my mind. I still haven't bought the NNfS book yet. But this doesn't reflect how much it love to watch and re-watch your videos. This series will probably stay State-of-the-Art for a long time. Thank you!
@NikhilSinghNeil
@NikhilSinghNeil 4 роки тому
finally a video giving a clear insight of an activation function. This is by far the best explanation of activation function I've come across. Really appreciate your work behind this series and getting into the crux of these topics.
@rakeshkottu
@rakeshkottu 3 роки тому
its been a month, still waiting for part 6.
@yaron3479
@yaron3479 3 роки тому
ecpect something huge
@josephastrahan6403
@josephastrahan6403 3 роки тому
Waiting also :), it will be worth the wait though for his quality.
@disappointedsquid
@disappointedsquid 3 роки тому
Yes, please
@Nightmare-or2yd
@Nightmare-or2yd 3 роки тому
I emailed Harrison about it, he says that he is finishing the draft (which is nearly complete) before continuing the series.
@aidankemp-harper2559
@aidankemp-harper2559 3 роки тому
@@Nightmare-or2yd yayyyyy
@2010karatekid
@2010karatekid 4 роки тому
I took my first machine learning course last semester and unfortunately all of the activities we did looked like those from the CS231 class you mentioned--no explanation, just code snippets and output. They were doable but considering it was most students first foray into python, it was quite a rough time to say the least. However, I am extraordinarily pleased to have found your channel and this series in particular--your instruction has helped more in the last 5 videos than my entire semester at university. Thanks for doing what you do.
@josephmejia9520
@josephmejia9520 3 роки тому
I love how passionate he is throughout all these videos it brings me joy while learning this subject.
@emado.7834
@emado.7834 3 роки тому
Bro, I just felt obligated to leave a comment for the perfect video you have made. This was literally the best visualization I have ever seen on youtube. This video deserves an oscar.
@muna4840
@muna4840 Рік тому
I'm going to be very good someday at building/training neural nets. It's all because of my curiosity that made me stumble on this fantastic playlist..... now I'm reading your book and practicing (coding after reading between the lines and understanding the theory) and consulting this playlist and several other resources in order to gain a deeper understanding. Thank you so much for being really amazing.
@themetalcommand
@themetalcommand 3 роки тому
Man loved the video! So helpful and easy to learn. Need pt. 6 sooner, too eager to learn about back-propagation and weight/bias adjustments!
@HarshithMK
@HarshithMK 4 роки тому
Loving the series so far! All I do is wait for the next video to come out...
@ifmondayhadaface9490
@ifmondayhadaface9490 3 роки тому
You’re the first person I’ve seen who actually explains how something like ReLu is so helpful and powerful. Looking forward to part 6!
@RoughlyAverage
@RoughlyAverage 4 роки тому
I really struggled with the explanation on feature sets / features / samples / classes, I definitely don't think I fully get it (first time that has happened in this series so far!) The animation you mentioned would for sure help!
@nishantsvnit
@nishantsvnit 4 роки тому
For the spiral dataset, - features are the x-coordinates (x) and y-coordinates (y) of the points - In the code, there are 300 x and 300 y values associated with the 300 points - feature sets are the pairs (x, y) that fully define one point in the dataset - In the code, there are 300 feature sets - classes are the labels associated to the points - In the code, there are 3 classes defined by the colors - red, blue, green - and each feature set (x, y) corresponds to one of these 3 classes (with 100 points each) - samples are the combination of feature sets and classes that form the dataset - For example: (x = 0.2, y = -0.5, color = red) and (x = -0.5, y = -0.2, color=blue) are samples from the dataset Edit: Calling the function X, y = spiral_data(100, 3) creates samples belonging to 3 classes with 100 feature sets each. X (feature sets) is an array of shape (300, 2) and y (classes) is a vector of size 300.
@iAmTheSquidThing
@iAmTheSquidThing 4 роки тому
Same here. That's the only thing so far in this series which confused me.
@iAmTheSquidThing
@iAmTheSquidThing 4 роки тому
@@nishantsvnit Ahh so a "feature set" is essentially "the set of features which a sample has" but unlabelled?
@nishantsvnit
@nishantsvnit 4 роки тому
@@iAmTheSquidThing You are right. But it is better to not call it "unlabeled" because that is a term used for feature sets that have no labels assigned to them (which was not discussed in the video). In the example in the video, all the feature sets have corresponding labels (i.e., the 300 x,y coordinates belong to one of the 3 colors). So to rephrase your sentence, you can say that feature set and label are the two components that make up a sample. If there is no label, the sample (or feature set) is called unlabeled. For more information on these terminologies, I would encourage you to see this: developers.google.com/machine-learning/crash-course/framing/ml-terminology#examples
@userre85
@userre85 4 роки тому
@@nishantsvnit thanks
@bamitsmanas
@bamitsmanas 4 роки тому
Its wonderful what you're doing! Im just loving the in-depth knowledge of this course. Although I'm in high school I'm not finding it difficult to catch on!!👍👍
@user-ns8dl3vm5z
@user-ns8dl3vm5z 4 роки тому
its like my favorite TV series uploaded new episode, this gonna be wild as always thank u & Daniel :)
@RaviKiran-qd1cl
@RaviKiran-qd1cl 4 роки тому
Thanks Sentdex. You make look tough concepts much simpler. Please continue to make these tutorials. Cheers.
@TheRelul
@TheRelul 3 роки тому
Man, this is just beautiful. Thank you and Daniel and the whole team responsible for this. You are bringing beauty into the world.
@Extorc
@Extorc 3 роки тому
Hey can u explain me what is an activation function atol
@TonyTheTrain
@TonyTheTrain 3 роки тому
I know I'm late to the party, but the animations are amazing. I watched the double neuron part probably 20 times with the sound off to figure out what was going on. I had a recommendation for the animation and as I was typing it, I realized that I STILL didn't fully understand what was going on. I've got it now - thank you for the animations! This would be MUCH more difficult without them. Specifically - the input of the second neuron going "backwards" was bending my brain.
@pnptea173
@pnptea173 3 роки тому
Part 6 cant come quick enough, loving this series
@VinayAggarwal
@VinayAggarwal 3 роки тому
Huge Respect!!!! You are really going far and beyond to make people understand this stuff. You are setting new milestones for educators around the world. ❣️Appreciate your efforts.
@palashchanda9308
@palashchanda9308 3 роки тому
Waiting for P.6 eagerly..
@sandeshtulsani1517
@sandeshtulsani1517 3 роки тому
same
@maxwellcrafter
@maxwellcrafter 3 роки тому
@@sandeshtulsani1517 Same
@srikarraoayilneni7074
@srikarraoayilneni7074 4 роки тому
Well, that's a long wait. Honestly I'll wait forever. 😂😂 But here it is, finally.♥️
@thecathode
@thecathode 3 роки тому
The detailed explanation and animations of fitting the sine wave are awesome!
@williamflores7323
@williamflores7323 4 роки тому
I've been waiting for this moment for all my week, oh Lord
@zendr0
@zendr0 2 роки тому
This is the best explanation video of how activation function works in the WWW 🚀. And thank you the one who put his time and effort in creating such beautiful animations for us. Thank you very much ❤
@parasjain3211
@parasjain3211 4 роки тому
This week was the hardest to pass of this quarantine! Please don't make us wait so long 🙏🏻🙏🏻🥺🥺
@Hacker097
@Hacker097 4 роки тому
idk mate, probably not a good idea to put pressure on him to upload more
@kris10an64
@kris10an64 4 роки тому
@@Hacker097 He is showing appreciation
@AliAbbas367
@AliAbbas367 3 роки тому
I cannot explain how much amazing you way of explaining is. I just saw all of your videos in one go and now I am waiting for another one. Thank you.
@shanka8518
@shanka8518 3 роки тому
Very good content. Really shows the intuition in how a neural network works. Hopefully pt. 6 comes out soon
@nivrak5411
@nivrak5411 4 роки тому
Me: Sees new video by sentdex about neural networks Hand: Invents FTL Travel to click the video
4 роки тому
Is there any paper for this optimizer? I've never heard of one before. How does it work?
@HT79
@HT79 4 роки тому
Perhaps the author could help us out.... Hey @Daniel Optimizer Kukiela, please tell us about your optimizer!
@dhruvdwivedy4192
@dhruvdwivedy4192 4 роки тому
Hello Daniel nice to see you here😂😂❤️
@user-ns8dl3vm5z
@user-ns8dl3vm5z 4 роки тому
You are a legend man
@whoisabishag3433
@whoisabishag3433 4 роки тому
How Does This Help Irene To SLAP ME"?" 👠😋😎
@josephastrahan6403
@josephastrahan6403 3 роки тому
When he was saying optimizer, he was saying that the guy literally did it 'by hand'. So there is no optimizer, it was done by a human :P, if i understood correctly.
@masterfloort
@masterfloort 3 роки тому
The animations are so smooth I could just sit there watching a graph all day!
@tasnimnishatislam7607
@tasnimnishatislam7607 4 роки тому
Hey, I am a beginner in machine learning and you genuinely have been an inspiration. Thanks for existing!
@PaderRiders
@PaderRiders 3 роки тому
Hey man! Thanks for your awesome videos! Im interested in this theme and your explaining it pretty good! Im waiting for your next video 😉 Greetings from Germany and keep on producing 🚀
@benicamera1577
@benicamera1577 3 роки тому
Lol PaderRiders interessiert sich für Nerdstuff XD
@erichartz594
@erichartz594 3 роки тому
I think an animation would be immensely helpful for absorbing the section about features and classes. Got lost for a while between data set and feature set and feature class
@yarutgruter4925
@yarutgruter4925 3 роки тому
please tell me this series didnt die out, I'am loving this so much!
2 роки тому
wow, the example of linear vs non-linear activation function is amazing! This series is pure gold
@satwikram2479
@satwikram2479 3 роки тому
Why NNFS has stopped?
@reyboyvideogames
@reyboyvideogames 3 роки тому
no stopped, sentdex is doin the draft first before uploading the new video i think
@Hunar1997
@Hunar1997 4 роки тому
Next course, deep q-learning from scratch XD
@alicemystery5520
@alicemystery5520 4 роки тому
Xomg, that was impressive as always sentdex ! The visuals are a big help. Thanks for all of your tutorials. Naturally, I will buy the book to show my appreciation and continue on as this channel has become the edge of what you can do with python.
@shrideepgaddad8721
@shrideepgaddad8721 4 роки тому
Hi, I noticed that you did not paste the code for generating a dataset in the description. Also thanks for the new video!
@nishantsvnit
@nishantsvnit 4 роки тому
The link to the code is there in the description under "Spiral data function"
@michaelparker6868
@michaelparker6868 3 роки тому
At 29:27 if you freeze the screen you can see it. I copied the text into Jupyter notebook and it worked.
@judedavis92
@judedavis92 3 роки тому
When is the next one coming?
@sayakbanerjee7214
@sayakbanerjee7214 4 роки тому
Had been waiting for this for nearly 2 weeks now.... Thank you @sentdex.... ❤️
@user-yg2xq1jq7t
@user-yg2xq1jq7t 5 місяців тому
This entire series has been amazing. I really appreciate your effort to simplify and get things to a granular level. Kudos to you.
@gamalaburdene5243
@gamalaburdene5243 4 роки тому
I never comment on videos, but I've been following sentdex for the last couple of years and this is amazing. Please keep up the good work, thank you for teaching me so so many things.
@sentdex
@sentdex 4 роки тому
Thank you! Will do!
@unixtreme
@unixtreme 3 роки тому
I want to get the book but tbh I'm on the fence, my brain doesn't allow me to sit and go through paper, if this series resumes then I will because it will be good as a complement but not as a main means of studying in 2020.
@shammabeth
@shammabeth 3 роки тому
Bought your book, but still eagerly waiting for the next video. These are very well produced deep dives that are easy to understand.
@heavymetalqueenxtc
@heavymetalqueenxtc 4 роки тому
That was great. Especially the explanation for why you need a non linear activation function.
@splch
@splch 4 роки тому
what optimizer do u use? noob: ADAM intellectual: Daniel
@subratkishoredutta4132
@subratkishoredutta4132 3 роки тому
When is part 6 going to be released
@OfficialYouTube3
@OfficialYouTube3 3 роки тому
He said expect it sometime between June 2022 and December 2038
@subratkishoredutta4132
@subratkishoredutta4132 3 роки тому
@@OfficialUKposts3 thts a long wait dude🙂
@OfficialYouTube3
@OfficialYouTube3 3 роки тому
@@subratkishoredutta4132 Yes but Sentdex is a busy man... writing books, running a UKposts channel, maintaining a website, did you know he is raising three different families on two different continents? (that last one is a secret)
@subratkishoredutta4132
@subratkishoredutta4132 3 роки тому
@@OfficialUKposts3 yes he is..
@kelpdock8913
@kelpdock8913 3 роки тому
@@OfficialUKposts3 that last one is wrong, he got a neural network to do the other two
@Sionlockett
@Sionlockett 4 роки тому
Those 3blue1brown api animations are amazing. MASSIVE production value. That really helped me understand this video to another level, thanks.
@luciorossi75
@luciorossi75 3 роки тому
The sine wave fitting is so informative! Kudos on the animations
@heyrmi
@heyrmi 4 роки тому
There has to be a place like heaven inside heaven for you.
@shauryapatel8372
@shauryapatel8372 4 роки тому
aren't you the guy that was faster than light?
@kelpdock8913
@kelpdock8913 3 роки тому
@@shauryapatel8372 yeah he was
@horticultural_industries
@horticultural_industries 2 роки тому
My guy cannot decide where to put his camera
@fodaseodinheiro
@fodaseodinheiro 3 роки тому
Your classes are awesome. Really appreciate it!
@DSinghsLAB
@DSinghsLAB 3 роки тому
Excellent work!! Please dont abandon so many of us and continue with part 6 and beyond... loads of respect for the time you put in making these videos.. Thank u from the most hidden layer of my heart for such explanations!!
@sentdex
@sentdex 3 роки тому
Working on the book atm. Videos after
@DSinghsLAB
@DSinghsLAB 3 роки тому
@@sentdex sure No Problem!! I promise I wont learn Neural Nets from any other source Till you are back in business !! Warm wishes for the book !!
@HT79
@HT79 4 роки тому
Finally! First view and first comment 😍
@danielwit5708
@danielwit5708 4 роки тому
Live success! Yay!
@haztec.
@haztec. 4 роки тому
Neu - ral - net. It's in the brain.
@michaeljburt
@michaeljburt 2 роки тому
Absolutely brilliant explanation as to why a non-linear activation function can lead to good mapping of desired non-linear outputs. This is actually an extremely pertinent topic in my field of study (electrical engineering, power systems, which for three phase AC circuits have non-linear power flow solutions). Seeing "how" these ReLU neurons can model non-linear functions is absolutely mind blowing. Bravo!
@allanmcelroy
@allanmcelroy 3 роки тому
Great series of videos, with clear, thorough explanations. Can't wait for future parts (so much so that I've ordered the e-book :) )
@davidgomez79
@davidgomez79 4 роки тому
at 25:10 When I code in python and I'm under 80 characters for my line of code. I rename my variables extra long just to end up at 82 characters to trigger the pep8 lovers. I hate pep8
@kris10an64
@kris10an64 4 роки тому
Why do you hate pep8?
@davidgomez79
@davidgomez79 4 роки тому
@@kris10an64 it has unreasonable rules that shouldn't always apply. People are too strict with them. Raymond Hettinger himself agrees. Its suppose to make code more readable but its very flawed especially since python is already based on indentation. The 80 characters per line rule is the worst rule. If you have nested if blocks or if you like to work with lambda functions and iterators it can easily become long and well it can make the code blocky and ends up making it hard to read which is the very thing it was meant to avoid. In many cases following pep8 isn't the best option. Are you one of those pep8 absolutists? It also feels very restricting.
@davidgomez79
@davidgomez79 4 роки тому
@@kris10an64 search for a video on youtube titled: "Beyond Pep8" where Raymond Hettinger talks about his dislikes of pep8 too and how some if its aspects are silly at best.
@davidgomez79
@davidgomez79 4 роки тому
@@kris10an64 Here is how I like to code. Maybe it comes from me preferring C++ but these 2 functions can take 2 strings with hex values and xor them together: def stringXOR(a,b): return ('0' * len(a if a > b else b) + '%02X' % (int(a,16) ^ int(b,16)))[-len(a if a > b else b):] def stringAND(a,b): return ('0' * len(a if a > b else b) + '%02X' % (int(a,16) & int(b,16)))[-len(a if a > b else b):] Make that pep8 friendly and it looks like hell.
@davidgomez79
@davidgomez79 4 роки тому
@@kris10an64 Here's another example. This is how I like to reverse a hex string: def byteFlop(hexstr): return ''.join(reversed([hexstr[y:y+2] for y in range(0, len(hexstr), 2)])) Show me a pep8 version that is better.
@AlfrihPetruFeras
@AlfrihPetruFeras 3 роки тому
I'm horribly waiting for the next episode... please don't make us wait longer! You're doing a great job and your effort is really appreciated.
@BB-sd6sm
@BB-sd6sm 4 роки тому
This is the best video explanation I have ever seen on activation functions. Bravo
@mdmotiurrahmansagar1170
@mdmotiurrahmansagar1170 4 роки тому
Best tutorial Visually, Verbally, Programmably and Conceptually. Thanks for enlightening us sentdex.
@divyanshusahu6413
@divyanshusahu6413 4 роки тому
This is the most valuable channel to me and hopefully many others on youtube. PLEASE UPLOAD THE NEXT VIDEO SOON
@alrineusaldore6764
@alrineusaldore6764 Рік тому
Ever since I've heard of ReLU I've always questioned why is it better than sigmoid and the others even though it looks like 2 linear functions put together. Now I finally understand how it works and why it's so efficient! I also understand linearity and non-linearity much better than before and my thirst for knowing why and how it's all happening is satiated. Thank you for those amazing videos!
@dolomikal
@dolomikal 4 роки тому
These keep getting better and better. I'll say again, this is exactly what I was looking for to get into AI as a (currently) non-AI dev. You go deep enough to understand what's going on behind the scenes but stay at high enough of a level that it doesn't feel like an advanced math course. Truly an art. Great work!
@acidtears
@acidtears 4 роки тому
That's because he hasn't talked about backpropagation yet lol You should watch 1-2 videos of 3blue1brown on linear algebra just to understand how derivatives work, as that will increase your understanding immensely. Also, the math isn't that complicated as you usually just need to understand it once and then you can apply it globally to other network architectures as well, as they tend to operate on the same underlying principles.
@alexaddison1783
@alexaddison1783 4 роки тому
Can't wait till this channel exceeds 1,000,000 subs its well deserved!!
@witek_smitek
@witek_smitek 4 роки тому
FINALLY Some explained me with easy, visual way, what impact has layers. I was always wondering why de heck we need 2 layers, or why 8 neurons at each, and why not a 100? Thanks a lot! You are doing a great work. I can't wait for the next part!!
@abdullrahmanmohamed2155
@abdullrahmanmohamed2155 3 роки тому
I cannot thank u enough on that magnificent effort . I'm waiting the following parts
@yogiisdaman
@yogiisdaman Рік тому
you have remarkable skill in explaining things concisely yet understandably, thank you for your videos!
@jordanbarnes1870
@jordanbarnes1870 4 роки тому
Yessssss been waiting for nearly 2 weeks I can't wait
@winnumber101
@winnumber101 3 роки тому
Got so much respect for this man's illustrative prowess
@tanishqvyas8387
@tanishqvyas8387 4 роки тому
I've learnt more python from your channel than any other resource out there
@HomeBologn
@HomeBologn 3 роки тому
This series is the best one you've ever done, hands down. Easiest to follow, helpfully illuminated by the manim animation (manimations?). 11/10
@Ideophagous
@Ideophagous 3 роки тому
Great series though! I'm loving it! Looking forward to the next episodes.
@lewisdrakeley9631
@lewisdrakeley9631 2 місяці тому
Fantastic video! I never understood the need for activation functions, now I get it completely. Incredible work thank you!
@user-pl2cx2vm4l
@user-pl2cx2vm4l 10 місяців тому
Man, this is the best neural network tutorial that I've ever seen. Thank you and keep going!
@shobhitbishop
@shobhitbishop 4 роки тому
Hi Harrison, Just wanted to thank you for this awesome series on NN, it really helps me alot in understanding things clearly from the scratch. You have built a confidence in me that yes I can also learn this complex topic! Thank you 😊
@sentdex
@sentdex 4 роки тому
You're very welcome!
@ianik
@ianik Рік тому
Quality content for free. Second time going through the series. You are a good man
@Micha-ky6um
@Micha-ky6um 3 роки тому
Dude you are such a good teacher, thank you for these vids! Waiting in anticipation for part6
@Gingnose
@Gingnose 3 місяці тому
There are very few topics online about learning what neurons and synapses in silico (Node and activation function) does in intuitive way. This video is already 3 years old but still holds the crown, sir.
@AlizerLeHaxor
@AlizerLeHaxor 2 роки тому
This is amazing, I'm coding along with C# and this is the first time I actually understand how Neural Networks work.
@elishashmalo3731
@elishashmalo3731 4 роки тому
This is so much fun!!! Can’t wait for part 6
@jaewooko4527
@jaewooko4527 3 роки тому
Thanks for revisiting this topic. It help me to restart my study on tensorflow.
@tanishqvyas8387
@tanishqvyas8387 4 роки тому
Really looking forward to the next video. Please keep making videos for this series covering each and every topic in the field of neural networks. I wish there was a certification course from the side of sentdex which we could take up, learn, write exam and get certified
@AnilDhulappanavar
@AnilDhulappanavar Рік тому
Hats Off Sentdex !!! It has been very helpful for me to learn a little bit of Neural Network. Really aappreciate all the effort gone behind this.
@GroterRonald
@GroterRonald 3 роки тому
Just bought your book! Thank you for your clear explanations👌
@sidharthgiri1610
@sidharthgiri1610 4 роки тому
okay period. the animations and explanation is MINDBLOWING
@sidchakravarty
@sidchakravarty 2 місяці тому
This is one of the BEST explanations of why ReLU works. I took 24 screen shots only for this video because of the amount of detail this video has. Eagerly waiting for the book to arrive today!!!
@1OJosh
@1OJosh 3 роки тому
This is amazing, I love your channel. I watch this everyday, I'm going to show my dad this whole playlist. He's going to love it and then we're going to have some fun trying to do something similar but more simple I guess xP
@arjunp3574
@arjunp3574 2 роки тому
Thank you so much for really making me understand the working of the activation function. After seeing this video, my motivation to learn neural networks skyrocketed. And the effort you put into this video is overwhelming and I really appreciate you from the bottom of my heart. Once again , Thank You ❤️❤️❤️❤️
@turanyasar3375
@turanyasar3375 3 роки тому
awesome work! ur like a prophet of the science. hope this series goes up to the end.
Neural Networks from Scratch - P.6 Softmax Activation
34:01
sentdex
Переглядів 159 тис.
The Most Important Algorithm in Machine Learning
40:08
Artem Kirsanov
Переглядів 175 тис.
LIVE - Парад Победы в Москве. 9 Мая 2024
2:27:56
AKIpress news
Переглядів 2,2 млн
Піхотинці - про потребу у людях
00:57
Суспільне Новини
Переглядів 907 тис.
Neural Networks from Scratch - P.1 Intro and Neuron Code
16:59
sentdex
Переглядів 1,3 млн
The ARM chip race is getting wild… Apple M4 unveiled
4:07
Fireship
Переглядів 633 тис.
Corel Linux - The (Word)Perfect Operating System
25:40
Michael MJD
Переглядів 77 тис.
Neural Networks from Scratch - P.4 Batches, Layers, and Objects
33:47
Watching Neural Networks Learn
25:28
Emergent Garden
Переглядів 1,1 млн
Best OS for programming? Mac vs Windows vs Linux debate settled
8:41
The Truth About Learning Python in 2024
9:38
Internet Made Coder
Переглядів 84 тис.