Activation Functions In Neural Networks Explained | Deep Learning Tutorial

  Переглядів 35,663

AssemblyAI

AssemblyAI

День тому

Get your Free Token for AssemblyAI Speech-To-Text API 👇
www.assemblyai.com/?...
In this video we are going to learn about Activation Functions in Neural Networks. We go over:
* The definition of activation functions
* Why they are used
* Different activation functions
* How to use them in code (TensorFlow and PyTorch)
Deep Learning In 5 Minutes video: • Deep learning in 5 min...
Different activation functions we go over:
Step Functions, Sigmoid, TanH, ReLU, Leaky ReLU, Softmax
Timestamps:
00:00 Introduction
00:35 Activation Functions Explained
01:48 Different activation functions
05:23 How to implement them
06:20 Get your Free AssemblyAI API link now!

КОМЕНТАРІ: 24
@_Anna_Nass_
@_Anna_Nass_ 2 місяці тому
OMG, you actually made this easy to understand. I can't believe it. The animations are so helpful. Thank you immensely!
@reireireireireireireireirei
@reireireireireireireireirei 2 роки тому
Actuation functions.
@kaiserkonok
@kaiserkonok Рік тому
🤣
@draziraphale
@draziraphale Рік тому
These videos from Assembly AI are excellent. Distilled clarity
@alpeshdongre8196
@alpeshdongre8196 5 місяців тому
🎯 Key Takeaways for quick navigation: 01:35 🧠 *Activation functions are crucial in neural networks as they introduce non-linearity, enabling the model to learn complex patterns. Without them, the network becomes a stacked linear regression model.* 02:43 🔄 *The sigmoid function, commonly used in the last layer for binary classification, outputs probabilities between 0 and 1. It's effective for transforming very negative or positive inputs.* 03:25 ⚖️ *Hyperbolic tangent, ranging from -1 to +1, is often chosen for hidden layers. ReLU (Rectified Linear Unit) is simple but effective, outputting the input for positive values and 0 for negatives, addressing the dying ReLU problem.* 04:32 🔍 *Leaky ReLU is a modification of ReLU that prevents neurons from becoming "dead" during training by allowing a small output for negative inputs. Useful in hidden layers to avoid the dying ReLU problem.* 05:13 🌐 *Softmax function is employed in the last layer for multi-class classification, converting raw inputs into probabilities. It's commonly used to determine the class with the highest probability.* Made with HARPA AI
@igrok878
@igrok878 Рік тому
thank you. Good pronouncing and good content.
@terrylee6904
@terrylee6904 Рік тому
Excellent Presentation.
@wagsman9999
@wagsman9999 10 місяців тому
Thank you. I am a little smarter now!
@bernardoolisan1010
@bernardoolisan1010 Рік тому
Very good video!
@thepresistence5935
@thepresistence5935 2 роки тому
Explained clearly
@AssemblyAI
@AssemblyAI 2 роки тому
thank you!
@narendrapratapsinghparmar91
@narendrapratapsinghparmar91 4 місяці тому
Thanks for this informative video
@anurajms
@anurajms 9 місяців тому
thank you
@muskduh
@muskduh Рік тому
thanks
@joguns8257
@joguns8257 Рік тому
Superb introduction. Other videos have just been vague and hazy inn approach.
@AssemblyAI
@AssemblyAI Рік тому
Glad you liked it
@rashadloulou
@rashadloulou 7 місяців тому
We could apply an AI tool to this video to replace actuation with activation :D
@bezelyesevenordek
@bezelyesevenordek 5 місяців тому
nice
@be_present_now
@be_present_now Рік тому
Good video! One thing I want to point out is that the presenter is talking too fast, a slower speed would make the video great!
@valentinleguizamon9957
@valentinleguizamon9957 19 днів тому
❤❤❤❤
@canygard
@canygard 2 місяці тому
Why was the ReLU neuron so depressed? ...It kept getting negative feedback, and couldn't find any positive input in its life.
@sumanbhattacharjee7550
@sumanbhattacharjee7550 5 місяців тому
real life Sheldon Cooper
@brianp9054
@brianp9054 Рік тому
it was said but worth the emphasis, ... 'actuation' function 🤣🤣🤣. Repeat after me, one two and three: A-C-T-I-V-A-T-I-0-N. Great, now keep doing it yourself until you stop saying actuation function...
@Huffman_Tree
@Huffman_Tree Рік тому
Ok I'll give it a try: Activatizeron!
Godzilla Attacks Brawl Stars!!!
00:39
Brawl Stars
Переглядів 9 млн
ISSEI funny story😂😂😂Strange World | Magic Lips💋
00:36
ISSEI / いっせい
Переглядів 94 млн
Activation Functions - EXPLAINED!
10:05
CodeEmporium
Переглядів 105 тис.
But what is a neural network? | Chapter 1, Deep learning
18:40
3Blue1Brown
Переглядів 16 млн
Watching Neural Networks Learn
25:28
Emergent Garden
Переглядів 1,1 млн
The Most Important Algorithm in Machine Learning
40:08
Artem Kirsanov
Переглядів 176 тис.
Best OS for programming? Mac vs Windows vs Linux debate settled
8:41
Transformers for beginners | What are they and how do they work
19:59
AssemblyAI
Переглядів 141 тис.