Attention Is All You Need

  Переглядів 609,826

Yannic Kilcher

Yannic Kilcher

День тому

arxiv.org/abs/1706.03762
Abstract:
The dominant sequence transduction models are based on complex recurrent or convolutional neural networks in an encoder-decoder configuration. The best performing models also connect the encoder and decoder through an attention mechanism. We propose a new simple network architecture, the Transformer, based solely on attention mechanisms, dispensing with recurrence and convolutions entirely. Experiments on two machine translation tasks show these models to be superior in quality while being more parallelizable and requiring significantly less time to train. Our model achieves 28.4 BLEU on the WMT 2014 English-to-German translation task, improving over the existing best results, including ensembles by over 2 BLEU. On the WMT 2014 English-to-French translation task, our model establishes a new single-model state-of-the-art BLEU score of 41.0 after training for 3.5 days on eight GPUs, a small fraction of the training costs of the best models from the literature. We show that the Transformer generalizes well to other tasks by applying it successfully to English constituency parsing both with large and limited training data.
Authors:
Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin

КОМЕНТАРІ: 288
@finlayl2505
@finlayl2505 3 роки тому
Friendship ended with LSTM, transformer is now my best friend.
@EvgenSuit
@EvgenSuit Рік тому
As far as i know there's a little amount of transformers for audio problems
@electric_mind
@electric_mind Рік тому
LSTMS generally perform better when it comes to short sequences, and remember LSTM is the revolution that led to the birth of The Transformer. I love both of them!
@klam77
@klam77 Рік тому
Lstm sequentialization is kludged inside transformers. Pay attention
@jamesbedwell4715
@jamesbedwell4715 Рік тому
Same but with GRU
@st0a
@st0a 8 місяців тому
Friendship ended with Transformers, Retentive Networks are now my best friend.
@tanmayjain6791
@tanmayjain6791 2 місяці тому
Nobody knew this paper would change the world
@RobotProctor
@RobotProctor 3 роки тому
I've watched this maybe 5 times over 1 year, each time getting more and more from it. I think I finally intuitively understand how this works. Thanks for your work and your time!
@niedas3426
@niedas3426 Рік тому
This has been my experience with ML in general: I have to re-read papers and books over and over again, and each time I understand more. It's hard, but it pays off to finally get the grasp of such an almost mystical cocnept.
@GuinessOriginal
@GuinessOriginal Рік тому
It’s a little bit more complicated than just predicting the next word based on the last, which is the take a lot of people have on it.
@electric_sand
@electric_sand Рік тому
@@niedas3426 How's it going...honestly this is how I feel sometimes, having to go through multiple videos and blogposts just to grasp concepts.
@niedas3426
@niedas3426 Рік тому
@@electric_sand Honestly, still making steady progress. I am now at a place where I am much, much further. I've mainly been preoccupied with datasets (e.g. reducing file storage size, faster reading and calculations, pytorch iterdatapipes) and realised it'd help me to go back more to the fundamentals (linear algebra, calculus, probability, pandas, numpy, data structures, builtin methods etc). It's been fun, overall. :)
@electric_sand
@electric_sand Рік тому
@@niedas3426 Thanks for your response. Same here, I decided to go back to the fundamentals as well...I simply got tired of struggling through papers. Wish you the best mate.
@TimKaseyMythHealer
@TimKaseyMythHealer Рік тому
Finally, someone is drawing vectors to describe what is meant by encoding with vectors, and how the vectors relate to one another. So many talk about this, but barely understand the details.
@herp_derpingson
@herp_derpingson 4 роки тому
I was searching for a channel like "Two minute papers" but not two mins in length and goes in depth. I think I found it! Subbed!
@kema8628
@kema8628 5 років тому
The explanation of querying a key-value pair is really nice
@gorgolyt
@gorgolyt 3 роки тому
I recommend looking at the paper, because they use exactly this analogy. I found their description very helpful: "An attention function can be described as mapping a query and a set of key-value pairs to an output, where the query, keys, values, and output are all vectors. The output is computed as a weighted sum of the values, where the weight assigned to each value is computed by a compatibility function of the query with the corresponding key."
@languagemodeler
@languagemodeler 9 місяців тому
It's amazing to have this explanation of the paper that is responsible for all of the AI interest and innovation happening now--- described as 'interesting' shortly after it came out. I love it.
@jugsma6676
@jugsma6676 6 років тому
By far the best explanation about the paper "Attention Is All You Need". well explained. Thanks Yannic Kilcher
@jugsma6676
@jugsma6676 3 роки тому
@bunch of nerds , BERT is also a transformer but with bidirectional (forward and backward) movement in the input sentence. And, GPT is a generative (autoregressive/ randomness) version of Transformer. Both are a language mode able to predict and understand the input sentence/s.
@jugsma6676
@jugsma6676 3 роки тому
@bunch of nerds , It's much better to use inbuild stable tools than doing from scratch.
@asmadali-
@asmadali- 3 роки тому
@@jugsma6676 is the most important thing to remember remember that you are the most powerful in the life of the tomb and your children are the most important thing to remember to be a a good friend friend of of a family who has been in in the last few months for a a long term relationship and you have to make sure that you have a good relationship with your your life insurance company in the world world world and of course you will be able and can can afford you for your business in the world and you you want to be a part of your happy moments and happiness in your lives by the time you get to the best of my abilities and I hope that you will be able and willing and willing to help help me in this situation as I am in the process and I am very very grateful for the the opportunity to work with my company in the field and I am am very grateful for the opportunity to work work with with my my partner in this process of learning from a very high level level of customer support for my future goals for the the industry in which I have have the opportunity to work with my business partner in the future of my company as well as a professional professional customer and and I am very confident confident in my ability to work work and to work with you in the process and I look forward to to speak with you you may be able to help me with this project as soon as I have a job in mind for my career and I will try and make sure I have the best of my abilities and I will try and get it to work for the best and I will try to the best of my abilities and I can make it to my end of of this project as soon as I get a job in a a good good place for my future goals for the next few weeks so I am going back to the best of my abilities and will let you know if I have any further ideas or comments about this opportunity and and if you have any further questions or concerns please feel free to contact us via email or or phone number and we can call or text or call or text me if you have any other ideas for us to discuss or if you have any other ideas for future projects and if you you need any help with any of the tomb or any other information you may have or if you have have any questions questions questions please feel free to contact me or you can call me me or my cell phone at any of those places or or at my cell number below are my my cell phone numbers and I will call them and get them to send me the info for your new home home and I can give you a few of the tomb I am looking at on Friday so I will be able and able to make some work if you want me there for a few hours and I can get you some info for the late night or early next weekend if you want me there there is no way I could be of any assistance or if you need need to get me a hotel for the night night and I will be in the office tomorrow morning to pick up my check for you to pick it 2017 and send it to me so I can send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I I I I I I I I I can come pick it out from your house and pick it out on the way ye if you want it for you or you could pick it up tomorrow morning and pick it out on the the phone or on the way to the office or at home or on the way to to get you a new phone or a new one for your phone and send it it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of my abilities and I will send it to the best of of my address address so I can send it to the best of my abilities as I am not sure if I will be able to make the payment today because I have not received any response from the seller and and I have not received any reply yet again for my response and I am not sure if I will be able to make this appointment today due to the fact fact that I am not able and I I will not be able to make it to the meeting today because I am unable to make it to class on Friday and I am not sure if if you you will be able or able help me out at this point I am not sure if I will be able to make this work or if I need this to work out or if I need this to work on my end of the semester and I will try to the best of the best of my abilities abilities as well and will let you know if I have any other openings for me I will try to the best of my abilities and will let you know if I have any further further information regarding the job offer or if you would like me help in any of the positions I am interested to apply to your position and I look forward forward to hearing from you in the near the end of next month to to see if I have any further information on the job opportunity that I am looking to move to a few years later this year as I have been working for the last last month of of the month of the month and I have a few things I need to do to get the job completed ASAP and I need a few things things to go over with my parents 243 4444444444443444 4w4424w4w4w442 2 I 4242 the 3 44442 of the country to make sure they have a 2 and they will 2 1 for their 3rd 11 in their 3rd party party 2 2 and they will not 32424242243 the world 44 34 25 2 44442 243 4444444444443444 and 2 2 is not a good place 4444 243 is the only 2 in the floor that that has to do with it in in 2nd century with 24 of a wide 20 and the 55 is the only 2nd largest of all 3 of all time in 2nd year of the tomb year and a long period of 20th 20th of the 4242 24 and a few 554444 243 554444 243 and a few of them were in the floor 3 of a few of them were the ones 2 and I 2 the ones that are in a good shape for a 3rd floor 44 to be the only one who is in a good mood and they will 2 their love and the love of 20 for the first day and a few more than a few days of their own lives in
@clray123
@clray123 3 роки тому
It's still like listening to a bad student struggling to explain something they don't themselves understand. And shame on the Google researchers for doing such a shit job explaining themselves, but I guess that's typical of the majority research papers out there - these people just don't care to teach their ideas to others (except maybe a very narrow circle with whom they have already communicated via other channels).
@rednas195
@rednas195 Рік тому
@@clray123 What parts do you think are explained poorly? To me it feels like Yannic understands the paper quite well, but i'm interested in what you think he might not understand all too well.
@prashanthkurella4500
@prashanthkurella4500 Рік тому
Who knew this paper would change how we look at sequences forever
@shandou5276
@shandou5276 5 років тому
Very well done! I agree with the other comments that this is the clearest explanation I have seen so far. Thanks for the great work!
@dariodemattiesreyes3788
@dariodemattiesreyes3788 4 роки тому
Really good explanation. You know how to provide the essence without getting lost into details. Details might be important later but the most important thing at first is the very main nature of the strategy and you provided it crystal clear. Thanks!!!
@vijeta268
@vijeta268 4 роки тому
You have done an excellent job in explaining attention method in simple words. Thanks so much!
@mdnayemuddin5595
@mdnayemuddin5595 2 роки тому
I just got a clear understanding of how the positional encoder works here. Kudos to you. Great Explanation!
@chandlerclement1365
@chandlerclement1365 5 років тому
Excellent video, thank you so much for illustrating these concepts so clearly.
@akhilvenkataraju7791
@akhilvenkataraju7791 3 роки тому
Thank you so much Yannic Kilcher, the paper seemed complex but you "encoded", performed "multi-head attention" and "decoded" it in such a simple way (: An amazing job! Undoubtedly the best explanation
@deaths1l3nce
@deaths1l3nce 3 роки тому
Thank you very much! This has helped me a lot. All I could find on this specific paper was confusing and hard to understand, I think it was explained extremely well in your video! Please make more of these, I think you might help lots of people :D
@Julian-tf8nj
@Julian-tf8nj 5 років тому
VERY helpful, thanks! I'd love to see a "part 2" ...
@jsphyan
@jsphyan 8 місяців тому
This is beautiful, I really appreciate your work! Thank you
@YtongT
@YtongT 4 роки тому
an amazing explanation. truly amazing. I cant say how much I appreciate you putting dot product and soft max into intuitive and easy to understand words. very grateful
@tassoskat8623
@tassoskat8623 3 роки тому
Great video and very unique amongst most machine learning videos on youtube. Thank you!
@deleteme924
@deleteme924 6 років тому
You have the best videos about machine learning I've seen, comparable to only perhaps 3blue1brown, but his videos aren't about as advanced topics. It would be really nice if you can make more!
@snippletrap
@snippletrap 3 роки тому
Arxiv Insights and Henry AI are pretty good too
@ambujmittal6824
@ambujmittal6824 3 роки тому
@@snippletrap Arxiv sadly stopped posting a long time back and I personally find Henry AI's discussion to be superficial. Try Chris McCormik and reading groups by Rachael though. :)
@peterhojnos6705
@peterhojnos6705 3 роки тому
Who do you mean by “Rachael reading groups”?
@48956l
@48956l 2 роки тому
Definitely both great channels but the comparison doesn't do justice to just how good 3b1b's animations are. Here Yannic writes on a tablet lol. Not really comparable.
@renehaas7866
@renehaas7866 3 роки тому
I really appreciate that you are making these videos.
@BrettHannigan
@BrettHannigan Рік тому
Excellent explanation of Transformers. Clear, easy to follow, and great information. Thanks!
@alexandrostsagkaropoulos
@alexandrostsagkaropoulos 9 місяців тому
Just exceptional explanation. You clear things up so much!
@aidangomez6004
@aidangomez6004 3 роки тому
This is a great summary, thanks for making this!!
@owenmarkley446
@owenmarkley446 5 місяців тому
This is by far the best explanation I've seen of this paper. I'm writing a review of this paper for a class and wouldn't have been able to do it without your video! Immensely grateful!
@fahds2583
@fahds2583 3 роки тому
you have such a cool state of mind ... really adds to making your teaching style more interesting
@vikaskumarjha9
@vikaskumarjha9 4 роки тому
Thank you so much. You explain it so well in very simple terms.
@sophiaxia3240
@sophiaxia3240 2 роки тому
by far the most intuitive explanation. Thanks!
@olegshpynov
@olegshpynov 4 роки тому
Great explanation of the transformer model. Thank a lot!
@michaelmuller136
@michaelmuller136 4 роки тому
Thank you, that was very informative and explained well!
@prasitamukherjee5864
@prasitamukherjee5864 2 роки тому
Thank you for the super neat explanation- Cleared a lot of stuff.
@VinBhaskara_
@VinBhaskara_ 6 років тому
great explanation. please keep posting such summaries of great papers thanks!
@Don-gk9ss
@Don-gk9ss 5 років тому
the best transformer video I have watched. Well explained
@magnuswiklander8204
@magnuswiklander8204 Рік тому
Fun to see this today after all the recent successful transformer results! (June 2022) Thanks Yannic, keep it up!!
@julinamaharjan6987
@julinamaharjan6987 4 роки тому
Very intuitive explanation. Thank you!
@lsqshr
@lsqshr 6 років тому
Really awesome job! I was puzzling about what the key, value pairs are. Thanks a lot!
@anisakhlyan8581
@anisakhlyan8581 4 роки тому
Thank you! This is a very good explanation which I actually used in presenting this paper. Cheers man!
@user-wt7ut4xj5r
@user-wt7ut4xj5r 5 років тому
Thank you so much. Your videos are so much helpful.
@halehdamirchi146
@halehdamirchi146 3 роки тому
This was really helpful, thank you!
@suyashshrivastava8317
@suyashshrivastava8317 3 роки тому
Thank you so much for this. Excellent explanation
@sahanagk4011
@sahanagk4011 3 роки тому
This explanation is amazing!! Thank you for this
@ankitbhardwaj1956
@ankitbhardwaj1956 4 роки тому
Thanks a lot for this explanation video!!
@tyfoodsforthought
@tyfoodsforthought 3 роки тому
This was wonderful. Thank you!!!
@fisherh9111
@fisherh9111 11 місяців тому
this is excellent. Thank you so much for sharing!
@hamedgholami261
@hamedgholami261 2 роки тому
I really understood the subject, thanks for your clear explanation.
@Aiducateur
@Aiducateur 5 років тому
thank you soooo much!! made the whole thing more easy to grasp!
@jabusch24
@jabusch24 5 років тому
very good intro. many videos don't focus on visual explanation which you definitely did cover. I'ld be thrilled to see a video that goes more into depth, also how exactly the decoding is done once it's trained and how embeddings could be obtained for other tasks. But other than that, very very very well done!
@MsShiloni
@MsShiloni 6 років тому
Great video! Really explanatory in short time..
@teddy5474
@teddy5474 2 роки тому
Best explanation I've seen on this topic!
@LightFykki
@LightFykki 5 років тому
Great explanation, thanks!
@qidichen1756
@qidichen1756 3 роки тому
One of the best explanation !!!
@marjansherafati6913
@marjansherafati6913 3 роки тому
Thank you very much, amazing explanation! 🙏🏼🙏🏼
@sarahpanda1167
@sarahpanda1167 4 роки тому
Very helpful !! Thank you !
@ziku8910
@ziku8910 Рік тому
Very intuitive explanation here, thank you!
@garrettosborne4364
@garrettosborne4364 3 роки тому
Thanks Yannic, great videos.
@sebchap24
@sebchap24 10 місяців тому
Quite an amazing explanation! thanks a lot
@pouriababvey1214
@pouriababvey1214 4 роки тому
Very good explanation, thank you!
@Luxcium
@Luxcium 2 місяці тому
This sounds like someone who was reading a paper without realizing that it was to be the third largest thing to happen onward to humanity after a pandemic and from my own perspective an invasive war in Europe then the spark of AI that would happen with ChatGPT and the expansion of generative imaging like Stable Diffusion and Midjourney 😅🎉🎉🎉🎉 I would love to know how many subscribers you got from back then to just before ChatGPT and from ChatGPT up to nowadays 😅😅😅😅 You are such an amazing communicator ❤
@kevind.shabahang
@kevind.shabahang 3 роки тому
Thank you. Very clear.
@aminzaiwardak6750
@aminzaiwardak6750 4 роки тому
Thanks a lot you explained very well.
@dailygrowth7967
@dailygrowth7967 2 роки тому
Thanks, i really enjoy your content!
@shaxar001
@shaxar001 4 роки тому
Thank you for the video! I really enjoyed it, especially the key, value and query part :) Your explanation make it a lot more intuitive to understand.
@starlite5097
@starlite5097 Рік тому
Thanks, nice video. You've come a long way since then, I'm sure, especially with the open assistant stuff.
@dalissonfigueiredo1180
@dalissonfigueiredo1180 2 роки тому
very well explained, thank you.
@nguyentrung1452
@nguyentrung1452 2 роки тому
great explanation. Thank you, master
@TijsZwinkels
@TijsZwinkels 11 місяців тому
Yeah, I'm late to the party, but I'd say that this video is still very relevant. I've read the paper several times and watching multiple blog posts and videos, but especially the Q,K,V mechanism never really clicked until watching this. Using dot-products between Q and K as a lookup mechanism. Ingenious! - Thanks for this video!
@pr3st0n2
@pr3st0n2 5 років тому
great insight! thanks :]
@dip7777
@dip7777 5 років тому
Very well explained. Thank You for making this video!
@RealStonedApe
@RealStonedApe 3 місяці тому
Love how 'All you need is attention' also applies to me in terms of understanding this video. Time to chug down some Adderall and take notes!! Also, probably not a good start when I have no idea what a vector even is... Take it in bit by bit. Robert Pirsig reading Thoreau style, ya dig?! Anyways, plz pray for me. Any God, Joseph Smith, Allah, Bill Murray, etc. And he will do. Pray for my attention, pray for my soul, pray for Bill Murray. Love❤🎉
@spinner4
@spinner4 Рік тому
Why could there not be such a UKposts explanation from authors of the paper: would be very helpful for humanity right now. But this is quite helpful.
@NtcPedroPeterNews
@NtcPedroPeterNews 3 роки тому
amazing video! thank you so much for the explanation!
@danecchio6621
@danecchio6621 2 роки тому
Grazie tantissimo per la spiegazione.
@maratkopytjuk3490
@maratkopytjuk3490 5 років тому
Enjoyed your video, wanted to give more thumb ups :D Great explanation - not too deep but gives a very nice overview of the architecture. Previously I was lacking a concept "what is a key" but your explanation made this clear to me that the query looks for e.g. "verb with context to cats" or s.th. :) Please go on with your work!
@bmackey
@bmackey 5 років тому
I am very lucky to have found this video indeed!
@shrikanthsingh8243
@shrikanthsingh8243 4 роки тому
Thank you so much it was a very good explanation
@arunantony3207
@arunantony3207 3 роки тому
Great explanation !
@maxgriffiths6968
@maxgriffiths6968 3 роки тому
Great video - thanks!
@rommeltito123
@rommeltito123 3 роки тому
Good that you were interrupted at 17:15. I had to strain my ears and go full volume to hear you. After that it was better.
@constanceyang4886
@constanceyang4886 5 років тому
Very good explanation!
@xingyubian5654
@xingyubian5654 Рік тому
Always wondered wht keys, values, queries are. Thank you for the clear explanation!
@goldfishjy95
@goldfishjy95 2 роки тому
Thank you so much!!!
@nchahine
@nchahine 3 роки тому
I always thought about doing a youtube channel like this, but I guess I don't need to because you are so good at this thanks!
@alfred17686
@alfred17686 5 років тому
This was such a good explanation. I've been trying to really understand these, but until now I haven't found a good resource. Cheers man!
@MuslimFriend2023
@MuslimFriend2023 Місяць тому
Man. I just found your channel. All the best insh'Allah
@arslanali900
@arslanali900 2 місяці тому
You are amazing!
@carlkenner4581
@carlkenner4581 Рік тому
This will never catch on. (Kidding)
@azizasaber8471
@azizasaber8471 2 роки тому
WELL explained!
@twobob
@twobob Рік тому
yeah. that was a really good explanation.
@astrobearmusic1977
@astrobearmusic1977 3 роки тому
I had to revisit this video several times, but I think transformers finally clicked for me. Thank you!
@bernhardvoggenberger9850
@bernhardvoggenberger9850 3 роки тому
As a student your videos are very helpful!
@tommykelly6840
@tommykelly6840 2 роки тому
You are literally amazing
@archywillhe1379
@archywillhe1379 4 роки тому
love your videos yannic such smoothing voice I missed Germany :D
@thecumbackkid9490
@thecumbackkid9490 3 роки тому
just beautiful man!!
@GunHolsters
@GunHolsters 2 роки тому
Thank you, now I understand.
@shaghayeghrabbanian609
@shaghayeghrabbanian609 2 роки тому
Great explanation thanks
@simons6512
@simons6512 2 роки тому
Super Erklärig, bis jetzt eini vo de beste woni gfunde ha. Findes cool zgseh das mal öpper us de schwiz sich so uf dere plattform engagiert. Witer so!
@PaulFidika
@PaulFidika 5 місяців тому
I'm watching the history of AGI being built right here
@Darthvanger
@Darthvanger 2 роки тому
21:30 - thanks for the great softmax explanation! I've had the "aha" moment :)
@edwardhu7883
@edwardhu7883 6 років тому
Great lecture.
Мы играли всей семьей
00:27
Даша Боровик
Переглядів 3,3 млн
WE MUST ADD STRUCTURE TO DEEP LEARNING BECAUSE...
1:49:11
Machine Learning Street Talk
Переглядів 74 тис.
The math behind Attention: Keys, Queries, and Values matrices
36:16
Serrano.Academy
Переглядів 193 тис.
Transformer Neural Networks, ChatGPT's foundation, Clearly Explained!!!
36:15
StatQuest with Josh Starmer
Переглядів 548 тис.
Let's build GPT: from scratch, in code, spelled out.
1:56:20
Andrej Karpathy
Переглядів 4,2 млн
Flow Matching for Generative Modeling (Paper Explained)
56:16
Yannic Kilcher
Переглядів 34 тис.
Transformer Neural Networks - EXPLAINED! (Attention is all you need)
13:05
Хомяк может разблокировать АЙФОН
0:14
Собиратель новостей
Переглядів 1,6 млн
How much charging is in your phone right now? 📱➡️ 🔋VS 🪫
0:11
Игровой ноутбук за 100тр в МВИДЕО
0:58
KOLBIN REVIEW
Переглядів 711 тис.
Самая важная функция в телефоне?
0:27
Опросный
Переглядів 192 тис.