How can an AI understand language? Computer-human communication is undergoing a revolution and AI can now listen to, understand, and speak back to us in much more powerful ways than it could before. On this episode, hear Scott Leishman discuss how AI can now write news articles, blog posts, poetry, and novels and how work done in the recent past is making it easier than ever to build incredibly powerful AI applications that can communicate with human beings.
-- TIMING –
00:48 Scott’s background in computer science at FICO, Core Logic, and Nirvana Systems (which exited to Intel for $400M in 2016), and Intel
06:56 What is Natural Language Processing (NLP)?
11:40 What was the significance of GPT-3’s release this year?
16:31 What can GPT-3 do? (explain it to somebody who doesn’t follow the field).
19:15 NLP is having its “ImageNet moment” – what does that mean? (Technical explanation)
25:39 Simplifying NLP for less-technical listeners
28:17 Standing on the shoulders of giants: Pre-trained models are making it easier to build AI applications
30:05 What kinds of new uses cases are possible with the current state of the art NLP?
33:29 Apple Knowledge Navigator – are we there yet?
37:25 Where does NLP live in the AI stack?
41:34 What are you doing with NLP at XOKind?
49:47 What should people be doing to improve their chances of working in this space?
-- LINKS --