Large Language Models and the Future of NLP

Recently we have seen the emergence of large pretrained language models such as GPT3. Unlike previous generations of models, “just” interacting with our models in natural language is a viable path to state-of-the-art performance on many useful tasks.

In this talk, I will discuss what the emergence of these large, difficult to train, but easy to interact with models might mean for the field of NLP, and the emergence of “prompt engineering” as a new, useful skill set.

About the speaker
Amy-Heineike

Connor Leahy

AI Researcher at Aleph Alpha GmbH

Connor is a founding member of EleutherAI, a decentralized research collective working on open-source AI research. He works as a researcher both with EleutherAI and at German startup Aleph Alpha.

His research focuses primarily on building, controlling, and understanding large language models and aligning them to human values.

NLP-Summit

When

Sessions: October 5 – 7
Trainings: October 8 – 9

Contact

nlpsummit@johnsnowlabs.com

Presented by

jhonsnow_logo