Large Language Models and the Future of NLP
Recently we have seen the emergence of large pretrained language models such as GPT3. Unlike previous generations of models, “just” interacting with our models in natural language is a viable path to state-of-the-art performance on many useful tasks.
In this talk, I will discuss what the emergence of these large, difficult to train, but easy to interact with models might mean for the field of NLP, and the emergence of “prompt engineering” as a new, useful skill set.
AI Researcher at Aleph Alpha GmbH
Connor is a founding member of EleutherAI, a decentralized research collective working on open-source AI research. He works as a researcher both with EleutherAI and at German startup Aleph Alpha.
His research focuses primarily on building, controlling, and understanding large language models and aligning them to human values.