At Money2020, AI Wereld sits down with Douwe Kiela—one of the Dutch pioneers who conceived and shaped the idea of Retrieval-Augmented Generation.
After introducing the technique as a researcher at FAIR, he led groundbreaking NLP initiatives at Hugging Face. Now, as CEO of
Contextual AI, he’s bringing RAG to life inside demanding enterprise environments—think customers like Qualcomm.
Where did your interest in AI begin?
“Back in high school. I was good with computers and taught myself to program, hack—all of it. Ironically, that’s why I didn’t want to study computer science. I started with philosophy, but soon realized it wasn’t the easiest path to a job. Through Liberal Arts and Sciences I landed in Cognitive Artificial Intelligence, and eventually studied logic—the foundations of mathematics. Then I did a second master’s at Cambridge, where I got into natural language processing (NLP), AI for language.”
Why philosophy?
“Because it tackles questions without clear answers. And because we still know so little about how the brain works. One way to understand it better is to explore how you can build brains in computers.”
Was moving from a Dutch university to Cambridge a culture shock?
“Not really. After high school I spent a gap year in Asia and gained research experience at Stanford and NYU. Cambridge felt like the next logical step.”
How did you end up at Facebook Research?
“During my PhD at Cambridge I interned with Léon Bottou, a well-known AI researcher. That went well. When Facebook AI Research (FAIR) launched, they asked me to join. I spent five years there—started as a postdoc and ended as research lead for NLP.”
What made you a good fit for that work?
“I’m good at articulating ideas with precision. It’s a form of communication that’s a lot like mathematics.”
What was it like working at Facebook?
“FAIR was fairly independent from the rest of Facebook. We didn’t work with production data—we did fundamental research. I published on language, multimodality, and contributed to projects like PyTorch and RAG (Retrieval-Augmented Generation).”
What exactly is RAG—and why does it matter?
“RAG pairs a language model with search. Instead of having a model generate everything from its internal knowledge, you feed it relevant context from an external source. That makes answers more accurate and up-to-date. It’s like having the model find evidence before it responds.”
You later joined Hugging Face. What changed when ChatGPT launched?
“ChatGPT wasn’t a technological revolution, but it was the first time language models truly worked as a product. Suddenly everyone knew about it. That’s when I realized: the market is ready for a company that gets RAG right. That became Contextual.ai, which I co-founded.”
Was fundraising difficult?
“No. I was in Silicon Valley, knew many VCs, was connected to Stanford, and had a solid track record. As soon as people heard I was starting a company, my LinkedIn inbox blew up. But you choose an investor like you choose a colleague: someone you trust and click with.”
What exactly does Contextual.ai do?
“We provide a platform for building agents that can search millions of documents using RAG. They answer complex questions using a company’s own data. That’s far more powerful than what you get from a standard language model like ChatGPT.”
Who are your customers?
“Qualcomm, for example. We help thousands of engineers find answers inside extremely complex technical documentation. That’s where our platform really stands out.”
Where is enterprise AI adoption right now?
“In the US, companies see this as a fundamental shift. In Europe, people are still too cautious. There’s not enough ambition. AI will change everything—from work to social structures. But if we do it right, people will actually have more room to be human.”
What should the Dutch or European governments do?
“Invest seriously in innovation. Don’t spread the money thin—focus it on a single top institute. Pay US-level salaries if you want to keep top talent. Otherwise, we’ll fall behind.”
Finally: which skills matter most in the AI era?
“Clear, precise communication. Prompting is essentially stating exactly what you want an AI to do. Philosophy helps with that. AI will handle the boring tasks—we need to keep thinking, talking, and understanding.”