Go Summarize

Perplexity CEO Aravind Srinivas, Thursday Nights in AI

Outset Capital2023-07-18
13K views|1 years ago
💫 Short Summary

Arvind Srinivas, the CEO of Perplexity AI, discusses the company's journey, its approach to building an answer engine, the importance of working alongside OpenAI, and the key factors in the AI landscape. He emphasizes the practicality of their approach and the need to focus on their own journey while navigating the rapidly changing AI industry.

✨ Highlights
📊 Transcript
This section introduces the second event in the Thursday nights Nai Series, co-hosted by outset capital and generally intelligent, featuring Arvind Srinivas, the co-founder and CEO of Perplexity AI.
00:00
Outset capital is an early stage fund that invests in pre-seed and seed companies.
Generally intelligent is a research company focusing on creating more capable and robust systems for digital environments.
Perplexity AI is the world's first generally available conversational answer engine.
Arvind Srinivas talks about his journey from being a PhD student at Berkeley to working in AI at various companies and eventually founding Perplexity AI.
04:00
He originally came to Berkeley for a PhD in AI and was interested in deep reinforcement learning (RL).
Inspired by the TV show Silicon Valley, he started working on generative models for lossless compression.
Saw the potential in AI after reading about the early days of Google at the library.
Got fascinated by the idea of PhD people starting companies.
Explored the idea of building an infrastructure for AI.
The speaker discusses the concept of "doing what the customer wants" and the importance of being motivated by the problem that one deeply cares about.
08:00
The focus was on building an infrastructure for search, inspired by Larry and Sergey.
The company decided to raise a small amount of cash, build the product without infrastructure, and then slowly invest in infrastructure over time.
The current plan is to work alongside OpenAI and build their own models in the future.
The speaker believes in the practical feasibility of building their own models up to a certain level (3.5) with the funding they have.
The approach is considered pragmatic and not necessarily bold, given the fast-paced and competitive AI landscape.
The speaker discusses the defensibility and long-term success of a company in the AI space, emphasizing the importance of having a large user base and investing in proprietary models and indexes.
12:00
Long-term defensibility is achieved through having a large user base and a high-quality product that users love.
Investing in proprietary models and indexes is vital for the company's success.
Two ways of building a company: focusing on infrastructure first or rolling out a product and then investing in infrastructure.
The speaker's company chose to raise a small amount of cash, build the product without infrastructure, and slowly invest in infrastructure later.
The current plan is to work alongside OpenAI and build their own models to work alongside OpenAI's models.
Arvind Srinivas says that being a founder is challenging and fast-paced, and he is not bold enough to raise a huge amount of money without a clear plan.
00;16;00
He believes in proving the world wrong by being successful at the intersection of what is right.
The ability to build gpt4 with $10 million funding may be more advantageous than having $500 million.
Scarcity cannot be faked, and the one who has more at stake and is more focused on winning will eventually succeed.
He mentions that the inflection funding round doesn't change their destinies, and it's more for OpenAI to worry about.
The speaker discusses the advantage of being a startup with a smaller user volume compared to Google, and the potential cost reduction strategies in the future.
22:00
Smaller models trained explicitly for retrieval augmented
Expected future cheaper hardware and more efficient inference techniques
Not needing to worry about Google's user volume for a long time
Building their own index will also be a big cost reduction
The speaker mentions that the cost of not having Google's user volume is not a big deal for their startup and the future cost reduction strategies, including the use of smaller models trained explicitly for retrieval augmented and the expected cheaper hardware.
28:00
The speaker also expects hardware to become cheaper over time and more tricks to make inference more efficient.
Building their own index will be a big cost reduction as well, as once indexed, the retrieval cost is not significant.
💫 FAQs about This YouTube Video

1. What is Perplexity Ai and how does it compare to existing search engines?

Perplexity Ai is a conversational answer engine that combines large language models with search indexes to provide more comprehensive and accurate answers. It aims to fulfill the human need for information and answers, and the speaker suggests that it outperforms traditional search engines like Google in terms of relevance and ad-free results.

2. Why did Arvind Srinivas decide to launch Perplexity Ai?

Arvind Srinivas, the CEO of Perplexity Ai, was inspired by the early days of Google and the idea of PhD people starting companies. His background in AI and a desire to create a more capable and robust system led him to launch Perplexity Ai, which focuses on making more generally capable, robust, and safer agents.

3. How does Perplexity Ai work under the hood?

Perplexity Ai is a combination of a traditional search index and the reasoning power and text transformation capabilities of large language models. It reformulates user queries, retrieves relevant links and paragraphs, and provides answers with citations in a concise and informative manner.

4. How does the speaker suggest preventing or reducing hallucinations in the AI model?

The speaker suggests that the core principle of the product is to only say what can be cited, similar to the practices in academia and journalism. By pulling up content from credible sources and using it for generating answers, the AI model can reduce hallucinations. However, the speaker acknowledges that there are still some cases where the model's output may be ambiguous or incorrect, and further improvements are being pursued.

5. What are the important factors in the future to bring the cost down and make it more profitable for Perplexity Ai?

The important factors in the future to bring the cost down and make it more profitable for Perplexity Ai include the use of smaller models trained explicitly for retrieval augmented, expected cheaper hardware over time, and the ability to build their own index, which will be a big cost reduction as well.