Hey friends,
A quick update on the $200 version of ChatGPT.
I've actually found myself using Grok (the free version) a lot more for two simple reasons: it's faster and the results are damn good.
I still plan on keeping the subscription (mostly to report any personal break-throughs) but in the spirit of transparency I want you all to try Grok too.
Notebook LM is blowing my mind
If I'm honest, I've committed most of my mind space to ChatGPT and Claude - not paying much attention to Google-related AI products.
But my friend Tiago Forte released this simple and detailed tutorial, which motivated me to give NotebookLM a shot.
Fundamentally, NotebookLM is built differently than the traditional "chat-based" LLMs we've grown used to. Here's how:
It's "Document-centric" There's much more emphasis on you uploading your own source material.
It's a "closed system" Since your documents anchor the chat, it's much more accurate and it cites your sources. Which means you don't have to fear hallucinations as much.
It's more than just chat The output can be audio, study guides, FAQs and briefing documents.
NotebookLM use cases
So when should you use NotebookLM? Remember, it works particularly well if you have: a lot of source material and you can't tolerate hallucinations.
It's even more powerful if your source material is proprietary such as conversation transcripts, internal analyses, unique data sets and custom research.
Some professional use cases include:
Investment research: Use quarterly reports, competitor analyses, and forecasts to identify trends and actionable insights
Strategic Planning: Analyze case studies, industry reports, and regulatory documents to create strategic plan
Client Management: Maintain dedicated notebooks for client projects with meeting notes, deliverables, and communication records
Learning New Skills: Organize tutorials, guides, and resources into structured learning paths
For my use case, I followed Tyler Cowen's advice on trying to understand what makes AI "exponential" - specifically the technical process called Reinforcement Learning.
(BTW, if you are non-technical and want to geek out on what's actually happening under the hood, you MUST WATCH this 3.5 hour video by Andrej Karpathy. It should be required watching for EVERYONE, yet sadly, few will!)
Responding quickly to new information
Remember when DeepSeek showed up one day and the entire stock market tanked?
It was because they had made a huge breakthrough in Reinforcement Learning - which could've paved the path for much less computationally-intensive AI.
And all of a sudden for investors, product managers, consultants and entrepreneurs it became a competitive advantage to understand what the heck was going on DeepSeek.
And to do it quickly.
Here's how you could get up to speed quickly using NotebookLM.
Step 1. Find and upload your inputs
This is where things get crazy.
You can upload 50 different information sources. This includes:
PDFs
Google Docs
Presentations
Website links
YouTube links
Audio files
Custom text
Each source can be up to 500,000 words. Multiplied by 50!
For a total of 25 million words!
For context, ChatGPT ($200 version) can take ~150,000 words and they must be directly pasted into the chat window as one giant block of text.
So in my case, I uploaded:
4 academic papers (including the paper from DeepSeek themselves)
4 blog post tutorials on Reinforcement Learning (an example)
5 YouTube videos (an example)
Step 2. Create your personalized podcast
The coolest feature IMO, is the ability to create a customized podcast.
NotebookLM takes all the sources and effectively summarizes them for you in a conversation between two AI voices.
It sounds really good and is extremely powerful for people who don't like to learn via books.
First, I gave it some constraints - since all those papers are for PhD-level AI experts, I needed to tailor the output to the level of someone who is "smart, but non-technical."
You can listen to the podcast it generated for me below.
(You can also "interrupt" the chat and ask it to go on tangents or dig deeper into a topic.)
Step 3. Customize your learning materials
Once your sources are in NotebookLM, there are a few ways to engage with your information.
First, you can chat and ask any questions.
How should investors in GPUs consider the impact of DeepSeek?
The nice thing is that the response (which I can't copy-paste) is heavily annotated. This allows you to quickly click-through to the source materials and verify the validity of any claim.
In conclusion, investors in GPU companies should view DeepSeek's emergence as a signal of a potentially evolving AI landscape. While the demand for GPUs will likely remain strong, the emphasis on efficiency, distillation, open-source initiatives, and algorithmic innovation suggests that growth in GPU demand might become more nuanced and less solely dependent on the insatiable need for the most powerful, cutting-edge hardware for every AI application. Investors should pay attention to GPU companies that are not only pushing the boundaries of computational power but also innovating in ways that support efficient AI development and deployment across a wider range of hardware.
Next, you can create:
Study Guides with short-answer quizzes, essay questions, an answer key, and a glossary of important terms.
Briefing Docs that summarize key themes, concepts, examples, quotes, and concludes with the main takeaways.
FAQs with concise answers that simplify complex topics.
Timelines that organize information sequentially or logically.
I could see this being very powerful for studying and immersing yourself in a new topic.
One demographic who loves NotebookLM? College students.
They can upload all of the textbooks, handouts, pdfs, lecture notes and supplementary reading to interact with all their learning materials.
How does NotebookLM fit into your "AI Portfolio?"
It's too early to tell. Maybe the learn-something-via-a-podcast is a novelty that will wear off quickly.
And since everything is heavily cited, at times it can feel like a rigid interaction (versus the yolo style of ChatGPT or Claude).
But it does lead to an interesting thought experiment:
Everyone has access to ChatGPT.
With time, everyone will get better at prompting ChatGPT.
So that competitive advantage will fade over time.
What will remain unique?
Access to unique data. Whether it's conversations, analyses, unique insights, proprietary data sources. And the ability to interact with this unique data - irrespective of the LLM - will ultimately serve as the differentiator.
See you next week!
Much love,
Khe
PS Here's a video of my entire process of loading and querying NotebookLM.
PPS I’m doing a personal 30 days of Deep Research challenge. Follow along on X/Twitter.
I've found NoteBookLM indispensable for research, verifying sources, cross-referencing sources, and checking context to make sure your interpretations are on point. It's an amazing tool that speeds up the process for me.
I totally missed the podcast feature; it's amazing! I'm a big fan of NotebookLM. Thanks for the article.