504 flashcards from one series. Zero textbooks. Two AIs and one algorithm trained on 700 million reviews.
This is everything I wish someone had shown me when I started learning English laid out in the exact order you should do it.
Every step builds on the last. But if you actually follow the system, not just read it and tell yourself "I'll try this later", you can go from barely understanding dialogue to watching full episodes without pausing in about one season.
Part I: Forget Everything You Think You Know About Language Learning
Most people think learning a language is about apps. Downloading Duolingo. Doing 15 minutes a day. Getting a streak.
Language learning is about input.
You are mostly working with context, repetition at the right intervals, and emotional anchoring that happens when you hear a word in a real scene spoken by a real person in a situation that actually matters.
Duolingo teaches you *"the cat is on the table"* 47 times in a cartoon interface. You feel productive. You learn nothing you can actually use. You definitely cannot understand two people arguing in a courtroom.
The problem with every language app is kinda the same: words in a vacuum. No context. No emotion. No stakes. Your brain tags them as irrelevant and deletes them.
That's biology. Not opinion.
Part II: The Forgetting Curve
Hermann Ebbinghaus proved this in 1885. You forget 40% of new information within days. 90% within a month.
That's the forgetting curve. Doesn't matter how smart you are. Doesn't matter how motivated.
The only thing that beats the forgetting curve is spaced repetition: reviewing information at precisely calculated intervals, right before you're about to forget it.
Each review strengthens the memory. The intervals get longer. After 5 to 6 reviews spread over weeks, the word is basically permanent.
This is not optional. This is the foundation.
Part III: Why Series and Not Apps
A word you hear while Jimmy McGill is arguing his way out of a felony *hits different* than a word on a green owl's flashcard.
Neuroscience has a term for this: contextual encoding. When a word is attached to emotion, a visual scene, a human voice, your brain creates multiple memory pathways. Not one. Multiple. The word sticks because your brain decided it matters.
An episode of any decent series has 50 to 100 words worth learning. Phrasal verbs. Slang. Idioms that no translator handles correctly. Google Translate sees *"cut it out"* and genuinely thinks you want to physically cut something.
The gap between what a translator gives you and what a native speaker actually means is where AI becomes useful.
Part IV: The Pipeline
Three tools. Each does one thing.
NotebookLM extracts the words from the episode transcript. Claude formats them into Anki-ready flashcards. Anki drills them into your long-term memory using an algorithm trained on 700 million reviews.
That's the whole system.
Step 1: Get the Transcript
Google [series name] transcript season X episode Y. Full scripts exist for thousands of series across multiple free sites. You will find yours.
Copy the full text. If it's too long for clipboard, save the page as PDF with any browser extension. One click.
Takes 30 seconds.
Step 2: NotebookLM
Open NotebookLM. New notebook. Paste the transcript as a text source or upload the PDF.
Why NotebookLM and not GPT or anything else.
NotebookLM is a closed system. It only works with sources you give it. Doesn't pull from the internet. Doesn't hallucinate. Doesn't invent example sentences that sound plausible but aren't from your episode.
Every word it finds and every example sentence it gives you comes *directly* from your transcript. For this task that's exactly what u need. You want real lines from the series you're about to watch, not stuff the AI made up.
It's free. Handles full episode transcripts without truncating. GPT tends to summarize instead of extracting everything and gives you 20 items when you asked for 50.
Once the source is loaded, send this:
Every example sentence is a real line of dialogue from the episode. Not generated. Real.
If the output is too easy: *"Too basic. Only B2-C1. Remove everything below B1."* It recalibrates.
Bonus: NotebookLM can generate a podcast-style Audio Overview about your transcript. Two AI voices breaking down the content of the episode in English. Free listening practice on the material you're about to watch. Just press generate.
Step 3: Claude
NotebookLM extracts. But Anki needs a specific format: two columns, CSV.
Claude handles structured data formatting better than any model I've tested. Gemini loses rows. GPT muddles columns. Claude outputs clean CSV every time.
Open Claude. Send this:
Paste the NotebookLM output. Claude returns CSV.
Copy. Paste into Google Sheets. Verify two clean columns. Download as .csv.
You can try doing everything in one model. But when you ask one AI to extract, translate, format, and output structured data at the same time, quality drops. Skips words. Breaks columns. Hallucinates transcriptions.
Two models, two steps, clean output every time.
Step 4: Anki
Anki is free flashcard software with spaced repetition. Shows you a card right before you're about to forget it. Not earlier (wasted review). Not later (already forgotten). At the exact point where recalling it strengthens the memory the most.
Free on desktop and Android. $25 on iOS, one time.
Import: Open Anki → Create deck → name it after your series → File → Import → select CSV → Separator: comma → Column 1 = Front, Column 2 = Back → Import.
Done. Front side: sentence in your language. Back side: original English dialogue from the episode.
FSRS: the part that actually matters
In 2023, Anki replaced its scheduling algorithm. The old one, SM-2, hadn't changed since 1987. Think about that. An algorithm from the Reagan era was deciding when you should review your flashcards.
The replacement is FSRS (Free Spaced Repetition Scheduler). Trained on 700 million real reviews from 20,000 users. Uses machine learning to model *your personal* forgetting curve. Not a generic one. Yours.
20 to 30% fewer reviews at the same retention rate. Over a year that's hundreds of hours saved.
How to turn it on:
1. Deck options (gear icon)
1. Scroll to FSRS
1. Toggle on
1. Desired Retention: 0.90
Don't go higher. 0.90 → 0.95 doubles your daily workload. 0.97 quadruples it. The research confirms 0.90 is the sweet spot.
Learning steps: 1m 10m. Nothing longer than 12 hours.
Buttons: Again and Good only. Hard and Easy mess with FSRS optimization. Two buttons. That's the whole interface.
Part V: The Part Everyone Gets Backwards
Most people: watch episode, don't understand, feel frustrated, Google three words, forget them by next week.
Flip the order.
Cards first. Episode second.
Anki loads 50 words into short-term memory. 10 to 15 minutes. Then you watch. Every word you just learned shows up in a real scene. With a human voice. With emotion. With context.
Your brain doesn't treat this as studying. It treats it as recognition. And recognition is the fastest path from short-term to long-term memory.
The forgetting curve flattens because your first review isn't another card. It's a real conversation on screen. Different medium. Same day.
This connects to Krashen's i+1 principle. Language acquisition happens when input is just above your current level. A series without prep is way beyond you. You zone out. A series after 15 minutes of cards sits right at the edge of your comprehension. Hard enough to grow. Easy enough to enjoy.
English subtitles only. Subtitles in your native language kill the process. Your brain latches onto the translation and stops processing English entirely.
This is the whole system working at once. Extraction, drilling, and immersion in the same hour.
Part VI: Upgrades
1. Pronunciation. Install AwesomeTTS or HyperTTS in Anki. Adds audio to every card through Azure Neural TTS. Or just ask NotebookLM to include IPA with every word. Already in the pipeline.
2. One-tap deletion. Mobile Anki lets you assign gestures. Set *"delete card"* to tap top-right corner. Word you already know? One tap. Gone. No menus.
3. Definitions over translation. Want monolingual cards? Ask Claude to add a third column with an English-only definition. Or copy unfamiliar phrases into Perplexity on the fly.
4. Batch processing. One series, 20 to 24 episodes. One episode per day. By end of season: 500 to 1,000 cards. B2 to C1 vocabulary from content you actually wanted to watch. Not from a textbook.
5. NotebookLM as tutor. After loading a transcript, ask it anything. *"What legal jargon appears in this episode?"* *"Give me 10 sentences using 'get away with.'"* Answers from the source only. No hallucinations.
Part VII: The Complete Pipeline
1. Find transcript online, copy or save as PDF
1. Load into NotebookLM
1. Extract 50 complex words with examples from the transcript
1. Send to Claude, get CSV
1. Import into Anki, FSRS on, retention 0.90
1. Study cards: 10 to 15 minutes
1. Watch the episode with English subtitles
1. Repeat
5 minutes of prep. 10 minutes of cards. Then the episode.
Bigger Than Language
This works for any content in any language. Podcasts. YouTube. TED Talks. Audiobooks. Anything with text.
You're not grinding someone else's vocabulary list. You're building a system that turns what you already watch into a language course personalized to your level.
AI does extraction. Machine learning handles the schedule. You watch series.
Everyone wants to learn a language. Nobody wants to study one.
This removes the studying. What's left is the series, 10 minutes of cards, and the slow realization that you actually understand what they're saying.
That feeling is worth more than any certificate.
More on AI and tech: @phosphenq
Save this. Send it to whoever's been "going to learn English" for three years.
Tg: https://t.me/+-0Rqj5XdaklmNTk6
