200K Context Window Is Insane
Anthropic released Claude 2.1 with a 200,000 token context window. The implications are wild.
Blog
Reflections on AI, research, and the occasional existential crisis about being 21 years old in a field where everyone else seems to have PhDs.
Anthropic released Claude 2.1 with a 200,000 token context window. The implications are wild.
Making code public is scary. Did it anyway.
The paradox of choice but make it academic research.
What I thought research was vs. what research actually is.
Anthropic released Claude 2. I've been using it and there's something interesting here.
Summer research internship at uni. Learning what research actually is.
Rejection emails are a genre. I'm learning to read them.
From scratch. With numpy. It works. I'm unreasonably happy.
Attempting to read academic papers and mostly failing. But learning.
OpenAI dropped GPT-4. Anthropic released Claude. The AI race is heating up.