Anthropic announced two significant updates to its Claude AI platform on March 13, 2026. The company is temporarily doubling Claude usage limits for most users while also making the full 1-million-token context window generally available for its latest models, Claude Opus 4.6 and Claude Sonnet 4.6. The doubled limits extend beyond the main chat interface to Claude Code, Cowork, Claude for Excel, and Claude for PowerPoint.
The updates expand Claude's ability to process very large amounts of information and give users additional capacity during a limited two-week promotional period.

Image source: Envato
Temporary Usage Boost for Claude Users
Anthropic confirmed that Claude usage limits are temporarily doubled from March 13 through March 27 for several subscription tiers, including Free, Pro, Max and Team plans.
The increased limits apply automatically during off-peak hours on the Claude platform, requiring no action from users. Off-peak hours are defined as any time outside 8 AM to 2 PM ET on weekdays, while doubled limits apply around the clock on weekends. Enterprise accounts are not included in the promotion.
For many users, the temporary boost provides additional room to experiment with longer prompts, extended conversations, and more complex AI workflows without hitting standard usage caps.
1M-Token Context Window Now Generally Available
Alongside the usage boost, Anthropic announced that the 1-million-token context window is now generally available for:
- Claude Opus 4.6
- Claude Sonnet 4.6
The context window determines how much information a model can analyze in a single request. With a 1M-token context window, Claude can process extremely large inputs, including long research documents, entire codebases, or extensive datasets.
Anthropic also removed the previous long-context pricing premium, meaning large prompts now cost the same per token as shorter ones under standard pricing.
The expanded context window significantly increases the amount of media that can be included in a single request, supporting hundreds of images or document pages within one interaction.
Why the 1M Context Window Matters
Long-context capabilities have become an increasingly important area of competition among leading AI developers.
A larger context window allows AI models to:
- Analyze large documents and datasets
- Work with entire software repositories
- Maintain longer reasoning chains
- Support complex agent-style workflows
For developers, the upgrade means that large codebases or documentation sets can be included directly in a single prompt without requiring multiple summarization steps.
For everyday users, the temporary usage boost provides additional capacity to explore Claude's long-context capabilities during the two-week period.
A Growing Race in Long-Context AI
Anthropic's update reflects a broader industry push to expand the amount of information AI systems can process in a single interaction.
As companies compete to build more capable AI models, long-context systems are becoming central to applications such as research analysis, coding assistance, and complex knowledge workflows.



Leave A Comment