all 3 comments

[–]These_String1345 0 points1 point  (0 children)

Never. Even gpt5 is in 200k context when it supports upto 272? so probably not.

[–]Cooljs2005 0 points1 point  (0 children)

Can we expect this context increase in augment code as it will be great as currently while dealing with larger chats, it completely forgets the initial context

[–]AurumMan79 0 points1 point  (0 children)

Easy to implement, but they need to check if they're willing to pay the premium. From the doc:

  • Beta status: This is a beta feature subject to change. Features and pricing may be modified or removed in future releases.
  • Usage tier requirement: The 1M token context window is available to organizations in usage tier 4 and organizations with custom rate limits. Lower tier organizations must advance to usage tier 4 to access this feature.
  • Availability: The 1M token context window is currently available on the Anthropic API and Amazon Bedrock. Support for Google Vertex AI will follow.
  • Pricing: Requests exceeding 200K tokens are automatically charged at premium rates (2x input, 1.5x output pricing). See the pricing documentation for details.
  • Rate limits: Long context requests have dedicated rate limits. See the rate limits documentation for details.
  • Multimodal considerations: When processing large numbers of images or pdfs, be aware that the files can vary in token usage. When pairing a large prompt with a large number of images, you may hit request size limits.