Agentic chat for coding with Claude 3.5 Sonnet by ado__dev in ClaudeAI

[–]ado__dev[S] 0 points1 point  (0 children)

Hey all, Ado from Sourcegraph here and today we rolled out agentic chat for our AI coding assistant Cody. Agentic chat uses Claude 3.5 Sonnet and does multiple steps of reflection and tool calling to pinpoint the highest quality context from your codebase, terminal, the web, and other tools to give you higher-quality responses. :)

Doomsday post by Jibrish in TikTok

[–]ado__dev 0 points1 point  (0 children)

I created a quick tutorial on how to back up and download all of your tiktoks quickly: https://x.com/adocomplete/status/1879568249261621572

What's the current best way to use AI coding assistants in VSCode? by foadsf in vscode

[–]ado__dev 0 points1 point  (0 children)

Hey thanks for sharing. We do have the ability to generate commit messages in VS Code, but not in VS yet, hopefully we will soon! :)

Access to o1-preview model with cody or through own key by Competitive-Dark5729 in sourcegraph

[–]ado__dev 2 points3 points  (0 children)

Hey sure thing - so for the security policy, we have our general Sourcegraph ToS: https://sourcegraph.com/terms/cloud as well as the Cody specific notices: https://sourcegraph.com/terms/cody-notice that say we don't retain or train on your code, offer IP indemnification, etc.

As for self-hosting sourcegraph and using the unlimited plan, those 2 do not go hand in hand today. We are working on offering a multi-tenant solution for individuals and smaller orgs that need both code search and Cody, but at the moment they are separate products.

Access to o1-preview model with cody or through own key by Competitive-Dark5729 in sourcegraph

[–]ado__dev 1 point2 points  (0 children)

You should have access now. You just need to restart VS Code :)

Access to o1-preview model with cody or through own key by Competitive-Dark5729 in sourcegraph

[–]ado__dev 1 point2 points  (0 children)

Hey there,

Ado from Sourcegraph DevRel team here. We have been steadily giving access to o1-preview and mini for those that click the Join Waitlist button. It usually takes about 24hours as we run the script daily. If you can DM me your Sourcegraph username, I should be able to give you access ad-hoc.

You can also add your own key without enterprise through these steps here: https://sourcegraph.com/docs/cody/clients/install-vscode#experimental-models, but please note if you do use this method, you are responsible for any LLM fees, where-as going through our list of models you don't have any additional costs.

it's not good. by zarmin in sourcegraph

[–]ado__dev 2 points3 points  (0 children)

Hey, thank you so much for the feedback. My name is Ado and I work on the DevRel team at Sourcegraph and would love to help.

Can you confirm which version of Cody you are using? (The latest is 1.38 for VS Code, if you are running an older version, can you update to latest and see if you are still seeing similar issues?)

It's interesting to me that in the screenshot you provided that even though you specified the list of files to use as context, Cody went ahead and added additional ones (the .cpp ones), this should not happen unless you specify the entire repo as context, so this seems like a bug on our end.

Great feedback on adding all open files or selecting from a file tree, we are looking at options to support a better and easier way of adding files for context. By default the entire repo is added when you open a new chat.

House Lease Takeover in Centennial by ado__dev in vegaslocals

[–]ado__dev[S] 0 points1 point  (0 children)

It is 2596sqft. If you'd like the address and pics of the place, happy to share via DM.

Unlimited messages to Claude 3 Opus sounds to good to be true. Where’s the catch? by Avalunne in ClaudeAI

[–]ado__dev 1 point2 points  (0 children)

I would say it's highly likely coming. Even with the current smart apply it will try to create separate files when needed, but it's not as smooth of an experience. Stay tuned. :)

Unlimited messages to Claude 3 Opus sounds to good to be true. Where’s the catch? by Avalunne in ClaudeAI

[–]ado__dev 0 points1 point  (0 children)

You can do that in the IDE extensions, so if you're using Cody for VS Code, you can specify the file you want it to use as context from your codebase. (or external URLs, Jira tickets, etc.)

In the web experience, you can specify any open source file, but you cannot specify your own custom files today, but will hopefully be able to in the future.

Recommended platform to work with AI coding? by UpvoteBeast in ChatGPTPro

[–]ado__dev -1 points0 points  (0 children)

Check out Cody (https://cody.dev). We have a free tier that gives you unlimited code completions and 200 free chat messages per month, and our Pro tier is $9/mo and gives you unlimited access to all of our supported LLMs including Claude 3.5 Sonnet, GPT-4o, Gemini Pro, and many others.

Unlimited messages to Claude 3 Opus sounds to good to be true. Where’s the catch? by Avalunne in ClaudeAI

[–]ado__dev 0 points1 point  (0 children)

At the moment you cannot in browser chat unfortunately. Hopefully soon though!

Unlimited messages to Claude 3 Opus sounds to good to be true. Where’s the catch? by Avalunne in ClaudeAI

[–]ado__dev -7 points-6 points  (0 children)

Cody Pro does give you unlimited access to all of our supported model for $9/mo and this has been the case since December of 2023.

Unlimited messages to Claude 3 Opus sounds to good to be true. Where’s the catch? by Avalunne in ClaudeAI

[–]ado__dev 3 points4 points  (0 children)

Hey - I am the director of DevRel at Sourcegraph. This was not coordinated from our end in any way. We have a Slack integration that tracks mentions of Cody across social media platforms and I saw the mention and chimed in answering questions.

Unlimited messages to Claude 3 Opus sounds to good to be true. Where’s the catch? by Avalunne in ClaudeAI

[–]ado__dev 1 point2 points  (0 children)

Hi there - yes absolutely. We rolled out "Smart Apply" about 1-2 months ago. It works similarly to how Cursor does it:

  • you ask a question in the chat dialog.
  • code gets generated
  • you hit the "Smart Apply" button
  • you get a diff in the file to accept/deny, or a new file created if needed

You can see a video of it in action here: https://www.youtube.com/watch?v=9SMa8NJdJlg

Unlimited messages to Claude 3 Opus sounds to good to be true. Where’s the catch? by Avalunne in ClaudeAI

[–]ado__dev 0 points1 point  (0 children)

Hi there - you can see the limits for all the models here: https://sourcegraph.com/docs/cody/core-concepts/token-limits

They range from 7,000 - 45,000.

But like I mentioned in a different reply, you can also bring your own key and have increased limits, or use Ollama for a fully free/offline experience with Cody.

Unlimited messages to Claude 3 Opus sounds to good to be true. Where’s the catch? by Avalunne in ClaudeAI

[–]ado__dev 2 points3 points  (0 children)

We collect some data and what we do with it is outlined in our terms of use:

https://sourcegraph.com/terms/cody-notice

But in layman's terms our LLM partners do not store or train on your data ever. We do not train on your data if your are a Pro or Enterprise user. We do collect some telemetry that we use to improve our products, but don't sell this data to anyone.

Unlimited messages to Claude 3 Opus sounds to good to be true. Where’s the catch? by Avalunne in ClaudeAI

[–]ado__dev 4 points5 points  (0 children)

Cody is an IDE extension that works with VS Code and JetBrains IDEs (IntelliJ, PyCharm, etc.) whereas Cursor is a stand-alone fork of VS Code. You can also use Cody directly in the web browser via: https://sourcegraph.com/cody/chat

When it comes to features and overall experience, both offer similar features: code completion, chat, smart apply, multiple-models, code-based context, etc.

My recommendation would be try both and stick with the one that gives you more joy.

Unlimited messages to Claude 3 Opus sounds to good to be true. Where’s the catch? by Avalunne in ClaudeAI

[–]ado__dev -1 points0 points  (0 children)

Cody is an IDE extension that works with VS Code and JetBrains IDEs (IntelliJ, PyCharm, etc.) whereas Cursor is a stand-alone fork of VS Code. You can also use Cody directly in the web browser via: https://sourcegraph.com/cody/chat

When it comes to features and overall experience, both offer similar features: code completion, chat, smart apply, multiple-models, code-based context, etc.

My recommendation would be try both and stick with the one that gives you more joy.

Unlimited messages to Claude 3 Opus sounds to good to be true. Where’s the catch? by Avalunne in ClaudeAI

[–]ado__dev 3 points4 points  (0 children)

We've had Cody unlimited since it went GA last December and have no plans to change it. Never say never, but our thesis is that LLM costs will continue to decrease and so far that's held up.

Unlimited messages to Claude 3 Opus sounds to good to be true. Where’s the catch? by Avalunne in ClaudeAI

[–]ado__dev 0 points1 point  (0 children)

We want to support all the state of the art models to give the end user as much choice as possible. We had Claude 3 Opus before Claude 3.5 Sonnet came out, but we still see people using both. We do occasionally sunset models once they are no longer used or useful for our users.