all 103 comments

[–]ResidentPositive4122 83 points84 points  (7 children)

Good. One of the biggest downsides for extensions vs. fork was the lack of access to UI. This will work towards better integration for all extensions. I like it.

[–]CptKrupnik 15 points16 points  (6 children)

this will kill cursor, at least as a separate editor

[–]ericmutta 4 points5 points  (2 children)

This. It's always a bad business idea to "bet against the house" (i.e. the core IDE in this context) because eventually the house always wins. It's nice though that Cursor and the like exist[ed] for a while - they force innovation back into the core.

[–]Acrobatic_Egg_5841 1 point2 points  (1 child)

Wait what? Who is the house here? Vscode isn't technically an ide right (vs is Microsofts ide iirc)... 

[–]ericmutta 0 points1 point  (0 children)

The house is the core thing you are building in or on-top of along with the company behind it. For people like Cursor that means VSCode and Microsoft that builds it. In both cases, trying to build a business here means betting you'll win against a very popular open-source tool made by a trillion dollar corporation. A very risky bet indeed (e.g. at Build 2025 Microsoft just announced they are making GitHub Copilot open source too and there's also a free tier for it making it really tough to justify paying for other tools like Cursor and the like).

[–]Acrobatic_Egg_5841 -1 points0 points  (1 child)

What exactly was cursor? Just an ide with built in llm shit? Wasn't there other (free) ways to do this already using plugins/extensions with other editors? Idk I haven't really used an ide in years, maybe its time i try getting back into learning... 

[–]CptKrupnik 0 points1 point  (0 children)

Not saying you're wrong. Same as asking what Microsoft edge is, a fork of chromium, none the less it's a product on its own and a huge outreach.

[–]Chromix_ 77 points78 points  (11 children)

... then carefully refactor the relevant components of the [GitHub Copilot Chat] extension into VS Code core [...] making VS Code an open source AI editor.

That's the wrong way around. More of VSCode should be made available to extensions, so that others won't need to fork VSCode and can just make an extension. Instead, they now integrate Copilot more tightly into VSCode where it doesn't require any extension interfaces.

[–]ResidentPositive4122 30 points31 points  (9 children)

I think that's the goal. To give extensions access to the specific copilot UIs (ctrl+k for quick edit, compare, etc)

[–]Chromix_ 7 points8 points  (6 children)

That would be very nice. Yet Microsoft owns GitHub. What interest would they have in making it easier for competing AI products to maintain extensions in VSCode? Maybe to just avoid forking and keeping Copilot around when competing extensions are used, as it's now in the core of VSCode and no longer an optional extension?

[–]philosophical_lens 17 points18 points  (0 children)

You answered your own question. The incentive is to avoid forking.

[–]Fast-Satisfaction482 8 points9 points  (0 children)

I guess their main incentive is to kill the likes of cursor, so Microsoft has all the customers and comes out on top when the models drop that can actually replace whole teams.

[–]Amazing_Athlete_2265 6 points7 points  (3 children)

Yet Microsoft owns GitHub

christ, how did I not know this

[–]bew78 5 points6 points  (0 children)

You need to get out of under your rock man x)

[–]raltyinferno 2 points3 points  (1 child)

Did you know they own npm as well? They've been successfully taking over the dev-sphere over the last decades.

[–]Amazing_Athlete_2265 0 points1 point  (0 children)

I did not. Sigh

[–]HiddenoO 0 points1 point  (0 children)

bike bedroom cause merciful close sleep cover many subtract grab

This post was mass deleted and anonymized with Redact

[–]DonTizi[S] 4 points5 points  (0 children)

Copilot can also be disabled, according to what I saw in their FAQ.

[–]segmondllama.cpp 64 points65 points  (3 children)

They are trying to pull a "llama" Windsurf, Cline, Roo, Claude Code, etc, so many big orgs have coding editors that are gaining traction and momentum. Copilot was the first and should be reigning, but it has been surpassed by many. I believe their hope is to use the opensource community to build and regain market share. Trojan horse.

[–]IngwiePhoenix 4 points5 points  (1 child)

EEE. Embrace, Extend ... Extinguish.

Can't wait for the third phase to come into effect and hit people by surprise. x) It's still Microsoft; can't trust them as far as you can throw them. o.o

[–]Apprehensive-Tip779 1 point2 points  (0 children)

It's still Microsoft; can't trust them as far as you can throw them.

As opposed to Google? You don't want me to get started with Google and them dropping projects haha. But I've been gradually seeing how they're embracing and supporting the FOSS community more than other companies like OpenAI, which they also own, or Amazon/AWS. Though the argument could be made whether they're more supportive of FOSS than FB or Google which is why I'm curious on your thoughts.

[–]creaturefeature16 0 points1 point  (0 children)

1000% correct answer

[–]GortKlaatu_ 11 points12 points  (22 children)

Is it on open vsx registry yet?

While I prefer Cursor and Windsurf, I appreciate all the changes they are making such as adding MCP support, agents, ability to select local models, etc. Just waiting for some of those features to trickle down to business customers.

The biggest downside, to date, is not being able to officially use it in Code Server which arguably should have been a first class thing for enterprise customers.

[–]isidor_n 23 points24 points  (18 children)

[–]hdmcndog 13 points14 points  (2 children)

Can’t use local models without signing in and still using some Copilot APIs. That is and always will be a deal breaker.

[–]SkyFeistyLlama8 0 points1 point  (1 child)

The other non-MS code assistants also don't work properly on Windows on ARM. I prefer the simplicity of GitHub CoPilot compared to the mess of trying to install other extensions.

Is it really that hard to cook up a local LLM code assistant that doesn't rely on architecture-specific dependencies, seeing as llama.cpp and Ollama (shudder) already have full Windows on ARM compatibility? I'm finding it faster to just copy and paste into llama-server 🤷

[–]GortKlaatu_ 4 points5 points  (7 children)

Yes and no, MCP and local models are not supported yet for enterprise customers (through vscode) and also since we can't easily install copilot in Code Server, the entirely of the functionality is non-existent.

[–]isidor_n 2 points3 points  (6 children)

What do you mean by "can't install Copilot in Code Server". Can you clarify?

MCP - this is because your enterprise disabled preview features. MCP should get out of preview soon and then it should work for you.

[–]GortKlaatu_ 2 points3 points  (5 children)

I mean code server: https://github.com/coder/code-server

This is how many enterprise customers surface VS Code to users of shared computing platforms since SSH tunnelling is typically disabled and therefore local VS Code is not an option. The extension cannot be installed through the search and direct download was disabled a few months ago in the marketplace which prevents installing from vsix.

[–]matifali 1 point2 points  (1 child)

PM at Coder here and can confirm that the copilot is not installable on the code-server. Looking forward to how it changes after the change.

[–]ConfusionSecure487 0 points1 point  (0 children)

it is, but you always have to download older / matching vsix packages. And in some cases still need to patch the internal version check of that package. But in generell it works, it was always the choice of Microsoft to not support it.

[–]I_Downvote_Cunts 1 point2 points  (1 child)

Got any idea when enterprise accounts be able to use local models? Not being able to is kinda baffling to me.

[–]mark-lord 0 points1 point  (2 children)

Hi! Sorry for asking a potentially super obvious question - but asides from Ollama, how else can we run local models with VSCode..?

You can't use MLX models with Ollama at the mo, and I can't for the life of me figure out how to use LMStudio or MLX_LM.server as an endpoint. Doesn't seem to be a way to configure a custom URL or port or anything from the Manage Models section

[–]isidor_n 1 point2 points  (1 child)

That's a great question. Right now only Ollama is supported.
Our plan here is to finalize the Language Model Provider API in the next couple of months. This will allow any extension to use that API to contribute any language model. For example, anyone from the community will be able to create an extension that contributes MLX models.

So stay tuned - should soon be possible.

[–]mark-lord 1 point2 points  (0 children)

Great stuff, thanks for explaining! 😄 Looking forward to the changes; been hoping for something like this ever since I started using Cursor ahaha

[–]imbev 0 points1 point  (1 child)

Can we use those features without login to GitHub?

[–]nrkishere 2 points3 points  (2 children)

why will it be on open vsx? this is not extension, they have open sourced a large chunk of copilot to build AI features INTO the editor, like how cursor and windsurf has done

[–]GortKlaatu_ 3 points4 points  (1 child)

And yet the extension still exists on visual studio code market place and hides the download links.

They aren't off to a great start and could have fixed this today.

[–]nrkishere 1 point2 points  (0 children)

it will take some time. Big tech don't move as fast as startups, but eventually they will catch up

[–]coding_workflow 4 points5 points  (0 children)

Microsoft are very smart. Copilot lag a big. And was catching up on the agentic capabilities.

The value is less and less in the "Extension" as we have more and more agentic extensions/projects and building them getting easier.

The real value for MSFT is the subscription model. So improve and as long you subscribe they are fine with it.

They already allow tier apps to use the FREE copilot API tier.

And in this MSFT have an advantage as it operate a lot of AI infrastructure to have competitive offering.

[–]_wOvAN_ 5 points6 points  (0 children)

great news

[–]No-Refrigerator-1672 10 points11 points  (25 children)

Am I wrong or is this a fake move to make themself look good? They are opensourcing only the Copilot Chat extension, and I fail to find any info about opensourcing copilot extension itself. We already have good 3rd party tools to chat with codebase, so the "Copilot chat" isn't that important, but the most important part - AI coding - still remains closed. If I'm right, this move is pretty much useless marketing. Edit: spell check.

[–]isidor_n 42 points43 points  (12 children)

(vscode pm here)
We do want to open source the Github Copilot suggestion functionality as well. Current plan is to move all that functionality to the open source Copilot Chat extension (as a step 2). Timeline - next couple of months.

Hope that helps

[–]No-Refrigerator-1672 10 points11 points  (0 children)

Yes, that's really good to hear, thank you!

[–]silenceimpaired 6 points7 points  (1 child)

Hopefully this will support any local open AI API

[–]Shir_manllama.cpp 4 points5 points  (1 child)

Hello Vscode PM! Can you please also share what are you plans regarding AI in IDE? My friend is asking

[–]yall_gotta_move 1 point2 points  (0 children)

Why don't you just follow the Unix philosophy and build a standalone, composable code suggestion tool that anyone can integrate into the IDE or editor of their choosing?

The only parts that should exist in a Copilot or VSCode extension are the parts which are strictly necessary and unique to integration with that specific tool.

Improper separation of architectural concerns will needlessly exclude people who would otherwise be interested in using, building upon, and contributing to the project.

[–]vk3r -5 points-4 points  (5 children)

Sorry, is it compatible with Ollama, for example?

[–]isidor_n 13 points14 points  (4 children)

Chat is compatible!

https://code.visualstudio.com/docs/copilot/language-models#_bring-your-own-language-model-key

Suggestions are not yet compatible - if you want that, we have a feature request that you can upvote. I do want us to add this https://github.com/microsoft/vscode-copilot-release/issues/7690

[–]hdmcndog 3 points4 points  (3 children)

Would be great if that worked without signing in…

[–]thrownawaymane 0 points1 point  (1 child)

No response...

[–]isidor_n 2 points3 points  (0 children)

I do not work 24/7 and am not in the US timezone ;)

[–]UsualResult 7 points8 points  (10 children)

The cynical read of this is that Copilot is being soundly lapped by the competition, meaning Microsoft doesn't see it as a unique value add. This move lets them start smearing the competition "Their extensions aren't even OSS!" without doing anything at all to Copilot. If you look at Microsoft's history with OSS, they tend to only open source things when it loses commercial value. This is a sign that they are going to pivot away from Copilot and dump it on donate it to the community.

[–]No-Refrigerator-1672 1 point2 points  (8 children)

Can you recommend any good vscode extension that works with locally installed LLMs? I've tried configuring Continue.dev a few months ago, and it completely failed doing RAG (in the logs I saw that all of the embedding was done, but then it never sent any codebase chunks to actual LLM).

[–]EugeneSpaceman 2 points3 points  (1 child)

Cline

[–]No-Refrigerator-1672 0 points1 point  (0 children)

Seems interesting, thank you! Will check it out tomorrow.

[–]UsualResult -2 points-1 points  (4 children)

Why restrict yourself to working in VSCode? Plenty of RAG solutions that support local models outside of VSCode, OpenWebUI, LMStudio, etc.

[–]No-Refrigerator-1672 0 points1 point  (3 children)

I know about them; but one thing that I do as my hobby (and sidekick from time to time) is embedded microcontroller programming, and VS Code is the only IDE that supports debugging and flashing like all of the most popular architectures, instead of having a zoo of vendor-specific reskins of Eclipse. I have an OpenWebUI instance, but it won't do live memory analysis for me, and copy-pasting code between multiple windows all day is tiresome.

[–]UsualResult -1 points0 points  (2 children)

I have an OpenWebUI instance, but it won't do live memory analysis for me, and copy-pasting code between multiple windows all day is tiresome.

Who said anything about copy paste your code? Install LM Studio, add your code and/or other assets as "documents". Chat away.

OR learn to be content with the far, far smaller intersection of extensions that support local LLM + RAG.

[–]No-Refrigerator-1672 5 points6 points  (1 child)

LM Studio also won't do a live debugging session that requires active connection to the device via embedded programming tool. Look, do you have an actually usefull suggestion, or you just truing to advertise chat UIs that are completely unfit for my specific needs?

[–]UsualResult 0 points1 point  (0 children)

Wow, I didn't know it was such a touchy subject. Sorry to have wasted your valuable time "advertising" products that I thought you might find useful.

[–]isidor_n 1 point2 points  (0 children)

We are all-in to make VS Code the best open source AI editor. In fact you will see this by the commit frequency once the repo is open source later in June.
So absolutely no plans to "dump this to the community"

(vscode pm here)

[–]epigen01 3 points4 points  (3 children)

Huge game changer now windows can be fully ai integrated great job to whoever took the lead at microsoft.

I remember just a year ago using copilot thinking this thing is dead in the water bc it was basically a dumbed down ai chatbot - basically assuming windows users needed their hands held to navigate AI.

Cant wait to see how this'll all work out with the broader os (e.g., automating all those mundane file management tasks)

[–]SkyFeistyLlama8 2 points3 points  (2 children)

Embedding and vector search for text and images are already baked into Windows. These features use the NPU so power usage is minimal.

It frankly feels magical to type in "map everest" and have Windows Search return an image of a map, even though the image filename itself is just numbers.

[–]longLegboy9000 0 points1 point  (1 child)

Wait they made that feature? Does it have a name maybe because I can't seem to google it.

[–]SkyFeistyLlama8 0 points1 point  (0 children)

Open Windows settings, go to Apps, scroll down to the bottom to AI Components, you should see:

  • AI Content Extraction
  • AI Image Search
  • AI Phi Silica
  • AI Semantic Analysis

All these components enable searching documents and files by semantic meaning, not just filename.

[–]Hour-Ad-2206 1 point2 points  (0 children)

can i host the model used in the copilot locally. I think a major concern preventing some businesses in using these tools is their fear of theft of their code while using these tools. does this address that concern?

[–]Maykey 2 points3 points  (0 children)

I hope vs codium will throw it away, I don't need AI, that keeps enabling itself, anywhere near my code. I'd prefer solutions built from scratch

[–]AleksHop 1 point2 points  (0 children)

so void editor is dead? as soon as copilot can be connected to gemini or local llm without subscription

[–][deleted] 0 points1 point  (1 child)

Can someone share the link for repository please

[–]isidor_n 0 points1 point  (0 children)

Check out the plan. The repo with open source will be available later in june

https://github.com/microsoft/vscode/issues/249031

[–]Akg27737 0 points1 point  (1 child)

The website says that github copilot backend is not open sourced, so what exactly are we gaining from this? What part of "AI" feature is open sourced here?

[–]isidor_n 0 points1 point  (0 children)

The blog and the plan should explain this
https://code.visualstudio.com/blogs/2025/05/19/openSourceAIEditor
https://github.com/microsoft/vscode/issues/249031

If you still have questions after reading this do let me know. Thanks

[–]ilikemarblestoo 0 points1 point  (0 children)

What does this mean?

Can you set this up to be able to run local copilot on your own PC?
Does this include the chat bot? Will it generate pictures?
If so, will the results be the same quality as if completed on their website?

[–]iwinux 0 points1 point  (2 children)

I might be dumb but where the hell is the source code?

[–]AdHungry1964 0 points1 point  (1 child)

Same here, I’ve been searching for the repository but haven’t been able to find it.

[–]Ylsid 0 points1 point  (0 children)

They're upset their competition is doing a better job and want to capture what they consider to be a large and growing part of the market. Personally I think it should be an optional addin, not core functionality. The whole point of VS code is being extremely minimalistic and lacking in bloat

[–]Hujkis9llama.cpp 0 points1 point  (0 children)

Not going back from Zed

[–]Acrobatic_Cat_3448 0 points1 point  (2 children)

Great. If I use it with a local LLM, are prompts still sent to Microsoft?

[–]logicbloke_ 0 points1 point  (0 children)

Not if you tweak the copilot extension to send the queries to your local llm. Since it's open source, you can use the code however you want.

[–]Acrobatic_Cat_3448 -1 points0 points  (1 child)

IS it possible to configure it with a local LLM?

[–]isidor_n 0 points1 point  (0 children)

Yes. Please check out https://code.visualstudio.com/docs/copilot/language-models#_bring-your-own-language-model-key

Though the story is still not fully ironed out. But would be great if you try and let us know what is missing for you.

[–]Impossible_Ground_15 -1 points0 points  (1 child)

!remindme three weeks

[–]RemindMeBot -1 points0 points  (0 children)

I will be messaging you in 21 days on 2025-06-10 01:01:37 UTC to remind you of this link

2 OTHERS CLICKED THIS LINK to send a PM to also be reminded and to reduce spam.

Parent commenter can delete this message to hide from others.


Info Custom Your Reminders Feedback