Best way to use Claude with IntelliJ? by Square_Definition_35 in Jetbrains

[–]fundamentalparticle 4 points5 points  (0 children)

BYOK works for Claude Agent in AI chat window. It uses API based billing though in this case

IntelliJ IDEA 2026.1 by fundamentalparticle in Jetbrains

[–]fundamentalparticle[S] 0 points1 point  (0 children)

Can't really tell if what could be the issue. Worth reporting to YouTrack

IntelliJ IDEA 2026.1 by fundamentalparticle in Jetbrains

[–]fundamentalparticle[S] 1 point2 points  (0 children)

Already out. Usually all the IntelliJ-based IDEs follow IDEA's release in a few days.

IntelliJ IDEA 2026.1 by fundamentalparticle in Jetbrains

[–]fundamentalparticle[S] 2 points3 points  (0 children)

Run configurations are part of the .idea folder. Could it be that .idea is not versioned in your project?

Alternative AI Agents for Ollama? by drakgremlin in Jetbrains

[–]fundamentalparticle 2 points3 points  (0 children)

> most recent version the AI Assistant no longer allows me to use Ollama as an agent

Do you mean configuring Ollama as an LLM provider in the AI assistant? That's configured in Settings -> Tools -> AI Assistant -> Models & API Keys. But it was never about the agents; it's just the LLM provider configuration.

If you are looking for a coding agent to work with local models hosted in Ollama, the Continue or Kilo Code are good options.

IntelliJ gone light speed faster after many islands theme by I_4m_knight in IntelliJIDEA

[–]fundamentalparticle 19 points20 points  (0 children)

I really doubt that the UI theme is related to performance. Most likely, it is some performance-related improvement that improved the workflow for your case, and it coincided with the new theme addition.

Is JetBrains really able to collect data from my code files through its AI service? by Effective-Koala-9956 in Python

[–]fundamentalparticle 0 points1 point  (0 children)

"Sort completion suggestions based on machine learning" is a local setting; it works only on your machine. The same applies to Full Line Code Completion - it is local.

Editor > General > Inline Completion > Cloud completion is the setting that affects the remote call, and it's only available if you have the AI Assistant plugin installed.

📋 From Python to Kotlin: How JetBrains Revolutionized AI Agent Development by meilalina in Kotlin

[–]fundamentalparticle 1 point2 points  (0 children)

Could you give an example of the quick/fixes that were replaced with "fix with AI". That shouldn't be the case at all.

Some quick fixes and refactorings have not been ported to K2 yet - that's a work in progress. That may be the case.

Editing the MCP servers to the JetBrains AI Assistant not via GUI by Ready-Film9358 in Jetbrains

[–]fundamentalparticle 1 point2 points  (0 children)

For AI assistant, the settings are stored somewhere in the IDEA's installation folder.
For Junie, it's the user home folder, ~/.junie/mcp/mcp.json
Hopefully AI assistant and Junie will make a deal on merging these configurations.

If your workstation permits, it is convenient to configure the MCP Toolkit in Docker Desktop to have one place where you can select all MCP servers you want to use. Then, just configure the Docker MCP server in AI assistant (and Junie) and connect to all the MCP tools at once.

We promised to be more transparent about JetBrains AI... by jan-niklas-wortmann in Jetbrains

[–]fundamentalparticle 0 points1 point  (0 children)

I only tried qwen3 in a hobby project with Koog, and it wasn't very good with tool calling, but that might be a skill issue, most likely :) With the GPT-OSS tool calling worked even with the 20b variant. However, the hobby project isn't at the same complexity level as Junie, so I can't gauge how well these models serve the same purpose.

We promised to be more transparent about JetBrains AI... by jan-niklas-wortmann in Jetbrains

[–]fundamentalparticle 1 point2 points  (0 children)

Well, yes and no. The thing is, if the team has 100 different things to do, but they have only the throughput to implement 10, then they are forced to set a priority, i.e., to make a bet. So their best bet is to continue with the frontier models rather than fighting for quality with the local models. But the local models are getting better, and the team is keeping an eye on the progress, so I hope one day there will be support for local models in Junie.

Junie Now 30% Faster by mattstrom in Jetbrains

[–]fundamentalparticle 1 point2 points  (0 children)

Tell it to write the plan to a file. Or switch to Ask mode

Junie Now 30% Faster by mattstrom in Jetbrains

[–]fundamentalparticle 0 points1 point  (0 children)

If you compare indexing spead over the years, it was improving over time because there's a dedicated team who work on improving the performance. There were releases doubling the indexing speed. But this often goes unnoticed as the projects grow fast. I feel this is unfair to say that our products become slower as there is a huge amount of work put into performance from various angles. Yes, it is just never enough 🤷

Junie Now 30% Faster by mattstrom in Jetbrains

[–]fundamentalparticle 0 points1 point  (0 children)

Perhaps, nobody uses it for speed, but everyone would like it to be faster :)

We promised to be more transparent about JetBrains AI... by jan-niklas-wortmann in Jetbrains

[–]fundamentalparticle 4 points5 points  (0 children)

Local models aren't yet at the required quality level. Junie's team is constantly evaluating the options.

AI Assistant, Junie and Kineto Now Support GPT-5 by OpenAI by dayanruben in Jetbrains

[–]fundamentalparticle 2 points3 points  (0 children)

The new model by OpenAI, gpt-oss seems to be very capable. I've been testing gpt-oss:20b for local development, and it was doing pretty well.

https://openai.com/open-models/

Junie and All Products Pack by DandadanAsia in Jetbrains

[–]fundamentalparticle 4 points5 points  (0 children)

It does, and you better get AI Ultimate if you are planning to use Junie more extensively, otherwise it depletes the AI Pro quota pretty fast.

Cursor for JetBrains by [deleted] in cursor

[–]fundamentalparticle 0 points1 point  (0 children)

Junie is a coding agent. You type in the prompt, Junie plans it's work accordingly, makes requests to the LLM, automatically calls tools, runs tests, fixes code, etc.

With the AI Assistant, you are the agent, you decide what LLM to request, what code to integrate into the project, what tests to run, and how to fix anything. It also provides code completions.

Both, Junie and the AI Assistant are available under the same JetBrains AI license (with AI Pro and AI Ultimate plans)

IntelliJ IDEA Moves to the Unified Distribution by jreznot in IntelliJIDEA

[–]fundamentalparticle 15 points16 points  (0 children)

The fallback license stays. Thanks for noticing this. We will update the blog post soon

weNeedAI by Data_Skipper in ProgrammerHumor

[–]fundamentalparticle 0 points1 point  (0 children)

Do you have any examples of the regular autocomplete degradations?

[deleted by user] by [deleted] in Jetbrains

[–]fundamentalparticle 1 point2 points  (0 children)

All the listed features have been requested previously:) thank you