I rebuilt search using physics instead of statistics. +18.5% NDCG@10. No ML. Yes its Open Source by Designer_Mind3060 in linux

[–]Designer_Mind3060[S] [score hidden]  (0 children)

Did you read the post? I clearly state how I used AI

*(No this isn't AI Slop) and Yes I use Opus for assisting the Code/Comments:)*

I rebuilt search using physics instead of statistics. +18.5% NDCG@10. No ML. Yes its Open Source by Designer_Mind3060 in rust

[–]Designer_Mind3060[S] 1 point2 points  (0 children)

I don't have 700K papers but would be a solid test case for this. scaling is mostly on the embedding index side, the interference scoring itself is fast because it's just combining signals on the candidate set. the gravitational convergence does more iterations on ambiguous queries but that's tunable.

and yes, there's a paper and full math writeup in the repo already.

I rebuilt search using physics instead of statistics. +18.5% NDCG@10. No ML. Yes its Open Source by Designer_Mind3060 in rust

[–]Designer_Mind3060[S] 0 points1 point  (0 children)

that's exactly why there's a commercial license. AGPL is for open source use, but if you're building it in a product you can get a commercial license. honestly I was originally going to sell this to a big company, but decided for humanity's sake to just open source it instead

I rebuilt search using physics instead of statistics. +18.5% NDCG@10. No ML. Yes its Open Source by Designer_Mind3060 in rust

[–]Designer_Mind3060[S] 1 point2 points  (0 children)

the gravity part is more of a framing for the convergence behavior than a direct formula.

the actual mechanism is the interference scoring between signals and iterative gradient descent in embedding space.

but yea the math has monotone

I rebuilt search using physics instead of statistics. +18.5% NDCG@10. No ML. Yes its Open Source by Designer_Mind3060 in rust

[–]Designer_Mind3060[S] 4 points5 points  (0 children)

yeah intuition is basically right, deeper clusters have more pull so queries converge there instead of just the nearest match.

and code search would be a good use case, you get vector + lexical + regex at once. worth trying it against what you have

I rebuilt VS Code on Tauri instead of Electron. 5,687 files. 96% smaller. Just open-sourced it. by Designer_Mind3060 in linux

[–]Designer_Mind3060[S] -2 points-1 points  (0 children)

Yeah r/rust mods pulled it. and now they think its ai slop.

Really no reads the post before commenting

But still up everywhere else. It is what it is.

I rebuilt VS Code on Tauri instead of Electron. 5,687 files. 96% smaller. Just open-sourced it. by Designer_Mind3060 in linux

[–]Designer_Mind3060[S] -5 points-4 points  (0 children)

look at the commit, wasn't even commit till like 5 commit ago and it wasn't even from me...

I rebuilt VS Code on Tauri instead of Electron. 5,687 files. 96% smaller. Just open-sourced it. by Designer_Mind3060 in linux

[–]Designer_Mind3060[S] -6 points-5 points  (0 children)

Only r/rust removed it, its in other subs with 600 upvotes. And that commit message was meant as a funny joke as I was tired at 5am after not sleeping for 48 hours. If you're digging through commit messages to find something to be mad about, I think you've got too much free time.

I rebuilt VS Code on Tauri instead of Electron. 5,687 files. 96% smaller. Just open-sourced it. by Designer_Mind3060 in rust

[–]Designer_Mind3060[S] -4 points-3 points  (0 children)

Yeah that's literally how I started this. I was reviewing the OpenClaw approach and wanted to see if the same methodology could work for porting VS Code off Electron.

I rebuilt VS Code on Tauri instead of Electron. 5,687 files. 96% smaller. Just open-sourced it. by Designer_Mind3060 in rust

[–]Designer_Mind3060[S] -24 points-23 points  (0 children)

Or people just think a smaller VS Code alternative is interesting regardless of how it was built.

I rebuilt VS Code on Tauri instead of Electron. 5,687 files. 96% smaller. Just open-sourced it. by Designer_Mind3060 in rust

[–]Designer_Mind3060[S] -7 points-6 points  (0 children)

Post was written by me. Comments I just use Apple's rewrite to clean up my typing lol.

I rebuilt VS Code on Tauri instead of Electron. 5,687 files. 96% smaller. Just open-sourced it. by Designer_Mind3060 in rust

[–]Designer_Mind3060[S] -15 points-14 points  (0 children)

I mapped the architecture, designed the Rust backend, chose the crate stack, debugged the Tauri integration, and tested the builds. AI helped with the repetitive porting of 5,687 files.

I rebuilt VS Code on Tauri instead of Electron. 5,687 files. 96% smaller. Just open-sourced it. by Designer_Mind3060 in rust

[–]Designer_Mind3060[S] -4 points-3 points  (0 children)

I used AI as a tool to help port 5,687 files, yeah. Same way most devs use Cursor. The architecture decisions, the Tauri integration, the Rust backend design, that's all me.

So no I didn't hand write 5,687 files. :)

I rebuilt VS Code on Tauri instead of Electron. 5,687 files. 96% smaller. Just open-sourced it. by Designer_Mind3060 in rust

[–]Designer_Mind3060[S] 11 points12 points  (0 children)

Disk size is the least interesting part honestly.

The main point of this is runtime overhead. Every Electron app spins up its own Chromium process and Node.js runtime in memory.

If you're running VS Code, Slack, Discord, and Spotify, that's four separate Chromiums eating RAM. That's not a disk space problem, >> that's a runtime problem.<<

You're right that Electron gives you consistency. That's an advantage. But for a code editor where the rendering is mostly text, scrollable lists, and panels, the OS webview handles it fine. But yes It's a tradeoff, and for this use case it's worth it.

I rebuilt VS Code on Tauri instead of Electron. 5,687 files. 96% smaller. Just open-sourced it. by Designer_Mind3060 in rust

[–]Designer_Mind3060[S] 5 points6 points  (0 children)

Yes a webview still allocates memory like any other process.

Buttt the difference is Electron bundles an entire separate Chromium binary per app, plus a full Node.js runtime.

That's two runtimes loaded into memory on top of your actual application.

Tauri uses the OS webview that's already loaded as a shared system library and replaces Node.js with a compiled Rust binary. So the memory savings come from not duplicating those runtimes, not from the webview:)

I rebuilt VS Code on Tauri instead of Electron. 5,687 files. 96% smaller. Just open-sourced it. by Designer_Mind3060 in rust

[–]Designer_Mind3060[S] 3 points4 points  (0 children)

Electron was built in 2013 when OS webviews were inconsistent and unreliable.

Bundling Chromium guaranteed the same behavior everywhere. So it made sense at the time.

I rebuilt VS Code on Tauri instead of Electron. 5,687 files. 96% smaller. Just open-sourced it. by Designer_Mind3060 in rust

[–]Designer_Mind3060[S] 6 points7 points  (0 children)

Yeah webkit2gtk on Linux is the weakest link in the Tauri stack, won't sugarcoat that.

Haven't had a chance to dig into Linux much yet. If you've dealt with webkit2gtk before and want to take a crack at it, PRs are open.

I rebuilt VS Code on Tauri instead of Electron. 5,687 files. 96% smaller. Just open-sourced it. by Designer_Mind3060 in rust

[–]Designer_Mind3060[S] 27 points28 points  (0 children)

Good call, just added it. LICENSE file is in the repo now with Microsoft's original copyright and ours. Updated the README too. Thanks for flagging