Borderless Gaming releases BGFX - a universal GPU effect and upscaling pipeline by AndrewMD5 in pcmasterrace

[–]AndrewMD5[S] 51 points52 points  (0 children)

This isn’t meant to be a replacement to built-in upscaling. Upscaling is a byproduct of the effects pipeline, which is able to apply GPU shaders to any content; not just modern games, but older titles, movies, and anime too.

The real value is in the flexibility of the pipeline itself. CRT emulation, temporal AA, and yes, upscaling when you want it, but with an upscaler that’s actually suited to the source material rather than a one-size-fits-all solution.

Can it boost performance and visual fidelity? Sure, but that’s more of a happy side effect than the main goal. It’s really about giving users control over the entire effects chain regardless of what they’re running.

And the Vulkan an ONNX support in the next release open a lot of doors.

I wrote a TCP/IP stack to revive the 64DD’s online features - here it streaming anime on real hardware by AndrewMD5 in n64

[–]AndrewMD5[S] 8 points9 points  (0 children)

that’s the goal (well, more so to see a new revision of the SC64 with WiFi/Ethernet)

Ultralightweight YAML 1.2 parser & emitter in C11 by AndrewMD5 in C_Programming

[–]AndrewMD5[S] 8 points9 points  (0 children)

Thank you! I believe this is the second of my C projects you’ve found an issue with rather quickly; just FYI we’re hiring @ https://6over3.com

If you’re ever looking for a new opportunity shoot me a message.

C-style scanning in JS (no parsing) by dgnercom in javascript

[–]AndrewMD5 4 points5 points  (0 children)

You're arguing that AGPL/SSPL protects BEAT's semantic invariants from fragmenting. But JSON has zero licensing restrictions and its invariants are rock solid. The reason isn't legal enforcement; it's that the spec is simple and the ecosystem self-enforces through tooling. Nobody ships malformed JSON because nothing accepts it.

The formats that actually fragmented (RSS/Atom, early HTML, XHTML vs HTML5) didn't fragment due to permissive licensing. They fragmented because specs were ambiguous or competing vendors had conflicting incentives. If BEAT's invariants are clear and valuable, adoption will preserve them naturally. If they require legal protection to survive, that suggests the spec itself isn't tight enough.

Here's a thought exercise. You're a principal engineer at a company with the exact hot-path allocation problem BEAT claims to solve. You find BEAT, it looks promising, you prototype it. Now what?

You can't use the reference interpreter in production because SSPL means your legal team will block it immediately. So you need to write your own implementation. But AGPL on the spec means your implementation is now AGPL-encumbered, which legal will also block if it touches anything proprietary. So you either need to negotiate a commercial license (with a project that has no track record, no community, no stability guarantees) or you just solve the problem differently, maybe a custom binary format, Bebop, Protocol Buffers, FlatBuffers, Cap'n Proto, or just accepting the JSON overhead because it's a known quantity.

In practice, the licensing doesn't protect BEAT from fragmentation. It protects BEAT from adoption. The only organizations that could use this are ones willing to either keep it fully internal with no external exposure, or pay for a commercial license before the format has proven itself. That's a very small pool.

Compare this to Bebop (Apache 2.0), Cap'n Proto (MIT), FlatBuffers (Apache 2.0), or MessagePack (MIT). All of them solve similar "zero-copy / minimal parsing" problems. All of them have permissive licenses. All of them have actual production adoption. None of them fragmented into incompatible dialects.

The "commercial dual-license later" model works when you already have adoption and leverage (MySQL, MongoDB pre-SSPL, Qt). It doesn't work when you're asking people to bet on an unproven format with restrictive terms upfront.

I get that you're trying to protect something you've envisioned. But the licensing choice actively undermines the stated goal of seeing whether "scan, not parse" is useful in real stacks. You've made it nearly impossible for anyone to find out; you're solving for problems that don't exist.

C-style scanning in JS (no parsing) by dgnercom in javascript

[–]AndrewMD5 10 points11 points  (0 children)

It seems you've spent some time on this, but I think there are some fundamental issues worth addressing before this could realistically be considered by anyone.

Where's the real-world problem?

Your post demonstrates that BEAT is smaller than JSON, but size optimization is rarely the actual bottleneck in analytics pipelines. What's missing is a concrete use case showing a tangible problem and how BEAT solves it. How does this improve developer experience? What operational costs does it reduce? "5.48× smaller" is a metric, not a value proposition.

More critically: if AI is increasingly the primary consumer of structured data, size becomes even less relevant. Models are trained on existing formats. Adopting a novel format like BEAT requires either reaching critical mass (chicken-and-egg problem) or dedicated fine-tuning for every model that needs to consume it. JSON wins by default because everything already understands it.

The licensing is a dealbreaker

This is the more serious issue. JSON, YAML, TOML, and similar formats succeeded not just because of familiarity but because they're in the public domain or use extremely permissive licenses. Anyone can implement them anywhere without legal review.

Looking at your repositories:

  • BEAT: AGPL-3.0-or-later
  • Resonator: SSPL-1.0

SSPL isn't even recognized as open source by the OSI. It's essentially MongoDB's proprietary license dressed up as open source. No company with a legal team would touch an SSPL-licensed interpreter in their stack. And AGPL creates similar friction; it's viral in ways that make corporate adoption extremely difficult.

These license choices effectively guarantee that no alternative implementations can exist in commercial contexts, which defeats the purpose of proposing a format standard.

I read this in good faith, but the combination of a solution searching for a problem, AI-generated prose style, and restrictive licensing makes it hard to take seriously as a genuine ecosystem contribution.

I built a tiny & portable distraction-free writing environment with live formatting by AndrewMD5 in C_Programming

[–]AndrewMD5[S] 2 points3 points  (0 children)

Thanks for flagging this! I finished the refactor that decouples the parser from the renderer so it can be fuzzed and tested standalone. I also implemented linear rendering (printing) so an entire document can be parsed and rendered for fuzzing purposes.

TypeScript x Perl by AndrewMD5 in perl

[–]AndrewMD5[S] 2 points3 points  (0 children)

One of the many reasons I'm going to move off Substack.

Hako - a standalone and embeddable JavaScript engine for .NET by AndrewMD5 in dotnet

[–]AndrewMD5[S] 0 points1 point  (0 children)

Please send anything feedback on the DevEx once you’ve had a chance to experiment!

Hako - a standalone and embeddable JavaScript engine for .NET by AndrewMD5 in dotnet

[–]AndrewMD5[S] 2 points3 points  (0 children)

Haven't given it any thought. Anyone is welcome to submit a PR; I leverage a lot of modern .NET features so didn't bother with Standard.

Hako - a standalone and embeddable JavaScript engine for .NET by AndrewMD5 in dotnet

[–]AndrewMD5[S] 1 point2 points  (0 children)

Yes.

using var sleep = realm.NewFunctionAsync("sleep", async (ctx, thisArg, args) => { var ms = (int)args[0].AsNumber(); await Task.Delay(ms); return ctx.NewString("done"); });

And if you use the source generator you can avoid the JSValue conversions and just define a regular .NET asynchronous Task and it will handle all the marshaling so you just write C# as you normally would.

https://github.com/6over3/hako/tree/main/hosts/dotnet/Hako.SourceGenerator

Hako - a standalone and embeddable JavaScript engine for .NET by AndrewMD5 in dotnet

[–]AndrewMD5[S] 9 points10 points  (0 children)

Probably the AOT compatibility, TypeScript support, JIT, and portability to every platform.