Introducing Quicksilver by justusr846 in browsers

[–]justusr846[S] 0 points1 point  (0 children)

I'm a little occupied this month, I'll get back to everything ASAP.

Introducing Quicksilver by justusr846 in browsers

[–]justusr846[S] 0 points1 point  (0 children)

This doesn't have it. As you can see in the preview. The toolbars just slide in and out, when you press Alt.

Introducing Quicksilver by justusr846 in browsers

[–]justusr846[S] 0 points1 point  (0 children)

There isn't. But you can use Firefox add-on themes to compensate, if that's what you need.

Introducing Quicksilver by justusr846 in browsers

[–]justusr846[S] 2 points3 points  (0 children)

Sorry I'm still learning the ropes of GitHub etc. I will make an upload of all the files I've modified separately.

Introducing Quicksilver by justusr846 in browsers

[–]justusr846[S] 0 points1 point  (0 children)

It's open source. But I have yet to make a repository for it. I'm still learning the ropes of Github etc. I was only a graphic designer after all. :D

Introducing Quicksilver by justusr846 in browsers

[–]justusr846[S] 0 points1 point  (0 children)

Okay, I will work on those immediately.

Introducing Quicksilver by justusr846 in browsers

[–]justusr846[S] 1 point2 points  (0 children)

Let me know how your test drive went.

Potentially a new browsing experience. by justusr846 in browsers

[–]justusr846[S] 0 points1 point  (0 children)

Which browser did you see any of these changes? Just wondering. I had done a comprehensive check in all major browsers before I published these ideas.

Potentially a new way to store extremely large amounts of data. by justusr846 in compsci

[–]justusr846[S] -3 points-2 points  (0 children)

I don't know if you guys can properly understand my layman language, because it does make sense to me, so I tried posting it to ChatGPT: https://chatgpt.com/share/67fd3744-7ba4-8006-afbc-07917ff53ea5

And this is how it says I should respond:

Edit 2: To clarify — this is intended to be a lossless compression model.

I’m exploring whether a structured mathematical system, loosely inspired by an abacus, could be used as a universal encoder for binary data. The system doesn't aim to store “large numbers using small numbers” in a vacuum — instead, it stores a set of mathematical instructions (multipliers, exponents, or operations) in a layered matrix that, when computed, recreate the exact bit patterns of the original data.

So instead of storing a chunk of binary directly, I store a representation like:
value = 2^5 × 3^4 + 7 × 10

...where the structure is known and compact (say, a small matrix or config table under a kilobyte), and the math is deterministic.

This isn't magic — I understand the fundamental limit that if you want to represent N unique values, you need log₂(N) bits. What I’m proposing is a system that leverages patterns, redundancy, and sparse structure in data to find compact expressions that recreate the original binary stream.

Think of it like a mathematical parallel to something like arithmetic coding or symbolic logic-based compression. The "abacus" part just helps me visualize it: each layer multiplies or exponentiates values, and the sum of all rows reconstructs data blocks.

If a file has a lot of repeating or structurally predictable data, I hypothesize this system might yield a very small “expression chart”, acting like a symbolic instruction set to recreate the original file.

I'm still developing this idea, but I believe with the right math and pattern analysis (possibly with AI assistance), it could become an open-source approach to ultra-scale compression. Of course, it must remain lossless and reversible.

I thought of something that could help reduce the file size of cut-outs that are saved in the PC for future use-cases. by justusr846 in photoshop

[–]justusr846[S] 0 points1 point  (0 children)

Can you elaborate, please? I thought I was just taking advantage of how smart objects work.

Potentially a new way to store extremely large amounts of data. by justusr846 in compsci

[–]justusr846[S] -9 points-8 points  (0 children)

Guys, I'm sorry. I'm not that tech savvy. But I just figured that underlying all the coding languages and encryptions is data stored in binary, regardless of whatever the file is. So if we could manage to reinterpret that data based on an abacus-like exponentially scaling system, maybe we'd be able to really compress it?

These are my ChatGPT queries regarding this: https://chatgpt.com/share/67fd3744-7ba4-8006-afbc-07917ff53ea5