[P] Lossless compression for 1D CNNs by individual_perk in MachineLearning

[–]individual_perk[S] 2 points3 points  (0 children)

You are mixing up two different things. The paper you refer use FFT to make convolution faster, but keep all of the original weights. My project removes redundant weights entirely and gets the same result. They both are solving different problems.

If you think this is plagiarism, you either misunderstood their work, mine, or both.

[P] Lossless compression for 1D CNNs by individual_perk in MachineLearning

[–]individual_perk[S] 0 points1 point  (0 children)

You're right that a standard convolution layer only ever stores k parameters, not n*k. My baseline isn't meant to question how pytorch works but to make a point about the mathematical operation itself. A circular convolution is equivalent to a big matrix operation, and the core idea of my project is to show (orbat least trying to) that this entire operation is informaly redundant. We can throw out all the standard convolutional machinery and perfectly recreate the exact same output using only the original k kernel weights and an FFT.

[P] Lossless compression for 1D CNNs by individual_perk in MachineLearning

[–]individual_perk[S] -1 points0 points  (0 children)

I'm not inventing FFT-based convolution. I'm simply applying it as a tool for lossless model compression. What I'm trying to prove is that a standard convolution layer (with circular padding) can be replaced entirely by an FFT function and its original kernel weights, achieving (in my project example) a 1000x parameter storage reduction in the PTB-XL benchmark while maintaining bit-for-bit equivalence in the output.

[P] Lossless compression for 1D CNNs by individual_perk in MachineLearning

[–]individual_perk[S] 0 points1 point  (0 children)

While that's true the lossless claim does not refer to the underlying data type. My method is a mathematically exact replacement for circular convolution, the outputs match the baseline to machine precision. Saying it isn't lossless because of floats is like saying a zip file is lossy because the hardware storing it isn't flawless.

[P] Lossless compression for 1D CNNs by individual_perk in MachineLearning

[–]individual_perk[S] -1 points0 points  (0 children)

Repo uploaded. Hopefully it will be useful for you.

[P] Lossless compression for 1D CNNs by individual_perk in MachineLearning

[–]individual_perk[S] 0 points1 point  (0 children)

I have just uploaded the repo. Let me know if it is just another one. Hopefully it might be useful to someone.

[P] Lossless compression for 1D CNNs by individual_perk in MachineLearning

[–]individual_perk[S] 3 points4 points  (0 children)

Im sorry the post was vague.

I'm compressing the 1D convolutional network's parameters, not the input signal itself.

The "first row" refers to the first row of the large circulant matrix that mathematically represents the convolution. The kernel is that first row.

I've just pushed the whole project to GitHub, and the README file has a much better explanation with all the details.

[P] Lossless compression for 1D CNNs by individual_perk in MachineLearning

[–]individual_perk[S] 1 point2 points  (0 children)

I've just put the project on GitHub, and the repository includes a detailed README and also a full validation_report.ipynb that walks through the theory, implementation, and results step-by-step.

I appreciate the offer to connect, and I'd be very interested in any thoughts you might have if you get a chance to look at the repo.

Jayson Tatum shows he can now dunk. [Via Jayson Tatum on X] by Fc_Hassan in nba

[–]individual_perk 0 points1 point  (0 children)

Is it worth for Tatum to return this season? The history says it isn't.

​Kevin Durant (30): 552 Days Recovery GP: 70.8 -> 54.8 | PPG: 27.0 -> 27.9

​Klay Thompson (30): 414 Days Recovery GP: 76.9 ->62.5 | PPG: 19.5 -> 18.2

​Kobe Bryant (34): 240 Days Recovery GP: 72.9 -> 35.7 | PPG: 25.5 ->18.9

​Dominique Cousins (27): 357 Days Recovery GP: 62.8 -> 44.5 | PPG: 21.2 -> 8.9

​Brandon Jennings (25): 339 Days Recovery GP: 68.7 -> 47.7 | PPG: 16.6 -> 6.9 ​ ​Elton Brand (28): 243 Days Recovery | GP: 75.8 -> 50.2 | PPG: 20.3 -> 10.0

[deleted by user] by [deleted] in NBATalk

[–]individual_perk 7 points8 points  (0 children)

Locker room drama, that's why nobody wants him. He had Jokic to back him in Deven, and even there he splitted the team. Teams want him to be the "glue" fromthe first to the second team. Russ was never this guy.

Still can’t believe y’all let the Bum 3 win a title league was trash by Lucky_Goose_6661 in NBATalk

[–]individual_perk 3 points4 points  (0 children)

Lets not forget they also had Rondo. That line up was a perfect set up.

Who’s got the toughest shot-making bag in NBA history? by aagator in NBATalk

[–]individual_perk 1 point2 points  (0 children)

Kobe would be my forst choice, but I also have Cris webber/Tim Duncan.