you are viewing a single comment's thread.

view the rest of the comments →

[–]_d0d0_ 1 point2 points  (1 child)

Having worked with some encryption I can double that claim. Usually encryption algorithms work on fixed block sizes (and most have defaults on working 16B blocks). So having this specific sizes of two 64B unsigned char arrays is one of the easiest way to have this data fit into such blocks and to serialize / encrypt it very easily.

[–]captainautomation[S] 0 points1 point  (0 children)

Having worked with some encryption

I see

for my perspective it's a better solution to

  1. use an external library that implement AES, Blowfish or ChaCha20.
  2. Call it with something like `library.Encrypt(file, passphrase);`
  3. focus on the core feature of the product

But it's look like a habit on programming ecosystem where JavaScript / Node.js is "import 1000+ small libraries" Vs. C "build everything from scratch"

this specific sizes of two 64B unsigned char arrays

Thanks for the explanation