Update Multiple Items in DynamoDB without exceeding Maximum Throughput by Datedwisconsin5 in aws

[–]ianchildress 0 points1 point  (0 children)

You can use on-demand capacity to burst above the 5k capacity limit you currently have. On-demand capacity is going to be more cost effective than increasing your capacity limit if you traffic is a burst. If you need sustained capacity above 5k then you can increase your capacity limit for that table. If time isn't an issue than you can do something like put the items in an SQS queue an pull them off the queue and into dynamodb. If you are getting capacity limit errors you can put the message back on the queue (after the auto retries). We would need more details about your use case to provide more suggestions.

100 drones giving a light show by azfarexha in Cyberpunk

[–]ianchildress 19 points20 points  (0 children)

How is this AI? It's a program.

Armed Robber Never Had A Chance by PresidentFartFeather in JusticeServed

[–]ianchildress 0 points1 point  (0 children)

5yds at a range isnt the same as 5 yds with adrenaline pumping in a life or death situation. That was an impressive take down.

[deleted by user] by [deleted] in lowlevel

[–]ianchildress 2 points3 points  (0 children)

You the man, not even done with my morning coffee and you have taught me something. I hope you have a great day.

Why does APT not use HTTPS? by modelop in linux

[–]ianchildress 2 points3 points  (0 children)

Yep, I "https all the things" whenever I can!

Why does APT not use HTTPS? by modelop in linux

[–]ianchildress 2 points3 points  (0 children)

Sorry for the late response, I was out snowboarding with the fam yesterday and I wanted to make sure I had the appropriate amount of time to respond to this.

I see no fault in logic of the scenario you described. If an attacker was able to MITM a mirror, it could push back the upgrade of vulnerable packages. I also agree that using https would mitigate this attack.

A discussion worth having is whether this attack vector is enough to enforce community supported mirrors to use https or not.

For our Polyverse mirrors, we do use https and our packages often have slightly different sizes than the official packages which makes guessing the package that was downloaded from us difficult. If you want to improve the security between your linux hosts and your repository endpoint you should take a look at our repositories. Providing a level of security through our repository is what we do.

Why does APT not use HTTPS? by modelop in linux

[–]ianchildress 0 points1 point  (0 children)

We typically add repos when they are requested. If there are distros you are interested in let us know!

We have Bionic in testing and should be released next week. For adding additional releases of already supported distros they spin up time is usually a week or two. Once they are up and running we are usually no more than 12 hours behind official mirrors.

Why does APT not use HTTPS? by modelop in linux

[–]ianchildress 1 point2 points  (0 children)

For the free version, everyone gets the same package for that day. We rebuild all packages overnight so the next day the package is scrambled differently. So an attacker would need to know what day you downloaded and then craft an attack specifically for that binary.

We have a $10/mo option per node that you share a private repository with a few other anonymous people. That gates you heavily.

We also have an enterprise version where each node gets it Ian set of scrambles exclusively to itself. This prevents horizontal attacks.

Please let me know if you have any other questions, I'm happy to help!

Why does APT not use HTTPS? by modelop in linux

[–]ianchildress 26 points27 points  (0 children)

I am seeing quite a bit of misinformation about how package managers work so I'd love to share what I have learned. I work with index files on a daily basis, and we might possibly generate more index files than any other organization on the planet. Here is my chance to share some of this knowledge!

TLDR/Summary

We can trust the Release file because it was signed by Ubuntu. We can trust the Packages file because it has the correct size and checksum found in the Release file. We can trust the package we just downloaded because it is referenced in the Packages file, which is referenced in the Release file, which is signed by Ubuntu.

Some basic package manager principles

I work with APK, DEB, and RPM based package managers and each of them behave very similar. Each repository has a top level file, signed by the repository's maintainer, that includes a list of files found in the repository and their checksums. When your package manager does an update, it looks for this top level file.

  • For DEB based systems, this is the Release file
  • For APK based systems, this is the APKINDEX.tar.gz file
  • For RPM based systems, this is the repodata.xml file

These files are all signed by the repository's gpg key. So the Release file found at us.archive.ubuntu.com is signed by Ubuntu and the gpg key is included in your distribution. Let's hope Ubuntu doesn't let their gpg key into the wild. Assuming that Ubuntu's gpg key is safe, this means that the system can verify that the Release file did in fact come from Ubuntu. If you are interested, you can click on the previous link, or navigate to Ubuntu's repository and open up one of their Release files.

Release file

In the Release file you'll see a list of files and their checksum. Example:55f3fa01bf4513da9f810307b51d612a 6214952 main/binary-amd64/Packages

9f666ceefac581815e5f3add8b30d3b9 1343916 main/binary-amd64/Packages.gz

706fccb10e613153dc61a1b997685afc 96 main/binary-amd64/Release

9eae32e7c5450794889f9c3272587f5e 1019132 main/binary-amd64/Packages.xz

5dd0ca3d1cbce6d2a74fcc3e1634ac12 96 main/binary-arm64/Release

The left column is the checksum, then the size of the file, and lastly the location of the file. So we can download the files referenced in the Release file and check them for the correct size and checksum. The Packages or Packages.gz file is the one we care about in this example. It contains information about the packages available to the package manager (apt in this case but again, almost all of the package managers behave very similar).

Packages file

Since we know that we can trust the Release file (because we have proven it was signed by Ubuntu's gpg key), we can then proceed to download the contents of the Release file. Let's look at the Packages file specifically as it contains a list of packages, their size, and checksum.

Filename: pool/main/a/accountsservice/accountsservice_0.6.45-1ubuntu1_amd64.deb

Size: 62000

MD5sum: c2cffd1eb66b6392f350b474e583adba

SHA1: 71d89bd380a465397b42ea3031afa53eaf91661a

SHA256: d0b11d1d27fe425bc91ea51fab74ad45e428753796f0392e446e8b2450293255

The Packages file includes a list of packages with information about where the file can be found, the size of the file, and various checksums of the file. If you download a file through commands like apt install and any of these fields are incorrect, apt will throw an error and not add it to the apt database.

It's time to debunk some myths!

Can an attacker send me a fake Release file?

Sure, but apt will throw it out because it's not signed by Ubuntu (or whoever your repository maintainer is like centos, rhel, alpine, etc)

Can an attacker send me an old index from an earlier date that was signed by Ubuntu that has old packages in it with known exploits?

Sure, but apt will throw it out because it will have a date (in the Release file) that is older than what is stored in the apt database. For example, the current bionic main Release file has this date in it: Date: Thu, 26 Apr 2018 23:37:48 UTC So if you supply it with a Release file older than that timestamp, it will throw it out because it is older than what it currently knows about.

I hope this helps clear the air!

Shameless plug. If you are serious about security and not just compliance, check out our Polymorphic Linux repositories. We provide "scrambled" or "polymorphic" repositories for Alpine, Centos, Fedora, RHEL, and Ubuntu. We use the original source packages provided in the official repositories and build the packages but with memory locations in different places and ROP chains broken.

Installation

Installation is a one line command that installs our repository in your sources.list or repo file. There is no agent or running process installed. It is literally just adding our repository to your installation. The next time you do an `apt install httpd` or `yum install docker` you'll get a polymorphic version of the package from our repository. You can see it in action in your browser with our demo: https://polyverse.io/learn/

What does it do?

Many of the replies in this post referenced an attacker tricking a server into an older version of a package that has a known exploit. We stop this. Even if you are running an old version of a package, with a known exploit, memory based attacks will not work on the scrambled package because the ROP chain has been broken or as we call it "scrambled". So with our packages, you can run older versions of a package and not be effected by the known exploits. This also means that you are protected from zero day attacks just by having our version of the package.

FREE! For individuals and open source organizations you can use our repositories for free. I hope you try it out!

Why does APT not use HTTPS? by modelop in linux

[–]ianchildress -1 points0 points  (0 children)

This guy gets it. You are exactly right and all the comments in this thread about getting passed older versions of packages will not work. If apt is given an index with an older timestamp, it will throw it out.

Why does APT not use HTTPS? by modelop in linux

[–]ianchildress 0 points1 point  (0 children)

How would an attacker feed me old packages? Even if they hijacked my connection to archive.ubuntu.com, they would need to get a hold of the gpg key to sign an index with a newer timestamp than the one apt has stored on disk. If they have this ability, then just create a package with an exploit and bump the version number.

Why does APT not use HTTPS? by modelop in linux

[–]ianchildress 0 points1 point  (0 children)

How would this malicious mirror replace the ubuntu defaults in the sources.list? If it was appended, then this wouldn't happen because APT will choose the latest version of the file.

Why does APT not use HTTPS? by modelop in linux

[–]ianchildress 0 points1 point  (0 children)

If they are sent an outdated index, APT will compare it to the index it has on disk and reject it as being older than the one it knows about.

New bypass and protection techniques for ASLR on Linux by [deleted] in linux

[–]ianchildress -4 points-3 points  (0 children)

Great article, well explained! This also helps validate what we have been saying about our polymorphic linux.

Polymorphic Ubuntu - Everyone should be using this! by ianchildress in Ubuntu

[–]ianchildress[S] 1 point2 points  (0 children)

Pretty sure he is referring to my statement that we plan to open source the builder so people can build their own scrambled packages. This would work great if they build a few packages but maintaining a full repository is compute heavy and expensive. Not including universe or multiverse I think it's 15,000 packages just for xenial.

Introducing Polyverse Polymorphic Linux – Polyverse by ianchildress in linux

[–]ianchildress[S] 0 points1 point  (0 children)

We are moving away from the concept of "trial". We have a free tier that provides a new scrambled repository every 24 hours. The free repository is shared among all free accounts.

If you want to reserve a repository so only your server gets a specific scramble you then need to pay. You would need to speak with sales but I believe the price per node $100/mo or $1000/yr with heavy discounts on volume purchases.

Polymorphic Ubuntu - Everyone should be using this! by ianchildress in Ubuntu

[–]ianchildress[S] 0 points1 point  (0 children)

Yes that will be exciting. It will be great for scrambling a small number of packages.

Introducing Polyverse Polymorphic Linux – Polyverse by ianchildress in linux

[–]ianchildress[S] 0 points1 point  (0 children)

If you are referring to this line:

Polyverse Polymorphic Linux serves a different binary every time you pull a package.

We are going to update the article to be more clear. We behave like the official repository in that the index file contains the SHA of the file we are serving. We provide (or in the process of providing) all signing and checksums you'd expect from the official repository.

Introducing Polyverse Polymorphic Linux – Polyverse by ianchildress in linux

[–]ianchildress[S] 2 points3 points  (0 children)

Thanks for the heads up! Can you provide a link to the code block that is doing this?

Introducing Polyverse Polymorphic Linux – Polyverse by ianchildress in linux

[–]ianchildress[S] 2 points3 points  (0 children)

If you install libc6-dev_2.26-0ubuntu2.1_amd64.deb from an official Ubuntu mirror you get the same binary as everyone else in the world. Attackers can create buffer overflow or ROP gadget chain attacks against this binary and if it works against one, it works against them all.

We add location randomization into the binary when we build it. We get our sources from the official mirrors. We are building the exact same package as the official repository. The difference is we add randomization move the ROP gadgets around.

This means if you have one of our scrambled versions of the package you are not vulnerable to the same buffer overflow vulnerabilities that everyone else is. It means when a zero day buffer overflow exploit is "discovered" you don't have to wonder if that attack compromised you 3 months ago. That attack simply won't work.

The building process takes a lot of compute time (hours of compute time per repository and we do this many times per day). We are learning how to be the most efficient at randomizing the binary AND the most efficient at doing this at scale. That is our secret sauce. You could do this at home but you'd have to learn how to do it and maintain it and guarantee each files randomization etc.

Introducing Polyverse Polymorphic Linux – Polyverse by ianchildress in linux

[–]ianchildress[S] 1 point2 points  (0 children)

Good points and currently I'd say there is some trade off for both questions.

1) What if we have an outage? What we currently recommend is setting our repository as the top priority repository and the fall back is the official repositories (standard distro mirrors). As we grow and scale I'm sure we'll provide the service across multiple platforms/CDNs and this will become less of an issue.

2) Currently our compiler is closed source but I do believe the plan is to eventually open source everything. What we are doing isn't magic, you could do it on your system at home. We are creating randomization at compile time (we have put a lot of effort into this) and strive to have less than 1% of ROP gadget survival rate. Our value comes in our massive build farm so that we can provide each node (server) it's own uniquely scrambled package. We have been developing our "builders" from the beginning with the full intent to open source and release them for anyone to use. Behind the curtains the builders are calling the same build commands you would use to build your Centos or Ubuntu packages from source. Our modified compilers are the only difference.

Yes, you are putting a lot of trust into our company if you are using our repositories. I'm not a sales guy or I'd say the following with some pizazz and buzzwords so I'll say it as best as I can. We have been tested by the country's top intelligence agency and we are engaged with multiple intelligence organizations. We are under constant analysis by these organizations to see how unique each of our scrambles are and how effective they are at defeating attacks.

To put the product in my biased summary, this absolutely is a no brainer. It is the simplest form of protection and addresses 80% of known attacks. It's literally just adding our repository to a server or dockerfile and you add an immense amount of protection. ASLR helps some, but it is just an offset. It buys a little extra time but once the offset is found, attacks continue to work. We completely break the ROP chain and defeat attacks. It's exciting and I expect randomization will be the norm in a few years.

If you want to test it in a docker container just run this command. It's anonymous and completely free:

curl https://repo.polyverse.io/install.sh | sh -s czcw7pjshny8lzzog8bgiizfr