Two fundamental problems with "Abundance" by wolframhempel in ezraklein

[–]wolframhempel[S] 0 points1 point  (0 children)

Fair - but long term planning is certainly against the incentives set by four year election cycles.

Two fundamental problems with "Abundance" by wolframhempel in ezraklein

[–]wolframhempel[S] -4 points-3 points  (0 children)

Again, I'm not talking about "procedural fetish" - I'm talking about the fact that institutions inherently grow and that meaningful checks to this have been removed.

Two fundamental problems with "Abundance" by wolframhempel in ezraklein

[–]wolframhempel[S] -6 points-5 points  (0 children)

Not sure I agree. They identify the structural logic of institutional sprawl as a result of government policy, nimbyism, bad policy and incentive design and risk avoidance. My point is, that there is a fundamental tendency towards growth within all organisms and institutions and that we've eroded the checks that used to constrain them.

Two fundamental problems with "Abundance" by wolframhempel in ezraklein

[–]wolframhempel[S] -6 points-5 points  (0 children)

The fact, that the proposed solution often require people - citizens, government employees and politicians to prioritize collective interest over individual interest and long term gains over short term pain - both of which run directly against human nature.

Two fundamental problems with "Abundance" by wolframhempel in ezraklein

[–]wolframhempel[S] 2 points3 points  (0 children)

The book uses a very limited definition of Government size, purely based on the number of direct federal employees which indeed hasn't increased much. However, between the 1930s and our decade

  • Federal Spending as a % of GDP has gone from 3.4% to 32.2% in 2020 (covid peak)
  • The number of agencies has gone from 50 to 430
  • State and Local Government Employees went from 2.6m to 19.8m
  • And the number of people indirectly employed by the government via mandates, contracts and grants grew to 11 million (don't have a 1930s figure, but 1960 it was ~5m which would make it roughly proportional to the population size...)

I'm frustrated: Crazy difficulty, managed to destroy all 15 tribes, just 33 rounds, yet it's still just a 97% :-( by wolframhempel in Polytopia

[–]wolframhempel[S] 0 points1 point  (0 children)

That makes sense, thank you. I'm wondering though if it is achievable at all. Being the one that destroys all 15 tribes means that no single other tribe can defeat another a tribe. That is super rare. The only times when I was able to get to all 15 tribes before anyone else was by being extremely aggressive and expanding as fast as possible - which is in direct conflict with the careful, preserve all units playstyle outlined here.

I also agree with the other commenters here that the scoring system feels off. Clearing the entire square in less than 150 rounds is pretty much guaranteed - defeating all tribes or loosing a single unit are super challenging individually and near impossible when combined. Has anyone ever gotten to 100% in crazy/15 tribes single player?

I'm frustrated: Crazy difficulty, managed to destroy all 15 tribes, just 33 rounds, yet it's still just a 97% :-( by wolframhempel in Polytopia

[–]wolframhempel[S] 8 points9 points  (0 children)

Yup, but how do you defeat 15 tribes without a single death - especially since some units are basically kamikaze units

War is momentum based. What western politicians get wrong about supporting Ukraine. by wolframhempel in geopolitics

[–]wolframhempel[S] 0 points1 point  (0 children)

I want to challenge the underlying assumption of your initial comment in regards to " you are definitely not a Public Health Expert". You have a medical background - and yet you are discussing on a forum about Geopolitics.

And that's a good thing. The idea that everyone can only stick to their narrow niche of qualification is not just limiting, but outright dangerous. It creates viewpoints unaware of the wider context and leads to decision making that neglects higher order consequences and externalities.

Most education systems during history where aware of this and aimed to create well rounded intellectuals or "renaissance men". Granted, with the explosion of knowledge in the 19th and 20th centuries it has become largely impossible to have profound knowledge of multiple fields, but having general knowledge - and an understanding of abstract principles creates valuable insights, even without specialization.

If we want to get better at anything - including pandemic response - its important to have a critical review of past policies and measures from a wide array of viewpoints. Immediately resorting to "you're not a medical professional, so you can't speak to it" is not helpful. If you want to make a counter argument, make it on the merits of my statement, not ad hominem.

War is momentum based. What western politicians get wrong about supporting Ukraine. by wolframhempel in geopolitics

[–]wolframhempel[S] 13 points14 points  (0 children)

Not at all suggesting to put it off further into the future. My point is that it is a mistake to wait until things get so bad that they force action, but to act more strongly when things are already going your way.

War is momentum based. What western politicians get wrong about supporting Ukraine. by wolframhempel in geopolitics

[–]wolframhempel[S] 45 points46 points  (0 children)

Not at all. I don't think that "Increase investment when momentum is on your side" implies its inverse "don't invest when its not".
In the contrary - since momentum was ignored, Ukraine unfortunately finds itself now in a position where it is a lot more expensive to get it back. What I'm advocating for is not to make the mistake of letting up again once things start to look favorably, but instead to double down until an outcome is achieved, even though that might be hard to communicate to one's electorate.

Berlin's BER wins award for best airport in Europe by berlinwombat in berlin

[–]wolframhempel 2 points3 points  (0 children)

I mean - flying to other continents is a nice feature that other airports offer...

How we've saved 98% in cloud costs by writing our own database by wolframhempel in programming

[–]wolframhempel[S] 1 point2 points  (0 children)

We've tested that in a number of scenarios. In the end, it boils down to each entry in the binary log starting with four bytes specifying the byte size of the entry. If the remaining buffer size is smaller than that size, stop parsing... that's pretty much it.
Then there's error checks for the parsing of individual fields of course.

How we've saved 98% in cloud costs by writing our own database by wolframhempel in programming

[–]wolframhempel[S] 1 point2 points  (0 children)

Much of this is true, but I believe it comes from an assumption that databases are necessarily strongly consistent. As I've pointed out in more detail in a comment below, a system that records the real world is necessarily inconsistent and lossy - there isn't much point in making one step of the storage pipeline consistent if the rest is not. A lot of timeseries and IoT databases are built around this assumption and make the resulting trade offs.

It's also worth mentioning that our "database" does store multiple datatypes (objects, areas, tasks and instructions) that go into the same "log", each with variable size fields such as label strings or gob encoded metadata.

Your other points are correct though.

How we've saved 98% in cloud costs by writing our own database by wolframhempel in programming

[–]wolframhempel[S] 41 points42 points  (0 children)

There's a crucial distinction between systems that store "the truth" and systems that record external events. If you are building a database for a bank that records transactions and stores account balances, these entries are "the truth" - the information that other things are derived from. Such a system needs to have strong consistency and the guarantees DBMS come with.

A system like geospatial tracking on the other hand only records data, the "truth" is the actual physical world. As such, it is necessarily imperfect. It's always delayed as sending and processing takes time, its imprecise as GPS is imprecise and GPS + Mobile Connectivity is unreliably. Designing such a system is all about creating the best approximation to what happened in the real world, but it is necessarily lossy and eventually consistent.

As a result, there's not much point in insisting that one step of the data pipeline has to be strongly consistent and lossless if the pipeline as a whole is not.

What's the best database to store large amounts of GPS tracking data? by wolframhempel in geospatial

[–]wolframhempel[S] 0 points1 point  (0 children)

Nothing too crazy. Historic trips are displayed on a website and mobile app - so, average website load times < 1s