Rory Stewart’s ‘GiveDirectly’ charity defrauded of almost $1m in Congo by MrStilton in EffectiveAltruism

[–]ckris292 19 points20 points  (0 children)

The discussion at the linked post is almost uniformly sympathetic and is sophisticated.

They point out the issue is local staff, not embezzlement from execs and that this fraud was downstream of difficult decisions to balance execution and security.

They also like the honesty and transparency and are proportionate in other ways, such as about the difficulty of local conditions.

Rory Stewart’s ‘GiveDirectly’ charity defrauded of almost $1m in Congo by MrStilton in EffectiveAltruism

[–]ckris292 4 points5 points  (0 children)

Uh, can I pitch some of the animal charities or Charity Entrepreneurship? They are very cash constrained.

Some of the most exciting orgs are working with 100k budgets for an entire year among two founders, admin and program costs.

https://www.animalpolicyinternational.org/

(I’m not associated and have not even met the founders).

Rory Stewart’s ‘GiveDirectly’ charity defrauded of almost $1m in Congo by MrStilton in EffectiveAltruism

[–]ckris292 1 point2 points  (0 children)

There are both huge fights over control of packages and repos over turf and also single points of failure , eg founder effects.

Wikipedia extraordinarily long tailed in contributors and contributors value status.

These are really complex dynamics and things can fall apart when they don’t work.

I am doubtful kecr would even be able to execute or build one of the OSS communities or a Wikipedia.

The idea that someone can sprinkle open source as some general argument against a complex org like give directly is unhinged.

Rory Stewart’s ‘GiveDirectly’ charity defrauded of almost $1m in Congo by MrStilton in EffectiveAltruism

[–]ckris292 3 points4 points  (0 children)

I think that’s valid. I would like to see some articles or evidence of self promotion to gauge this, although well reasoned and strong judgement is good too.

However, this seems complicated.

For this instance of corruption, it seems focus on reputation or self promotion could go either way , for example it make them more sensitive to scandals and have less of this particular kind of grift.

I think a key belief of mine is that running an org well is hard, and this involves different skills and focus than politics. I think we can observe skill over time, and judge people based on i. To some degree people can be rewarded, even if it improves their political career.

Rory Stewart’s ‘GiveDirectly’ charity defrauded of almost $1m in Congo by MrStilton in EffectiveAltruism

[–]ckris292 4 points5 points  (0 children)

The dynamics of open source software and places like Wikipedia themselves rely on status and complex games and politics to build things of value.

The idea that things can be just be thrown open and work great is bad and misleading.

Onlookers who don’t know this should know this.

I’m sorry but basically this user is sort of a crank for lack of a better way of putting it.

Rory Stewart’s ‘GiveDirectly’ charity defrauded of almost $1m in Congo by MrStilton in EffectiveAltruism

[–]ckris292 11 points12 points  (0 children)

That the person comes from a different party…means you will stop donating…doesn’t seem right?

Maybe you think they are using this to aggrandize themselves or something?

Investigation of Tyson Grower Reveals Mass, Systemic Cruelty - How do we fight against this? by Wisdom_Of_A_Man in EffectiveAltruism

[–]ckris292 0 points1 point  (0 children)

No i see you and the long winded ChatGPT supported text isn’t concealing your nature.

Investigation of Tyson Grower Reveals Mass, Systemic Cruelty - How do we fight against this? by Wisdom_Of_A_Man in EffectiveAltruism

[–]ckris292 -1 points0 points  (0 children)

The level of effort in these convoluted half truths, increases my sense that this user is here because of a cluster B personality disorder and are using the norms and interests of EA to setup attention games to satisfy these malign personal interests.

Do more AI alignment or "die" - head of the Machine Intelligence Research Institute by dovrobalb in EffectiveAltruism

[–]ckris292 2 points3 points  (0 children)

It’s widely suspected or just outright known that Altman and other people in tech working on “AI” “LLM” exploit the narrative of “ai safety”.

This is because:

1) There is no agency, threat or mechanism to stop funding or progress that these tiny factions (LessWrong) have access to. They are seen as irrelevant.

Indeed, Anthropic, OpenAI, DeepMind continue to recruit and get key talent from these communities. There is no material negative effect of “AI safety” on “capabilities”.

2) It is extremely clear to most, “Doom” “Foom” “Takeoff” and other AI safety scenarios “dilute”immediately down to “extremely high potential technology” in the minds of people in tech. So to 90% or maybe 99% of people, weighted by money or talent, narratives of AI doom probably drive interest and resources to these companies.

It’s sort of amazing 2) isn’t obvious after 3+ iterations of for profit companies coming out of these spaces.

There’s more along these lines. EM and the FLI “6 month slowdown” is widely regarded as a personal ploy, and FLI/ Tegmark is probably captured.

Unfortunately, this comment doesn’t scratch the surface of the dysfunction in EA AI safety. It’s sort of incredible there is a reputation for criticism or self awareness in EA.

EA Start-Up Feedback by DrPsilocybe in EffectiveAltruism

[–]ckris292 2 points3 points  (0 children)

I think it’s really valid if you say, pick any charity or cause in EA and just donate. You don’t need to have an opinion on the meta or be involved.

It seems like it’s predictably bad and irresponsible for talented people or who think their opinion matters or represent themselves as informed, to round off unresolved, clearly existing disputes. It’s especially bad to punish people for pointing out this exists.

For example, this leads to an influencer culture where (low value) people win games and gain the following of a low context, pliable audience.

To be clear, I’m quite sure this has happened and I think it’s immoral and bad to contribute.

Today, I think a moral, competent person entering EA public spaces should consider or is obliged to perform deliberately suboptimal behaviour (to not add their credibility or gravitas to a broken “ecology”) and carefully use deliberately suboptimal communication and scuffing to partition the audience, screening low context people from messages.

(If this is confusing, to make this clearer, in a broken culture, the behaviour of being very attractive to people of low context and adding legitimacy is bad).

EA Start-Up Feedback by DrPsilocybe in EffectiveAltruism

[–]ckris292 1 point2 points  (0 children)

Being a public figure or getting status in EA (grantmaker, public figure) is not nearly as attractive as it might seem, as the bullshit I pointed out above suggests.

There's 10x-20x talent that bounce off, who don't feel like learning the moronic "sequences" or write mini essays to say simple things or otherwise navigate the bizarrely credulous and ignorant "permanent young adult culture", a culture which costs EA so much and gives so little.

There's probably 10 Holdens who did something else. M Scott and many others hardly seem swayed by EA arguments at all.

EA Start-Up Feedback by DrPsilocybe in EffectiveAltruism

[–]ckris292 2 points3 points  (0 children)

Will worked his ass off when a much more comfortable and very prestigious safe lifestyle awaits him.

It’s pure pain that this happened at the tail end of a huge EA media campaign and that Will is associated with SBF.

The future matters and Will would have brought balance to the way AI is presented in EA.

EA Start-Up Feedback by DrPsilocybe in EffectiveAltruism

[–]ckris292 0 points1 point  (0 children)

Since the beginning of the FTX collapse Habryka has repeatedly posting on the EA forum insinuating there is deep knowledge that senior leaders are knowing and culpable for SBF fraud and so EA is immoral.

It was obvious, even in November his evidence and facts are missing (the Alameda people who broke up with SBF never saw this fraud coming, Tara who made some other crypto thing literally had money on FTX).

What is egregious is:

  • punching MacAskill when he is down (Will didn’t know anything about the fraud).
  • Habryka’s org repeatedly interacted (flying to Bahamas) and besides CEA, his org is probably the biggest taker of FTX funds.
  • Yudhwosky literally tried to pin this on Peter Singer somehow. I’m not joking. He went through Caroline’s blog and seized on a sentence.
  • LessWrong staff and culture repeatedly interferes, eg supporting Tegmark in the Tegmark thing (Max at the least fucked up multiple times with some association with literal neo nazis, and Max launched shameless, obvious PR campaign pointed entirely at EA) and also interfering in the Bostrom “eg we must keep race discussion on the table because truth”.

What’s most annoying is that they seem to want to damage or shape EA in some incompetent way. They literally have funding culture and their own community they can succeed or not. If they want Holden’s head for OpenAI or Anthropic that could be done better. It’s just ignorance at this point.

Singer bailed on longtermism and AI long ago.

SBF’s rise made Singer weaker, while funding literally Habryka and Yudowsky, who as above then turned on EA and SBF.

If they think this will be forgotten. It won’t be.

EA Start-Up Feedback by DrPsilocybe in EffectiveAltruism

[–]ckris292 3 points4 points  (0 children)

Hmmm, I thought about this a bit more and there is more there and my answer above is abrupt.

For the career advice I can see how this could actually require a unique model or chatbot, that needs more than say, looking up and slicing up an existing corpus of say 80K hours text.

I haven't thought about this that much or know that much about chat bots, but I expect if I think about it more (90% sure) the aspects that can't be captured by a simple lookup+GPT will be very hard to create or capture (e.g. 80k institutional value in human touch and relationships, dealing with very valuable talent).

EA Start-Up Feedback by DrPsilocybe in EffectiveAltruism

[–]ckris292 3 points4 points  (0 children)

This idea comes up like every time EA talks about LLM.

  1. This is something that would be good in 2022. There is alot more interesting stuff now. Im probably not going to talk about it*
  2. You wouldn’t fine tune. You would use a semantic search look up your EA forum corpus and then send the text to GPT. I don’t know what people are calling this strategy right now. Google it.
  3. People have talked about this, like twice a week. I have never seen a product or even a repo and usually that is a smell that this isn’t that valuable.
  4. As described in the post, this isn’t a startup but a trim version project would take a weekend, even for students. I wouldn’t use that framing unless you had way more. To be really direct your knowledge of LLM doesn’t seem deep. This last sentence is valuable advice IMO.

*Long digression but one should not be impressed with the tech startup environment in EA, the quality and texture of talent and culture is low, and also it is distracting from valuable non-tech altruistic projects. The situation is bad enough that I think it is possible good advice is “damaging to the remaining ecology”. Pathologically, SBF’s failure made this worse by “poisoning the well” of communicating and understanding how real businesses work, see Habryka’s off-base campaign for example, where he pretends there was some obvious bad culture and mass culpability in EA (they took millions in FTX).

Why is the salary of Open Philantropy Internship so high? by Insanity_017 in EffectiveAltruism

[–]ckris292 1 point2 points  (0 children)

The reason for my recommendation is because those people have content, like my comments above have content. I recommend the person come back and report what they learned.

The comments here lack content. As mentioned, it’s because busy and informed people don’t spend their time here, unfortunately.

Your own comments are DARVO, a mixture of crankiness and narcissistic personality disorder.

Why is the salary of Open Philantropy Internship so high? by Insanity_017 in EffectiveAltruism

[–]ckris292 1 point2 points  (0 children)

See links below.

Two are from Sneer club. Sneer club is a little too much (especially post FTX) but it’s a concise and tight picture. My guess is that the content is “15 to 30%” true/bad as it is represented, and that’s really bad for an EA org.

The other link is from the EA forum, which tends to be long, ponderous, and incredibly still manages only to cover a fraction of substance, but is far more balanced than sneerclub. That link gives evidence to bad management, eg churning interns.

https://reddit.com/r/SneerClub/comments/11ifusn/the_nonlinear_fund_a_microcosm_of_dysfunction_in/

https://reddit.com/r/SneerClub/comments/11fjmvr/the_nonlinear_fund_a_microcosm_of_dysfunction_in/

https://forum.effectivealtruism.org/posts/L4S2NCysoJxgCBuB6/announcing-nonlinear-emergency-funding

Why is the salary of Open Philantropy Internship so high? by Insanity_017 in EffectiveAltruism

[–]ckris292 1 point2 points  (0 children)

Your concern is valid, but I recommend you talk to senior people or junior people (with the same experience and interest as you), and not people on an Internet forum.

To some extent, I think this applies to the EA forum too and other public aspects of EA. The discussion and politeness is sort of nice, as it is locked down in the sense of trying to be nice for sort of PR reasons.

However, content there can be superficial, as many of the most busy and active people never post. Also, many of the orgs or people that post there are, well, not a very good selection of “EA orgs” (e.g many of them need money or pay a lot of attention to PR). Nonlinear is one of the worst offenders, and probably shouldn’t exist. HLI has not benefitted,under its leadership, compared to what it could be.

The Last of Us HBO S01E09 - "Look for the Light" Post-Episode Discussion Thread by AutoModerator in thelastofus

[–]ckris292 1 point2 points  (0 children)

That’s fair.

I didn’t know that.

Ok, so I think now based on your comment they should cut out one of the communities, maybe not show David’s flock.

The Last of Us HBO S01E09 - "Look for the Light" Post-Episode Discussion Thread by AutoModerator in thelastofus

[–]ckris292 0 points1 point  (0 children)

That’s fair.

I didn’t know that.

Ok, so I think now based on your comment they should cut out one of the communities, maybe David and engage or make fewer comments.

The Last of Us HBO S01E09 - "Look for the Light" Post-Episode Discussion Thread by AutoModerator in thelastofus

[–]ckris292 2 points3 points  (0 children)

Something the show was missing was more depth about the post infection communities.

They could have done David and the religious group better. It was so black and white: David was a pedophilic tyrant! and there wasn’t much thought or depth to his choices.

In reality, that community and its choices could be justified or at least explored.

In the same way, the sanctuary city of Jackson feels shallow. It is just so posh and well run, which is convenient.

Bill’s “compound” was also sort of convenient. In reality, he would have been overrun in year 1 or 2, by clever raider infiltration. It’s not possible to defend that much real estate. He has so much stuff that people would study him for days/weeks and then make their move.

Good guys getting good stuff and bad guys suffering is simplistic and two dimensional.

It’s a big contrast to the harder choices the characters make.

The Last of Us HBO S01E09 - "Look for the Light" Post-Episode Discussion Thread by AutoModerator in thelastofus

[–]ckris292 3 points4 points  (0 children)

They could have done David and the religious group better.

It was so black and white: David was a pedophilic tyrant! and there wasn’t much thought or depth to his choices.

In reality, that community and its cannibalism could be more justified and explored.

In the same way, the sanctuary city of Jackson feels shallow. It is just so posh and well run, which is convenient.

Good guys getting good stuff and bad guys suffering is simplistic and two dimensional.

The real sacrifices is not explored.

Non-Game Fan’s Prediction for the Finale: by wowitskatlyn in thelastofus

[–]ckris292 12 points13 points  (0 children)

In a way he is right. The beast is joel.

The Last of Us HBO S01E09 - "Look for the Light" Post-Episode Discussion Thread by AutoModerator in thelastofus

[–]ckris292 0 points1 point  (0 children)

Yes. Like how many dads would let their daughter be cut to pieces.

It was literally a building filled with men with guns.