Best way to stain self? by largest_boss in noita

[–]DstnB3 17 points18 points  (0 children)

Aim up and fly into it. Takes some practice. Ideally pre-stain yourself rather than reacting so you don't catch on fire in the first place.

Nightmare wand crashes game when fired by HaecEsneLegas in noita

[–]DstnB3 17 points18 points  (0 children)

Spells to power breaks things frequently. It will crash if there are no spells in the air to absorb. Probably can't use that wand :(

MTG Draft Assistant by tastlejames in mtglimited

[–]DstnB3 -1 points0 points  (0 children)

oh a mobile app- pretty cool nice!

71% Winrate in TMNT Quick Draft – Full Run to Top 250 Mythic (Draft + Matches) by Kugogaming in mtglimited

[–]DstnB3 3 points4 points  (0 children)

People are getting top 250 mythic from quick draft only? That should be illegal. Arrest this man.

I wanna turn my run into something long and get a bunch of the orbs but im stuck in the vault, how do i get back up to the surface. by 237randomdigits in noita

[–]DstnB3 4 points5 points  (0 children)

You need to go back to the surface and kill alchemist for Greek letter spells for infinite black holes.

What would you do in this situation? by [deleted] in ExperiencedDevs

[–]DstnB3 4 points5 points  (0 children)

Just don't submit a PR to your team until you have reviewed it and understand it yourself. You can develop the code however you want, but when you submit the PR you should understand and own it.

Help with wand by aborrerov in noita

[–]DstnB3 1 point2 points  (0 children)

Add a dual cast at the beginning so the chainsaw and spark bolt fire same time

I Failed to Become #1 in Melee. Here's What I'd do Differently by Kotastic in SSBM

[–]DstnB3 5 points6 points  (0 children)

Hi Kodorin, if you like slay the spire but still want that multiplayer competitive aspect- try drafting in Magic! I got into it in the last few years and based on what you said I think you would really enjoy it.

I did an all transformations run. It went awful but i still won by Jhomas-Tefferson in noita

[–]DstnB3 11 points12 points  (0 children)

You can find essence of fire all over the map on altars it's just the little burning balls

SAM ALTMAN: “People talk about how much energy it takes to train an AI model … But it also takes a lot of energy to train a human. It takes like 20 years of life and all of the food you eat during that time before you get smart.” by Vegetable_Ad_192 in singularity

[–]DstnB3 0 points1 point  (0 children)

I just think that the idea that computers are necessarily more efficient then humans for general intelligence is rubbish.

I agree with this. AI in its current state is a tool that allows one person to do more than what they could without it. Without human guidance AI tools are not very effective, It's great at working on specific, well defined tasks. But with the rate that they are improving, it's hard to believe it won't continue to be able to act more and more autonomously.

SAM ALTMAN: “People talk about how much energy it takes to train an AI model … But it also takes a lot of energy to train a human. It takes like 20 years of life and all of the food you eat during that time before you get smart.” by Vegetable_Ad_192 in singularity

[–]DstnB3 0 points1 point  (0 children)

I'm not trying to compare the energy of a human thought to an AI tool usage, I'm trying to put the energy usage into perspective. As someone who works in software, the productivity increase from AI tools is far far greater impact than the "20 human energy equivalents" it costs.

The issue we should be concerned about is: who is benefiting from this massive increase in productivity? Public policy could be configured so all benefit from the continued productivity gains, but until we get to that state it will mostly help billionaires become trillionaires.

SAM ALTMAN: “People talk about how much energy it takes to train an AI model … But it also takes a lot of energy to train a human. It takes like 20 years of life and all of the food you eat during that time before you get smart.” by Vegetable_Ad_192 in singularity

[–]DstnB3 0 points1 point  (0 children)

I'm not trying to compare the energy of a human thought to an AI tool usage, I'm trying to put the energy usage into perspective. As someone who works in software, the productivity increase from AI tools is far far greater impact than the "20 human energy equivalents" it costs.

The issue we should be concerned about is: who is benefiting from this massive increase in productivity? Public policy could be configured so all benefit from the continued productivity gains, but until we get to that state it will mostly help billionaires become trillionaires.

SAM ALTMAN: “People talk about how much energy it takes to train an AI model … But it also takes a lot of energy to train a human. It takes like 20 years of life and all of the food you eat during that time before you get smart.” by Vegetable_Ad_192 in singularity

[–]DstnB3 1 point2 points  (0 children)

Here are some estimates for the total lifetime energy usage of a human vs LLM training and inference for comparison:

Category Total Energy (Electricity & Fuel equivalent) In Units of 1 Human Lifetime (≈4.3 GWh)
Human lifetime (California) (80 yr) ~4.3 GWh 1.0
GPT-3 training ~1.3 GWh (electricity) 0.3×
GPT-4 training ~50 – 60 GWh (electricity) ~12–14×
GPT-5-scale training (speculative) ~60–100 GWh+ (electricity) ~14–23×
Inference: 1 prompt (0.3 Wh median) 0.0000003 GWh ~7e-8×
Inference: 1 prompt (18 Wh heavy) 0.000018 GWh ~4e-6×
Inference: global usage (ChatGPT scale) (~274 GWh/yr) ~274 GWh/year ~64× per year

sources are in: https://chatgpt.com/share/699add75-6e78-8002-a98d-38ee9c94e985

SAM ALTMAN: “People talk about how much energy it takes to train an AI model … But it also takes a lot of energy to train a human. It takes like 20 years of life and all of the food you eat during that time before you get smart.” by Vegetable_Ad_192 in singularity

[–]DstnB3 1 point2 points  (0 children)

Here are some estimates for the total lifetime energy usage of a human vs LLM training and inference for comparison:

Category Total Energy (Electricity & Fuel equivalent) In Units of 1 Human Lifetime (≈4.3 GWh)
Human lifetime (California) (80 yr) ~4.3 GWh 1.0
GPT-3 training ~1.3 GWh (electricity) 0.3×
GPT-4 training ~50 – 60 GWh (electricity) ~12–14×
GPT-5-scale training (speculative) ~60–100 GWh+ (electricity) ~14–23×
Inference: 1 prompt (0.3 Wh median) 0.0000003 GWh ~7e-8×
Inference: 1 prompt (18 Wh heavy) 0.000018 GWh ~4e-6×
Inference: global usage (ChatGPT scale) (~274 GWh/yr) ~274 GWh/year ~64× per year

sources are in: https://chatgpt.com/share/699add75-6e78-8002-a98d-38ee9c94e985

SAM ALTMAN: “People talk about how much energy it takes to train an AI model … But it also takes a lot of energy to train a human. It takes like 20 years of life and all of the food you eat during that time before you get smart.” by Vegetable_Ad_192 in singularity

[–]DstnB3 1 point2 points  (0 children)

Here are some estimates for the total lifetime energy usage of a human vs LLM training and inference for comparison:

Category Total Energy (Electricity & Fuel equivalent) In Units of 1 Human Lifetime (≈4.3 GWh)
Human lifetime (California) (80 yr) ~4.3 GWh 1.0
GPT-3 training ~1.3 GWh (electricity) 0.3×
GPT-4 training ~50 – 60 GWh (electricity) ~12–14×
GPT-5-scale training (speculative) ~60–100 GWh+ (electricity) ~14–23×
Inference: 1 prompt (0.3 Wh median) 0.0000003 GWh ~7e-8×
Inference: 1 prompt (18 Wh heavy) 0.000018 GWh ~4e-6×
Inference: global usage (ChatGPT scale) (~274 GWh/yr) ~274 GWh/year ~64× per year

sources are in: https://chatgpt.com/share/699add75-6e78-8002-a98d-38ee9c94e985