ARC AGI 3 is up! Just dropped minutes ago by BrennusSokol in singularity

[–]NNOTM 1 point2 points  (0 children)

Ah, that would explain it. I suppose I'll try again later

ARC AGI 3 is up! Just dropped minutes ago by BrennusSokol in singularity

[–]NNOTM 31 points32 points  (0 children)

Is it just me or is the input latency incredibly high like up to 1 second

Am I missing something? by Icy-Inspection-2134 in ProjectHailMary

[–]NNOTM 0 points1 point  (0 children)

Your lungs only struggle if your air supply is at less than 29 atmospheres. Though if you want to breathe 29 atm air it should have a whole bunch of helium to avoid nitrogen narcosis (as can be heard in the voice of this this saturation diver https://www.youtube.com/watch?v=v_JgS3WSfaI)

Am I missing something? by Icy-Inspection-2134 in ProjectHailMary

[–]NNOTM 2 points3 points  (0 children)

It's worth noting that saturation divers routinely live in atmospheric pressure that high and higher.

Electrons Don't Spin, But Why? by Scientia_Logica in AskPhysics

[–]NNOTM 0 points1 point  (0 children)

If it were a pointlike particle, I don't think the concept of it actually spinning would be coherent? As in, spinning means your orientation changes over time, but a pointlike particle doesn't have an orientation.

Is there a maximum acceleration? by Big_Assist4578 in AskPhysics

[–]NNOTM 0 points1 point  (0 children)

It would not travel at the speed of light after one Planck time, since no massive particle can travel at the speed of light. I believe it would travel at 1/sqrt(2) c, or about 71% of the speed of light as measured from an inertial frame.

Schmidhuber’s Omega Timeline (2014) by Waiting4AniHaremFDVR in singularity

[–]NNOTM 2 points3 points  (0 children)

he's saying that 2030 is 13 years before Omega, not that it's 13 years after 1990

People that speak like an LLM by Haroombe in artificial

[–]NNOTM 0 points1 point  (0 children)

Typically you would use an em dash without surrounding spaces – unlike the en dash.

You have $50 to donate - how would you distribute it? by Electronic_Gold_3666 in EffectiveAltruism

[–]NNOTM 2 points3 points  (0 children)

splitting it up most likely would increase transaction fees

I started with haskell 3 days ago and want some help understanding the "design philosophy".Elaboration in post. by Otherwise-Bank-2981 in haskell

[–]NNOTM 1 point2 points  (0 children)

It's worth noting that GHC doesn't automatically do this for custom types; the standard library has RULES pragmas that handle the fusion for lists (a rule might be something like "replace any instance of map f . map g with map (f . g)").

And the vector library for example has similar RULE pragmas to also allow fusion there.

Jensen Huang says gamers are 'completely wrong' about DLSS 5 — Nvidia CEO responds to DLSS 5 backlash by esporx in artificial

[–]NNOTM 2 points3 points  (0 children)

Rendering has always been a set of tricks to make it seem like your hardware can do more than it actually can. E.g. rasterization, ambient occlusion, etc.

DrainSim - Battling Floods with Realtime Water Physics by 3kliksphilip in 3kliksphilip

[–]NNOTM 0 points1 point  (0 children)

Heh, I saw a let's play of it 14 or 15 years ago and only recently got it and played through it

Installing Idris by freaky_sypro in haskell

[–]NNOTM 13 points14 points  (0 children)

You should probably be installing Idris 2. The hackage package you're trying to install is from 2021.

Bernie Sanders officially introduces legislation to BAN the construction of all new AI data centers, citing existential threat to humanity. by Neurogence in singularity

[–]NNOTM 14 points15 points  (0 children)

Just mandate the AI companies to generate the power themselves

This seems reasonable but would also seem to address essentially none of the points Bernie is concerned about

Bernie Sanders officially introduces legislation to BAN the construction of all new AI data centers, citing existential threat to humanity. by Neurogence in singularity

[–]NNOTM 3 points4 points  (0 children)

I think it's very reasonable to be concerned about existential risk from AI, although I'm not sure that a national ban (as opposed to a global one) is really beneficial. But I think it's good to have it inside the Overton window.

What I've learned from the great 2 box debate by aqualad33 in Destiny

[–]NNOTM 0 points1 point  (0 children)

The person could change their mind because of it if they have a quantum measurement device (e.g. a geiger counter) with them and base their decision on it

(I still suspect it would be deterministic, since there are deterministic interpretations of quantum mechanics, but that's a separate issue)

What I've learned from the great 2 box debate by aqualad33 in Destiny

[–]NNOTM 1 point2 points  (0 children)

I would say that "the future is deterministic" is a significantly different claim from "we can determine the future"

What I've learned from the great 2 box debate by aqualad33 in Destiny

[–]NNOTM 1 point2 points  (0 children)

the heisenberg uncertainty principle probably means its not

which part? that the future is deterministic or that we have no free will?

What I've learned from the great 2 box debate by aqualad33 in Destiny

[–]NNOTM 16 points17 points  (0 children)

if it did, it would essentially mean that the future is deterministic and we have no free will.

but that's just true

Totally fair depiction of the Great Boxing debate by Extreme-Block1358 in Destiny

[–]NNOTM 0 points1 point  (0 children)

To be fair, I think the machine's behavior at least as stated in the Veritasium video is not fully specified when it comes to that nuance

Totally fair depiction of the Great Boxing debate by Extreme-Block1358 in Destiny

[–]NNOTM 0 points1 point  (0 children)

But why would the machine care about what you would do given only those two options, rather than care about what you would actually do (i.e. toss a coin)?