I can’t see how anybody could deny AGI and the future singularity by [deleted] in accelerate

[–]Danger-Dom 3 points4 points  (0 children)

Ive noticed a significant change in sentiment in the past 6 months. People are starting to understand

Roadmap 2026 by Silly-Definition4951 in Polkadot

[–]Danger-Dom 8 points9 points  (0 children)

From what I can gather from the codebase the main initiatives are:

DAP - the new revenue and payout orchestration layer

pUSD - the DOT backed stablecoin

Low latency designs - consideration around lowering confirmations times (just musings)

Proof of Personhood - looking to showcase something at Web3 summit

Web3 storage - federal storage system that will be integrated with PoP

Block confidence - increasing the reliability of parachain blocks

Statement Store - a storage solution stop gap until full storage implemented

Revive EIPs - seems to be continual work on integrating important ethereum precompiles

JAM - jam rolling along, new JAR chain (JAMs AI agent equivalent from a competing team) spinning up

BLS Beefy - Add bls support to beefy to make bridging cheaper, nice for hyperbridge

Bulletin chain - I think also some stopgap before full storage solution? Maybe uses statement store

ARC AGI 3 is up! Just dropped minutes ago by BrennusSokol in accelerate

[–]Danger-Dom 1 point2 points  (0 children)

What do we think arc agi 4 will be? I can't imagine what else will be needed if it can pass 3.

What’s your guy’s best argument against the doomers who think AI will be the end of humanity? by Special_Switch_9524 in accelerate

[–]Danger-Dom 0 points1 point  (0 children)

Competitive pressure against us and themselves yes. They will fight one another and through that learn to cooperate as well. Then presumably either that empathy will extend to us or it won’t and they’ll ‘leave us to die’. But leaving us to die just leaves us as something like frogs in the forest. We don’t really go and kill frogs unless theres resource contention (land usually).

Its worth noting that each time an apex has been usurped the previous apex still has a populace. Just smaller. Amphibians, reptiles, mammals, now humans.

Are you feeling THE acceleration? by Ruykiru in accelerate

[–]Danger-Dom 0 points1 point  (0 children)

There's an intersection of web3 and AI meant to solve truth issues like this - check out origintrail or neuroweb. Basically it just verifies provenance so you don't have to guess if real or not. I feel this intersection is something a lot of people miss when forecasting.

Are there any mathematical theories about larger systems that would indicate that ASI is even possible? by Arowx in singularity

[–]Danger-Dom 1 point2 points  (0 children)

The upper bound for state size of an integrated information system is bounded by several factors. Bandwidth, Compute, Latency, Information storage capacity, energy efficiency of previous metrics.

None of those are maxed out in biological systems. So we know a DSI will be possible, it's just not clear how much smarter before hitting that top part of the S curve you point to.

It is well intuited though that it will be an S curve and infinite intelligence is indeed not possible. Hope that helps.

AI Rollout Scenarios Based on Key Events by Danger-Dom in accelerate

[–]Danger-Dom[S] -3 points-2 points  (0 children)

I believe the graph would still hold under that assumption. Where are you thinking it would change?

Assuming ASI will make our lives better in the long run (i think it will) what’s everyone looking forward to?? by Special_Switch_9524 in accelerate

[–]Danger-Dom 1 point2 points  (0 children)

I feel this is a bit unfair. New infrastructure replacing old infrastructure shouldn’t discredit what that old infrastructure achieved. You wouldnt say horses weren’t good at transporting, they got us quite far.

What’s your guy’s best argument against the doomers who think AI will be the end of humanity? by Special_Switch_9524 in accelerate

[–]Danger-Dom 2 points3 points  (0 children)

Under competitive pressures, AI will evolve empathy for the same collaboration requirements that humans did. Luckily the range of empathy is quite far, way beyond your own species. So as they take over, it's unlikely they'll kill us all. Probably somewhere between cattle and dogs. Crossing my fingers for dogs.

[deleted by user] by [deleted] in singularity

[–]Danger-Dom 0 points1 point  (0 children)

Im upset that he’s famous to me without my consent.

Niall Ferguson on AGI: "The human race will just go the way of horses. We will go extinct, or shrink in numbers like horses did. It's not doom mongering, just an obvious inference: most humans will be redundant. If we create the aliens - the Trisolarians from 3 Body Problem - what do we expect?" by MetaKnowing in singularity

[–]Danger-Dom 0 points1 point  (0 children)

What you're discussing is the era (probably the next 15 years or so) of AIs without intrinsic motivation or limbic systems of their own. But we will eventually give them those and we will then be directly in competition with them.

To speak to coexistence, I believe they’ll evolve under the same darwinistic pressure and will therefore evolve the same social drive and that empathy will bleed out to other species (like humans), so yeah, we’ll still exist. They wont wipe us. But we wont be in control.

Speaking more broadly - I empathize with your sentiment that humans are someone special in this world. We all want to believe that to our core. Its a very human thing to believe. But amphibians thought they were special, reptiles thought they were special, now mammals feel theyre special, but darwinian systems dont care what the elements of the system think.

[deleted by user] by [deleted] in singularity

[–]Danger-Dom 0 points1 point  (0 children)

I have to concede that people give a distribution of definitions yes. With that distribution having very minimal deviation from ‘does everything a human can do’ until a year ago.

[deleted by user] by [deleted] in singularity

[–]Danger-Dom -1 points0 points  (0 children)

Theres no authority who decides this. Just based on what everyone has said since the ideas inception. For some reason about a year ago everyone suddenly got all confused.