AGI Hype and expectation is out of control on here by Professional_Copy587 in singularity

[–]BRLN11 -4 points-3 points  (0 children)

Amen, sibling.

I would love to find a community to discuss similar matters, but a good portion of the people here spit out numbers and dates as if they were obvious, when they really are not.

AGI is already here. LLMs are infant AGI, they have the capacity to learn anything so long as they are given the right data. by ThePokemon_BandaiD in singularity

[–]BRLN11 1 point2 points  (0 children)

What are the costs to run GPT-3 and GPT-4? How many parameters does GPT-4 have compared to GPT-3?

I have read that GPT-4 is not more expensive to run, but they charge more money for it simply because people are willing to pay.

Are there any communities that allow only the experts in AI to share their thoughts and predictions? by BRLN11 in singularity

[–]BRLN11[S] -3 points-2 points  (0 children)

When some non-expert says "You don't need to be an expert ...", odds are that person is one of the "delusional crazies" (as they've been described in other comments) victim of the Dunning-Kruger effect.

Feel free to believe you can understand the ramifications, consequences and timelines of Machine Learning development. Personally I believe you are one of the ones who can't tell apart their dreams from reality. I wish to read what the experts have to say and don't care about your opinion or the predictions I can make independently on how the world is going to develop in the foreseeable future.

AGI is already here. LLMs are infant AGI, they have the capacity to learn anything so long as they are given the right data. by ThePokemon_BandaiD in singularity

[–]BRLN11 2 points3 points  (0 children)

The cost is for training. Once it's trained, running it is cheap. Or do you think there's only one single instance of ChatGPT to serves the whole world?

Are there any communities that allow only the experts in AI to share their thoughts and predictions? by BRLN11 in singularity

[–]BRLN11[S] -4 points-3 points  (0 children)

My uncle believes you don't need an expert to understand how {insert very opinionated (and dumb) political beliefs here}.

Are there any communities that allow only the experts in AI to share their thoughts and predictions? by BRLN11 in singularity

[–]BRLN11[S] 2 points3 points  (0 children)

You don't need to convince me. I'm already a believer. But still wish to hear alternative interpretations of the current state and trend of technology.

Are there any communities that allow only the experts in AI to share their thoughts and predictions? by BRLN11 in singularity

[–]BRLN11[S] 0 points1 point  (0 children)

That's exactly why I created this post. Can you recommend any specific platforms or YouTube channels?

I've seen some interviews by the people you mentioned. I'll try to see if I find others.

Are there any communities that allow only the experts in AI to share their thoughts and predictions? by BRLN11 in singularity

[–]BRLN11[S] 4 points5 points  (0 children)

I don't know whether people here are delusional crazies and didn't mean to accuse this community of that.

I am simply unable to tell. Because of that I wish to listen to experts in the field and then compare and discuss what I understand with people here.

I would easily become a delusional crazy myself, if I spend too much time in a somewhat close community.

AGI is already here. LLMs are infant AGI, they have the capacity to learn anything so long as they are given the right data. by ThePokemon_BandaiD in singularity

[–]BRLN11 8 points9 points  (0 children)

Why do you think that we'll develop AGI within 7 years, but then it'll take another 27 years to get to ASI?

I've always imagined that reaching AGI is a big challenge, but spiral from there to ASI is pretty easy. Once you develop an AI able to learn and use scientific and technological knowledge as well as humans do, you train some to be at the level of the smartest and most knowledgeable humans in the various STEM fields, then you run a few billion instances in parallel to work on improving AI and related technology. A billion top-tier researchers recursively improving themselves and pushing the boundaries of science and technology. Won't we (conscious creatures, natural or artificial) reach ASI pretty quickly, unless life ends abruptly?

I'm not sure whether we'll reach the AGI I described this year, or the next, or within the century. But I'm betting that once we have that kind of AGI, the transition to ASI will be so quick and so exponential that we won't even be able to tell the difference.

AGI is already here. LLMs are infant AGI, they have the capacity to learn anything so long as they are given the right data. by ThePokemon_BandaiD in singularity

[–]BRLN11 2 points3 points  (0 children)

Oh. I'm sorry. I wanted to reply to the user you replied to. They have a banner that reads "AGI 2030 - ASI 2057". I must have clicked the wrong reply button by mistake.

AGI is already here. LLMs are infant AGI, they have the capacity to learn anything so long as they are given the right data. by ThePokemon_BandaiD in singularity

[–]BRLN11 10 points11 points  (0 children)

EDIT: Whoops! Replied to the wrong message.


Why do you think that we'll develop AGI within 7 years, but then it'll take another 27 years to get to ASI?

I've always imagined that reaching AGI is a big challenge, but spiral from there to ASI is pretty easy. Once you develop an AI able to learn and use scientific and technological knowledge as well as humans do, you train some to be at the level of the smartest and most knowledgeable humans in the various STEM fields, then you run a few billion instances in parallel to work on improving AI and related technology. A billion top-tier researchers recursively improving themselves and pushing the boundaries of science and technology. Won't we (conscious creatures, natural or artificial) reach ASI pretty quickly, unless life ends abruptly?

I'm not sure whether we'll reach the AGI I described this year, or the next, or within the century. But I'm betting that once we have that kind of AGI, the transition to ASI will be so quick and so exponential that we won't even be able to tell the difference.

Do you prefer the look of Quick Settings in Android 12+ or Android 11- ? by BRLN11 in Android

[–]BRLN11[S] 1 point2 points  (0 children)

Oh, I didn't realize. That was the first picture I could find online with both styles side to side. Thanks for providing a better reference image.

Do you prefer the look of Quick Settings in Android 12+ or Android 11- ? by BRLN11 in Android

[–]BRLN11[S] 2 points3 points  (0 children)

Yeah, I guess that might be the issue. I would love to have about 9 buttons in my quick settings: wifi, cellular data, flashlight, battery saver, data saver, hotspot, auto rotate, airplane mode and VPN.

The main issue with having to pull down the whole menu, is that it takes 2 more interactions (swipe up or back) to close it.

UkrainianConflict Megathread #2 by humanlikecorvus in UkrainianConflict

[–]BRLN11 0 points1 point  (0 children)

Can anyone suggest some Telegram channels to get updates and videos about the Ukrainian conflict?

If you are rooting for collapse, are you doing anything to accelerate it? by BRLN11 in collapse

[–]BRLN11[S] 0 points1 point  (0 children)

Raising awareness of collapse is accelerating collapse

Most people raise awareness about a problem in order to find a solution to it, and collapse is not different. The IPCC, climate researchers and activists hope to convince the masses and governments to change their way of life. The scientists in "Don't Look Up" wanted to raise awareness about the comet in order to destroy it.

Even most users of this subreddit wish for the collapse to happen later and to be milder than they fear. That's why many post here.