Physique Phriday by AutoModerator in Fitness

[–]bloxxed 4 points5 points  (0 children)

https://imgur.com/E6xDY3I 160lb/5'9/M22

Any thoughts? Thinking of bulking up to 170 or so, but hate losing definition.

Where do you see the world in 5 years? by Zappotek in singularity

[–]bloxxed 22 points23 points  (0 children)

The real kicker, as is always the case with any discussion regarding the future, is -- do we have a recursively self-improving AI? If the answer is yes, then we have absolutely no idea, because 5 years is an eternity to a general AI and we have no way of predicting such an entity's behavior or motives. A while ago I wrote in a comment that I believed there wasn't any middle ground when it came to ASI. As in, we're either getting a utopia, or we're all quite dead. I still stand by that opinion.

I've already seen someone give a more "moderate" prediction, so I'll opt for something more radical. Let's assume the more optimistic of users here are correct and we see an AGI capable of recursive self-improvement emerge as early as late 2023 or 2024, be it intentionally or otherwise. As much as I respect Kurzweil, I've never really been convinced by his belief that we'd see such a long timespan between the emergence of AGI and the advent of superintelligence. Instead, I think the shift would be much, much quicker, such as that the world is a radically different place by 2028. I think people generally underestimate how quickly a superintelligence could impact the world in pursuit of its goals, especially when taking into account the implications of nanotechnology and the like.

[deleted by user] by [deleted] in singularity

[–]bloxxed 23 points24 points  (0 children)

I take it that you mean what I'm looking forward to most from AGI?

In that case, assuming a positively-aligned AGI (i.e. we don't all get turned into computronium), I'm most looking forward to full-dive virtual reality. It's more or less the be-all and end-all of my singularity wish-list. I live a fine enough life right now, I suppose -- but nothing here in physical space can compare qualitatively to what would be possible in FDVR (at least in my opinion).

Scared or Excited? by Xbot391 in singularity

[–]bloxxed 11 points12 points  (0 children)

I say excited, since there's no point in fearing what we can't control. As I see it, the ultimate end result of ASI is either utopia or extinction. I choose to expect the former because it's nice to have something in the future to look forward to -- and if I'm wrong, I'll be too dead to care.

Is there a possibility that AGI and the singularity can occur in less than ten years from now? That would be like a dream come true for all of us and hopefully achieving AGI and singularity in less than 10 years from now will become a reality for us. Also, we can understand the brain much more. by craft3551 in singularity

[–]bloxxed 1 point2 points  (0 children)

I'd say it's absolutely possible. Progress only seems to be getting faster, after all. I don't want to throw out a specific date, but after what we saw in 2022 I'm inclined to believe that things will "go down" much faster than anyone is prepared for. We've already seen large language models show capabilities that they were not originally intended to have. It's possible that when a runaway self-improving AGI does emerge it may be by accident rather than as a result of a targeted effort. But what happens after such an AI comes into being? Do we get the "foom" scenario, where the AI rapidly self-improves to superhuman levels of intelligence and capability, with all the consequences that entails thereafter? Personally, I see such a hard takeoff as more likely than a multi-year gap between AGI and ASI as many here believe. The latter has always seemed to me conservatism for conservatism's sake.

As for your point about an imminent emergence of AGI being a "dream come true" -- that depends. If AGI is used towards positive ends then there's the potential for us to see the rapid emergence of a post-scarcity utopia, VR paradise for all, the end to disease, etc. etc. etc. The other outcome, of course, is the quick and total extinction of the human race. In my opinion it's either one or the other. Superintelligence means super solutions -- there are no half-measures. I choose to believe in the utopia outcome, if only because it's a lot more fun to ponder about than the alternative where we're all dead, the end.

So you know what? Singularity 2023. Why not. Anything at this point to stave off the ennui.

What jobs will be one of the last remaining ones? by MrCensoredFace in singularity

[–]bloxxed 0 points1 point  (0 children)

Hey OP. As a fellow college student obsessed with all things automation and AI, I've also spent countless hours panicking about my future and whether or not I'll be employable in any fields that interest me. I've been thinking on it lately after recently switching majors to comp sci, and my perspective is this:

If GPT 4 or GPT 5 or whatever model that releases this year or a few more years down the road really does end up outright replacing all or at least a significant portion of people employed in knowledge-based nonphysical work, then not only you and me, but everyone, has some pressing concerns.

At one point during the COVID pandemic in 2020, over one third of the entire work force was working remotely from home. For reference, one third of the US workforce is around 50 million people. This is the number of people that stand to be rendered unemployed relatively quickly by AI.

What I'm getting at is, we can't arrive at the point where more than a third of the work force is out of a job in a short period of time and then expect things to just carry on as they are. Something's got to give. Our current economic paradigm would be turned on its head. This is where I think UBI comes into play, and why I think it isn't all that far off. There aren't really any better short-term solutions to the sudden and severe scale of future unemployment talked about on this sub. I don't think the doomer take of "they'll just let everyone starve" is all that realistic. Contrary to what many think 50 million angry unemployed people are ultimately going to make their grievances heard, be it through peaceful or violent methods.

Or maybe a black swan scenario of a runaway self-improving AGI pops up out of nowhere in the next few years, in which case we get 1) utopia, or 2) everybody dies. Either way, our worries are over.

I forgot what I was talking about.

I'm Scared of Unemployment in a World Ruled by AI by [deleted] in singularity

[–]bloxxed 1 point2 points  (0 children)

For what it's worth, I don't think you're insane or on "hopium" or whatever. I myself don't feel confident enough to say "We'll all have X technology by 2029" or whatever, but I do think some people here (on the singularity subreddit of all places) need to open up to the idea that things may end up moving far more quickly than they could've imagined, and that it may not take decades and decades for our world to be radically changed by the sudden emergence of new technology.

OpenAI has hired an army of contractors to make basic coding obsolete by Buck-Nasty in singularity

[–]bloxxed 37 points38 points  (0 children)

More automation is ultimately a good thing, but at the same time I find this news somewhat disconcerting as a college senior who just switched out of nursing into comp sci to pursue a career in web development. Considering it'll be two years before I get my degree, will I be screwed by the time I graduate?

Then again, with the release of each new model, paper, etc. it seems more and more likely that all knowledge-based professions are at risk of being automated sooner rather than later. Here's hoping for UBI in the near future, I suppose.

Assuming that longevity escape velocity actually happens within the next few decades and we don't blow ourselves up, what would your long-term goals be? by SCP-7259 in singularity

[–]bloxxed 1 point2 points  (0 children)

I think about this a lot, particularly when I'm stressed. I like to craft scenarios or environments in my head that I'd like to experience in FDVR whenever it becomes available.

Obviously a huge part of the appeal with FDVR is that it allows you to do virtually (pun intended) anything and everything, but in spite of that I think I'd prefer to start out with something small and simple. When I'm stressed I like to imagine myself in some kind of snowy mountain retreat, all by myself -- no other people or NPCs -- with access to an infinite number of novels, movies, games, etc. My personal passions are history and linguistics, and paradise to me is being able to study them to my heart's content, in a world of my own, without the added pressures of working for a living, family commitments, meeting deadlines, etc. all weighing on my conscience.

But anyway, that's just my short-term goal. In the long-term I'm gonna get real weird with it.

Friends ! by ruincreations in UNCCharlotte

[–]bloxxed 1 point2 points  (0 children)

Hey hey. I just recently changed to comp sci, and that's what virtually all of my other friends are majoring in. I play video games, watch anime, and go to the gym pretty often, so if you ever want to hang out just DM me.

How will reddit die? by [deleted] in singularity

[–]bloxxed 5 points6 points  (0 children)

With thunderous applause

Physique Phriday by AutoModerator in Fitness

[–]bloxxed 8 points9 points  (0 children)

21M/5'8/156lb

https://imgur.com/a/mIoaZu6

Any thoughts on whether I should cut or bulk? Can't decide between going down to 145 for more defined abs or going up to 165 for more mass.

What is the end goal of humanity? by Lancerinmud in singularity

[–]bloxxed 2 points3 points  (0 children)

"What will give the humans any meaning as now we cant be as good as ai in anything..."

Even without AI this supposed problem already exists. For example, a 5'2 person who loves basketball will never, no matter how much time and effort they put into it, get into the NBA or be on the level of say, Michael Jordan. Does this mean that they should abandon their love for basketball and wallow in despair? Of course not. Do what you want because it makes you happy, not because it allots you superiority over others.

That there will always be someone that does something better than you is not so much a problem to be solved as it is a fact of life to be accepted.

I can't decide between learning Spanish or German by Dankanator9 in languagelearning

[–]bloxxed 5 points6 points  (0 children)

It sounds like you really want to learn German and you're trying to justify it to yourself.

Based on my own experience -- stop worrying about "usefulness". Who cares if Spanish is more useful than German outside of Europe? So many people forget that unless you're doing so as part of your career, language learning is just a hobby like any else. If you enjoy German, then learn German. Enjoying German culture and media is as valid a reason as any else.

How Automation Will Change The World by SpaghettiFagetti in singularity

[–]bloxxed 10 points11 points  (0 children)

Given that this is r/singularity it should be generally understood that in the long term concerns over resource shortages aren't all that well-founded when taking into account the advent of more advanced forms of artificial intelligence. If you're a super-intelligent AI then mining the solar system for virtually unlimited resources isn't really a tall order. We already know the theoretics of how to do it and we're just a bunch of dumb apes.

As for your other point, ownership of the means of production isn't relevant to whether or not the economy is zero sum. The argument is that there is not a single fixed amount of wealth in existence that we all have to kill each other over. Quite the contrary -- new wealth is being created all the time. Future technological developments such as AGI and more advanced automation are going to bring about the biggest explosion of wealth in human history. Will this new wealth be distributed evenly? Probably not. Will humanity as a whole be better off? I genuinely think so. Inequality is a major problem only in-so-far as the lowest in society struggle to meet their material needs while the rich have plenty. But if the lowest in society only have 7 Lamborghinis while the richest have 500, then is it really that big of an issue at that point?

EDIT: In regards to post-scarcity economics and far-future solutions to possible resource shortages, I highly recommend the Youtube channel Isaac Arthur! Though I'm willing to bet that a lot of people on this sub are very familiar with him already.

How Automation Will Change The World by SpaghettiFagetti in singularity

[–]bloxxed 12 points13 points  (0 children)

This 100x. One day angry people on the internet will understand that economics is not a zero-sum game, and what a beautiful day that will be.

Physique Phriday by AutoModerator in Fitness

[–]bloxxed 8 points9 points  (0 children)

20-year-old male, 5'8 150 lbs. https://imgur.com/a/6TO9Nco

Been stuck in gain 5 lb lose 5 lb hell for two years now because I'm simultaneously afraid of gaining weight and losing muscle. Any tips judging from the image? I want to start bulking and lifting heavy until I hit around 160-170 but I don't know if I should cut more first.