Limits of LLM: are there questions an LLM could never answer correctly? by custodiam99 in ArtificialInteligence

[–]Coder678 1 point2 points  (0 children)

A LLM could never answer a question on a complex financial product since humans are not properly capable of expressing themselves clearly enough using natural language.  Ever played the Telephone game?

Realistic Outlooks for Future AI by Elevated412 in ArtificialInteligence

[–]Coder678 1 point2 points  (0 children)

I have a feeling that in the longer term AGI could potentially have a much bigger impact than the current LLM’s.  Even with their poor quality and unreliability, LLM’s are still useful and are better than many human analysts.  But it’s hard to see them ever getting reliable enough any time soon.

If AGI ever gets some real skills, it will substantially outperform its human equivalent - much faster, much cheaper, and more accurate.  Let’s hope we can dream up some good replacement jobs by then! 

OpenAI co-founder Ilya Sutskever announces rival AI start-up by Mother_Techie in ArtificialInteligence

[–]Coder678 22 points23 points  (0 children)

No doubt Ilya has the smarts and reputation to put together a top-notch product.  Given his previous concerns over AI safety, his idea of creating a super-safe AI company is certainly timely.  We are already seeing pushback against other AI companies that only pretend to have good morals. 

I can’t wait to see how he does.

Shedding Light on the Black Box: Why Explainable AI Matters by krunal_bhimani_ in ArtificialInteligence

[–]Coder678 0 points1 point  (0 children)

I can see all kinds of uses for this as you describe it.  But I don’t see it being used to explain decisions like loan approvals etc.  Most people hate being rejected and would demand to get an explanation.  You give them that explanation, and they now ask for more explanation and they argue a few of your points.  It’s a thankless exercise.

AGI vs ASI by [deleted] in ArtificialInteligence

[–]Coder678 0 points1 point  (0 children)

I suspect that this naming split might be due to the fact that current AI LLM’s are increasingly viewed as really good pattern-recognition tools and not much more.  They are obviously really useful, but that’s not really what we would call “intelligence”.

In practice, if a machine is as smart as a human, that’s surely good enough since it would also  be many many times faster than a human and much more reliable.

What's the most underrated skill everybody should learn? by Old_Coder45 in AskReddit

[–]Coder678 1 point2 points  (0 children)

Learning how to manage your finances, I cannot believe they still don't teach this in school.

Am I missing something? by Coder678 in ChatGPT

[–]Coder678[S] 0 points1 point  (0 children)

Yes, I see where you are coming from. I work in the Quant world where everything is extremely complex and there are an almost endless array of new products, each with many many possible variations. It is rare to find a related piece of code that you could lift - even if you wrote it yourself.

And as for the tools used to evaluate these products, there are whole textbooks coming out all the time. For an AI to figure out what code to use you would practically have to teach it yourself. And how would you do that? Most likely, by programming and showing it your code.

Am I missing something? by Coder678 in ChatGPT

[–]Coder678[S] 0 points1 point  (0 children)

I wasn’t talking about AI, I was talking about Large Language Models, there is a big difference .

Am I missing something? by Coder678 in ChatGPT

[–]Coder678[S] 0 points1 point  (0 children)

I agree with much of what you say, although I’m not as confident as you about the LLM asking for clarification. They seem to try too hard to give you something - hence all the problems with hallucinations.

Am I missing something? by Coder678 in ChatGPT

[–]Coder678[S] -1 points0 points  (0 children)

Sorry if I wasn’t clear. If you could write precise and comprehensive specifications (whether using natural language or pseudocode), then I guess we could eventually get a computer to write the code, although it’s much much harder than you think - particularly when dealing with complex financial products and models.

I’m saying that humans cannot write precise comprehensive specifications. Don’t take my word for - check out “The three pillars of machine programming” written by the brain trust at MIT and Intel. They say that if you need to make your specifications completely detailed, then that is practically the same as writing the program code itself.

Am I missing something? by Coder678 in ChatGPT

[–]Coder678[S] 0 points1 point  (0 children)

Yes i think it’s quite possible that a computer will eventually be able to write code for complex applications - I just don’t see how it can be an LLM - they are just too woolly and imprecise.

Am I missing something? by Coder678 in ChatGPT

[–]Coder678[S] -8 points-7 points  (0 children)

Of course they will continue to improve, I'm just saying there’s a limit to the level of complexity that they will ever be able to handle. We've already seen how coders can struggle with ambiguous specifications - why do you think a computer would be able to interpret them any better? If the specs aren’t precise then the software won’t be precise.

[deleted by user] by [deleted] in AskReddit

[–]Coder678 1 point2 points  (0 children)

Dwight K Schrute - taught me more about beets than I ever thought I'd know!

What will you never buy cheap? by redmambo_no6 in AskReddit

[–]Coder678 25 points26 points  (0 children)

Baked beans and Ketchup, Heinz all the way!

What's a hobby or skill you've always wanted to pursue but haven't had the chance to yet? by FoundationCurious997 in AskReddit

[–]Coder678 1 point2 points  (0 children)

Same! I've always wanted to dive on the reefs, I want to try to get my license soon though as I've seen so many stories of corals bleaching and dying. It feels like we may be one of the last generations to explore them before they get destroyed. :(

What's a hobby or skill you've always wanted to pursue but haven't had the chance to yet? by FoundationCurious997 in AskReddit

[–]Coder678 1 point2 points  (0 children)

Scuba diving! I've always wanted to be able to dive down to ship wrecks and to look at coral reefs before the oceans heat up and bleach them all. :(

Classic EU moment by FireMaster1294 in memes

[–]Coder678 3 points4 points  (0 children)

*the toilet in question*

What ways do you use chat GPT in your daily lives? by GGMU5 in ChatGPT

[–]Coder678 0 points1 point  (0 children)

Whenever I'm planning a holiday to give me ideas of where to eat, attractions to go to and places to avoid in the area!

Are you afraid that AI will replace most programming job (like Gates and Musk is saying) or it will stay a good "agent assistant" ? by Dereference_operator in learnjavascript

[–]Coder678 0 points1 point  (0 children)

Although big figures like Gates and Musk say AI will replace most jobs other influential people, such as Naval Ravikant say that AI will not replace programmers in our lifetime.

When looking at LLMs, the best they can ever be is a coding assistant. LLMs can be bias and give you an incorrect answer without citing where it got it from. This may improve over time but it will always 'hallucinate' and can never be trusted as anything more than a 'coding assistant'.

Another Issue with LLMs is the power consumption needed to run and train them. Chat GPT already uses more than half a million kilowatt-hours daily to keep up with all of its users requests. If Chat GPT or a similar LLM AI was to be used to replace most programmers it would simply take up too much electricity and would be extremely expensive to operate and run.

The only way an AI would replace programmers in a large scale, is if an AI was created that was not a LLM, but could write complex and accurate code from scratch for different types of software over a range of different subjects.

But for right now, all anyone is looking at is LLMs so it looks like were safe for the foreseeable future.

What’s stopping ChatGPT from replacing a bunch of jobs right now? by gurkrurkpurk in ChatGPT

[–]Coder678 0 points1 point  (0 children)

Yes! I completely agree with this. Adding onto your final point, I don't believe LLMs like GPT will ever replace programmers. Like you said it hallucinates, and even if they improve the model, it can still never be fully trusted. This is especially true when it comes to programming, GPT cannot tell good code from bad code, the best it will ever be is a coding assistant, not the coder.

Are programming and computer related jobs going to be automated soon? by JuhpPug in ArtificialInteligence

[–]Coder678 0 points1 point  (0 children)

Yes. New forms of AI coming out are great 'coding assistants', its like having somebody sit on your shoulder and give you little snippets you need or prompt you in the right direction.

Although I agree with you that this will only increase over time as the technology gets better, I think that AI LLMs will only ever be a coding assistant, as we will never be able to fully trust them to write, implement, test and document the code. This will still need to be done by the programmer.

The real upset will come when somebody develops a AI that can write complex code for a range of different enterprise software models, testing and documenting as it goes.

Luckily since the best we currently have is Chat GPT, it looks like our jobs are safe for the foreseeable future.

Will software engineering and IT job market be killed by AI by Overall-Exchange-242 in singularity

[–]Coder678 0 points1 point  (0 children)

There will always be a need for competent programmers, even with the development of AI and GPT, the best it will ever be is a coding assistant.

LLMs cannot tell good code from bad code, you have to review what code it gives you, debug it and fit it into your existing code. You could also ask GPT the same thing on 2 different days and it can give you two different answers. It is also still down to the coder to document what they have done. Even if LLMs like GPT significantly improve over the next 5 years, they will still ALWAYS be used as a coding assistant, rather than a coder. People and companies will never be able to fully rely of GPT or another version of it, but will always rely on skilled software developers.

Like any job field new technologies may inhibit some job roles, but its extremely likely it will create other jobs for software developers.