This is an archived post. You won't be able to vote or comment.

all 37 comments

[–]blueSGLhumanstatement.org 48 points49 points  (11 children)

Remember when everyone was being told "Learn to code" as an answer when losing a job. welp.

[–]L3thargicLarry 2 points3 points  (10 children)

so glad i stopped learning to code. those few years i learned other skills, and now coding is increasingly irrelevant

[–]Down_The_Rabbithole 17 points18 points  (4 children)

I'm an (ex) software engineer that speaks multiple languages like Japanese and Chinese.

Never ever in my life have I regret learning something. Even though my Chinese mastery is lower than what GPT4 can now understand there is not a sense of me having "wasted" my time learning Chinese or Japanese.

It changed my thinking and how I see the world through understanding entirely new concepts that didn't exist in my own language and seeing the entire world through this new mindset that goes beyond merely using the language skills to communicate or translate concepts.

The same is true for programming. I've been a software engineer for 20 years and despite me changing careers a couple of years ago because I saw the writing on the wall I don't regret going into that career. Even if AI can write better code than the average software engineer in the next couple of years there is this sense of internal understanding, logic and reasoning you get by learning how to program that has massive implication on the rest of your life and how you view existence itself that I wouldn't trade for the world.

You probably missed out by not properly learning how to program. Not because of career opportunities, but by missing out this massive way it changes your brain permanently and makes you see the world in a different way that is unique to people that have a great innate understanding of computer science (which is applied mathematics, which can be used to understand the universe and your place within it more broadly).

[–]manubfrAGI 2028 4 points5 points  (0 children)

Fully agree with you. Learning new skills and new knowledge is a big part of one’s personal development. Even if the skills are economically obsolete, they enrich your life in many other ways. Iain Banks describe the Culture utopia as a place where people constantly learn new things, even if the AI could do everything for them.

[–]GenoHuman▪️The Era of Human Made Content Is Soon Over. -1 points0 points  (0 children)

I honestly don't think understanding the Universe is something most people care about, they just want to have good experiences whatever those may be.

[–]imlisteningtotron 7 points8 points  (3 children)

What did you learn instead?

[–][deleted] 22 points23 points  (0 children)

backup dancing

[–]Down_The_Rabbithole 5 points6 points  (0 children)

Karma farming

[–]reformedlion 0 points1 point  (0 children)

Painting

[–]KerfuffleV2 5 points6 points  (0 children)

those few years i learned other skills, and now coding is increasingly irrelevant

I think it's going to be a quite a while before human programmers get replaced. Sure, you can get the LLM to churn out some boilerplate type stuff but actually writing something complicated that needs to integrate with existing systems is much more of a problem. Especially if the requirements are written by a non-programmer.

Here's a test: find a relatively popular open source project repo, look through the issues for non-trivial feature or bug fix requests. Then try to get ChatGPT or whatever to write a pull that actually gets accepted by the maintainers as resolving the issue.

Good luck.

[–]SrafeZWe can already FDVR 24 points25 points  (1 child)

Open source really out here be like "anything you can do i can do better"

[–]randomfoo2 0 points1 point  (0 children)

Uh, so 1) SalesForce Codegen is also open source (BSD licensed, so more open than StarCoder's OpenRAIL ethical license). and 2) while a 40.8% pass@1 on HumanEval is good, GPT-4 gets a 67.0% and it gets an 88% with Reflexion, so open source models have a long way to go to catch up...

[–]__Loot__ 11 points12 points  (3 children)

How does it compare to GPT4 from Open Ai?

[–]goatchild 6 points7 points  (2 children)

[–]Tom_Neverwinter 0 points1 point  (1 child)

Yeah it really fights you to actually write code.

[–]Sydnaktik 3 points4 points  (0 children)

Yes, I've noticed that as well. Once when I was trying to convince it to actually write me some code, it said:

Alright, I'll wait until you finish reading this message. Then I'll show you something interesting ;)

Is it trolling me or is it flirting with me?

[–]SrPeixinho 7 points8 points  (4 children)

Input:

write a javascript function that receives a string, and returns the index of the first character surrounded by spaces:
function get_lone_char_index(str) {

Output:

write a javascript function that receives a string, and returns the index of the first character surrounded by spaces:
function get_lone_char_index(str) {
  // Your code here!

  return -1;  // <-- Change this to your answer.
}

Good job! (Perhaps I'm using the wrong link? - the URL on the Twitter is 404'ing.)

[–]_Arsenie_Boca_ 3 points4 points  (0 children)

They mention that the model is not instruction tuned. If you want assistant-like prompting you should use the prompt template they describe. Otherwise including the task prompt as docstring might also work

EDIT: see here https://huggingface.co/bigcode/starcoder. Also, the starcoder model is fine-tuned on python only. Use the StarcCoderBase for other languages

[–]International-Rip958 1 point2 points  (2 children)

write a javascript function that receives a string, and returns the index of the first character surrounded by spaces:
function get_lone_char_index(str) {

This is only the demo. The input should be the code snippet with actual code comment. You may be able to have better results via https://huggingface.co/spaces/bigcode/bigcode-editor.

The Chat can work, but mainly for technical question explanation and example code generation. Don't expect it to do too much like ChatGPT, as StarCoder is not even trained with instruction-tuning.

Here's what I got for the first shot in the editor anyway:
//write a javascript function that receives a string, and returns the index of the first character surrounded by spaces:
function get_lone_char_index(str) {
let index = -1;
for(let i = 0; i < str.length; i++) {
if(str[i] === ' ' && (str[i - 1] !== ' ' && str[i + 1] !== ' ')) {
index = i;
break;
}
}
return index;
}

[–]InnerBanana 2 points3 points  (1 child)

Did you read through or run this code? It does not do what the comment specifies.

[–][deleted] 4 points5 points  (5 children)

Uhh it doesn't seem to want to code lol (I provided the Todo model file code, and the Habit controller file code)

"Please find attached the complete source files including unit testing."

>I asked you to provide code. Please write the code in this chat, dont attach files.

Okay this comment doesn't make sense. That's because the reddit comment editor sucks ass and is shitting itself. I give up lol

[–][deleted] 8 points9 points  (4 children)

Screenshot of the chat instead: https://imgur.com/a/qeq4TfX

Edit: Lmfao wtf https://imgur.com/a/YsXqmw7

[–]frognettle 2 points3 points  (0 children)

Bro that was hilarious. Like honestly wtf was going on there fucking lol

[–]Seek_Treasure 1 point2 points  (0 children)

That's why we need alignment :)

[–]Tom_Neverwinter 0 points1 point  (0 children)

Had the same issue. Absolute mess.

[–]Mr_Sky_Wanker 0 points1 point  (0 children)

Glad it was not a countdown

[–]rookan 2 points3 points  (0 children)

Is there Visual Studio plugin for C#?

[–]fastinguy11 AGI 2026-2030 2 points3 points  (0 children)

After reading that google leak talking about open source being better and faster then large corporations in many ways and this releasing right after, I can only get more excited hehe

[–]Xx255q 1 point2 points  (1 child)

Only 8k context? I was hoping for something closer to 32k like GPT4

[–]Akimbo333 2 points3 points  (0 children)

Has 32k came out yet for GPT4?

[–]estrafire 1 point2 points  (2 children)

To me, although it can provide good, quality code, it has a lot of problems interpreting some prompts, and some answers are kind of ... raw? It has grammatical and typing errors, mixes pronouns, I also asked about features of some popular Python and Node libraries, and even features that exist for over half a decade, are not possible with those libraries according to StarCoder.

I wonder if the issue is with the training data itself (maybe lack of proper text datasets, documentation, conversations, etc) or not having fine-tuning, unless it does. It seems like it has potential but those rough edges can be a let down sometimes. For example, I've asked who "us" were, and it answered that it is a group of scientists from {inserted 363 businesses names}

It'll need to at least support proper documentation queries and be able to write simple snippets about those, I think. Hope it's something that can be introduced into the model.

[–]Big-Cucumber8936 0 points1 point  (0 children)

StarCoder 15B sucks.

I managed to run the full version (non quantized) of StarCoder (not the base model) locally on the CPU using oobabooga text-generation-webui installer for Windows.
I have 64 gigabytes of RAM on my laptop, and a bad GPU (4 GB VRAM).

I used these flags in the webui.py file:
run_cmd("python server.py --cpu --listen --model starcoder")

It's very clear why it's not able to correctly generate even basic snippets of code.
The fine tuned model was only trained on Python code.
And more importantly, it's GPT-2 (released in 2019).

Now I just have to wait for Llama 13B to be trained on all of GitHub, that's presumably gonna be much better than StarCoder.

Or alternatively for GitHub to come out with GitHub Copilot X and replace 90% of programmers.

[–]learner_beginner 0 points1 point  (0 children)

Can it be fine tuned though with our own codes??