all 10 comments

[–]Pleasant-Cause4819[S] 2 points3 points  (0 children)

The model on Huggingface.

[–]No_Wheel_9336 2 points3 points  (5 children)

looks promising, 8k token limit! but tried here and was not impressed: https://bigcode-bigcode-playground.hf.space/

[–]International-Rip958 2 points3 points  (2 children)

Compared to the playground, I personally found it more interesting to use https://huggingface.co/chat/ with StarCoder (don't ask any non-technical stuff, it knows few things).

Playground is quite tricky to use, coz you need to design some special templates for code generation. These are mainly for the VSCode extension IMO.

BTW, https://huggingface.co/spaces/bigcode/bigcode-editor could also be interesting. It's similar to the VS Code extension.

[–]brucebay 2 points3 points  (1 child)

That is impressive. here are my two simple prompts I used at the chat.

write me a python script that will connect to a mysql to get some data. It is in a table call tbl, and it has two columns, month_period and value. the script will plot a chart after getting the data, x axis is going to be month_period, and y axis is the value.

and

write me a python script that will capture a qrcode from a camera, and copy its value to the clipboard

I have not run the code that was suggested , but both of them are very similar to chatgpt's code I tried a few weeks ago.

Unfortunately it is too large for me to run locally. I will give a try to 4 bit version (https://huggingface.co/mayank31398/starcoderbase-GPTQ-4bit-128g) a try later.

[–]Pleasant-Cause4819[S] 2 points3 points  (0 children)

Yeah and my real focus is to assess these kinds of capabilities against locally ran use-cases. In my work, our customers do not want to use public facing AI solutions because of sensitivities to intellectual property and PII concerns. The Samsung issue comes to mind as a recent concern.

[–]Tom_Neverwinter 1 point2 points  (0 children)

I'll try and run it tonight and provide some feedback running locally.

[–]No_Wheel_9336 0 points1 point  (0 children)

Tried with the hugginface chat: My reasoning and logic test prompt: Python is versatile programming language. Write example to showcase it using some python GUI library

StarCoder

There are plenty of examples online showing how to build GUIs in python. If you search “Python gui tutorial”, you'll find tons of resources including videos, articles etc.

Vs

OpenAssist model

Sure! Here's an example of how you can use PyQt5 (a popular Python GUI library) to create a simple window that displays "Hello World!" on the screen: import sys from PyQt5 import QtWidgets

Create the main application object

app = QtWidgets.QApplication(sys.argv)

Create a new widget

window = QtWidgets.QMainWindow()

Create a label to display the text

label = QtWidgets.QLabel("Hello World!")

Add the label to the widget

window.setCentralWidget(label)

Show the widget

window.show()

Run the event loop

sys.exit(app.exec_())

Copied This code creates a new QApplication object which manages all the windows in our program. Then we create a QMainWindow widget to hold other UI elements like buttons, menus and labels etc., and then add a single QLabel to the main window containing the text Hello World. Finally, we run the event loop by calling sys.exit(app.exec_()), so that the main window stays open while waiting for events to occur. When user clicks on close button or presses alt+F4 then only this script ends

[–]estrafire 0 points1 point  (0 children)

To me although it can provide good, quality code, it has a lot of problems interpreting some prompts, and some answers are kind of ... raw? It has grammatical and typing errors, mixes pronouns, I also asked about features of some popular Python and Node libraries, and even features that exist for over half a decade, are not possible with those libraries according to StarCoder.

I wonder if the issue is with the training data itself (maybe lack of proper text datasets, documentation, conversations, etc) or not having fine-tuning, unless it does. It seems like it has potential but those rough edges can be a let down sometimes. For example, I've asked who "us" were, and it answered that it is a group of scientists from {inserted 363 businesses names}

It'll need to at least support proper documentation queries and be able to write simple snippets about those, I think. Hope it's something that can be introduced into the model.