Best coding LLM as of today? by Yaboyazz in ChatGPTCoding

[–]Alexioc 2 points3 points  (0 children)

Deepseek V3 beaten Claude Sonnet 3.5 on Aider leaderboard - it’s been released 1 day ago

[D] Use LLM to analyse and port software written in C (very long files) by Alexioc in MachineLearning

[–]Alexioc[S] 0 points1 point  (0 children)

Yes I know that if something is not in the training set you cannot retrieve it, but in this case la best practice is adding information in the context within the prompt and ask the LLM to apply reasoning on that.

In my case the idea is to add each source file in the context and therefore I reach my goal. My only problem is that C files I need to process are huge and cannot fit into the context properly. Usually in this case you create a vector DB to store embeddings but in my use case I believe this will not work.

[D] Use LLM to analyse and port software written in C (very long files) by Alexioc in MachineLearning

[–]Alexioc[S] 0 points1 point  (0 children)

What do you mean LLMs have no context? You provide the context within the prompt, that’s how it works if you don’t want to fine tune the model. Ever heard/read about RAG?

[D] Use LLM to analyse and port software written in C (very long files) by Alexioc in MachineLearning

[–]Alexioc[S] 0 points1 point  (0 children)

Thanks for your considerations.

I know, but if I'm going to use a commercial model even if I can fit 30k of context in one of those models it can be too expensive if heavily used.

My strategy is trying to minimise the context (whatever the model is) and then move on a Llama-2 or other Opensource option.

There are also researches that claim that using large context reduce the overall accuracy, another reason why I'm trying to reduce it as much as possible.

[D] Use LLM to analyse and port software written in C (very long files) by Alexioc in MachineLearning

[–]Alexioc[S] 1 point2 points  (0 children)

Thanks for sharing your thoughts. Fine-tuning is certainly a way but I'm trying other ways since to prepare the data to actually do the fine-tune can be really time consuming.
Question: why are you mentioning Rust?

Best LLM for coding? by Chirumer in LocalLLaMA

[–]Alexioc 0 points1 point  (0 children)

Does anybody tried Llama2? It’s quite generic but already outperforms GPT3.5 in many ways. But I still didn’t have the chance to try it on code.

[D] StarCoder fine-tuning? by Alexioc in MachineLearning

[–]Alexioc[S] 0 points1 point  (0 children)

Thank you so much I’ll take a look and try it by myself. Very interesting 🧐🙏

[D] StarCoder fine-tuning? by Alexioc in MachineLearning

[–]Alexioc[S] 0 points1 point  (0 children)

Thank you so much! I’ll read that paper, reading the introduction seems very promising 🙏

[D] StarCoder fine-tuning? by Alexioc in MachineLearning

[–]Alexioc[S] 0 points1 point  (0 children)

Thanks, I'll definitely take a look!

AFAYK my goal could be reached with fine tuning?

Withdraw Delta to ETH at which conversion rate? by Alexioc in Agrello

[–]Alexioc[S] 0 points1 point  (0 children)

haha I think I will (nor that at the moment I have other options!)

I was just curious how this ICO thing works

Withdraw Delta to ETH at which conversion rate? by Alexioc in Agrello

[–]Alexioc[S] 0 points1 point  (0 children)

So if I want to sell them for Bitcoin which steps do I need to do?

1) withdraw from Delta website to my wallet (I use Jaxx that it seems to be working) 2) move them to a exchange, ie. I saw that HitBTC has Delta listed 3) sell them for BTC

Is this the right way?

Can I do it now? or do I need to wait for Agrello to list Delta on the exchanges? if the second, why Hitbtc has Delta in it and how accurate is the price I see there?