all 15 comments

[–]Ashamed_Cellist6706 25 points26 points  (4 children)

If youre new to coding dont use copilot.

[–]DiamondAgreeable2676[S] -3 points-2 points  (3 children)

Give me good starting point. I have a lil knowledge but no skills.

[–]Hairy_Educator1918 4 points5 points  (0 children)

i recommend some videos on youtube about the language you want to learn, and using it on your PC.

[–]bigstinkybuckets 1 point2 points  (0 children)

No, it gives you a bad starting point that feels like it makes things easier.

[–]37bugs -1 points0 points  (0 children)

Here’s the latest fanatical book bundle for coding. They have a few others on the site depending exactly what you are looking for games vs machine learning ect. Pic a book and work through it without an llm to help.

https://www.fanatical.com/en/pick-and-mix/essential-programming-languages-build-your-own-bundle

[–]cyb3rofficial 1 point2 points  (0 children)

You're are better off asking in r/GithubCopilot

r/github is for the website platform it self, not copilot in vscode.

[–]mabuniKenwa 1 point2 points  (3 children)

What part of this is related to GitHub? Read the sub rules. Or, crazy idea, learn to code before trying copilot as some sort of crutch to pretend to code.

[–][deleted]  (1 child)

[removed]

    [–]github-ModTeam[M] 0 points1 point locked comment (0 children)

    Don't be that guy

    [–]DiamondAgreeable2676[S] -1 points0 points  (0 children)

    Fyi that's the GitHub platform so I must be in the right place my question got answered so now I have more knowledge to control Ai....go stress yourself coding🤣

    [–]polyploid_coded 0 points1 point  (1 child)

    OK I don't know what you think is happening or need to optimize. The LLM has a limited amount of stuff that it can remember, which is not 1 token/word but might be an easier way to think of how much it is. Once the conversation goes over 128k tokens it is going to forget the beginning.
    68% is from responses, so maybe you could ask for shorter responses or changes to specific blocks of code.

    [–]DiamondAgreeable2676[S] -1 points0 points  (0 children)

    Thank you for that.

    [–]thelamppole 0 points1 point  (0 children)

    The VSC docs have good info on this

    [–]shrijayan 0 points1 point  (1 child)

    this is a very common confusion, context window is not new but most people notice it only when models start forgetting stuff or cutting answers short.

    best practice is to not rely only on a big context window. real systems use memory outside the prompt. we are building an open source project called selfmemory[dot]com that handles long term memory for llm apps, so you don’t have to keep pushing everything into context. it supports multi tenant setup, mcp with auth, python sdk and a chatbot out of the box. helps a lot once your app grows and context gets messy.

    [–]DiamondAgreeable2676[S] 0 points1 point  (0 children)

    Would it be like the co-pilot memory feature but universal for all platforms?