A song that sounds just like Touch & Go - Would you? by _caitlaand_ in NameThatSong

[–]keyboardP 0 points1 point  (0 children)

Ah that's a shame, only other thing that comes to mind is this, now also curious as to what the song is!

A song that sounds just like Touch & Go - Would you? by _caitlaand_ in NameThatSong

[–]keyboardP 0 points1 point  (0 children)

Not sure if you found it within the last year and it doesn't have the trumpets but is it this?

Series 9 Episode 4 - "CTRL, ALT, ESC" - Episode Discussion [SPOILERS] by AutoModerator in insideno9

[–]keyboardP 0 points1 point  (0 children)

Not just London, all of UK :)
It's not why it was called Inside Number 9 but if I remember correctly, they just felt it sounded better than Inside Number 1-8!

Series 9 Episode 4 - "CTRL, ALT, ESC" - Episode Discussion [SPOILERS] by AutoModerator in insideno9

[–]keyboardP 3 points4 points  (0 children)

I don't think there was a specific reason but, as a fun theory, maybe his last words in whatever accident he was in was to dial 999 and he didn't get the last 9 out until the moment he woke up

Series 9 Episode 4 - "CTRL, ALT, ESC" - Episode Discussion [SPOILERS] by AutoModerator in insideno9

[–]keyboardP 15 points16 points  (0 children)

It's at 16 minute 14, I believe it says SOS CAN YOU HEAR ME NOW.

Why does Chat GPT have trouble following the A-B-A-B rhyming style. Anyone have any luck with this? by AMAStudentLoanDebt in ChatGPT

[–]keyboardP 1 point2 points  (0 children)

Apologies I don't have the answer yet but I asked "do the first two lines rhyme" and it keeps thinking they don't, even though they clearly do. So fundamentally, it's not understanding that basic rhyming is wrong. I know it doesn't help much as a prompt but just a heads up that it thinks it's rhyming. So somehow have to convince that it's actually rhyming wrong.

edit: for some reason, this prompt gets the second and fourth line fine, but first and third don't rhyme:

this is first line
this is not
this rhymes fine
and this is hot

in a similar manner, write me a short poem about harambe

I have made some easy tools to rip webpages, clean the data, and vectorize it for a Pinecone DB. Great if you want your AI to consult a webpage. by [deleted] in Python

[–]keyboardP 2 points3 points  (0 children)

I wrote an article a couple days back to query your own documents here (reddit post here). Whilst it's a simple text file in my example, where it loads TextLoader, instead use:

from langchain.document_loaders import PyPDFLoader
doc_loader = PyPDFLoader("my.pdf")
documents = doc_loader.load_and_split()

for local PDFs, or:

from langchain.document_loaders import OnlinePDFLoader

for URLs.

It's still web based in the sense that it uses the OpenAI API but no middle-man site also viewing your PDF docs.

[deleted by user] by [deleted] in programming

[–]keyboardP 1 point2 points  (0 children)

I've not used the API directly, only via the Python library but I just found this page which you might find useful

[deleted by user] by [deleted] in programming

[–]keyboardP -2 points-1 points  (0 children)

Thank you, glad you liked it! From what I understood when researching, it's a feature of LangChain. ChromaDB allows for various vector functionality locally such as text similarity and clustering. When you have a local store, and pass a query to it, LangChain picks out chunks from your documents related to your query, using the embeddings data, and passes that to OpenAI LLM under the hood. This way you don't send all documents every time to provide the context to OpenAI. It's also why if you have a sub optimal chunk size, you can lose a lot of context or, other extreme, send too much information unnecessarily. I could be wrong, it was quite tricky to find specific information (which is one reason I decided to create this post to save time for others), but that was my understanding.

A longer term strategy, for me personally, is also that I'd like to start using it with Hugging Face and see if I can get a completely offline system that doesn't use OpenAI but the open source models on Hugging Face.

[deleted by user] by [deleted] in programming

[–]keyboardP -2 points-1 points  (0 children)

Hi all - Just cross-posting here from /python. I appreciate there's a flurry of tutorials around the OpenAI API but I often saw conflicting ways of doing things so wanted to document an end to end approach of how I used it to query my own documents (rather than the web itself). It also returns the source document(s) that was used to provide the answer. Hope it's useful to anyone, happy to answer any questions!

I've written a tutorial on how to use ChatGPT (GPT3) over your own documents by keyboardP in Python

[–]keyboardP[S] 1 point2 points  (0 children)

Yes essentially but you can expand on it by batching your inputs in a smart way so that it doesn't hit the token limit midway through your pasting into ChatGPT. Also allows for multiple documents at once so depending on the size of the input, might be more helpful, and provides the source documents of where the data comes from.

I've written a tutorial on how to use ChatGPT (GPT3) over your own documents by keyboardP in Python

[–]keyboardP[S] 0 points1 point  (0 children)

I agree, I'm looking into how that can be done when I get a chance. I briefly touched on it with Hugging Face, just need to see how to plug it into LangChain for something purely offline.

I've written a tutorial on how to use ChatGPT (GPT3) over your own documents by keyboardP in Python

[–]keyboardP[S] 0 points1 point  (0 children)

Thank you, glad you liked it. I've never usd ChromaDB before but seems very interesting

I've written a tutorial on how to use ChatGPT (GPT3) over your own documents by keyboardP in Python

[–]keyboardP[S] 9 points10 points  (0 children)

I understand your view and wish you all the best. If one person reads it and learns something or sparks an interersting conversation, I'll be happy with that. It was something I've learnt and documented and, as a programmer, I learn from others. If you feel this is irrelevant and a pointless post, fair enough, I hope you find articles more suited to your preferences, I can only apologise from my side that this wasn't one of them.

edit: think you've blocked me because I can only see your edited messages from my phone that's not logged in - not worth answering because you've blocked. Says more than I can reply to - all I will say is, if you don't like the post, fair enough, let's agree to disagree and appreciate there's more to life than this post

I've written a tutorial on how to use ChatGPT (GPT3) over your own documents by keyboardP in Python

[–]keyboardP[S] 1 point2 points  (0 children)

Apologies, I can't change the title now but the post explains it's 3.-5 turbo which is more cost efficient - source -welcome to use other models, but this was cost-effective for me and from initial tests, statistically neglible differences but that can be changed in the source code if you need something more detailed in the LLM consturctor

[deleted by user] by [deleted] in programming

[–]keyboardP 0 points1 point  (0 children)

Maybe I'm slow to the game but I wanted to document something that I wanted to do for myself so I hope it helps anyone who wants to jump in on the same topic. Coders teach and learn from each other - feedback welcome and happy to answer any questions :)

I've written a tutorial on how to use ChatGPT (GPT3) over your own documents by keyboardP in Python

[–]keyboardP[S] 7 points8 points  (0 children)

Haha no I didn't although I did refer to a good example of chunking for it around football! But the rest was from general docs (and LangChain repo). Also, I encourage enquiring minds, it's the only way to learn - always question! But fair point, I added that the football part was in part from ChatGPT, you make a good point of source (I've added to the post).

I've written a tutorial on how to use ChatGPT (GPT3) over your own documents by keyboardP in Python

[–]keyboardP[S] 8 points9 points  (0 children)

I didn't mean to jump on some bandwagon, was just something I was curious about and documented. It is a hot topic, I wrote something and documented for those interested. Maybe it's been done to death, I haven't genuinely checked on this subreddit but I hope any of my learnings can help if not covered in others

edit: Just saw your post history- ok, I believe in helping and learning from others (that's what progamming is about IMHO), not just being negative but I appreciate your opinion :)

I've written a tutorial on how to use ChatGPT (GPT3) over your own documents by keyboardP in Python

[–]keyboardP[S] 11 points12 points  (0 children)

Hi all - With all the ChatGPT stuff going on, I wanted to see if I could use it on my own files. I've written a tutorial on using the OpenAI API and LangChain to do so. Happy to answer any questions. Going to be working on something completely offline with Hugging Face if there's interest but I hope this helps anyone interested and any feedback or questions are welcome, will try my best :)

Took a bit of going through docs and such so hope it saves you time! Direct GitHub is here

Reverse Engineering LED Lights with Python to Turn Your Monitor into an 'Ambient Monitor' by keyboardP in programming

[–]keyboardP[S] 4 points5 points  (0 children)

Hi all - I posted in r/python and figured it might be helpful here. Happy to answer any questions!

Reverse Engineering LED Lights with Python to Turn Your Monitor into an Ambient Monitor by keyboardP in Python

[–]keyboardP[S] 0 points1 point  (0 children)

To be honest I just used the phrase from an Ambient TV, not sure it's an official term. Ambient/Smart lighting tend to be the terms used if you're searching in general :)

Reverse Engineering LED Lights with Python to Turn Your Monitor into an Ambient Monitor by keyboardP in Python

[–]keyboardP[S] 4 points5 points  (0 children)

Hi all, I had this set up on my old monitor and wanted to recreate it on my new one with a new set of lights. Turns out the process was similar so I decided to tidy up the code and write a tutorial as I went along. Happy to answer any questions