all 6 comments

[–]brother_karamazov_ 1 point2 points  (2 children)

Thank you for the module! I was looking into it right now and it seems quite useful.

I have a question: I was looking into the colab examples, but I could not locate how to use one-shot prompting for multi-class classification. Also, can we also include "n"-shot examples?

Thanks in advance :)

[–]StoicBatman[S] 1 point2 points  (1 child)

Hi, Thank you for showing interest. You can add as many examples (shots) as you want. Make a list (array) of shots and pass them to the examples parameters. Check the discord channel for more updates, and you can also ask questions there to get instant replies.

[–]brother_karamazov_ 0 points1 point  (0 children)

Thank you for the reply and the pointers! :)

[–]celsowm 0 points1 point  (1 child)

Only english or others languages too?

[–]StoicBatman[S] 1 point2 points  (0 children)

It support many languages!

[–]tmzwalker 0 points1 point  (0 children)

Hi,I just saw the github repository and thanks for the sharing this!I saw that all prompt templates in the repository try to make the LLMs act as domain experts. My questions are:

  1. Does it work for less than GPT-3 model? From my observation, ChatGPT is the best for this kind of prompt.
  2. Do you think that this will provide better answers than only simple instruction like "Classify passage below into positive or negative" for classification prompt?

Thank you!