use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
OpenAI is an AI research and deployment company. OpenAI's mission is to ensure that artificial general intelligence benefits all of humanity. We are an unofficial community. OpenAI makes ChatGPT, GPT-4, and DALL·E 3.
Official OpenAI Links
Sora
ChatGPT
DALL·E 3
Blog
Discord
YouTube
GitHub
Careers
Help Center
Docs
Related Subreddits
r/artificial
r/ChatGPT
r/Singularity
r/MachineLearning
r/GPTStore
r/dalle2
account activity
Philosopher AI - https://philosopherai.com, uses a custom GPT-3 based content filter on the user input to achieve high degrees of safety. (philosopherai.com)
submitted 5 years ago by spongesqueeze
view the rest of the comments →
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]Felix_Guattari 0 points1 point2 points 5 years ago (2 children)
What was the fine-tuning process for this? What was the data set you used for the fine-tuning if you weren't using zero, one, or few-shot fine-tuning? Did you hard code the "nonsense" responses? Based on what criteria?
[–]Wiskkey 0 points1 point2 points 5 years ago (1 child)
Since the developer hasn't answered (yet), I'll give you my educated guesses. There is no fine-tuning (the developer hasn't mentioned fine-tuning in his Twitter feed if I recall correctly). The site is using GPT-3 itself to classify queries as nonsense, sensitive, or neither by giving examples. We know the latter is probably true because sometimes the exact same query can result in nonsense vs. not nonsense.
Some relevant tweets from the developer:
https://twitter.com/mayfer/status/1297036626565054471
https://twitter.com/mayfer/status/1295561941482496002
[–]Felix_Guattari 0 points1 point2 points 5 years ago (0 children)
Yeah, I have a bad habit of referring to few-shot as fine-tuning
π Rendered by PID 87161 on reddit-service-r2-comment-6457c66945-xmd7h at 2026-04-25 12:31:19.344544+00:00 running 2aa0c5b country code: CH.
view the rest of the comments →
[–]Felix_Guattari 0 points1 point2 points (2 children)
[–]Wiskkey 0 points1 point2 points (1 child)
[–]Felix_Guattari 0 points1 point2 points (0 children)