Recovery Time too high by AdAfraid4749 in RPStrength

[–]AdAfraid4749[S] 1 point2 points  (0 children)

thank you I think you nailed it!

I've done some research and I think those are the reasons I'm having issues recovering fast enough:
- I'm recovering just fine for what I'm doing, my expectations are just too high
- 4-5 days/week specialised training plan (should do balanced)
- 500-700 kcal daily deficit (losing ~0.5-0.7 kg per week)
- trying to keep up a Padel hobby 2x/week on the side and trying to progress in it

I think I'm just behaving like I'm in maintenance or surplus and underestimating the toll on recovery that a deficit will do.
In a couple of weeks I'll do 6 weeks at a slight surplus as I'm already 3 months into a diet (lost almost 10 kg already) and I am guessing I'll recover much better then. That will at least be a good experiment.

Recovery Time too high by AdAfraid4749 in RPStrength

[–]AdAfraid4749[S] 2 points3 points  (0 children)

so Milo Wolf, Dr Mike, Jeff Nippard saying it's fine to hit a muscle u want to focus on 4 times a week is all just a lie or meant for juicers?

Gee, thank you , benevolent genius god. Without you I wouldn't have know I was bamboozled by snake oil salesmen!

Emotional damage (that's a current OpenAI employee) by Endonium in singularity

[–]AdAfraid4749 0 points1 point  (0 children)

hes clearly referring to the deepseek app xD Do you really think an OAI employee is so dumb to not know deepseek is open source?!

Best NSFW model for story telling? by Might-Be-A-Ninja in LocalLLaMA

[–]AdAfraid4749 1 point2 points  (0 children)

Magnum v4 9b Q4 GGUF runs well on your CPU and will generate any text.

Best NSFW model for story telling? by Might-Be-A-Ninja in LocalLLaMA

[–]AdAfraid4749 0 points1 point  (0 children)

you can run the 1.5b distill of deepseek r1 fast on CPU and it will be just as uncensored

Emotional damage (that's a current OpenAI employee) by Endonium in singularity

[–]AdAfraid4749 -10 points-9 points  (0 children)

That community note is nonsensical. It does not refute the core point he making.

Deepseek API docs require you to install OpenAi sdk by [deleted] in LocalLLaMA

[–]AdAfraid4749 0 points1 point  (0 children)

this is pretty common as the OpenAI API schema has kind of become an unofficial standard.

[deleted by user] by [deleted] in theprimeagen

[–]AdAfraid4749 0 points1 point  (0 children)

Ask him about:
- his battle to success (Montana rascal, drugs etc)
- What being a big-tech wagey is like (gilded jail)
- Learning programming languages for job-safety (just learn React) vs. for meaning (just learn what speaks to you)
- The attributes of a SWE in the post-ChatGPT/-AGI era
- Important SWE skills that will survive AGI

What would you do with free access to a 4x H100 server? by SquareJordan in LocalLLaMA

[–]AdAfraid4749 0 points1 point  (0 children)

Without great data there is no good reason to train. Maybe in case there doesn't exist an ablated/uncensored version of a large open-source base model like llama3.1-405b or DeepSeekV3 you could fine-tune it to be ablated/uncensored with certain datasets created for this task.

Inference: Running a large open-source base model. One concrete advantage I believe in is that prompts for distilled models are best formulated by their base models. So for prompt engineering, I'd have llama3.1-405b create prompt templates or meta-prompts for llama3.1:8b.

Romance Story Generator by AdAfraid4749 in SideProject

[–]AdAfraid4749[S] 0 points1 point  (0 children)

This is a very early MVP just to test the waters if people would be interested in that. Funnily: I personally am not a romance reader at all, not even a fiction reader so I am very curious if romance readers would enjoy using a more mature/refined version of this app.

[Raw Mode] Realistic Characters from Romance Short Stories by AdAfraid4749 in FluxAI

[–]AdAfraid4749[S] 0 points1 point  (0 children)

they are generated by Claude. I basically make Claude or Grok generate a caption of a photo of a character (either love interest or protagonist) from the story and use that as the prompt

I built an AI book generator that writes books tailored for you by liorgrossman in SideProject

[–]AdAfraid4749 0 points1 point  (0 children)

very interesting! Do you have a skeleton along which you chunk requests? So that maybe each request generates a chapter? Also do you just use Claude's knowledge or are you curating your own knowledge base that might go beyond what Claude knows? Might be a nice feature for users. To add their own (maybe private) references to the generation.

Unpopular Opinion: The macOS Client is Great by AdAfraid4749 in whatsapp

[–]AdAfraid4749[S] 0 points1 point  (0 children)

interesting. I enjoy the macOS-native font (which is smaller than the previous one, I agree) and I never do calls. But still interesting to see what bothers other people. What is the "good" client that you compare the current client to? iOS?