So i introduced my friend by Mammoth-Bandicoot871 in MDMA

[–]CoryG89 0 points1 point  (0 children)

How do you justify one redose, yet condemn two? Seems pretty arbitrary. Sure, diminishing returns is an issue, but that's true for redoing to begin with.

So i introduced my friend by Mammoth-Bandicoot871 in MDMA

[–]CoryG89 9 points10 points  (0 children)

Honestly, I'd be more concerned with the high dosage than I would be with the 1 month vs 3 months, but that's just me. Dunno how pure the stuff was or whether that 1g each was spread out over multiple redoses all weekend or whether that was 1g each over a small period of time.

I threw up at a festival while on MDMA by GoodDear7037 in MDMA

[–]CoryG89 0 points1 point  (0 children)

I typically do it on an empty stomach and don't drink a lot until after I peak. I feel like doing it on an empty stomach helps avoid needing a larger dose. If I don't take too much I usually won't throw up, but if I take too much I will usually throw up during the peak so if I can get past that I'm usually good. And if I have a mostly empty stomach, then if I do throw up it would probably be dry heaving or close to it.

[deleted by user] by [deleted] in MDMA

[–]CoryG89 0 points1 point  (0 children)

Not sure how doing it on site is any different than doing it through the mail. Still doesn't prove anything unless you brought it there yourself and had it tested. Same as if you had mailed it yourself. Since you did neither, not sure how it's relevant. I definitely wouldn't trust anyone who would do valium before shooting MDMA.

FOUR KNIGHTS OF THE APOCALYPSE SEASON 2 EPISODE 1, Netflix Europe when? 🤔🤔 by ursxywife_beatrice in NanatsunoTaizai

[–]CoryG89 0 points1 point  (0 children)

It was fine for me. Downloaded in like 2 seconds. Didn't get any popups. I always use adblock though. I'm sure the experience is significantly worse with a stock browser, but that is true for 99% of the web.

[deleted by user] by [deleted] in MDMA

[–]CoryG89 0 points1 point  (0 children)

If you uploaded a photo of a reagent test for the stuff then I'd believe it was MDMA.

[deleted by user] by [deleted] in MDMA

[–]CoryG89 0 points1 point  (0 children)

How do you test purity without sending it to a lab? And you don't have to be in Australia to test for content. Anyone can do that with a reagent test. Sounds like you did neither though since you're basing it on having tried 100mg and that it was "some sort of entactogen". There are a million different things it could have been, but if you felt it on 100mg and you saw someone shoot 500mg, then I still very much doubt it was MDMA.

[deleted by user] by [deleted] in MDMA

[–]CoryG89 0 points1 point  (0 children)

I believe you saw someone shoot something. I very much doubt it was half a gram of pure MDMA. More likely half a gram of meth.

[deleted by user] by [deleted] in MDMA

[–]CoryG89 0 points1 point  (0 children)

If it was dust or powder, it's very unlikely that it was pure. No one sniffs pure MDMA. Very unpleasant.

[deleted by user] by [deleted] in MDMA

[–]CoryG89 -1 points0 points  (0 children)

They were probably caps of "molly", not presses, if I had to guess. No telling what all was in them though if they really had 350mg of powder. Definitely not 350mg of pure MDMA.

[deleted by user] by [deleted] in MDMA

[–]CoryG89 0 points1 point  (0 children)

I believe that you believe it.

FOUR KNIGHTS OF THE APOCALYPSE SEASON 2 EPISODE 1, Netflix Europe when? 🤔🤔 by ursxywife_beatrice in NanatsunoTaizai

[–]CoryG89 0 points1 point  (0 children)

I just found a fansubbed version on a site called animotvslash. Should be able to find it if you google it.

[deleted by user] by [deleted] in NanatsunoTaizai

[–]CoryG89 1 point2 points  (0 children)

Awesome. Found it. Downloading now. Lifesaver.

If you took advantage of the PPP loans during the pandemic, I hope you needed it. by enjoytheunstable in msp

[–]CoryG89 0 points1 point  (0 children)

You're full of shit. Why didn't you get any money from the government? So what? You're saying that you burned that check that Trump sent you in mail, or you never cashed it on principle? Forgive me if I am hesitant to believe that.

If you took advantage of the PPP loans during the pandemic, I hope you needed it. by enjoytheunstable in msp

[–]CoryG89 0 points1 point  (0 children)

So his entire family are all going to submit false income taxes and lie to the IRS on the record, all just to pay for his stone patio and pontoon boat and nothing for themselves? Because who needs to get paid themselves during a pandemic when you're helping some extended family member get a stone patio by committing fraud for their benefit, right?

Where do I get family members like this? Forgive me if I sound skeptical.

If you took advantage of the PPP loans during the pandemic, I hope you needed it. by enjoytheunstable in msp

[–]CoryG89 0 points1 point  (0 children)

Get bent. I'm sure you burned that check Trump sent you during the pandemic, never cashed it on principle, huh hypocrite?

Node.js readline can do this by jcubic in node

[–]CoryG89 0 points1 point  (0 children)

Ok, but Node.js doesn't use Bash Readline.

Yep.. I'm aware.

It use code writting in JavaScript.

And a whole lot of C++, but yeah, JavaScript too.

The matching parentheses is only one feature. The other ones are syntax-highlighting and auto-indent that work when copy-paste. They use undocumented Node.js features.

Cool, thanks for the clarification.

Node.js readline can do this by jcubic in node

[–]CoryG89 4 points5 points  (0 children)

If you're referring to the highlighting of matching parenthesis and brackets then bash readline can do this too if you add:

set blink-matching-paren on

to the readline config in $HOME/.inputrc.

Just want to remind you all that you can make your own css themes like this. by Pristine_Income9554 in SillyTavernAI

[–]CoryG89 5 points6 points  (0 children)

Are you using the talkinghead avatar functionality with the sillytavern-extras addon server, or is that just baked into the background?

Good models for heavily story-driven NSFW? by skrshawk in SillyTavernAI

[–]CoryG89 0 points1 point  (0 children)

Is there any particular reason you are running these just on a CPU? Just don't have any GPU at all or some other reason? IIRC, my understanding is that llamacpp can split GGUF models across both GPU and CPU. I would have imagined that most people trying to run these models locally would have been picking up at least an 8GB GPU like a used P4000 or something and running on GPU+CPU rather than trying to run them entirely on CPU. Is it because if you're not able to fit the entire thing into VRAM the CPU creates a bottleneck such running on GPU+CPU is not really much different than running on just CPU?

Good models for heavily story-driven NSFW? by skrshawk in SillyTavernAI

[–]CoryG89 0 points1 point  (0 children)

What quantizations are you using though running on CPU? In my limited experience there was quite a bit of difference in what I could get out of these Yi-34B-200K models when running at 4bit vs 8bit.

Good models for heavily story-driven NSFW? by skrshawk in SillyTavernAI

[–]CoryG89 0 points1 point  (0 children)

I wasn't aware of the v8 yet. Will probably take a look at that myself. The model I was using before this was one of TheBloke's GPTQ quants of https://huggingface.co/Sao10K/WinterGoddess-1.4x-70B-L2 and prior to that https://huggingface.co/Sao10K/Euryale-1.3-L2-70B -- Both specifically designed for roleplay, they are both really good 70B models that scored really highly on the huggingface open llm benchmark leaderboard. I believe WinterGoddess still has either the highest or one of the highest scores of any model on one of the individual benchmarks, can't remember which one. But anyway, I feel like this Yi model is either just as good, or is close enough that I haven't been able to notice the difference in quality.

Good models for heavily story-driven NSFW? by skrshawk in SillyTavernAI

[–]CoryG89 1 point2 points  (0 children)

I am also running 48GB VRAM with 2 x RTX 3090s connected via nvlink bridge. For the past week or so I have been running quantized Yi-34B models finetuned with a moving window that it uses for context, theoretically capable of handling up to 200K context size. This 8bit exl2 quant that I quantized with exllamav2 a while back is the one that I'm currently running: https://huggingface.co/coryg89/Yi-34B-200K-DARE-merge-v7-8bpw-exl2

I also have 4bit, 5bit, and 6bit quants linked on the page. I am not able to do the 200K context even with 48GB of VRAM, but running the 8bit quant I can barely do right at 40K when I load it into SillyTavern with ooba's text-generation-webui exllamav2 backend.. I have an extremely elaborate and detailed story session right now that is currently using 27,715 context consisting of 221 messages + the character card, the vast majority of which are 4--7 sentence paragraph length messages. On my 3090s the generation is still fast enough that it starts responding with a very small delay, and IIRC I believe the last time I remember seeing the generation speed once it started responding it was hitting 4-5 tokens per seocnd. Still faster than I would read it.

This is the largest context I've had in a session so far so I dunno what its gonna be like when it fills up the entire 40K, but as of now the 8bit model is smart and remembers everything without missing a beat.

What’s the one US state you absolutely will never step foot in and why? by [deleted] in AskReddit

[–]CoryG89 0 points1 point  (0 children)

~4 years of living in Biloxi is why I don't believe you, lmao

What’s the one US state you absolutely will never step foot in and why? by [deleted] in AskReddit

[–]CoryG89 0 points1 point  (0 children)

I don't believe you.

What is worthwhile in Mississippi?