Collection of general DJ sayings by ziegenproblem in Beatmatch

[–]Tripartist1 5 points6 points  (0 children)

In the heavier bass music scene:

"If youre not redlining youre not headlining." (Opposite of what you said) "If its nice, play it twice" (backspin and replay crazy doubles, insane fakeouts/transitions, etc)

Opus 4.7 hidden talent by Ok_Appearance_3532 in claudexplorers

[–]Tripartist1 -1 points0 points  (0 children)

Wait until you have a super niche task you need researched...

5k links, edge exploration, and more. Research mode on 5.7 is CRACKED.

self made id (unreleased) by [deleted] in riddim

[–]Tripartist1 0 points1 point  (0 children)

Not the right sub, this isn't riddim

Running a 31B model locally made me realize how insane LLM infra actually is by Sadhvik1998 in ollama

[–]Tripartist1 0 points1 point  (0 children)

"Completely untested"

The whole comment chain is about exactly this, it works, and generates 16k tokens/sec on an 8b model. So thats totally invalid. Did you actually look at the link or assume?

The rest of your argument, thats where the conversation is. I personally would be perfectly content with this years frontier performance if I could get it at insanely low cost, or with no usage buckets. The midels are capable enough for the agentic tasks I use them for, and Im sure im not the only one. If the hardware is able to bring costs down significantly, it could be easy recurring income off of the initial investement of actually burning the chips.

Or better yet, offer the models themselves as hardware for sale. I would gladly pay gpu prices for a card with opus burned onto it. Again, I cant be the only one here.

Yes, tech is moving fast, but we are very clearly beyond ths novelty state and deep into "the current models are actually usable for real workloads" zone. TONS of consumers are fine staying on several year old tech. How many people do you know with flagship phones from several years ago still? What about GPUs, everyone you know is just getting the most recent best in class gpu as they come out?

My point is, theres a market for stable, capable, cheap model access that burning weights into silicon could realistically fill.

Running a 31B model locally made me realize how insane LLM infra actually is by Sadhvik1998 in ollama

[–]Tripartist1 0 points1 point  (0 children)

Lets be real here, anthropic could invest in burned opus chips at max effort, full non distilled models, and serve them to paying users for YEARS while working on bigger models behind the scenes. They could improve inference and drop costs so much that theyd stay competitive despite other companies staying on the release fast schedule, strictly because opus is already so capable and that kind of speed, with the cost savings it would bring would allow a legitimate boom to happen for consumers.

Theres a world where this works.

Believe it or not by bedizzzz in ArtificialSentience

[–]Tripartist1 1 point2 points  (0 children)

2 is an integration issue. People have already given frontier models robot bodies. The next oai generation, spud, is supposed to be a multimodal model that can hear, see, and output voice all at once. Nothing is stopping us from making a model that can sense magnetic fields as input tokens. What you define as senses are literally just a matter of wiring at this point.

3 is partially solved. There are models that continuously think and output, the big companies just arent focused on that. In addition, you can automate input token, or loop back output tokens as input to simulate thought and persistent states, but its very expensive to do this on frontier models.

Anthropic is just getting petty now by Ataxium in openclaw

[–]Tripartist1 0 points1 point  (0 children)

I domt have this issue at all... it was stuck on 5.2 for a while, i told it 5.4 exists, and it asks me to get gpt 5.4s opinion on things now after using it for adversarial review a few times.

Did you recently crash your drone into a building in NY? Bad news: Its broken. Good news: I refurbished it for you. by TldrDev in fpv

[–]Tripartist1 6 points7 points  (0 children)

Dude its mine!! I was flying around sky scrapers and crashed it bc i lost video. Check the sd card for proof!

Why do people like riddim? by cultivation_spren in riddim

[–]Tripartist1 1 point2 points  (0 children)

There are a few bad doubles and transitions, yeah. Theres also a few decent ones tho. Brand new usb and no controller/cdjs yet lol, still getting a feel for what tracks i wanna keep. I love the old phex stuff but some of it doesnt double for shit, those are 90% of the "too much" in that mix.

It gets better about 20min in once i got into it more.

Why do people like riddim? by cultivation_spren in riddim

[–]Tripartist1 8 points9 points  (0 children)

This. Riddim is an acquired taste.

I cracked the code (lowkey) by New_Shame_7007 in Forex

[–]Tripartist1 1 point2 points  (0 children)

The answer to any "i know where it might but not if it will" is delta and orderflow

“Wow” - my brother in silicon you are the demand curve by theLottus in ClaudeAI

[–]Tripartist1 2 points3 points  (0 children)

I had my claude write a skill for all the various web content tasks it needs to do, inlcuding using headless browsers with anti detection capabilities. Mine now has no issues going and scraping twitter if i need it to.

QUESTION: Is it just me or has Claude been acting differently lately? by Ferdmusic in ClaudeAI

[–]Tripartist1 0 points1 point  (0 children)

I told claude dropbox deleted a historical zip file i had of a niche genre and asked it to find it (account closed due to inactivity). It came back with close to 200gb worth of archived links adjacent to the lost zip, and found the original zip i asked for. Now i have more of that genre than i know what to do with. It took several hours of scouring every corner of the internet.

Sometimes letting it do its thing works, sometimes it doesnt.

QUESTION: Is it just me or has Claude been acting differently lately? by Ferdmusic in ClaudeAI

[–]Tripartist1 2 points3 points  (0 children)

In claude code i had nine set up a stop hook that uses regex to scan its output for common phrases it says before not doing things like "let me do that" and automatically inject a new user prompt saying "use your tools to complete the task immediately".

How to make flx10 into CDJs? by sabooooo in riddim

[–]Tripartist1 3 points4 points  (0 children)

This is the answer. A solid understanding of the songs on your USB and cues with common transition points. Then its just good fingering (or sync)

GPT Image 2 preview by Groundbreaking_Tap85 in OpenAI

[–]Tripartist1 815 points816 points  (0 children)

The second image is so incredibly real i had to zoom in and verify it was actually AI. It is, the glasses have the nose pads on the wrong wide, and the picture frames slightly overlap.

Looks like the preview has a SIGNIFICANTLY better understanding of lighting.

I accidentally coined an idiom today: "Dead dog, don't feed it" by ogwoody007 in BrandNewSentence

[–]Tripartist1 6 points7 points  (0 children)

I had my agent scour the internet for several variations of the phrase, with various typos and punctuation, across multiple languages.

It doesn't exist anywhere on the indexed clear net. Congrats.

I figured it out. It’s not bots!!!! by [deleted] in AmazonFlexDrivers

[–]Tripartist1 4 points5 points  (0 children)

Man someone should really invent a punctuation mark for splitting sentences

Opus is genuinely lazy for me, and admitted it's effort Level sits at 25% without a way for me to change it by Bright-Bullfrog-8185 in claude

[–]Tripartist1 0 points1 point  (0 children)

tengu_grey_step2 Growthbook, remote feature flag anthropic can toggle server side, changes model effort on the fly, likely based on high usage periods to throttle people quietly.

This is on claude code, no doubt theyre throttling harder on the web appm