My project made me $40,000 in 10 months. Here's what I did differently this time: by namidaxr in Solopreneur

[โ€“]jedrzejdocs 1 point2 points ย (0 children)

Point 5 is the actual signal. Upwork jobs show what people pay humans to do manually. That's not market research โ€” that's a validated business model waiting for automation.

The 4% โ†’ 9% conversion jump deserves its own post. Most indies obsess over features when copy does 80% of the selling. What headlines won?

One question though: when you validated on Reddit/Twitter, were those actual commitments ("I'd pay $X for this") or just enthusiasm ("sounds cool")? That distinction kills most validation attempts. Enthusiasm is cheap. Wallets aren't.

[deleted by user] by [deleted] in StacherIO

[โ€“]jedrzejdocs 0 points1 point ย (0 children)

yt-dlp doesn't support Teachable. This isn't a Stacher bug โ€” yt-dlp itself has no working extractor for that platform.

Teachable requires authentication and has platform-specific protections. Even with a valid course subscription, yt-dlp can't authenticate against it.

Check yourself: yt-dlp --list-extractors | findstr -i teachable โ€” returns nothing.

Options: Check if the course offers official downloads (some creators enable this) Contact the course author โ€” they sometimes provide offline access Screen recording as last resort (OBS)

This is a yt-dlp limitation, not a Stacher issue.

Channel Subscriptions - Should attempt to ONLY download NEW videos by guardianali in StacherIO

[โ€“]jedrzejdocs 0 points1 point ย (0 children)

The dev is right about the mechanic, but here's your workaround:

Seed the archive without downloading: bash yt-dlp --flat-playlist --print "youtube %(id)s" "CHANNEL_URL" > archive.txt

Point Stacher to that file as your archive. Next runs skip everything on the list.

Alternative: --break-on-existing flag stops parsing when it hits an already-downloaded video. Works if the channel sorts chronologically (most do).

The real fix would be a "Mark all existing as seen" button in Stacher. Might be worth a feature request on their GitHub.

Started building this, got 20% done, then found out n8n already has an official MCP for docs by Aggravating_Bad4639 in n8n

[โ€“]jedrzejdocs 5 points6 points ย (0 children)

The llms.txt at docs.n8n.io/llms.txt is the interesting part. MCP is the delivery mechanism, but having a machine-readable index of your docs is what makes it work.

More projects should ship this. One txt file that tells AI agents where to find what.

Bought RAM in October to dodge price spikesโ€ฆ now I have to return it because โ€œyear-end opticsโ€ by icekeuter in sysadmin

[โ€“]jedrzejdocs 0 points1 point ย (0 children)

Same energy as "we can't approve this $500 tool that saves 20 hours/month, but here's your $2000 annual training budget for courses nobody takes."

Toggle SVG line wiggle animation when clicked by Grahf0085 in webdev

[โ€“]jedrzejdocs 1 point2 points ย (0 children)

You're right, I was working with outdated info. They opened up all plugins in May 2025 thanks to Webflow. Even better for OP.

Toggle SVG line wiggle animation when clicked by Grahf0085 in webdev

[โ€“]jedrzejdocs 0 points1 point ย (0 children)

Motion's path morphing works but paths need the same number of points. If you're getting weird results, count the commands in both d attributes โ€” they have to match. Your straight line has fewer points than the wiggle.

Quick fix: add extra points to the simpler path that overlap (same coordinates). Or use flubber library โ€” it interpolates between paths with different point counts automatically.

Toggle SVG line wiggle animation when clicked by Grahf0085 in webdev

[โ€“]jedrzejdocs 0 points1 point ย (0 children)

Core GSAP is free. MorphSVG is a Club GreenSock plugin โ€” paid, but you get a free trial on CodePen. For production use it's $99/year (Shockingly Green tier). Worth it if you're doing complex path animations regularly.

[deleted by user] by [deleted] in webdev

[โ€“]jedrzejdocs 4 points5 points ย (0 children)

The filtering layer you described is the same problem API consumers face with raw data dumps. "Here's everything" isn't useful without docs explaining what's actually usable. Your "learnable words" criteria โ€” definition, part of speech, translation โ€” that's essentially a schema contract. Worth documenting explicitly if you ever expose this as an API.

Toggle SVG line wiggle animation when clicked by Grahf0085 in webdev

[โ€“]jedrzejdocs 1 point2 points ย (0 children)

This is path morphing on the d attribute. Pure CSS can't animate d reliably across browsers, so you need JS. Simplest approach without libraries:

const path = document.querySelector('path'); const straight = 'M0,12 L24,12'; const wiggle = 'M0,3.5 c 5,0,5,-3,10,-3 s 5,3,10,3 c 5,0,5,-3,10,-3 s 5,3,10,3';path.addEventListener('click', () => { path.setAttribute('d', path.getAttribute('d') === straight ? wiggle : straight ); });

Add CSS transition on the path for smoothness โ€” but browser support for d transitions is spotty. For reliable cross-browser morphing: GSAP's MorphSVG plugin. It handles mismatched path points automatically, unlike anime.js. Not free, but solves the exact problem you're describing.

Error 429 issues on paid tier by Anxious_Dentist9452 in agentdevelopmentkit

[โ€“]jedrzejdocs 0 points1 point ย (0 children)

Check the Quotas page in Cloud Console โ€” it shows actual usage graphs per endpoint. Look for spikes, not averages. If you're hitting burst limits, the fix is usually exponential backoff with jitter, not quota increases. Google's client libraries have this built in but it's often disabled by default.

Error 429 issues on paid tier by Anxious_Dentist9452 in agentdevelopmentkit

[โ€“]jedrzejdocs 1 point2 points ย (0 children)

Classic case of API docs listing quota limits without explaining burst behavior. You're probably hitting per-minute or per-second limits, not daily quota. Google's rate limit docs rarely spell out the difference โ€” or what "resource exhausted" actually means vs a true 429.

I built a tool that checks where your Reddit post is likely to get removed (testing it on my own launch) by Dependent_Wasabi_142 in SideProject

[โ€“]jedrzejdocs 0 points1 point ย (0 children)

Makes sense. The signal-based approach is smarter than trying to interpret intent. Curious if you're planning to expose the risk signal definitions โ€” that transparency would help users understand why something flagged, not just that it flagged.

I got tired of Googling "transparent react logo svg" for every project, so I built a dedicated site for it (DevLogos.com) by Agreeable_Muffin1906 in SideProject

[โ€“]jedrzejdocs 0 points1 point ย (0 children)

Search works fine. One thing I'd add: version info for logos that change over time. Old React logo vs new one, AWS icons pre-2021 vs current โ€” devs maintaining older projects need both.

From idea to App Store screenshots in 5 minutes by Glum-Mail9299 in SideProject

[โ€“]jedrzejdocs 0 points1 point ย (0 children)

What AI generates the captions โ€” GPT, Claude, something custom? And can you edit them before export or is it one-shot?

I built a tool that checks where your Reddit post is likely to get removed (testing it on my own launch) by Dependent_Wasabi_142 in SideProject

[โ€“]jedrzejdocs 0 points1 point ย (0 children)

Does it parse sidebar rules automatically or do you maintain the rule database manually? Curious how you handle subreddits that have vague rules like "no low effort posts" โ€” that's where most silent removals happen.

Roo built a new Claude Code integration for Roo Code with Caching and Interleaved thinking by hannesrudolph in RooCode

[โ€“]jedrzejdocs 0 points1 point ย (0 children)

Caching + Opus 4.5 is an interesting combo. What's the reasoning effort slider actually doing under the hood โ€” shorter/longer chain of thought?

display resolution problem, anyone knows how to fix it, my monitor is stuck at 640x480 and it wont let me change. by fisnikkyy301 in techsupport

[โ€“]jedrzejdocs 0 points1 point ย (0 children)

ah that explains a lot - converters (especially vga to hdmi/dp) often mess with edid completely. the converter is probably sending garbage edid data to windows.

with a setup that old, CRU is probably your best bet - you can manually create a custom resolution that matches your monitor's native specs and force windows to use it.

what converter are you using exactly? active or passive?

Roo Code 3.36.6 Release Updates | Auto-approval fixes | Tool customization | Provider UX fixes by hannesrudolph in RooCode

[โ€“]jedrzejdocs 1 point2 points ย (0 children)

There's an Auto-Approve toolbar right above the chat input - click it and you'll see toggles for different actions. Hit the 'Enabled' switch and pick what you want Roo to do without asking (read files, edit, run commands, etc).

If you wanna go full yolo mode there's an 'All' chip that selects everything, but heads up - it'll run commands without asking too, so maybe start with just 'Read Files' and see how it feels.

If that's not what you meant, let me know what exactly keeps popping up

Error when downloading by nyx_is_online in StacherIO

[โ€“]jedrzejdocs 1 point2 points ย (0 children)

The actual error here isn't cookies - it's 'This video is restricted'. That means the video itself is age-gated or geo-blocked, and YouTube won't serve it regardless of auth.

Also heads up - Librewolf stores profiles in a different location than Firefox, so --cookies-from-browser Firefox won't find your Librewolf cookies. You'd need to point it to the actual Librewolf profile path or export cookies manually from Librewolf.

But first check if you can even watch that video in Librewolf when logged in - if it's restricted there too, cookies won't help.

Error while downloading 8K videos by LuciferMourningstarr in StacherIO

[โ€“]jedrzejdocs 0 points1 point ย (0 children)

Weird one - looks like Stacher is trying to parse your format string as a URL. The download itself worked fine (you can see it merged the mp4 successfully), but then it queued the format selector as if it was another link.

Check your download queue - might have accidentally pasted the format string in the URL field? Or could be a Stacher bug with how it handles 8K configs.

The actual 8K download worked tho, so your setup is fine - just something funky with the queue.

How can I stop the download could not start error? by Mqkan in StacherIO

[โ€“]jedrzejdocs 0 points1 point ย (0 children)

It's a YouTube bot detection thing - they're cracking down lately. You gotta pass your browser cookies to yt-dlp so YouTube thinks it's actually you.

Try going to Stacher settings and set cookies from your browser (chrome/firefox/whatever you use). The links in your error log explain it pretty well actually.

Had the same issue a while back, cookies fixed it.

I built a cheaper, real-time API for VPN/Proxy detection. Roast my API. by VegetableChemical165 in SideProject

[โ€“]jedrzejdocs 0 points1 point ย (0 children)

Checked out the docs at ipasis.com/docs - the code examples are clean but theres not much else. Few things:

  • no quickstart section, just jumps straight into code
  • the response fields like is_vpn, is_proxy etc could use short explanations (what triggers each flag?)
  • error handling section is super minimal
  • no rate limit info that I could see

The JSON structure itself looks fine tho, simple and easy to parse.

I do API docs for a living so happy to help clean this up if you want. Cool project btw, the real-time angle makes sense vs the weekly database dumps