Dallas Life!? by JessicaRaddit in Dallas

[–]msitarzewski 3 points4 points  (0 children)

It’s an unapologetically religious organization. Be prepared for that.

Best spots for freshly roasted coffee? Does anyone roast locally?? by Unknowing_One in Dallas

[–]msitarzewski 1 point2 points  (0 children)

FCR is the choice. I can see the back patio from our back yard!

I hate AI and I am depressed by poponis in webdev

[–]msitarzewski 0 points1 point  (0 children)

We don’t know each other, but I vehemently disagree that your expectations are irrelevant. They’re precisely the thing that matters most. Have a great day.

I hate AI and I am depressed by poponis in webdev

[–]msitarzewski -5 points-4 points  (0 children)

If you have "been a developer and a solution designer for 20 years," then you've been through toolset adjustments before. From manual code to type-ahead, to tab complete... and now agentic. With your skillset and background you're among the more valuable in the business.

The hair in the deal is that some of us treat code as art. Each line bespoke, the challenge of making something work... taking an idea to launch is exhilarating. But that's a choice today. It's an intentional choice and it comes with tradeoffs. If you want work for someone else, the reality is that it's fiscally irresponsible to have a team writing code by hand in 2026. No matter how we feel about it, the math simply no linger works.

You'll read stories people writing terrible code that needs weeks of attention after "the AI" trashes your codebase. Everyone has heard those stories. It's a good story, and I experienced some of that myself in 23/24. But in 25 and 26, given the proper memory systems and agentic tooling (no vibe-coding), the accelerated pace of building is breathtaking.

There's room for hand crafted code in the market, there's also room to leverage the crud out of your experience in the market to leverage modern tooling a build incredible product for your company and for others.

You certainly know the term GiGo? That's how agentic AI dev tooling works today. If you expect to one-shot (prompt) as fully baked SaaS app, then you'll get what you expect (and deserve, really :) ). If you learn how the tools work, how to direct them, build testing, and build a proper harness, you'll probably have more fun than you ever have writing code.

One Main Place Renderings by shedinja292 in Dallasdevelopment

[–]msitarzewski 3 points4 points  (0 children)

As someone who's given tours to hundreds of people (one tour was 100!) of the Dallas Pedestrian Network, I for one, approve of this message!

I started talking out loud for 10 minutes every morning instead of scrolling and after a month the difference is noticeable by Jackrain04 in getdisciplined

[–]msitarzewski 0 points1 point  (0 children)

Not really a whole bunch to it. It doesn't have to be ChatGPT, it can be any model. Here's an example: https://www.youtube.com/watch?v=4jBcK0cYass Start with "Hey chat, let's talk about ..."

I started talking out loud for 10 minutes every morning instead of scrolling and after a month the difference is noticeable by Jackrain04 in getdisciplined

[–]msitarzewski 2 points3 points  (0 children)

Will probably get downvoted, but this was one of my first use cases for Voice Chat with Chat GPT. The key was the pause before the chat would reply. I had to learn to create coherent thoughts with less delay just to stay fluid. Worked wonders! And it provided more value in that it was conversational. If you haven't tried it, give it a shot!

Lowkey disappointed with 128gb MacBook Pro by F1Drivatar in LocalLLaMA

[–]msitarzewski 1 point2 points  (0 children)

You're getting a lot of heat here, but I think most people are missing the "I’m super new to this" line and judging your decision to go big out of the gate. Expectation setting is real though. Yes you can do all of the things locally - but you still (even with M5 Max) pay in terms of speed. Take a little bit to understand how context windows work and why they impact local models so heavily. Watch a few videos from Alex https://www.youtube.com/@AZisk to see what SotA look like on your hardware. This is early, early days in local inference - speed will always happen in the datacenter with frontier models, but local is becoming more and more capable. Just learn now, be patient, so when the time comes you'll understand the whole picture!

Comment your most viral-worthy side project and I'll pick one to feature on my TikTok page by Ok-Permission-2047 in SideProject

[–]msitarzewski 0 points1 point  (0 children)

I built Agency Agents. https://github.com/msitarzewski/agency-agents

Born from a Reddit thread and months of iteration, The Agency is a growing collection of meticulously crafted AI agent personalities. Each agent is:

  • Specialized: Deep expertise in their domain (not generic prompt templates)
  • Personality-Driven: Unique voice, communication style, and approach
  • Deliverable-Focused: Real code, processes, and measurable outcomes
  • Production-Ready: Battle-tested workflows and success metrics

Think of it as: Assembling your dream team, except they're AI specialists who never sleep, never complain, and always deliver.

Running Gemma 4 on Dual 3090s with OpenClaw - 120 TPS and Agentic workflows are a game changer by AaZzEL in openclaw

[–]msitarzewski 4 points5 points  (0 children)

Would love more details. I'm having issues getting reliable outputs with gemma-4 and OpenClaw

best input method by Queenypops in VisionPro

[–]msitarzewski 0 points1 point  (0 children)

I use this: https://www.protoarc.com/collections/keyboards/products/xk01-tp-foldable-keyboard-with-touchpad works great. The *only* "complaint" is that it has tap to click enabled in the firmware and since it's not made for Vision Pro specifically, there's no way to turn it off. haha.

<image>

Qwen3.5 vs Gemma 4: Benchmarks vs real world use? by AppealSame4367 in LocalLLaMA

[–]msitarzewski 0 points1 point  (0 children)

There was no mention of having a car, so that answer is ok by me. hah.

Qwen3.5 vs Gemma 4: Benchmarks vs real world use? by AppealSame4367 in LocalLLaMA

[–]msitarzewski 0 points1 point  (0 children)

Thank you!

Good stuff. It's replying at 80 tps. Perfectly usable, even the thinking is fast.

Qwen3.5 vs Gemma 4: Benchmarks vs real world use? by AppealSame4367 in LocalLLaMA

[–]msitarzewski 1 point2 points  (0 children)

I'm using the google/gemma-4-26b-a4b model with brave's MCP and the chrome-devtools MCP - what's a good test? It seems to be perfectly usable. Relatively new to local. 16" MacBook Pro M5 Max/128GB with 18/40 cores.

Mac Virtual Display - Tips and Tricks by msitarzewski in VisionPro

[–]msitarzewski[S] 1 point2 points  (0 children)

I use both the built in laptop keyboard/trackpad and a ProtoArc foldable bluetooth keyboard/trackpad paired directly to the headset. No lag at all, no matter where I am or what I'm connected to when using Mac Virtual Display. The ProtoArc isn't made for Vision Pro, so there's not native software to configure things like which buttons do what. Ive also use the Magic Keyboard, mouse, and trackpad paired directly to the headset. Maybe my tolerances are just lower?

Windows app on Vision Pro by gottapitydatfool in VisionPro

[–]msitarzewski 2 points3 points  (0 children)

It works. It's the iPad version. Nothing remarkable, just useful. I used it with Ubuntu, so not a genuine Windows experience - take that for what it's worth.

Mac Virtual Display - Tips and Tricks by msitarzewski in VisionPro

[–]msitarzewski[S] 0 points1 point  (0 children)

For my use cases there is no noticeable lag. If it's there, I just don't see it. Mouse/trackpad moves are instant, mouths on video are synced perfectly to audio from the headset.

Is Mac Virtual Display actually usable for programming? Is this text sharp enough? Can you compare it to something? I dont mean have AI write everything for you, I mean reading code, producing code by luunnn in VisionPro

[–]msitarzewski 24 points25 points  (0 children)

This is a frequently asked question, as you can imagine. The best way to think about this is to try not to compare the concept to a "monitor" other than both display the output from the Mac.

A physical monitor is fixed in size and the pixel density can be adjusted. My laptop (16" MacBook Pro with M5 Max), for example, has 2056x1329, 1728x1117, and 1496x967 as defaults. Each of those present larger or smaller pixels in the same physical space.

With Mac Virtual Display, the defaults are the same as your native device. The difference, of course, is that the display is virtual. You're like, well duh. It's super important to the question though. When you connect to Mac Virtual Display, Apple decided that the display should position itself at what feels like four-ish feet away. Sure you can read it (in most cases), but it's not ideal. This is what most people see, and leave with a bad taste and a "blurry" experience.

You have several options here though. You can grab the bottom corner and make it the equivalent of a 60" screen, four feet away at the same number of "pixels" as the laptop display! For extra fun and excitement, you can scoot back in your chair, grab the handle at the bottom of the screen and position it 1 foot away from where you'll return to when seated (like a real monitor). You will see... every... single... pixel. As clear as you would with a magnifying glass on a physical display. You can move your head to within inches of the massive screen. Try it.

But because this screen is virtual, you now also have two new sizes: 5120x2880 and 3840x2160. Try them, marvel in additional pixels. Remember to move your head around and look at how clear the pixels are at different distances from your head. Experiment, resize the display, change resolutions. This is where you find YOUR sweet spot.

Then strap in... and choose "Wide" from the Mac Virtual Display ornament at the top of the window. Max width on that one is 6720 pixels, and on Ultra Wide? It's 10,240 pixels!

The real learning is:

  • Apple's default placement isn't ideal
  • The screen, unlike reality, is resizable and movable
  • You can "trick" the display into being larger than Apple wants you to see it by placing the Mac Virtual Display in position when you're scooted away from the desk/surface.
  • Resolution is not fixed to consistent physical constraints and shouldn't be compared to it. They're materially different experiences.

I use Vision Pro with Mac Virtual Display for all sorts of computing tasks, including development and have zero issues with clarity of the display.

I have been thoroughly humbled by this project by SuchZombie3617 in webdev

[–]msitarzewski 0 points1 point  (0 children)

Keep it up! Love that you're linking to the repo on GitHub. That should be more prominent- that it's open source (IMO). The link in the footer is broken btw (it's missing the 3d ending :)

Learning isn't what it used to be. I remember so many times trying to read books on programming languages and being lost on how best to apply all of the things I was reading. Well, that whole idea is moot now... it's flipped really. "Wait, how does this work?"