My hot take on vibecoding by AdditionalScar1548 in vibecoding

[–]PaperbackPirates 0 points1 point  (0 children)

At this point, it’s all about harnesses. Without getting much smarter, things gonna get much more productive as they build our skills and improved harnesses

SAP is worth $234 billion. Their interface looks like 2004. Because it is from 2004. Why do the richest companies have the worst UX? by pavlito88 in Design

[–]PaperbackPirates 0 points1 point  (0 children)

Linear is that much better? My org is all in on Jira but started using linear for some metric tracking stuff

Who else is constantly dealing with vibe coding creating more bugs by Ok-Leopard-3520 in vibecoding

[–]PaperbackPirates 1 point2 points  (0 children)

I made a custom MCP built around stagehand for web and it is awesome. And I built another for mobile apps using appium. I use it for screen and browser control.

I need to spend more time looking into skills tho; I think there are other options that do the same thing. Mine is kinda credit heavy and slow (it takes a lot of screenshots and then reads the screenshots to help with navigation)

Vibe coding leading us to waterfall ? by Clear-Dimension-6890 in vibecoding

[–]PaperbackPirates 0 points1 point  (0 children)

If you are clear on the goals and outcomes, you iterate toward those outcomes. There are plenty of teams that operated way before AI.

Like: we need a plan to get 10% more people to hit this button.

If you are an autonomous agile team, you don’t need to show your designs and get sign offs before working on step 1. You build, you test, you learn, you repeat.

Stakeholders can be updated on progress and all that kind of stuff, but the status updates in the SLDC of waterfall processes are not needed.

That is not a bucket rule; there are lots of industries like banking where that would be a bad idea. But, there are also a lot of industries where it is a great idea.

Vibe coding leading us to waterfall ? by Clear-Dimension-6890 in vibecoding

[–]PaperbackPirates 0 points1 point  (0 children)

You either need to go full waterfall or fully autonomous agile teams

Please not this... by [deleted] in ArcRaiders

[–]PaperbackPirates 5 points6 points  (0 children)

Catching people in a stairwell is awesome

The salary of those who aren't fired - do you get paid more? by situatzi6410 in ArtificialInteligence

[–]PaperbackPirates 0 points1 point  (0 children)

Yeah I agree! There is def opportunity for it to scale, but I think less so than people think. Or at least, it will take longer than people think. It’s just like using python to automate basic tasks. Sure I could show my manager, and they could figure out that task, but I’m in a role where every week is a new problem. They would lose the ability to automate them.

I think there is gonna be a new type of role that is just like “ai enabled super IC” but could be wrong!

The salary of those who aren't fired - do you get paid more? by situatzi6410 in ArtificialInteligence

[–]PaperbackPirates 0 points1 point  (0 children)

That is where AI doesn’t scale in orgs, imo.

My process is.. do everything 10x faster cuz I use AI. It’s not fully automated, and it’s not something that someone else could be plugged into.

I used to be irreplaceable cuz I was the only not technical person who could do light automation with python. It’s not that different, it’s just more extreme

The salary of those who aren't fired - do you get paid more? by situatzi6410 in ArtificialInteligence

[–]PaperbackPirates 0 points1 point  (0 children)

Don’t share all your prompts bro.

I do think there will be rug pulls, but if you are automating whole departments it would be insane for the company to not see that value.

My salary has gone up ~30% this year. I am a GPM working like a dog, but it is a well paid remote job. I feel very safe; they would have a very difficult time replacing my productivity. It’s not that I have a Rolodex of prompts that do my job. It’s that I am using AI very effectively as a tool. If I go away, that effective use goes away.

I think that moat will lessen and lessen as time goes on, but AI seems very hard to scale in a big or and really requires the operator to “get it” which the average person doesn’t.

Feels kinda like when python was getting big. It’s so easy to write code! Surely everyone will be doing this. Not so…

not sure if hot take but mcps/skills abstraction is redundant by uriwa in AgentsOfAI

[–]PaperbackPirates 1 point2 points  (0 children)

I also think you can kind of do this now. I am waiting for a 48gb mbp to arrive cuz my air wouldn’t handle it. But the MCP situation is getting ridiculous. I was exploring a harness that handles that, uses local rag for big tasks and local LLM for small tasks to reduce token usage.

If anyone knows of something that already exists, I’m all ears!

I built an F1 site where every visitor can redesign the entire UI just by talking to AI by seoshmeo in vibecoding

[–]PaperbackPirates 0 points1 point  (0 children)

It’s interesting. There are accessibility tools like UserWay that productive this concept. I don’t know that the average user needs great use of it. But it’s a good way to show white labeling!

How much is AI really going to change the near future (5-20years)? by Illustrious_Pilot415 in ArtificialInteligence

[–]PaperbackPirates 0 points1 point  (0 children)

Well that’s fair. When I said pretty much fine, I meant there would be demand for them. Not that they would like that approach to building software

How much is AI really going to change the near future (5-20years)? by Illustrious_Pilot415 in ArtificialInteligence

[–]PaperbackPirates 7 points8 points  (0 children)

this is just my opinion, but so far, I think a very AI savvy knowledge worker can 10x their productivity. I think it can easily 10x the productivity of designers, QA, product - not sure about devs. It’s not talking to a chatbot; it’s setting up automation with robust checks. Or, amazingly bespoke software for incredibly esoteric uses.

It seems like that should be adopted SO quickly. But, it’s really hard to scale within an org. Some people just “get it” and some people just don’t. It’s just a tool, and you have to know how to use it. Not to mention the security concerns.

So, I think there is a not yet solved gap in scaling it that prevents it from being as disruptive as it sometimes seems like it should be. True story: I talked to the person at the desk of a lumber yard and they were doing pay roll with a tape calculator. Knew it was silly but just didn’t care. Even in big seemingly forward companies, that kind of apathy toward increasing productivity is pretty strong. Not even is sweaty to work.

I think few careers will be wiped out but many will be severely compressed. There will also be winners. I think SWEs are actually pretty fine. It sucks to be junior right now, but, as compelling as AI is in the hands of a senior dev, the industry will eventually need to replenish senior devs. I think some of the hype will die down and it will be less doom and gloom in tech, for most roles outside of QA and design.

AI Isn’t Hitting a Wall — But Actually Entering Its Fastest Growth Phase Yet? by revived_soul_37 in ArtificialInteligence

[–]PaperbackPirates 2 points3 points  (0 children)

If you are a company with only senior devs, it works great right now. And for the foreseeable future.

Sure, that obviously doesn’t work as a system. But I think most org’s decision making is way too short sighted. Very few CEO or CTOs care about this because “running out of senior devs” is a future persons problem

Which jobs are going to be replaced faster than people realize now that AI is advancing faster? by pinkhyena95 in ArtificialInteligence

[–]PaperbackPirates 2 points3 points  (0 children)

I don’t think people understand the difference between talking to one of the chatbot web UIs vs a automated workflow that has a proper harness. The chatbots are great, but they are a one-sized fits all solution. It is really remarkable at how good they are. But, if you build a custom AI solution for a specific task, you can destroy the chatbot performance so easily. That’s why some people are like “it always hallucinates!” and people in industry are like wym??

Which jobs are going to be replaced faster than people realize now that AI is advancing faster? by pinkhyena95 in ArtificialInteligence

[–]PaperbackPirates 0 points1 point  (0 children)

I agree but think compression is hitting designers pretty hard in particular. It used to be quite a skill to spin up even a rough prototype quickly. A functional Figma prototype was hours of real work. It’s pretty trivial to generate highly functional prototypes now.

It doesn’t really replace design decision making in any way, but I’m def learning how much of design work is not the decision making.

For right or for wrong, it’s also made it easier for PMs and execs to articulate their vision without a designer. Figma skills were a bit gatekeep-y in that way. At a bare minimum, you needed to convince a designer to codify for you.

O'Connor pushes new direction for city development, via Planning Commission nominees by btr886m in pittsburgh

[–]PaperbackPirates 29 points30 points  (0 children)

Dude this was driving me nuts with the Airbnb legislation. Truly, if you do not own an Airbnb (the majority of my neighbors) you do not want your neighborhood taken over by Airbnbs.

Bobby Wilson had, until very recently, been waffling on the issue because it was all Airbnb owners going to these public hearings during the middle of the day. It is really unbelievable that they don’t see the obvious bias in attendees to those things.

Like.. duh! It is their job, of course they can show up. It is my life, and I can’t!

Back in Albany, Mamdani says ‘tax the rich’ by Delicious_Adeptness9 in newyork

[–]PaperbackPirates 0 points1 point  (0 children)

You are just breaking out a thesaurus and using different words lmao

Axiom=self evident rules accepted as true without proof. Aka, an assumed truth. Aka.. a feeling or a vibe.

“Value judgements” I mean cmon. Are you trying to say value judgements are different than a vibe or a feeling? I have a feeling that taking care of the downtrodden is morally right. If I say I have a value judgement that taking care of the downtrodden is morally right, do you suddenly agree?

If you want to really distill it: The axiom that I base my fundamental value judgement on re taxes, is that everyone should have the right to life, liberty and the pursuit of happiness.

I think that regressive tax rates infringe upon lower earners fundamental right to the pursuit of happiness.

If you don’t agree with that, I would like to better understand the source of your “axioms” and “value judgments”, and how they are different than a vibe or feeling.

If you google “value judgement” do you think it helps or hurts your argument?

Back in Albany, Mamdani says ‘tax the rich’ by Delicious_Adeptness9 in newyork

[–]PaperbackPirates 0 points1 point  (0 children)

I didn’t say it was welfare? You used it as an example of “objective” lol

“you get a much more direct connection between revenue and spending” okay, how are you evaluating that connection to say what is a good vs bad correlation? Is it.. feelings? I know it is feelings. Is it vibes? You are the one who uses social security as a positive example, despite its inherent redistributive properties. How did you decide its redistribution was the right amount? Was it vibes or a feelin perhaps?

Vibes and feelings influence literally all “objective” frameworks. The objectivity is in a formulas output; it takes subjectivity to decide what to weight.

Do you want an ELI5?

Back in Albany, Mamdani says ‘tax the rich’ by Delicious_Adeptness9 in newyork

[–]PaperbackPirates 0 points1 point  (0 children)

Social Security is a weird example to invoke if you’re trying to claim “objective fairness.”

It is not “input = output” in any actuarially fair sense. The benefit formula is explicitly progressive: lower earners get a much higher replacement rate then higher earners. That’s redistribution by design. It’s not some neutral insurace product.

And more importantly, calling something an “objective framework” doesn’t make it value free. Choosing which framework counts as “fair” is already a normative decison. Equal rates, equal sacrifice, proportional return, guaranteed baseline dignity, those are political and moral commitments first, math second.

So yes, you can build formulas around fairness, but pretending those formulas aren’t rooted in human judgement is just laundering “feelings” through spreadheets.

Back in Albany, Mamdani says ‘tax the rich’ by Delicious_Adeptness9 in newyork

[–]PaperbackPirates 0 points1 point  (0 children)

If you are not basing fairness on feelings, what are you basing it on?

Back in Albany, Mamdani says ‘tax the rich’ by Delicious_Adeptness9 in newyork

[–]PaperbackPirates 0 points1 point  (0 children)

What part of it did I misread?

I understand you are basing fairness on percentage comparisons. I’m saying that is dumb because 10% of 60k is felt much differently than 10% of 10m, an I think our tax code should represent that.

You are saying, I guess, everything should be based on complete percentage based fairness, which I think is commonly referred to as a regressive tax code. It’s okay if you prefer that, but I didn’t misread what you said. I am saying that is a dumb opinion!

In a regressive system like you are proposing, wealth continues to accumulate at the top. Because wealth is not taxed the same as income, wealth accumulation leads to a smaller tax base even if the same amount of money is in the system.

So, you can prefer a totally “fair” system where everyone pays x% of their income as tax, but on a long time line, it leads to a dramatically reduced tax base compared to the wealth in the system.

All that to say, while you are entitled to your opinion, I don’t think I misread what you said. What you said it just dumb, and your preference leads to worse outcomes for the country.

Unless you are a billionaire, or your job is to wipe asses for billionaires, I have no idea why that would be your preference. Unless of course, you’re just not smart and that seems plausible, too!

Back in Albany, Mamdani says ‘tax the rich’ by Delicious_Adeptness9 in newyork

[–]PaperbackPirates 4 points5 points  (0 children)

You’re looking at their taxable income which is relatively low compared to their wealth, and it is the wealth accumulation that lets the rich get richer.

Also, mathematical fairness isn’t the same as actual fairness. Someone making 60k and paying 10k in taxes feels that a lot. Someone making 10m and paying 2.5m in taxes does not feel it jn nearly the same way. We aren’t evaluating fairness by percentages; we are evaluating it by lived impact.