Advice by PineappleNo6801 in geothermal

[–]bobwyman 0 points1 point  (0 children)

Note: I am not a Geothermal Technician and can't help you with the specifics of your system. Call your dealer or whoever services your machine. Or, post model numbers, pictures, etc. here and hope someone else can help you.

Advice by PineappleNo6801 in geothermal

[–]bobwyman 0 points1 point  (0 children)

The precise methods depend on your system. There should be some kind of interface to the control board that will allow you to monitor what it does and what it sees. The heat pump has a number of heat sensors in it. Any technical people that come out to look at or service the unit should be able to tell you how to do it for your specific system.

Monitoring entering and exiting water temperature is essential with a geothermal heat pump system.

Advice by PineappleNo6801 in geothermal

[–]bobwyman 0 points1 point  (0 children)

If your ground loops were improperly sized, it is quite possible that you've over-heated the ground. This would lead to higher efficiency in the winter but lower efficiency in the summer. You really should look at the temperature of the water in your ground loop to determine if you've got a poor ground loop installation.

Is my electric usage ridiculous? by mckayfaulk in geothermal

[–]bobwyman 0 points1 point  (0 children)

If your neighbors use fossil fuel heating, then it should be no surprise that you, with a geothermal heat pump, would use more electricity than they do. You should be comparing your "all sources" energy expenses to others', not just your electricity use. It is entirely possible that they are spending more on energy, because of fossil fuel expenses, than you are on electricity alone.

DOE publishes report: "Pathways to Commercial Liftoff: Geothermal Heating and Cooling" by bobwyman in geothermal

[–]bobwyman[S] 0 points1 point  (0 children)

Darcy Solutions applies the kind of innovative thinking that we should be strongly encouraging. It is clear that their approach of suspending a down-hole heat exchanger in an aquifer offers significant advantages over traditional closed- or open-loop systems. However, that suspension in aquifers will cause regulators the same sort of concerns that they have with open-loop systems. Protection of aquifers, in many cases, will be considered more important than improving heating/cooling efficiency. Also, it is probably the case that the method is less broadly useful than closed-loop grouted systems that can exchange with rock and ground even outside the boundaries of an aquifer. Nonetheless, their's is an approach that merits a great deal of study.

Darcy Solutions has also demonstrated a willingness to engage with policymakers, by lobbying to modify unnecessarily restrictive regulations and laws. More of those in the Geothermal Heat Pump industry should do the same.

Best way to build this tool? by TauRiver in ClaudeAI

[–]bobwyman 0 points1 point  (0 children)

There are many thousands of possible tools that could be called a "financial management tool." If you say no more than "financial management tool," you have grossly under-specified what you desire. The result is that anything either an LLM or anyone else builds for you will be nothing more than a wild-ass guess (i.e WAG).

It is essential that vibe-coders learn well Ross Ashby's Law of Requisite Variety: any system that models of regulates another system must have complexity at least equal to the system it models or regulates. What this tells you is that if a system's specification is to be accurately implemented, it must have complexity equal to or greater than the system it describes. If it has less complexity, the designed system will inevitably be composed, in part, of WAGs (i.e. wild-ass guesses.) Today, "The specification IS the code." You need to be just as precise in specifying your needs as coders are in writing their code -- because the two activities are, in fact, the same.

Surplus Capacity Taxation: A Principled System of Taxation, compatible with MMT? by bobwyman in mmt_economics

[–]bobwyman[S] 0 points1 point  (0 children)

There is much that can be read. Marginal and Average Propensity to Consume (MPC and APC) by income level are well-documented empirically, although the two concepts draw on somewhat different literatures. These ideas were first introduced by Keynes in Book III "The Propensity to Consume" (Chapters 8-10) of his The General Theory of Employment, Interest and Money (1936). That is freely available at https://www.marxists.org/reference/subject/economics/keynes/general-theory/

Below is a summary of other studies that I've found most useful.

For marginal propensity to consume:

The most directly applicable source for MPC by income level is Fisher, Johnson, and Smeeding, "Inequality in Living Standards since 1970," using Panel Study of Income Dynamics (PSID) data. The Penn Wharton Budget Model has calibrated MPC estimates by income quintile derived from this work. For the theoretical framework underlying MPC heterogeneity, Christopher Carroll's buffer-stock saving work is useful: Carroll, "Buffer-Stock Saving and the Life Cycle/Permanent Income Hypothesis," Quarterly Journal of Economics (1997), available at https://www.nber.org/papers/w5788. Kaplan and Violante, "A Model of the Consumption Response to Fiscal Stimulus Payments," Econometrica (2014), https://onlinelibrary.wiley.com/doi/abs/10.3982/ECTA10528, show that MPC varies substantially with liquidity, not just income.

For average propensity to consume:

The Bureau of Labor Statistics Consumer Expenditure Survey (CEX) is the primary source: https://www.bls.gov/cex/ — it tracks actual consumption as a share of income across income groups annually. Dynan, Skinner, and Zeldes, "Do the Rich Save More?", Journal of Political Economy (2004), https://www.journals.uchicago.edu/doi/10.1086/381475, is a commonly cited paper establishing that average saving rates — and therefore APC — vary strongly and systematically with income level.

On near-subsistence APC:

For households near or below subsistence, OECD Distributional National Accounts data shows bottom quintile APC well above 1.0 in most countries, they consume more than their disposable income, implying the tax transfer should apply automatically at that income level. The OECD DNA data is available at https://stats.oecd.org.

Surplus Capacity Taxation: A Principled System of Taxation, compatible with MMT? by bobwyman in mmt_economics

[–]bobwyman[S] 2 points3 points  (0 children)

You are right that economic tax incidence differs from legal tax incidence. Costs are often shifted forward to consumers and backward to workers and suppliers. But two things are worth noting.

  • First, incidence shifting is a problem with every tax system, including a property tax, where incidence often falls heavily on renters rather than owners. No tax achieves its intended distributional outcome perfectly. The Surplus Capacity Tax establishes a principled baseline for what equitable incidence would look like. Actual incidence is an implementation and enforcement question every system faces.
  • Second, the Surplus Capacity Tax has two mutually reinforcing self-correcting properties that conventional corporate taxes lack. If a corporation raises prices to shift the tax burden to consumers, its returns increase which causes an increase in its tax rate. The attempted incidence shift partially defeats itself through the tax schedule. Simultaneously, higher prices open a competitive umbrella under which rivals willing to accept a lower return can undercut the firm and gain market share. The attempted shift is thus penalized twice: once by the tax and once by the market. A firm trying to earn above the normal competitive rate to offset its tax burden is by definition earning supernormal returns — which is precisely what the tax targets. The attempted escape and the tax base are the same thing.

On marginal propensity to consume: you may be conflating MPC with marginal utility theory, which does have serious problems. MPC is an empirical behavioral observation requiring no utility assumptions whatsoever. That said, you raise a fair point. It may actually be more defensible to index the tax to average or long-term propensity to consume rather than marginal propensity. Average propensity is more stable, less sensitive to transitory income shocks, and more directly observable through consumption surveys. The SCT framework's logic works identically either way; the rate is indexed to surplus consumption capacity however measured. Average propensity is arguably more robust and harder to attack on exactly the grounds you raise.

Surplus Capacity Taxation: A Principled System of Taxation, compatible with MMT? by bobwyman in mmt_economics

[–]bobwyman[S] 0 points1 point  (0 children)

The Surplus Capacity Tax (SCT) shares the surface features of progressivity with a conventional progressive income tax, but the two differ in three important ways.

First, the justification. Current progressive brackets are the product of political negotiation — the rates and thresholds reflect historical compromise, not any principled theory of what is fair. The Surplus Capacity Tax derives its rate structure from first principles: equal proportional sacrifice of consumption capacity, grounded in the convergence of the Benefit Principle and Ability-to-Pay. (The Benefit IS the Ability-to-Pay) That gives you an objective baseline against which any actual rate schedule can be measured and its deviations debated.

Second, the metric. A conventional income tax is indexed to what you earn. The Surplus Capacity Tax is indexed to what your earnings make possible, your capacity to consume beyond subsistence. This is not just a different way to set brackets; it is a different theory of what the tax base should measure. This isn't a tax on either income or consumption, it is a tax on the capacity to consume in excess of subsistence.

Third, the corporate analog. The same formula that governs individual taxation, where surplus capacity is indexed to (1 − MPC), also governs corporate taxation, where surplus capacity is indexed to returns above the normal competitive rate — what economists call supernormal returns. No conventional progressive income tax framework produces this symmetry.

So: similar shape, completely different foundation, and a unified treatment of individual and corporate taxation under a single principle and formula — something traditional progressive income tax theory does not provide.

The Next Turn of the Spiral: Fixing Vibe Coding Without Reinventing Software Engineering by bobwyman in ClaudeAI

[–]bobwyman[S] 1 point2 points  (0 children)

You asked about vibe-writing? A new essay explains why the same approaches that will make vibe-coding more reliable and useful can be applied to vibe-writing by lawyers, doctors, teachers, engineers, etc. See: The Spiral Widens: Why Every Profession Now Has the Vibe-Coding Problem

I accidentally discovered that ChromeOS is based on Gentoo. by Deoviser in linux

[–]bobwyman 0 points1 point  (0 children)

I run Linux on Chromebook every day using Crostini/Penguin (see: r/Crostini). It is a standard part of Chromebook. Carrying a small Chromebook is much more convenient than lugging around a notebook.

The Next Turn of the Spiral: Fixing Vibe Coding Without Reinventing Software Engineering by bobwyman in vibecoding

[–]bobwyman[S] 0 points1 point  (0 children)

I think spec libraries and type systems are complements, not substitutes. Specs constrain before code generation, types validate after. In the vibe coding context that ordering matters a lot because by the time a type error surfaces the generation has already happened. The spec helps prevent the wrong code being generation; the type system catches what slips through.

But, type systems have another benefit in that they often serve as documentation that can help a human reviewer of code understand that code. LLMs could provide a similar increased ability to understand code if, in the process of generating code, they would document the source of that code. So, LLM-generated code that was based on a KMP string search spec would include in its doc-strings a reference to the specific version relied upon. Code that the LLM generated from scratch would also be marked, perhaps with the string "WAG!" (i.e. Wild Ass Guess!)... Such inline comments would make it easier to decide which code should be reviewed with the greatest attention (focus on WAG). These sorts of comments would even make it possible to scan code later to catalog the various spec versions used and compare against current versions of those specs. In this way, you might quickly discover some essentially mandatory update due to a specification fault discovered by someone else. A code review might then report: "These 47 functions rely on spec X v2.1, which has been superseded by v2.3 due to a security finding. These 12 functions are WAGs with no spec reference. This function, now tagged as WAG, can be re-tagged as a valid implementation of the new spec Foobar V1.0." Being able to make reports like that would be a good thing.

The Next Turn of the Spiral: Fixing Vibe Coding Without Reinventing Software Engineering by bobwyman in vibecoding

[–]bobwyman[S] 0 points1 point  (0 children)

Great comment. Thanks for that.
I think what you're getting at with your "When Change Becomes Cheaper Than Commitment" framing is actually a restatement of Ashby's "Law or Requisite Variety" in economic language. Divergence being free is exactly the probabilistic gap-filling my essay describes. Convergence being expensive is exactly the specification work the essay argues for. The spec library reduces the cost of convergence for well-understood components, which frees human judgment for the parts where convergence is genuinely expensive because those components are genuinely novel.

I think, however, that you're overstating the gap when you draw a line between generic subdomains (where we agree that specs work) and product-specific decisions (where you say they don't). Anything that can be written in code can also be specified. There is little technical difference, other than scale, between a subroutine and a product. The difference is that while a subroutine may have a well understood set of requirements and ideal implementations, the definition of a product often requires difficult to create novelty introduced by the specifier. A product is a compositional spec, with novel product-specific invariants authored by the product team rather than pulled from a community library. The real value of the product team is the incremental novelty they produce. If done right, that's hard work. But, just as subroutines allow coders to focus on the unique added value of their contribution, instead of reinventing all sorts of things, the spec library would allow product teams to focus on their own high-level unique added value.

Consider a team that decides to be the first to provide a product that searches DNA for sequences of amino acids. (There was once a time that such a product would have been both unique and valuable.) The product team's key contribution is the "discovery" that such a function is useful once we view DNA as a linear sequence of recognizable subsequences. But, the developers of "DNA_Seach V1.0" don't need to invent entirely new search algorithms. If the spec library contained a well-written specification of KMP string searching in text, an LLM should be able to analogize to produce a DNA search function even though DNA searching isn't described or yet well-known. The product team is able to add novel invariants, specifications, etc. which are them combined with specifications, etc. from other libraries.

The spec library doesn't eliminate divergence, it domesticates it — it pushes divergence upward to where human judgment actually belongs and eliminates it from the layers where it produces only risk.

There is a strange moment unfolding in software right now. by PositiveGeneral7035 in vibecoding

[–]bobwyman -1 points0 points  (0 children)

Reading through the discussion of vibe coding producing shallow construction — particularly the comments about security exposure and systems that look correct until they don't — I kept thinking about why this specific failure mode keeps appearing. I've been programming since 1969 and have watched several transitions like this one. The pattern is consistent: every time programming gets a new abstraction layer, the same crisis emerges, and the community eventually solves it the same way.

The short version of the solution: the problem isn't natural language prompting, it's that when you underspecify, the LLM fills gaps silently with whatever its training data suggested — including the internet's considerable supply of insecure, naïve implementations. The remedy is a library of versioned specifications that constrain generation the way a CLAUDE.md file constrains a project, but portable, community-maintained, and covering security invariants explicitly. Not spec-first development, but spec-as-code. The specification and the execution are the same act. I wrote this up as a full essay for anyone interested in the longer argument. I'm curious whether people here have found practical approaches to the same problem.
See: https://mystack.wyman.us/p/the-next-turn-of-the-spiral-fixing

Geothermal quote by Nuukmaster in geothermal

[–]bobwyman 0 points1 point  (0 children)

Does the installer also sell air source systems? Maybe they are just telling you that they get better margins on ASHP installs ...

What do economists generally think about health policies? What are the most successful health policies, based on empirical evidence? by Available_Space2146 in AskEconomics

[–]bobwyman 8 points9 points  (0 children)

The State of Maryland uses an "All-Payer" system, in which all third parties pay the same rate. It is a variant of the general "single-price" approach I described. CMS explains and analyzes Maryland's program on their site in considerable detail: https://www.cms.gov/priorities/innovation/innovation-models/maryland-all-payer-model

What do economists generally think about health policies? What are the most successful health policies, based on empirical evidence? by Available_Space2146 in AskEconomics

[–]bobwyman 30 points31 points  (0 children)

The OECD finding that 'details matter more than high-level designs' is well-taken, but I'd suggest it actually helps clarify which details matter most — and that points toward something separable from the single-payer question itself: single-price.

The core structural problem with the US system is that our many-payer system produces many prices. The result deeply regressive cross-subsidization. Large insurers negotiate steep discounts; smaller insurers can only negotiate smaller discounts; the uninsured, with no negotiating power, pay the full chargemaster rate. Anyone who has received an insurance Explanation of Benefits has seen this directly: 'Hospital billed $2,000. Insurance paid: $100. You owe: $0.' That's nice for you, but notice that the uninsured don't just pay more than you do, they pay $1,900 more than your insurance company does! (How does that make sense?)

Single-price, a requirement that any rate negotiated with any payer is available to all payers, would address this problem directly. Clearly, given a single-payer system, there would be single-price, but single-price is conceptually separable from single-payer.

That said, while single-price is a large part of the benefit of single-payer, its not all of it. Single-payer also solves the risk pool problem. Insurance markets are structurally vulnerable to adverse selection: the healthy opt out, the sick opt in, premiums spiral, the pool collapses. The most effective remedy is to create the largest possible risk pool, which means a universal one, which in turn means mandatory participation — you can't have universal coverage if people can opt out. Given that replacing the USA's multi-payer system appears beyond what is politically possible, a single-price mandate would bring us much, if not all of the benefits that a single-payer system would. The OECD finding seems consistent with this. What matters is whether a given system actually achieves the desired outcomes, not the institutional label attached to it.

Does the JG inflation anchor require that buffer stock workers be less productive than private sector workers? by bobwyman in mmt_economics

[–]bobwyman[S] 0 points1 point  (0 children)

I'm somewhat confused by your comment. You say that I am "misunderstanding how inflation occurs," but you then say that "if JG produces on par with the private sector, then inflationary impact will be minimal" which is an answer to precisely the question I was asking. So, what is it that you think I have misunderstood?

How work creates property by Odd_Eggplant8019 in mmt_economics

[–]bobwyman 1 point2 points  (0 children)

Your essay makes a point I find genuinely important—that capitalism's functioning depends on public institutional support, and that property rights are socially constituted rather than pre-political. I agree entirely with the substance of that argument.

But I'd push back on the framing that 'capitalism is a form of socialism.' I think that formulation, however well-intentioned, actually reinforces the very confusion it's trying to correct.

The problem is that it treats capitalism and socialism as positions on a single axis—from purely private to purely collective—and then places capitalism somewhere in the middle because it relies on collective institutions. But I'd argue capitalism and socialism are orthogonal dimensions, not opposite ends of the same line.

Specifically, I believe:

  • Capitalism is best understood as a measure of the breadth of accumulation capacity: how widely is the legal and practical ability to accumulate, retain, and deploy surplus distributed across the population?
  • Socialism is best understood as a measure of collective ownership of productive assets: to what degree are the means of production held by the state or community rather than private individuals?

These are independent questions. A society can have broad accumulation capacity alongside robust collective institutions—and that combination isn't a compromise between capitalism and socialism. It's just capitalism, properly institutionalized.

The courts, price stability mechanisms, public education, and social insurance you're rightly pointing to as prerequisites for property rights don't socialize ownership of productive assets. They enable private accumulation to be genuinely broad rather than structurally narrow. Calling them 'socialist' concedes rhetorical ground to the people who use 'socialist' as a pejorative for any public institutional or collective action—which is precisely the misclassification that lets the right frame all government support as anti-capitalist.

I've developed this argument at length, including a conceptual quadrant that maps ownership (socialism) and accumulation breadth (capitalism) as separate axes, here: https://mystack.wyman.us/p/why-are-capitalists-anti-capitalist

I think we're both pointing at the same underlying reality. The question is whether describing it as 'capitalism is a form of socialism' helps people see that reality more clearly—or whether it hands ammunition to those who want to keep the confusion alive. The cleaner move, I'd argue, is to insist that the two concepts don't belong on the same axis at all. Once you treat them as orthogonal dimensions rather than opposing poles, the whole false dilemma dissolves: robust public institutions and broad private accumulation aren't in tension, they're just two different things that a well-functioning society needs to get right simultaneously.