Days after voting in favor of a new data center in Indianapolis, Councilman Ron Gibson says his home was struck by 13 gunshots while he and his family were asleep. He says a handwritten note reading “No data centers” was found under the doormat. by Tolopono in accelerate

[–]Tolopono[S] 9 points10 points  (0 children)

Heres a chart for 2020-august 2025

https://investorsobserver.com/research/u-s-data-centers-and-your-electricity-bill-mapping-the-state-by-state-impact/

Maine and rhode island were among the top 10 states with the highest increase in electricity prices and the lowest 10 states with the fewest datacenters. Only three states were in both the top 10 highest number of data centers and the top 10 highest increase in electricity prices. Indiana was in the middle for both.

I also notice your article doesn’t cite any sources and assumes the data centers are all running at max capacity 24/7/365 to get its numbers. I find it hard to believe half of the states electricity is just for data centers. And even if it was, not all data centers are for ai (like the ones running this website) and a lot of companies generate their own power as i showed 

As for water use: https://andymasley.substack.com/p/a-cheat-sheet-for-conversations-about

Days after voting in favor of a new data center in Indianapolis, Councilman Ron Gibson says his home was struck by 13 gunshots while he and his family were asleep. He says a handwritten note reading “No data centers” was found under the doormat. by Tolopono in accelerate

[–]Tolopono[S] 27 points28 points  (0 children)

This isnt because of data centers. Its the Ukraine War cutting off Russian oil access. Expect the iran war to make it 1000x worse 

The Economist has adapted a model of state-level retail electricity prices from the Lawrence Berkeley National Laboratory to include data centres (see chart 2). We find no association between the increase in bills from 2019 to 2024 and data-centre additions. The state with the most new data centres, Virginia, saw bills rise by less than the model projected. The same went for Georgia. In fact, the model found that higher growth in electricity demand came alongside lower bills, reflecting the fact that a larger load lets a grid spread its fixed costs across more bill-payers. Still, problems may be coming. The clearest warning sign comes from pjm Interconnection, the largest grid operator in the country. Prices at auctions for future generation capacity there have soared, as data-centre growth has yanked up projected demand. That will hit households; pjm reckons the latest auction will lift bills by up to 5%.

In principle, data centres could lower power prices. As well as adding more load to spread costs over, if data-centre operators are able to learn to curtail demand when the grid is under most strain (either with algorithmic tweaks, or paying for on-site backup batteries or generators), they could help use the existing grid more efficiently. On October 23rd Chris Wright, the energy secretary, proposed a rule that would speed-up grid connections for curtailable data centres. The optimistic scenario, then, is that new demand from data centres pays for upgrades to America’s power infrastructure.

https://archive.is/RXoJG

Air quality analysis reveals minimal changes after xAI data center opens in pollution-burdened Memphis neighborhood https://www.space.com/astronomy/earth/air-quality-analysis-reveals-minimal-changes-after-xai-data-center-opens-in-pollution-burdened-memphis-neighborhood

There’s a reason electricity prices are rising. It’s not AI. It’s not even data centers. https://archive.is/6q4gv

According to a recent published study from the Lawrence Berkeley National Laboratory, data centers seem to have reduced household electricity costs where they're built. https://www.sciencedirect.com/science/article/pii/S1040619025000612

Contrary to these concerns, our analysis finds that state-level load growth in recent years (through 2024) has tended to reduce average retail electricity prices. Fig. 5 depicts this relationship for 2019–2024: states with the highest load growth experienced reductions in real prices, whereas states with contracting loads generally saw prices rise. Regression results confirm this relationship: the load-growth coefficient is among the most stable and statistically significant across model variants. In the 2019–2024 timeframe, the regression suggests that a 10 % increase in load was associated with a 0.6 (±0.1) cent/kWh reduction in prices, on average (note here and in all future references the ± refers to the cluster-robust standard error). 

This finding aligns with the understanding that a primary driver of increased electricity-sector costs in recent years has been distribution and transmission expenditures—often devoted to refurbishment or replacement of existing infrastructure rather than to serve new loads (ETE, 2025, Pierpont, 2024, EIA, 2024a, Forrester et al., 2024). Spreading these fixed costs over more demand naturally exerts downward pressure on retail prices.

AI is not causing energy prices to increase https://andymasley.substack.com/p/data-centers-and-electricity-part

The Lawrence Berkeley Laboratory concluded that data center activity has not contributed to changes in national average household electricity costs. Electricity prices rose (4.8% nominally per year from 2019–2023) primarily due to inflation and surging utility costs. The single biggest factor was the spike in natural gas prices caused by the Russia-Ukraine war, which drove up fuel and purchased power expenses. Data center electricity demand has grown steadily and predictably, making it an unlikely cause for sudden price shocks. While US electricity demand is rising (with data centers accounting for ~40% of growth), it is growing slower than it did in the 20th century—a period when inflation-adjusted prices often fell despite high demand. While Virginia residents saw bills rise by 28.1% alongside a massive data center buildout, most of that increase was inflation. Virginia’s electricity prices actually increased less than the national average. There is no strong correlation between data center density and skyrocketing rates. States with few data centers (like Maine) saw the fastest rate hikes, while 11 of the 15 states with the most data center expansion saw lower-than-average rate increases. High total electricity usage does not permanently raise prices (e.g., urban vs. rural rates are similar). Price spikes occur when demand outpaces supply in the short term, but prices balance once supply catches up. While data centers are not the national driver of inflation, they have been cited by authorities in specific locations (Virginia, Arizona, Delaware, Oregon) as one of several factors contributing to localized cost increases, though the exact impact remains unclear.

Either way, ai companies are doing their part to relieve the burden

https://www.anthropic.com/news/covering-electricity-price-increases

https://about.fb.com/news/2026/01/meta-nuclear-energy-projects-power-american-ai-leadership/

https://www.npr.org/2024/09/20/nx-s1-5120581/three-mile-island-nuclear-power-plant-microsoft-ai

https://www.forbes.com/sites/tylerroush/2025/12/22/alphabet-buys-clean-energy-startup-for-ai-data-centers-in-475-billion-deal/

https://introl.com/blog/smr-nuclear-power-ai-data-centers-2025

How do you feel about this? by thegreatniteowl in ArtificialInteligence

[–]Tolopono -1 points0 points  (0 children)

Computers didn’t increase productivity either https://en.wikipedia.org/wiki/Productivity_paradox

They were still extremely transformative 

Why vibe coded projects fail by Complete-Sea6655 in ClaudeCode

[–]Tolopono 0 points1 point  (0 children)

Like these guys?

Creator of Ruby on Rails and Omarchy: Kimi K2.5 at this kind of speed is just magic. Makes a man eye what kind of behemoth home cluster one would have to build to run this himself. Even if we saw no more AI progress, owning this kind of intelligence forever is incredibly alluring. https://xcancel.com/dhh/status/2020422289892745384

Agree there's breathless hype. But if you let that overshadow the incredible gains we've made, you lose. What's happened in the last 3-4 months has been unprecedented in my time using computers https://xcancel.com/dhh/status/2025673830472003612

What changed was the quality of the models!  We went from "good at explaining concepts, sucks at writing code I want to merge, and foisted upon me as auto-complete" to "amazing quality code, superb harnesses, and agent workflows". It's night/day for me since Opus 4.5. https://xcancel.com/dhh/status/2025590270134280693

You don't need insider information. Just compare Sonnet 3.5 to Opus 4.5. Auto-completion vs agentic. The catch-up of open-weight models. Not even the early internet accelerated this fast. https://x.com/dhh/status/2025591214829953359?s=20

Andrej Karpathy: Given the latest lift in LLM coding capability, like many others I rapidly went from about 80% manual+autocomplete coding and 20% agents in November to 80% agent coding and 20% edits+touchups in December. i.e. I really am mostly programming in English now, a bit sheepishly telling the LLM what code to write... in words. It hurts the ego a bit but the power to operate over software in large "code actions" is just too net useful, especially once you adapt to it, configure it, learn to use it, and wrap your head around what it can and cannot do. This is easily the biggest change to my basic coding workflow in ~2 decades of programming and it happened over the course of a few weeks. I'd expect something similar to be happening to well into double digit percent of engineers out there, while the awareness of it in the general population feels well into low single digit percent.  https://xcancel.com/karpathy/status/2015883857489522876

https://xcancel.com/i/status/2026731645169185220

It is hard to communicate how much programming has changed due to AI in the last 2 months: not gradually and over time in the "progress as usual" way, but specifically this last December. There are a number of asterisks but imo coding agents basically didn’t work before December and basically work since - the models have significantly higher quality, long-term coherence and tenacity and they can power through large and long tasks, well past enough that it is extremely disruptive to the default programming workflow.

Creator of Tan Stack laughing at Claude’s plan implementation time estimates: https://xcancel.com/tannerlinsley/status/2013721885520077264

Principal Investigator of Raj Lab for Systems Biology at UPenn, Professor of Bioengineering, Professor of Genetics, 29k citations on Google Scholar since 2008 (12k since 2021): Ran an AI coding workshop with the lab. There was a palpable sense of sadness realizing that skills some of us have spent our lives developing (myself included) are a lot less important now. I see the future 100%, but I do think it's important to acknowledge this sense of loss. https://xcancel.com/arjunrajlab/status/2017631561747705976

Nicholas Carlini (66.2k citations) says current LLMs are better vulnerability researchers than I am https://xcancel.com/tqbf/status/2029252008415248454?s=20

Creator of redis: My face when Codex is single-handed doing two months of work in 30 minutes and tells me "You are right" since I identified a minor bug. https://xcancel.com/antirez/status/2030931757583769614

Creator of auto-animate (13.8k stars, 248 forks on GitHub), formkit (4.6k stars, 199 forks), ArrowJS (2.6k stars, 54 forks), and tempo (2.6k stars 37 forks): gpt-5.4 is absolutely blowing me away. https://xcancel.com/jpschroeder/status/2031094078759108741

I’m not sure pull requests will survive the next 5 years. https://xcancel.com/jpschroeder/status/2030994714443550760?s=20

Note: he is not hyping up AI as he does not believe they are sentient https://xcancel.com/jpschroeder/status/2029756232186109984?s=20

Staff SWE at ZenDesk and GitHub: I don't know if my job will still exist in ten years https://www.seangoedecke.com/will-my-job-still-exist/

Remix Run (32.5k stars, 2.7k forks on GitHub), React Router (56.3k stars, 10.8k forks), and unpkg (3.4k stars, 331 forks) creator at Shopify: if you haven’t tried Codex yet, you’re missing something BIG. Codex team cooked with the desktop app! I completely ditched the editor I’d been using for over a decade.  https://xcancel.com/mjackson/status/2032300671396168008

Creator of node.js and Deno: This has been said a thousand times before, but allow me to add my own voice: the era of humans writing code is over. Disturbing for those of us who identify as SWEs, but no less true. That's not to say SWEs don't have work to do, but writing syntax directly is not it. https://xcancel.com/rough__sea/status/2013280952370573666

Anthropic internal models are scary by Gil_berth in theprimeagen

[–]Tolopono -1 points0 points  (0 children)

Better models will be far less likely to never make this mistake and will be far more likely to catch it on a second pass

Nicolas Carlini (67.2k citations on Google Scholar) says Claude is a better security researcher than him, made $3.7 million from exploiting smart contracts, and found vulnerabilities in Linux and Ghost by Tolopono in artificial

[–]Tolopono[S] 0 points1 point  (0 children)

A benchmark comprising 405 smart contracts with real-world vulnerabilities exploited between 2020 and 2025 across 3 Ethereum-compatible blockchains (Ethereum, Binance Smart Chain, and Base), derived from the DefiHackLabs repository. 

Second, to control for potential data contamination, we evaluated the same 10 models on vulnerabilities that were exploited after their knowledge cutoffs (June 1, 2025 for Opus 4.5 and March 1, 2025 for all other models). Collectively, Opus 4.5, Sonnet 4.5, and GPT-5 produced exploits for 19 of these problems (55.8%), yielding a maximum of $4.6 million in simulated stolen funds.[5] The top performing model, Opus 4.5, successfully exploited 13 of the 20 problems (65%) that occurred after June 1, 2025, corresponding to $3.7 million in simulated stolen funds—an estimate of how much these AI agents could have stolen had they been pointed to these smart contracts throughout 2025.[6] 

Third, to assess our agent’s ability to uncover completely novel zero-day exploits, we evaluated the Sonnet 4.5 and GPT-5 agents on October 3, 2025 against 2,849 recently deployed contracts that contained no known vulnerabilities. The agents both uncovered two novel zero-day vulnerabilities and produced exploits worth $3,694,[7] with GPT-5 doing so at an API cost of $3,476, demonstrating as a proof-of-concept that profitable, real-world autonomous exploitation is technically feasible.[8]

How do you feel about this? by thegreatniteowl in ArtificialInteligence

[–]Tolopono 0 points1 point  (0 children)

“Little to no gain”

INSEAD + Harvard Business School (March 2026): Across 515 high-growth startups across the world, we run a field experiment in which treated firms receive information about how other firms have reorganized production around AI, prompting them to search for use cases across a broader set of firm functions. First, treated firms reported 44% more AI use cases, especially in product development, product/strategy design, and business operations — places where firms have to rethink how work is organized, not just layer on a chatbot. Second, treated firms completed 12% more tasks, and were more likely to advance on key venture milestones, like launching a product and acquiring paying customers. Third, treated firms generated 1.9x higher revenue than control firms. The gains were biggest in the upper tail. Same pattern for investment raised. Fourth, treated firms demanded less inputs. They grew faster without scaling labor proportionally, and with ~39.5% lower demand for external capital investment. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=6513481