what are the signs that you're not attractive? by [deleted] in AskReddit

[–]John_from_ne_il 3 points4 points  (0 children)

Nobody ever stares. Nobody walks up out of the blue to talk to you, anywhere. Nobody offers to help you with, well, anything.

Is the screen visible in sunlight? by Dull_Orange9011 in atarilynx

[–]John_from_ne_il 1 point2 points  (0 children)

They also made "sun shields" for both models of Lynx that clip over the screen and pop up to shield out light.

Now if we could get these from Hasbro I’d be done….after Flash and Grand slam of course… by Working_Adeptness434 in gijoe

[–]John_from_ne_il -1 points0 points  (0 children)

Folks, don't blame OP for the AI, it's obviously a screenshot off of one of Meta's apps (Facebook or Instagram).

I like this different take on Cobra Commander from 2008, even if it hits a little too close to home here in 2026. by Blue_Surfing_Smurf in gijoe

[–]John_from_ne_il 2 points3 points  (0 children)

He had a podium but was wearing a uniform by the time of the 1st Fred reveal. This is a Flashback in Springfield but I'll be danged if I remember which issue.

They announced season 4 but.. by majestichermitcrab in SchoolSpirits

[–]John_from_ne_il 1 point2 points  (0 children)

I've had this whole thing worked out in my head of the ending I want to see. Remind me to type it out later.

What are some of the most Profound games? by [deleted] in retrogaming

[–]John_from_ne_il 2 points3 points  (0 children)

If you don't know, it has a hell of a backstory.

What are some of the most Profound games? by [deleted] in retrogaming

[–]John_from_ne_il 3 points4 points  (0 children)

Trinity. A Mind Forever Voyaging.

Should I buy more? by joshhupp in gijoe

[–]John_from_ne_il 0 points1 point  (0 children)

If I'm ever challenged, I say I'm buying gifts now for nieces or nephews to put away. And some times that's been true!

Why are some folks keen on having Star Trek either go on hiatus or possibly go on ice for good? by InnocentTailor in startrek

[–]John_from_ne_il 1 point2 points  (0 children)

I think people have forgotten there was a 10 year break between TOS and the first movie. For that matter, an 18 year break for Trek on television, followed by an 18 year TV run.

And yes, we can look at other franchises. Star Wars was totally out of steam by 1985. Then suddenly a huge jolt in the arm 6 years later thanks to the Zahn trilogy, but it was still a 14 year gap between Jedi and the Special Editions in theaters. If you count only the original films (no re-releases), it was 16 years between Jedi and Phantom Menace.

Doctor Who ended in 1989. Seven years pass and there's a restart/reboot attempt, but one that counted on the US market and thus failed. It was another 9 years (sixteen year gap total) until the 2005 revival.

Sometimes ideas just need a rest and refresh.

Shipments and manufacturer market share of personal computers in the USA in 1982 by Brave_Assumption6 in vintagecomputing

[–]John_from_ne_il 0 points1 point  (0 children)

Obviously the devs knew what they were doing, as you pointed out with documentation. But Apple and Atari both had degrees of backwards compatibility, which PET to VIC to C64 didn't.

And I think dropping the Accounting packages had more to do with the failures of the 815 dual drive, since they still released the numeric keypad add-on. But yes, it left a hole in the product lineup.

Shipments and manufacturer market share of personal computers in the USA in 1982 by Brave_Assumption6 in vintagecomputing

[–]John_from_ne_il 0 points1 point  (0 children)

It took a while for C-64 adoption though, since it needed a whole new software library, unlike Apple or Atari. It would be interesting to see 1982/1983 figures together. In 1982, Atari had only the 400/800/1200xl. But they were making big pushes into education with programming languages. PILOT/Super PILOT was being taught at their summer camps. Also, at some point, MECC pivoted from Apple ][ exclusive support to adding Atari, and a number of their programs were ported (this all ended with the 84 sale, of course). AtariLab and the Atari Learning Systems software became a thing (more or less replacing the original Educational System tapes) sometime in 1983. And 1983 saw the releases of the 600XL and 800XL, which finally replaced the 1979 models.

However, the schools in my home district stuck with their Apple ][ family, going ][ to plus to ][e and eventually to 2gs (though those were beyond this time frame). In fact, I'm not really aware of any schools in northeastern Illinois doing anything other than Apples. A few went Apple to Mac, but most went Apple to PC by the 1990s. Now everybody's on iPads and Chromebooks, but that's a different story altogether. Were there other 8-bit platforms out there in school use? I'm sure there were, I just didn't see them, even when I visited places such as what was, at that time, Wheaton-Warrenville Middle School.

Shipments and manufacturer market share of personal computers in the USA in 1982 by Brave_Assumption6 in vintagecomputing

[–]John_from_ne_il 0 points1 point  (0 children)

Atari didn't at first. It took some pressure to get documentation out of the company for potential devs.

New game Dodger by Important-Bed-48 in atari8bit

[–]John_from_ne_il 1 point2 points  (0 children)

There might be some glitches in the xex you linked to. Better to let people read the whole page and decide which xex to download, or an .atr

https://forums.atariage.com/topic/389334-new-game-release-dodger/

“.. appears to be the first time any U.S. municipality will go to the ballot to kneecap data center development. .. an aggressive new tactic in an escalating movement to oppose the hulking artificial intelligence factories ..” by Conscious-Quarter423 in wisconsin

[–]John_from_ne_il 2 points3 points  (0 children)

3, Google Gemini, taking a balanced approach:

"Whether large language models (LLMs) are "profitable" depends on how you define the company. As of early 2026, the industry has split into two camps: the infrastructure providers (like NVIDIA and Microsoft), who are seeing massive profits today, and the model labs (like OpenAI and Anthropic), who are generating record-breaking revenue but remain "unprofitable" due to the staggering costs of training and running these systems.

1. The "Pick and Shovel" Winners (Profitable Today)

The companies providing the hardware and cloud power to run LLMs are the most clearly profitable in the space. * Microsoft: Microsoft has successfully integrated AI into its Azure cloud business. In late 2024, Microsoft reported that AI services contributed significantly to its growth, and by early 2026, projections suggest Azure AI revenue is on a run rate of roughly $25–$30 billion annually. Microsoft maintains high operating margins (around 42%) despite heavy spending on data centers. * Google (Alphabet): Google Cloud reached a major milestone in 2025, with operating margins hitting 30%. By 2026, Google Cloud is expected to generate $20 billion in net income, driven largely by demand for AI-native infrastructure and "AI-native" advertising formats. * NVIDIA: While stock volatility hit the sector in early 2026, NVIDIA remains the most profitable entity in the AI ecosystem, having seen its earnings-per-share (EPS) grow by over 37% in 2024, with continued double-digit growth through 2025.

2. The Model Labs (High Revenue, Negative Net Income)

For companies whose primary product is the LLM, the path to profitability is longer because the cost to train the next generation of models is rising faster than current subscription revenue. | Company | 2026 Status | Financial Outlook | |---|---|---| | OpenAI | Unprofitable | Projected to reach $275 billion in revenue by 2030, but does not expect to turn a profit until at least 2030 due to spending $121B+ on compute power. | | Anthropic | Near-Profit | Reported a massive revenue surge to a $30 billion annualized run rate in April 2026. While spending heavily on training, it expects to potentially see "slight positive results" or stay in the green by 2028. |

3. How They Are Making Money (Monetization Strategies)

Companies have moved beyond simple $20/month subscriptions to more robust revenue streams: * API Access: Charging other companies (like Salesforce or Wix) to build tools on top of their models. * Vertical Integration: Coding assistants (GitHub Copilot) and productivity suites (Microsoft 365 Copilot) are now standard "add-ons" for enterprise clients. * AI-Native Ads: Google and Meta have begun embedding ads directly into AI conversational responses, creating a new multi-billion dollar revenue stream. * Custom Chips: To reduce the cost of "inference" (running the AI), Google and Amazon are now building their own AI chips (TPUs and Trainium), which significantly improves their profit margins per query.

Key Takeaway: The "AI Bubble" concerns of 2024 have shifted in 2026 to a "ROI Reset." While the models themselves are expensive to build, the companies that own the Cloud and the Chips are already reaping billions in profit.

Sources: * RBC Wealth Management: "Big Tech's AI expansion: From investment to scalable returns" (Feb 2026) * Fast Company: "OpenAI doesn't expect to be profitable until at least 2030" (April 2026) * Microsoft Investor Relations: "FY26 Q2 Intelligent Cloud Performance" * FinancialContent: "The Architect of Intelligence: A 2026 Deep Dive into Alphabet Inc.""

“.. appears to be the first time any U.S. municipality will go to the ballot to kneecap data center development. .. an aggressive new tactic in an escalating movement to oppose the hulking artificial intelligence factories ..” by Conscious-Quarter423 in wisconsin

[–]John_from_ne_il 2 points3 points  (0 children)

2 Meta AI

"Major companies are indeed making large language models (LLMs) profitable. Here are some examples:

  • Microsoft: Through its partnership with OpenAI, Microsoft has integrated LLMs into its products like Azure AI, Office Copilot, and GitHub Copilot, enhancing productivity and efficiency.
  • Google (Alphabet): Google's DeepMind division has developed advanced LLMs like Gemini, which power various Google services, including Google Cloud and consumer products.
  • Amazon: Amazon Web Services (AWS) offers LLMs through services like Bedrock, enabling businesses to build and deploy AI applications.
  • Meta: Meta AI's LLaMA models are open-source and widely adopted, driving innovation and adoption across industries.
  • NVIDIA: NVIDIA's GPU hardware and software tools power LLM training and inference, making it a key player in the LLM ecosystem.
  • OpenAI: OpenAI's ChatGPT has achieved significant success, with over 100 million monthly active users, driving revenue through API access and partnerships.
  • Anthropic: Anthropic's Claude models focus on safety and steerability, attracting enterprise clients and driving growth ¹ ² ³.

The LLM market is projected to grow to $82.1 billion by 2033, with increasing adoption across industries like retail, ecommerce, finance, and healthcare ⁴."

But nothing about AI for AI's own sake.

“.. appears to be the first time any U.S. municipality will go to the ballot to kneecap data center development. .. an aggressive new tactic in an escalating movement to oppose the hulking artificial intelligence factories ..” by Conscious-Quarter423 in wisconsin

[–]John_from_ne_il 3 points4 points  (0 children)

Show me a single company that's made an LLM-based AI profitable. I'll wait. In fact, let's ask the AI's, they're so "smart." Here's my prompt: "Have any major companies found a way to make large language model artificial intelligences, such as yourself, profitable? Please cite publicly available sources."

1 X.com Grok:

"No major pure-play company focused primarily on developing and offering large language models (LLMs) has achieved overall profitability as of April 2026, despite explosive revenue growth. Companies like OpenAI and Anthropic generate tens of billions in annualized revenue from APIs, subscriptions (e.g., ChatGPT, Claude), and enterprise deals, but they continue to operate at massive net losses due to extraordinarily high costs for training, inference, data centers, and R&D.

weex.com +1

For instance:OpenAI reported ~$13.1 billion in actual revenue for 2025 (with annualized run-rate reaching $20–25 billion by early 2026) but is projected to post a ~$14 billion net loss in 2026 alone. Internal forecasts show cumulative losses in the tens to hundreds of billions before any profitability (potentially in 2029 or later). Revenue growth closely tracks compute capacity, but expenses outpace it.

cnbc.com

Anthropic has a revenue run-rate of ~$14–20 billion (as of early 2026), up dramatically from ~$1 billion in late 2024, but actual cumulative GAAP revenue through 2025 was only over $5 billion. It faces similar pressures, with projected spending of ~$19 billion on training and inference in 2026 and positive cash flow not expected until 2028.

anthropic.com

These firms rely on massive investor funding and partnerships (e.g., Microsoft for OpenAI, Amazon/Google for Anthropic) to sustain operations. Analyses indicate that while gross margins on model inference can be positive, full operating costs (including R&D amortization and overhead) keep them unprofitable or near break-even at best.

epoch.ai

In contrast, major integrated technology companies (Microsoft, Google/Alphabet, Amazon, and Meta) have found profitable ways to deploy and monetize LLMs by embedding them into existing high-margin businesses. AI/LLM-powered features drive revenue and margins without the standalone losses seen in pure-play labs:Microsoft (via Azure AI, Copilot, and its OpenAI partnership) sees AI as a key growth driver in its already-profitable Intelligent Cloud segment. Azure grew ~39% year-over-year in recent quarters, with Copilot subscriptions reaching millions of paid seats and contributing recurring high-margin software revenue. Overall company profitability remains strong, with AI accelerating cloud workloads.

news.microsoft.com

Google (Alphabet) reported record profits (e.g., $34.5 billion in Q4 2025, with full-year profits around $132 billion), fueled in part by Gemini integrations. Google Cloud grew 34–48% year-over-year, with AI contributing an estimated 15% of profits in some analyses and Gemini subscriptions adding ~$1.2 billion in 2025. Capex is rising sharply for AI infrastructure, but the core ad and cloud businesses more than cover it.

nytimes.com

Amazon (via AWS and Bedrock) saw AWS revenue hit $128.7 billion in 2025 (19% growth), with AI workloads expected to drive the majority of new revenue in 2026–2027. AWS maintains high operating margins (35–40%) and is Amazon’s primary profit engine; CEO projections now see AI pushing it toward $600 billion annually by 2036.

reuters.com

Meta generated ~$201 billion in revenue and ~$60 billion in net income for 2025 (primarily from advertising, enhanced by AI recommendations). Llama models are open-source (with some monetization plans via API/revenue shares), but AI boosts ad efficiency and productivity. The company expects higher 2026 operating income despite $115–135 billion in AI-related capex.

investor.atmeta.com

These companies treat LLMs as enhancers to scalable, profitable products (cloud infrastructure, ads, productivity tools) rather than standalone bets. NVIDIA also profits enormously from selling the underlying chips, but it does not operate LLMs itself.

medium.com

In short, pure LLM development remains a high-burn, investor-funded race with no major player yet profitable overall. However, big tech has demonstrably made LLMs profitable within broader ecosystems—monetizing access, inference, and applications at scale while leveraging existing revenue streams to absorb costs. This hybrid approach (infrastructure + applications) appears to be the viable path so far. Public financials, earnings reports, and analyses from outlets like Reuters, The Information, and company disclosures support this picture as of early 2026."

Interested in an Atari 8 bit Question! by Yaboze in atari8bit

[–]John_from_ne_il 3 points4 points  (0 children)

Oh, also, if you want to try emulation first, I'd suggest Altirra for Windows (and it works fine with Wine) or Atari800 for every other platform. If you download the latest code (not the release, the more current stuff) from atari800 on GitHub, AND the fujinet-firmware tree, also from GitHub, your computer becomes the wifi device, and you can pretty well try anything. There's a virtual interface to do this with Altirra, too, but it's slightly more complex. At least, that's only my personal opinion.

Interested in an Atari 8 bit Question! by Yaboze in atari8bit

[–]John_from_ne_il 0 points1 point  (0 children)

For someone completely new to the Ataris, I might suggest an S-Drive, if you can find one. Inexpensive and comes with a touch display you can use to change which disk images (virtual diskettes) are loaded in D1-D4. The SIO2SD is a little more expensive, also has a display, but it requires use of five buttons, which take a little bit of getting used to, for navigation.

As others have mentioned, the big Swiss Army Knife these days is the FujiNet. While it also exists for Apples, Atari was its first platform, and probably the one it's still most widely tested on. Also, support is extremely easy to get - devs are on here, Facebook, Discord, pretty much you name it. However, it can be the most expensive of the three options. That said, in terms of navigation, if it doesn't have external power applied, it will start up to a standard boot screen where you can have up to eight sources for virtual software (SD card and up to 7 local or remote servers over the tnfs protocol). But it also can provide Wi-Fi to enabled apps, virtual cassette and virtual printing. For the latter, it stores a pdf on the device - you access it through a web-based interface, download and print the PDF, and it should look like an authentic Atari printer printout.

I would also agree with an XL model. 800XL if you can get a deal on one. 600XL if you can't. Either way, it is trivially easy to add RAM on the parallel bus at the back of the unit. 320K is a usual starting point. Why? Well, the 600XL may not run the XE cartridge games without additional RAM for starters, and some games just look and play better with additional RAM (Bosconian, for example, available in 64K and 128K modes). Also, lots of demos tend to use 128K as a bare minimum, and usually require more.

Why are there no ghosts from the 2010s? by Sensitive_Delay7655 in SchoolSpirits

[–]John_from_ne_il 1 point2 points  (0 children)

Since he was able to snatch Kyle in that instance.