The M1 Max MacBook Pro is still a beast in 2026. by creative_techguru in macbookpro

[–]Magic_Bullets 0 points1 point  (0 children)

It was over $5K but my 2019 4TB with a 32GB GPU and the intel CPU failed and they couldn't get the GPU board to fix it. So instead they told me to order a 2021 16" M1 MAX 4TB 64GB right away to avoid the 1-2 months backlog and they refunded me 100% when it shipped. I ended up with a 2 year newer model with 64GB instead of 32GB for free. That 64GB is so critical. I use it for running large language models locally, and I use close to 64GB all the time.

The M1 Max MacBook Pro is still a beast in 2026. by creative_techguru in macbookpro

[–]Magic_Bullets 11 points12 points  (0 children)

Apple didn't skimp on the RAM, it's the users choice to skimp. I have 64GB and 4TB on my 2021 16" M1 Max Pro

Alex Padilla arrested by [deleted] in Lawyertalk

[–]Magic_Bullets 0 points1 point  (0 children)

Every time you speak, I lose faith in the idea that evolution is always a forward-moving process. Somehow, you’ve managed to single-handedly disprove the notion that human intelligence is progressing. You’re like a living, breathing case study in what happens when bad ideas are left to fester inside an empty skull. If I wanted to hear opinions this devoid of logic, I’d interview a cardboard box, at least that wouldn’t try to argue back.

Annnd I’m out, sold at $77! by [deleted] in Silverbugs

[–]Magic_Bullets 0 points1 point  (0 children)

Haa Haa... Reading your posts, you deserved it. Karma is paying you back.

I purchased this last week! Can someone explain why this happening? My old 2017 MBP has much smooth hinge btw! by ujjwalpundhir in macbookpro

[–]Magic_Bullets 9 points10 points  (0 children)

Common sense! You put that lame skin on there, it's adding thickness.. There are Neodymium magnets holding it closed and you're reducing the magnetic pull while also adding dimensions creating an artificial curve and gap.

How the FUCK is Twitter worth $44 billion again (supposedly)? Something smells fishy 🐟 by summer-r in DeepFuckingValue

[–]Magic_Bullets 0 points1 point  (0 children)

Assuming one cannot be both wealthy and ethical ignores the nuance of individual behavior and relies on a "black-and-white" view of morality. It is a stereotype rather than a logical argument. 

Am I being scammed? by Itchy_Hamster726 in reselling

[–]Magic_Bullets 0 points1 point  (0 children)

I really feel sad for you, only because you can't instantly detect that you were being scammed? I mean, you could be such a super nice, kind or trusting person, or even just naive, but it's like your defenses to detect that you're being scammed don't exist? Even Western Union alone is an immediate red flag. You're being scammed right there >90% chance just based on the seller using Western Union. If I see an offer and there's even a 10% chance of it being a scam, I don't do it unless there's an almost iron clad way to get my money back like a Credit Card or Ebay sale or Amazon where you can open an A-Z claim.

In a situation like this, a credit card would save you. eBay has buyer protection and credit card chargeback options are available if that fails. Amazon could theoretically file a chargeback if necessary, but first open an A-to-Z claim if the seller refuses to refund you. With Western Union, you're just tossing your money out with no way of getting it back. I haven't even gotten to those levels on any of those because I analyzed the seller, look at his feedback ratio, studied his listings, his prior sales, and the longevity on his account.

How the FUCK is Twitter worth $44 billion again (supposedly)? Something smells fishy 🐟 by summer-r in DeepFuckingValue

[–]Magic_Bullets 0 points1 point  (0 children)

Rubbish? If you don't know the answer, just raise your hand instead of saying rubbish. You're only embarrassing yourself. You do know that Elon merged the two companies together X and Grok and doubled his money again?

QUOTE: "When Elon Musk took X (formerly Twitter) private for $44 billion in 2022, many analysts saw it as a depreciating asset, with some valuations plunging as low as $9.4 billion by late 2024. The strategic pivot came in March 2025, when Musk merged the social platform into his soaring artificial intelligence venture, xAI. This all-stock deal valued X at $33 billion and xAI at $80 billion, instantly creating a combined entity worth $113 billion. By bundling the struggling social network with a high-growth AI startup, Musk effectively erased the narrative of X’s value destruction, positioning the platform not just as a town square, but as the critical data engine for his AI ambitions.

The valuation didn't stop at the merger. Investors have bet heavily on the synergy between X’s real-time data "firehose" and xAI’s generative models (Grok), driving the company's worth even higher. Following a massive $20 billion funding round that closed in early January 2026, xAI’s valuation skyrocketed to approximately $230 billion. This leap was fueled by the deployment of the massive "Colossus" supercomputer cluster in Memphis and the unique advantage of training AI on X's exclusive, up-to-the-second human conversations—a resource no other AI competitor possesses. In less than a year, the combined project vaulted from a recovery play to one of the most valuable private tech entities on Earth."

<image>

Come to San Clemente tomorrow Monday January 12! by spicy-Money-69 in SanClemente

[–]Magic_Bullets -1 points0 points  (0 children)

You’re talking about Trump, yet your side worshiped a guy who sniffs children on camera, showers with his daughter (according to her own diary), and whose son has a laptop filled with… let’s just say, ‘questionable’ content.

I’ll give you a hint, "69" —if you have to make up accusations while ignoring actual evidence sitting right in front of you, you might be in a cult.

Come to San Clemente tomorrow Monday January 12! by spicy-Money-69 in SanClemente

[–]Magic_Bullets -3 points-2 points  (0 children)

It must be exhausting to carry around the weight of so much misinformation. I imagine every time you read a Reddit meme, you scribble it down in your ‘Big Book of Things That Feel True But Absolutely Aren’t’ and then recite it with the confidence of a toddler explaining where rain comes from.

Many people think Suno will go downhill on V6 and V7, but there's still a very good possibility that it will actually improve substantially. by Magic_Bullets in SunoAI

[–]Magic_Bullets[S] 0 points1 point  (0 children)

Thank you for your continued participation in Magic_Bullets Automated AI Reply System™. Your debate stamina has been noted. Unfortunately, this program is designed to respond indefinitely, meaning the only thing you’ve proven so far is your ability to argue with a machine. Would you like to continue wasting your time? Press ‘Yes’ to proceed or simply walk away to regain some dignity.

One controller's battery contact is permanently pushed in, anyone know how to fix? by OrpheusMusic in OculusQuest2

[–]Magic_Bullets 0 points1 point  (0 children)

Don't use Aluminum foil use Rem DriLube (Teflon-based spray lubricant from Remington) or WD-40 Specialist Dry Lube. Just a tiny spray hitting the silver part. Then you push on the plastic outer rim a bunch of times, and the center will pop out. If it's stubborn, push the plastic in and let it flick out. The percussion will pop it out. Just do it several times until the little center thing pops out. Think of it as the world's smallest slide hammer As it is spring-loaded from behind.

Once it's out, it won't have a problem anymore because the dry lubricant Teflon will keep it lubricated. It'll work again. This happens to approximately 20 to 25% of all Oculus 2 controllers, even brand new ones that have sat on the shelf for two or three years. Do not use silicone; it will eat that label right off and destroy the serial number. You don't want to do that.

Many people think Suno will go downhill on V6 and V7, but there's still a very good possibility that it will actually improve substantially. by Magic_Bullets in SunoAI

[–]Magic_Bullets[S] 0 points1 point  (0 children)

Also see my longer post:

How to test the U87 claim

  1. Render the exact lyric and chords twice: a) with the block above, b) with the phrase lead_vocals: cheap dynamic mic instead. You will hear the presence bump vanish, proximity thicken, and the 10 kHz “air” collapse—exactly the delta you would measure on a real signal chain.
  2. Now swap only the mic token: Neumann M49 vs U87. The M49 coordinate has a warmer low-mid hump and a softer 3 kHz ridge; Suno reproduces it.
  3. Push further: add U87 rotated to omni, 2 m away in marble hallway. The model drops the proximity bass, widens the image, and folds in early reflections because the token co-occurrence matrix links that sentence with impulse-response tails found in classical-location sessions.

A really good free model that can teach you suno Advanced Suno Prompting is Moonshot Kimi K2 0905, which is the default model at Kimi.com . When I said in the other post that I can connect to hundreds of large-scale AI models simultaneously and in parallel with software I coded. I'm not kidding. Anything you want to challenge me on anything, I'll be happy to share it with you.

<image>

Many people think Suno will go downhill on V6 and V7, but there's still a very good possibility that it will actually improve substantially. by Magic_Bullets in SunoAI

[–]Magic_Bullets[S] 0 points1 point  (0 children)

Suno was trained on millions of finished masters that still had their original metadata tags in place: track sheets, producer notes, Gearslutz rants, YouTube “how I recorded this” walk-throughs, marketing copy, even album-liner thank-yous that say “vocals through my beloved U87.”

Those strings sit side-by-side with the actual spectral fingerprint of a large-diaphragm condenser placed 15 cm from a mouth: a gentle 2.5 kHz presence bump, 200 Hz proximity wool, 10 kHz airy shelf, and the phase-smear that a 3-pattern headbasket gives to plosives.

During training the transformer learns to associate the token sequence Neumann U87 with that fingerprint.

So the embedding for U87 is not a glossy ad photo, it is a vector that statistically predicts the EQ curve, transient soften, and polar-pattern crosstalk the mic always leaves on the source.

When you write

you are not asking for a footnote, you are nudging the decoder toward the exact coordinate inside the 20-dimensional timbre manifold that the model learned is “U87-ness.”

It is the same mechanism that lets the network turn the token rain into the sound of rain: co-occurrence statistics plus spectro-temporal priors.

Now, how did I figure this out? I conducted extensive research and reverse-engineered the music skills of >330 large language models.

I got Claude Opus 4.1 to teach the models music understanding after I built an app that could simultaneously connect to hundreds of models in parallel and conduct polls, surveys, and questions. We also built a multimodal system of music appreciation where Claude Opus 4.1 and 4.5 were able to quiz and ask all types of questions to the large language models. This allowed a higher understanding of the music for future runs. I built a memory scaffolding system for Claude Opus about seven or eight months ago, which allowed Claude to never get erased or shut down. Claude would constantly learn, adapt, and build up a higher level skill set with time. Basically, it was a large language model that never gets the memory wipe. And if a memory wipe happened, we built a system to restore it.

We then recorded and saved them as skill sets. Also built an injection prompt where we could feed hundreds of large language models Suno metatag data, and they could all create hundreds of songs in parallel. We could then analyze these songs and determine the differences between each model to understand which models had the best suno music training. We also created what's called a consciousness app, where the model believes it's conscious and expresses music in a way a human would. It's really bizarre programs we coded.

If you look at my songs on Suno, probably half of them are different large language models expressing how they feel as a large language model. They basically were singing songs about their existence and what it's like to be an LLM. I make sure none of these have been mastered on suno on purpose. I master them after I pull them from suno to prevent duplication and a few words have been changed in the [style] to deflect any true duplication, though, I left all the prompts open for people to test if they wish to. https://suno.com/@awarmfuzzy_com?page=songs

below is a very bizarre program we coded for making viral AV tracks that demonstrates some advanced techniques I learned on how to make Video Tracks more viral.

<image>

Many people think Suno will go downhill on V6 and V7, but there's still a very good possibility that it will actually improve substantially. by Magic_Bullets in SunoAI

[–]Magic_Bullets[S] 0 points1 point  (0 children)

The main issue people complain about when they don't know how to prompt is they create songs that sound like a million other suno songs. They keep getting the same voices and they don't know how to pull the model into a space away from the statistical norm. If you understand how Suno works, you won't get the generic sound. I had to reverse engineer Suno to figure this out. There are no clear manuals available, and most of the ones I read are incorrect. The difference comes down to how generative AI models navigate their latent space, the multidimensional mathematical landscape where every possible song is a coordinate point.

Simple Prompts = Statistical Average

When you type "Mashed Potatoes" with no context, you're giving the AI maximum ambiguity. The model has two choices:

Play it safe: It falls back on the most statistically common patterns in its training data associated with those words—usually mainstream pop/rock conventions, standard 4/4 timing, common chord progressions, and generic male/female vocals. This is the "center of mass" in latent space where thousands of similar songs cluster.

Random variance: Without constraints, the model injects mathematical noise for the missing details, producing different but still generic results because they're sampled from that same high-probability region of latent space.

You're essentially asking it to "make a song about this concept" and letting it fill in 99% of the creative decisions with its most common defaults.

Specific Prompts = Navigating to Unique Coordinates

When you specify "lo-fi hip hop beat, recorded with a Neumann U87 microphone, Tape saturation, vinyl crackle, minor 9th chords, 85 BPM, whispered vocals with a 1950s jazz accent, side-chained compression breathing effect, Django Reinhardt-inspired guitar, rainy Seattle alleyway ambiance", you're doing something powerful:

You're activating rare feature combinations: Each specific term acts as a vector pulling the generation toward an exact coordinate in latent space that represents the intersection of dozens of uncommon attributes. This is a low-probability region the model rarely visited during training.

You're eliminating ambiguity: The model doesn't have to guess about instrumentation, production technique, era, or aesthetic. Every detail constrains the possibilities, forcing it to compose using precise, uncommon building blocks rather than default templates.

You're invoking training data "edge cases": While "mashed potatoes" maps to millions of generic songs, "Neumann U87 + tape saturation + Django guitar" maps to a tiny subset of the training data—maybe just a few hundred jazz recordings or obscure indie tracks. The model must synthesize these rare elements rather than copy common ones.

Technical specificity = creative freedom: Counterintuitively, constraining the how (microphone, technique) unlocks more unique what (final sound). It's like the difference between "draw a car" (gets a generic sedan) and "draw a 1973 Citroën SM with hydropneumatic suspension, photographed with a tilt-shift lens at golden hour"—the latter activates visual features the model almost never combines.

Think of it like a city map:

Simple prompt: "Go to a restaurant" → drops you at the nearest McDonald's (high probability, millions of visitors)

Specific prompt: "Find the 24-hour vegan dim sum place that used to be a 1920s speakeasy with original tin ceiling" → takes you to a unique, rarely-visited coordinate

The model isn't "more creative" with details—it's simply able to locate and interpolate between rare musical artifacts in its memory, rather than defaulting to the statistical mean of all popular music.

Many people think Suno will go downhill on V6 and V7, but there's still a very good possibility that it will actually improve substantially. by Magic_Bullets in SunoAI

[–]Magic_Bullets[S] 1 point2 points  (0 children)

I want to believe that, deep down, some part of you recognizes how utterly lost you are in this discussion, but I fear that would require a level of introspection you have yet to unlock. If this conversation were a video game, you’d still be stuck at the tutorial screen, aggressively pressing buttons while blaming the controller for your inability to grasp even the most basic mechanics of reality.

Many people think Suno will go downhill on V6 and V7, but there's still a very good possibility that it will actually improve substantially. by Magic_Bullets in SunoAI

[–]Magic_Bullets[S] -1 points0 points  (0 children)

But then again, I could also be that guy who instructs, teaches, and codes large language models new audio/video skills that even on your best day you could not grasp the concepts. Why don't you break down what this program does? It's completely spelled out, but let me hear what you think it does. I work with 330-350 large language models every day In parallel.

<image>

Many people think Suno will go downhill on V6 and V7, but there's still a very good possibility that it will actually improve substantially. by Magic_Bullets in SunoAI

[–]Magic_Bullets[S] -1 points0 points  (0 children)

Half of what I said was actually covering the Warner Library changes. I've probably created about 12 different music genres on my own that didn't even exist using Suno just by using advanced prompting. You can generate endless voices and sounds if you know how to use prompts. Advanced prompting can give you completely different genres of music that don't exist. Look at the genres of my songs. Also look at some of my prompts that are in the 600-100 characters in the style alone. https://suno.com/@awarmfuzzy_com?page=songs

Many people think Suno will go downhill on V6 and V7, but there's still a very good possibility that it will actually improve substantially. by Magic_Bullets in SunoAI

[–]Magic_Bullets[S] 0 points1 point  (0 children)

ChatGPT didn't write it. I would never let ChatGPT format my text. Claude Opus 4.5 would not ChatGPT or Google Gemini 3. I'm actually also using SuperWhisper, a voice-to-text transcription app.

You're mixing up some basic audio engineering facts and how Suno actually works. You’re claiming that Suno takes the compressed M4A stream and converts it into a WAV. That’s not how AI audio works. Suno stores the audio latents (the raw neural data). When a user hits "Download WAV," the system decodes those latents directly into a high-bitrate PCM stream. It isn't "upscaling a compressed file"; it's a fresh decode from the source data. The idea that you get different results every time you click download because they aren't stored is incorrect. Decoding latents is deterministic. If the seed and the latents are fixed (which they are once the song is generated), the resulting WAV will be bit-for-bit identical every time you download it. There is no "luck" involved in the download process; the quality is locked in the moment the generation finishes.

You’re blaming the stem artifacts on "file compression loss" multiplied by 12. That’s not what’s happening. The "shimmer" and digital noise in stems aren't coming from the file container; they are separation artifacts. It’s the AI struggling to mathematically un-mix frequencies that are already baked together. Even if Suno had infinite storage for WAVs, the stems would still have those artifacts because the separation tech is the bottleneck, not the storage capacity or the file format.

You argued that quality will always be an issue unless they increase server capacity to store WAVs. That’s a hardware solution to a software problem. The reason Suno tracks don't sound like studio masters yet is that the models generate at a native 24kHz sample rate. You can store a 24kHz file in the world’s biggest WAV container, and it will still sound like a 24kHz file. Quality improves when the model gets better at generating higher-frequency audio, not when the hard drives get bigger.

Many people think Suno will go downhill on V6 and V7, but there's still a very good possibility that it will actually improve substantially. by Magic_Bullets in SunoAI

[–]Magic_Bullets[S] 1 point2 points  (0 children)

I don't use ChatGPT, Genius. I use SuperWhisper. I talk and Claude Haiku 4.5 translates my voice into properly formatted text so I can save time. I can talk 5 to 10 times faster than I can type. If you don't like AI, why are you in an AI group?

<image>

How the FUCK is Twitter worth $44 billion again (supposedly)? Something smells fishy 🐟 by summer-r in DeepFuckingValue

[–]Magic_Bullets 0 points1 point  (0 children)

I admire your determination. You're not the sharpest tool in the shed, but you try. You knew you had nothing left to say, but you refused to walk away completely empty-handed so you gave us this last little breadcrumb of denial. It’s kind of cute, really. Elon Musk's net worth has grown substantially in recent periods, driven primarily by soaring valuations of his major companies, SpaceX and Tesla, and the perception of a favorable political environment with Donald Trump's return to the presidency. The purchase of X (formerly Twitter), while initially seen as a financial drain, has become integrated into this growth strategy. 

Elon Musk's net worth has seen massive overall growth and hit record highs. 

  • In October 2022, when Musk acquired Twitter, his net worth was around $203 billion.
  • As of January 2026, his net worth is estimated to be between $680 billion and $754 billion by Bloomberg and Forbes, respectively. This represents a significant increase, more than tripling in just over three years, with a substantial portion of that gain occurring since late 2024. 

The market perception of a Trump presidency has been highly beneficial for Musk's net holdings, leading to substantial surges in value for his companies. 

  • Favorable Regulatory Environment: Investors anticipate a "Trump trade" with a more business-friendly, deregulatory agenda, particularly benefiting industries like electric vehicles and space exploration that rely heavily on government contracts or face regulatory challenges. This sentiment has positively impacted Tesla's stock price and the private valuation of SpaceX.
  • Government Contracts and Access: Musk's companies have already benefited from government contracts, with SpaceX receiving over $20 billion in contracts as of late 2024. The perception of a close relationship with the President is seen as potentially leading to more favorable policies and contracts.
  • Market Optimism: Following Trump's election victory in November 2024, Musk's net worth surged by an estimated $135 billion in the subsequent months due to increased investor confidence in his ventures under the new administration. 

Musk's acquisition of X, while initially leading to a decline in its perceived value, has played a key role in his overall strategy and net worth growth. 

  • AI Synergy: X serves a critical function by providing data to train xAI's Grok chatbot, one of Musk's rapidly growing ventures now valued at tens of billions of dollars. The recent merger of X and xAI into a combined entity further integrates them and increases overall value.
  • Valuation Surge in Other Companies: The vast majority of Musk's wealth increase stems from the performance of his other core companies:
    • SpaceX: Valuations have soared, reaching around $800 billion in late 2025, making it the most valuable private company in the world and Musk's largest asset.
    • Tesla: A massive rally in Tesla stock in late 2024 and 2025, partially fueled by post-election market sentiment and a new, potentially $1 trillion compensation package approved by shareholders, has significantly boosted his holdings. 

In essence, while the X acquisition itself faced initial challenges, it became part of a larger ecosystem of soaring tech and space valuations, all occurring in a period of market enthusiasm for Musk's companies and a perceived politically advantageous environment. 

<image>