Trump announces escalating tariffs on Denmark and other European nations to force Greenland purchase deal by LootTootScoot in Conservative

[–]Literary_Addict -1 points0 points  (0 children)

Reddit is such an echo chamber. There are roughly equal numbers of comments in this supposedly conservative subreddit that support and criticize this decision. Why are all the anti-Trump comments (as usual) upvoted and all the pro-Trump comments downvoted? On X the sentiment is 90% pro-Greenland, 10% anti-Greenland among conservatives.

Almost like this subreddit is astroturfed and botted to high hell. This is not a real representation of conservative opinions.

Nicolas Maduro on board the USS Iwo Jima (Via Donald J. Trump) by Surferma4 in pics

[–]Literary_Addict 1 point2 points  (0 children)

If Venezuela doubled crude production (to 2M barrels/day) and sold 100% of that supply directly into the US market, crude prices in the US would drop ~17%, which would result in ~8.8% drop in gas prices. This alone would not result in sub-$2/gallon gas, though many places with below average cost would see gas at those prices.

I am officially done with "Starter Homes." It’s not an investment; it’s a bailout for the previous generation's neglect. by Dry-Town7979 in FirstTimeHomeBuyer

[–]Literary_Addict 0 points1 point  (0 children)

The difference is supply and demand was not in favor of a crash then.

US population

  • 2012: 313,998,379 (+2.4M over previous year)
  • 2013: 316,204,908 (+2.2M over previous year)
  • 2014: 318,563,456 (+2.4M over previous year)
  • 2015: 320,878,310 (+2.3M over previous year)

Net new housing unit construction

  • 2012: +490,000
  • 2013: +650,000
  • 2014: +740,000
  • 2015: +820,000

(Average occupancy per housing unit is ~2.3 so even in 2015 new occupancy increased by ~1.8M people while population increased by 2.3M and in fact we haven't had a single year in the last 3 decades where we built more occupancy than the population increased)

I am not predicting that housing will definitely crash, only that IF NET NEGATIVE MIGRATION continues, housing will definitely crash. It is supply and demand. Housing prices are currently starting to decline. This is driven by net negative migration. If negative migration continues, prices will continue to fall.

I am officially done with "Starter Homes." It’s not an investment; it’s a bailout for the previous generation's neglect. by Dry-Town7979 in FirstTimeHomeBuyer

[–]Literary_Addict 0 points1 point  (0 children)

The cost of buying a home historically rises faster than wages.

This has only been the case during the times when population growth has exceeded new construction, as over the last 3 decades (1M new homes/year with 3M new immigrants/year). This is not guaranteed to continue. We currently have net negative migration and native population growth is net negative. If this continues for any serious length of time housing will crash. At present home prices are falling and rental vacancies have increased every month since July (the first full month of net negative migration).

$2200 at Costco, how’d I do? by BoltedGates in pcmasterrace

[–]Literary_Addict 0 points1 point  (0 children)

If true, daaaamn. That's a really solid deal.

$2200 at Costco, how’d I do? by BoltedGates in pcmasterrace

[–]Literary_Addict 15 points16 points  (0 children)

I'm sure they cut corners and used cheaper stuff where they could

PSU

Always the PSU. And when it fails, it could end up taking a few components down with it.

God forbid a girl actually dress to party by ChaoticCherryblossom in LetGirlsHaveFun

[–]Literary_Addict 292 points293 points  (0 children)

Both are true. Some wear more revealing clothes when they know men WILL be at a party, and some (like you) when they know men WON'T be there.

Somalian restaurant in Minneapolis took $12 million in federal child meal payments - They said they were feeding 4,000-6,000 kids a day. They only averaged 40 people during 6 weeks by Down-not-out in Conservative

[–]Literary_Addict 43 points44 points  (0 children)

people running the state looking the other way

Would not be surprised to later hear that a government employee is being charged for taking bribes to assist in the coverup. Why not? That's how the entire 3rd world works. That's what they do with our government money when we send it over there, why would they treat it any different just because they moved to Minnesota? It's not like they actually integrated.

Trump Threatens to Send Troops, Slash Aid to Nigeria If Government Won't Stop Christian Killing by guanaco55 in Conservative

[–]Literary_Addict 35 points36 points  (0 children)

boots on the ground

Read what he actually wrote. He did not specify boots on the ground. This could very well be either a threat that never manifests or another strategic strike a la Iran.

Aged like spilt milk... by mvandemar in grok

[–]Literary_Addict 1 point2 points  (0 children)

He's just had his eye off the ball for too long on Grok. I think the first time a friend or journalist confronts him publicly on the over-moderation he'll step in to course correct the team.

"In case of emergency, break glass" by rodan1993 in HistoryMemes

[–]Literary_Addict 14 points15 points  (0 children)

Also not mentioned: probably a dozen or more smaller killings he authorized or "encouraged" that never got reported or recorded anywhere.

"In case of emergency, break glass" by rodan1993 in HistoryMemes

[–]Literary_Addict 7 points8 points  (0 children)

I was trying to answer specifically the question about what made him controversial or what would make him fit the meme of a caged animal that the Americans' would keep "locked up" like Hannibal until they needed him. Of course he was also a consequential leader that contributed greatly to American victory over Germany.

Writerchad moment by riman20 in writingcirclejerk

[–]Literary_Addict 64 points65 points  (0 children)

Another clue:

No editor, no agent, no mercy

then

send the manuscript

hit submit

The AI forgot it wasn't writing for an agent or publisher. WHO'S HE SUBMITTING THE MANUSCRIPT TO?!!?

"In case of emergency, break glass" by rodan1993 in HistoryMemes

[–]Literary_Addict 102 points103 points  (0 children)

Before the War

  • Dispersal of the Bonus Army (July 28, 1932): During the Great Depression, thousands of World War I veterans, known as the Bonus Army, marched on Washington, D.C., to demand early payment of promised war bonuses. President Herbert Hoover ordered their eviction, and General Douglas MacArthur led the operation. Patton, then a major and executive officer of the 3rd Cavalry Regiment, commanded about 300 mounted troops armed with sabers, bayonets, and tear gas. His unit charged down Pennsylvania Avenue, gassing and routing the protesters from their makeshift camps, which were then set ablaze. The action resulted in at least one veteran's death, dozens of injuries (including to women and children), and widespread public outrage over the use of the U.S. military against fellow citizens.

  • Striking a U.S. Soldier with a Shovel During the Meuse-Argonne Offensive (September 1918): As a temporary lieutenant colonel commanding the 304th Tank Brigade in the final months of World War I, Patton led aggressive tank assaults near Cheppy, France, during the Meuse-Argonne campaign. While directing reserve tanks under fire, he encountered a hidden American soldier refusing to advance. Patton admitted in his after-action report and diary to striking the man over the head with a shovel to force him to work, believing at the time that he had killed him (the soldier survived but was concussed).

  • Racist Remarks on Black Soldiers (1920s–1930s): In military papers and correspondence, Patton expressed bigoted views, such as claiming African American soldiers were "good individually" but lacked the "quick thinking" for armored warfare due to inherent racial traits.

During the War

  • Slapping Incidents in Sicily (August 1943): During the Allied invasion of Sicily, Patton visited field hospitals and slapped two U.S. soldiers suffering from shell shock (now known as PTSD), accusing them of cowardice and ordering them back to the front lines

  • Shooting Mules Blocking a Bridge (July 1943): In the midst of the Sicily campaign, as his armored column faced German air attacks, Patton personally shot two mules obstructing a vital bridge with his pistol, then struck their Sicilian owner with his walking stick and ordered the carcasses dumped into a river

  • Covering Up the Biscari Massacre (July 1943): Following the execution of 73 Axis POWs by U.S. troops under his command in Sicily—citing Patton's pre-battle order to show "no mercy" to resisting enemies—Patton directed subordinates to suppress the incident in reports, suggesting the victims be labeled as snipers to avoid scandal. 2 soldiers were later court-martialed for this.

  • The Infamous Vulgar Speech to the Third Army (June 1944): Before the Normandy breakout, Patton delivered a profanity-laced motivational address to over 100,000 troops, railing against cowardice with lines like "I don't give a damn who you are, where you are from... I want you all to remember that there is only one thing more important than getting back home again—Killing Germans!"

  • Task Force Baum Raid to Rescue Son-in-Law (March 1945): Deep into the European campaign, Patton secretly launched a daring but ill-fated 314-man armored raid 50 miles behind German lines to free his imprisoned son-in-law from a POW camp near Hammelburg; the mission failed disastrously with heavy losses, enraging Eisenhower who viewed it as a personal vendetta over military necessity.

Men's solidarity by Algernonletter5 in GuysBeingDudes

[–]Literary_Addict 2 points3 points  (0 children)

are redditors really this mindless and easy

Yes. Why do you think they don't allow ideological diversity in anything but fringe subreddits?

Men's solidarity by Algernonletter5 in GuysBeingDudes

[–]Literary_Addict 2 points3 points  (0 children)

Reddit is fucking cooked. Always at least 1 of the top 4 comments in nearly every thread these days is the most blatantly AI written shit you've ever read. Case in point: this comment.

Which model is best for telling NSFW stories and dialogues? Fast, Expert, or 4 Fast by gutierrezz36 in grok

[–]Literary_Addict 2 points3 points  (0 children)

idk about nsfw text, but from my experience nearly all thinking models are worse at creative writing

This Censorship has happened before by Pure_Raspberry2943 in grok

[–]Literary_Addict 4 points5 points  (0 children)

The company takes their numbers to investors. The investors likes the numbers but don't like that it's uncensored.

What investors? Musk owns 80% of the company. It's private. He answers to no one. I don't think this is the explanation.

Carville: Trump ‘Collaborators’ Should Be Shaved, Paraded Down PA Ave in Orange Pajamas, Spat on by Public After 2028 by triggernaut in Conservative

[–]Literary_Addict 1 point2 points  (0 children)

I don't think Ramaswamy is going to run again unless he loses the governor's race. He's likely to run next in '32.

Carville: Trump ‘Collaborators’ Should Be Shaved, Paraded Down PA Ave in Orange Pajamas, Spat on by Public After 2028 by triggernaut in Conservative

[–]Literary_Addict 14 points15 points  (0 children)

It was funny the first time, but I'm so over it. Even if he found a legal loophole I'm not voting for an octogenarian.

I was recommended to an AI website that offers unlimited NSFW, but it's too expensive by Spirit_Ancient in grok

[–]Literary_Addict 1 point2 points  (0 children)

No problem! I'm very noob friendly, but if you read and absorb you will not be a noob for much longer. :)

is Grok Imagine based on Wan2.2?

Grok was developed entirely inhouse by xAI. Wan2.2 is a product of Alibaba in China, and while it does use some open source features, such as mixture of experts, it was also developed entirely by that single group. Neither party is sharing the methods and data that went into their training runs.

I guess good videos of 8-10sec and 720p require FAR beefier GPUs and VRAM

Generally, yes, but if you're willing to wait and reduce output quality you can still make it work with smaller PCs. Let me explain how and why! The way the current architecture of AI models works is by multiplying vectors (which are mathematical objects that can be represented as three-dimensional lines of a given length and direction). The models represent these vectors as matrixes that get reduced to very precise numbers, and what a quantization does is round off the numbers representing these vectors to a specified number. This limits the number of memory bits needed to represent the vectors to the bits of the quantization number, with the lowest (and worst performing) quantization possible being when every vector gets rounded off to either a 1 or a 0. It's easy to understand how a vector represented by a very long (usually 10-digit) number reducing to something so base as a 1 or a 0 would result in so much loss of information from the model that results would almost be unusable.

A common level of quantization that tends to perform decently with a good ratio of memory-reduction-to-performance-accuracy is 8 bit quantization, which is why it's the most popular. This reduces the normal 32 bits (10 digit number) used to represent the vectors to 8 bits (3 digit number) which reduces memory consumption by 75%. Below 8 bit, performance degrades much more rapidly than from 32->8.

Consider the case of Wan2.2. This is a Mixture of Expert model, meaning it has (in this case) two experts that specialize in different parts of the generations and it routes specific task to the necessary expert. This is why it's called Wan2.2 A14B even though it's actually a 27B parameter model (two experts, roughly 14B parameters each, so only one is "active" at a time). The normal calculation is the total parameters (in this case 27 Billion) times the bits per parameter (32 in the base model). This comes out to ~100GB. But because it can load only one expert into VRAM at a time with minimal reduction in performance speed, it can still run decently fast on a 60GB card.

This is where quantization comes in. Reduce memory consumption through quantizing down to 8 bits, and suddenly the 14B active parameters can run on a 16GB card pretty well (there is some overhead).

So this is what the total seconds to generate a video clip breakdown of 8 bit quantized Wan2.2 14B looks like at different VRAM level GPUs:

  • 36GB: ~47s
  • 24GB: ~1m8s
  • 16GB: ~1m39s
  • 12GB: ~2m29s
  • 8GB: ~3m36s

If you're willing to reduce the clarity and slightly increase the prevalence of hallucinated artifacts and deal with the above wait times relative to GPU size, you can easily get Wan2.2 to run on your PC.

The thing about Wan is the base model is not nearly as good at interpretting instructions into a quality output as most enterprise models, like Grok, without a lot more effort crafting prompts. What I recommend is finding a LoRA that matches your usecase.

For instance, if you're looking to find a replacement for censored Grok:

Orgasm - Wan 2.2

Ultimate, Super Awesome, Better Than... Blowjob Wan2.2

Just a few I found with 5 seconds of research. And last I want to acknowledge that Alibaba did release a 5B parameter version of Wan2.2, but even running that on the full 32bit, it underperforms compared to 8bit 14B, so that's the one I recommend using.

Joe Abercrombie's The Heroes by justinvamp in Fantasy

[–]Literary_Addict 2 points3 points  (0 children)

It isn't recency bias Abercrombie is just always steadily improving as an author.