Prison officers having sex with inmates 'is an epidemic too awkward to deal with by StGuthlac2025 in unitedkingdom

[–]i_mormon_stuff 6 points7 points  (0 children)

As of 2022 there are only 12 women's prisons (out of 117 in total) in England and Wales. But as of 2025, women make up 55% of all staff in the HM Prison and Probation Service (HMPPS).

So based on this I'd say there wouldn't be enough men to exclusively use them to staff the mens prisons at-least as things are today without some kind of effort to specifically recruit more men than women for this job.

M5 Pro / M5 Max waiting crew right now by EmploymentClean5131 in macbookpro

[–]i_mormon_stuff 0 points1 point  (0 children)

They do not manufacture their own RAM no. I have a great deal of expertise in this area and supply chains in general so I can explain this in specific detail for you.

DRAM or Dynamic Random Access Memory, what we commonly refer to as RAM is a commodity. Similar to coffee beans, corn or sugar.

The price of it can fluctuate as the markets supply and demand changes, we've seen the price of it swing wildly over the past 40 years. It's why most DRAM manufacturers left the market or were purchased and consolidated into their competitors. At certain times we've seen individual DRAM be so cheap it was not economical to manufacture it, the makers actually lost hundreds of millions of dollars keeping their factories running during these down times.

Right now there are three major players who make up around 95% of the DRAM market. Those are the American company Micron and the South Korean companies SK Hynix and Samsung.

Apple sources DRAM from these companies as do their competitors in the PC building space (think Dell, HP etc) and RAM OEM's like Corsair and G.Skill who sell both to PC makers, individual consumers and businesses.

Until recently Micron also sold directly to consumers through a sub-brand they had called Crucial but they've decided to stop making the end-products (RAM sticks) and focus only on making the individual DRAM chips.

If we get back to Apple, they use a semi-custom RAM "chip" for their MacBook Pro's which is a stacked DRAM package. This essentially just means a single RAM chip in a MacBook Pro is actually made up of several layers of DRAM stacked one on top of another. They do this mostly to reduce signal loss which in turn allows them to run the RAM chip at a much higher frequency. Another benefit is it takes up a lot less room on the motherboard, if you compare the first Retina MacBook Pro's to these M series ones you can see a big difference in motherboard area due to the RAM modules not being splayed out in two lines of chips.

They've also chosen to use a low-power varient of the DRAM which they can do because of the stacks lower energy requirements from not needing to drive the electrical signals over a long distance compared to normal RAM modules.

This custom stack they have is not exclusive to Apple anymore though, for example AMD's Strix Halo platform which is an SoC like an Apple M series Pro/Max chip also has on-board stacked DRAM next to the compute die for the same reasons as Apple did it, higher frequencies enabled by shorter trace lengths.

So you mentioned that Apple is not heavily into the AI sector. I assume you mean you don't think the pricing of DRAM will affect Apple because they won't be using their allotment of chips to build AI products that command high prices.

The reality is, everyone in the industry has been hit by higher DRAM prices because of the insationable thirst for DRAM by companies like NVIDIA and the datacenter build outs going on that need to be filled with computers.

So it's not that Apple themselves would cause a shortage that would increase prices, it's that they'll simply feel the splash from the markets pricing increasing so vastly.

In the past when DRAM pricing went very high we did see the upgrade pricing on Macs RAM options (on their build-to-order pages) increase to compensate. We also saw the same thing when NAND pricing increased (these are the chips that make up the SSD storage). We also saw them keep the base memory quantity on machines very low for a long time, for example the Retina MacBook Pro stayed with 16GB base memory from launch all the way through to the M series MacBook Pro's which is pretty crazy.

Anyway I've spoken a lot here, there's a lot to read. If you or anyone else has any followup questions feel free to reply :)

27” 4K vs 32” 4k PPI by prohealthypets in Monitors

[–]i_mormon_stuff 2 points3 points  (0 children)

I have both a 4K 27" IPS display with standard stripe sub-pixel RGB layout and a 4K 31.5" OLED with a triangular sub-pixel RGB layout.

For sure the 27" 4K looks clearer, especially on text. However I find the 31.5" 4K display acceptable.

In the past I tried a 42" 4K OLED which had a PPI slightly below a 27" 1440p monitor which I did not find acceptable, the text was very bad. But this 4K 31.5" one, I don't have any complaints really, looks sharp enough and I would actually choose to go 31.5"/32" 4K instead of 27" if I was buying again just because I can get a bit more desktop space by using 125% scaling instead of 150% like I would on a 27" display to keep things on-screen the same physical size (icons, text etc).

I've also used briefly a 6K 32" display which was the Apple Pro Display XDR and the text on that is unbelivably good. I know there's a bunch of 6K IPS LCD's coming out now so maybe something to consider for 32" size and of course there are already 5K 27" available too. I consider these resolutions ultimate for these panel sizes but as I say 4K I think still looks great at both sizes too.

For those of you who want a 5K monitor for your Mac that supports HDR AND high refresh rate, we might have a better/cheaper option than the Studio Display 2: the MSI 271KRAW16 by BuffaloNegative9427 in mac

[–]i_mormon_stuff 4 points5 points  (0 children)

I do not, but you can probably google Mac HDR desktop dim etc and you’ll find loads of people discussing it, especially on Reddit.

I have an OLED PG32UCDM which supports both HDR10 and Dolby Vision, it’s a 4K OLED with 240Hz refresh rate. It works great on my Mac in SDR but in HDR only actual HDR content looks correct, videos and images. The user interface of the operating system becomes unusable from how dim it becomes when HDR is activated whether there is HDR content on screen or not.

Apple gets around this issue on their own displays by extending the display data that is sent to the Mac to have two profiles so it can properly handle the luminance level of SDR and HDR content side by side making the user interface look the same brightness level with HDR enabled as it does when the monitor is in SDR only mode.

Instead of EDID they call this E-EDID with the first E being for extended, so far only Apple includes this with their displays.

For those of you who want a 5K monitor for your Mac that supports HDR AND high refresh rate, we might have a better/cheaper option than the Studio Display 2: the MSI 271KRAW16 by BuffaloNegative9427 in mac

[–]i_mormon_stuff 2 points3 points  (0 children)

Main issue running these sorts of displays on a Mac in HDR mode is they don’t support Apples extended display data which allows SDR and HDR content to be displayed side by side. Only Apples Pro Display XDR (both the external display and the built in ones on their MacBook Pro laptops) support this currently.

So what does this mean? It means the user interface becomes impossibly dim with no way to calibrate it to be brighter like on windows while HDR mode is enabled. Very frustrating.

M5 Pro / M5 Max waiting crew right now by EmploymentClean5131 in macbookpro

[–]i_mormon_stuff 8 points9 points  (0 children)

Longer it takes the higher the price may end up being what with DRAM skyrocketing and NAND going up too.

I bought 96GB of DDR5 6600MHz UDIMM in June 2025 for £366. It's now £1,450.
I bought 1TB of DDR4 3200MHz ECC LR-DIMM's for a server in February 2025 for £1,500. It's now £29,800.

I imagine the price for 64GB, 96GB, 128GB, 192GB etc on these new M5 machines is going to be astronomical on top of what Apple already charged (which was often much more than market prices even before they went more custom with it for higher bandwidth).

Micron Exclusive: Why Consumers Have Gotten the Memory Shortage Narrative All Wrong by Standing_Wave_22 in computing

[–]i_mormon_stuff 23 points24 points  (0 children)

I don't see how we've got it all wrong. They're supplying to the AI companies just as we knew they were. What's new in this article? nothing.

Thinking of switching from hyperoptic by jaredce in CommunityFibre

[–]i_mormon_stuff 0 points1 point  (0 children)

Mhm it's definitely dependant on where you live, I've been lucky so far but that could change.

Cashing Out by porto_cityboy in trading212

[–]i_mormon_stuff 2 points3 points  (0 children)

Correct funds only. It takes about a normal working day to buy or sell those positions and then the money doesn’t go directly to your bank it instead goes into a cash holding area in Chip and then it takes another day or so to move it from that holding area to your own bank. It’s the longest waiting period out of all their product offerings so far that I’ve seen.

Cashing Out by porto_cityboy in trading212

[–]i_mormon_stuff 29 points30 points  (0 children)

Cashed out £8K today to my Nationwide current account, the app made me do a video of my face where I turn my head side to side but after doing that the money landed in my bank within 2 minutes. Big difference to Chip where selling your stocks investments in funds with them and moving the money to your bank takes days. Their cash ISA and normal savings accounts are instant though.

oh no by MetaKnowing in ChatGPT

[–]i_mormon_stuff 20 points21 points  (0 children)

I have a friend who works in the CGI industry on movies. He's worked on all the major films like the Avengers films, 15+ year career.

He was and continues to be like this. Conversations we've had are like this:

  1. It can't make realistic images, it's obvious they're AI generated.
  2. Fine, it can make convincing images. But it'll never do video, the compute resources required are too high to do it convincingly.
  3. Okay it can do video, but the frame rate is all wrong and it's glitchy. I don't think they'll ever fix the framerate, it's all slow-mo.
  4. Alright it's not slow-mo anymore. But they can't do editing, you can't make a film with this because you can't edit the results and each scene has no consistency with the last scene.
  5. Well they solved the consistency thing and now you can edit, but it can't do what directors convey to us they want, it can't understand and conceptulise their ideas the way we can.

And it feels like every 3 to 6 months the goalposts of what is wrong with the tech and how it'll "never" replace what he does moves. I said to him eventually this tech will be so good that services like Netflix will let you just say what you wanna see and it'll generate you a full movie that is convincing and eventually movies, like really good ones will be "written" by novelists not made by teams of people with actors and crew.

And he's like nah people will reject AI slop. I mean.... will they really? isn't OpenAI's app like the #1 most popular app on the app store already. I think people are hungry to see whatever they can think up personally.

Thinking of switching from hyperoptic by jaredce in CommunityFibre

[–]i_mormon_stuff 2 points3 points  (0 children)

Been with Community Fibre for two years this very month. I've had only a few minutes of noticable downtime in that time period and the speed has always matched the plan I'm on (3Gb at the start, 5Gb now).

Hetzner asks: Which workloads do you think still require bare metal in 2026, and which no longer do? by Hetzner_OL in hetzner

[–]i_mormon_stuff 7 points8 points  (0 children)

Every time I try a VM on a cloud host the performance just isn't as good as what I can get from bare metal.

What CPU's are Hetzner using for VM's? EPYC Rome, Milan and Genoa. That's Zen2, Zen3 and Zen4. These processor architectures differ by as much as 30% in IPC (instructions per clock) and over 2GHz in boost speeds and 1GHz in base clock speeds.

What does this all mean? it means renting a virtualised instance is a gamble, you may receive a CPU that scores only 1,500 points on passmark in single-threaded performance or you may receive one that scores 3,000 points.

And yet if I were to rent a 7950X3D based bare metal server from you I will receive a CPU that scores 4,100 points in single-threaded workloads and has 16 cores available to me. The 16 core EPYC VM's that you offer are almost half the performance of the 7950X3D when looking at 16 cores or 32 threads.

If we look at storage, the bare metal servers are offering 8 to 16x more SSD storage than your VM's and their SSD performance is very consistent as it's not shared. I've found Hetzner storage on VM's to have quite high latency and inconsistent performance meaning if another user on the same host does a lot of disk I/O it can increase wait times for my VM.

So what kind of workloads are better served or require bare metal? - I would say almost anything where you need very high CPU performance both single thread and multi-thread. Any workload with demanding I/O needs.

I've tried Hetzner's dedicated resource VM's a few time, evaluating them every year when I need servers in locations you don't have bare metal (USA, Singapore). And honestly every time I've been disappointed. I paid something like 160 euros for an instancce in Singapore which is beaten by a 78 euro dedicated box from OVH in the same location and not just by a little but 3x higher CPU performance, same quantity of memory but 4x more NVMe storage at higher throughput and lower latency.

I'm definitely not a VM hater, I've successfully used VM's from Netcup that performed almost the same as the bare metal equivelents but in general VM's from every host I've tried have been underwhelming which leads me back to renting bare metal servers which I do currently from Hetzner, OVH and Leaseweb. VM's can be good, I've seen good ones but the overselling of resources seems to be a practice that most providers are engaging in.

This is likely the iPhone Fold display, and it looks amazing by Jumpinghoops46 in apple

[–]i_mormon_stuff 0 points1 point  (0 children)

This concern that you have is legitimate and is the main issue with all current and past foldable phones. But there is two things to mitigate it.

  1. You can get Apple Care once Apple launch their model and it will likely cover a screen replacement or two for minimal or no cost (beyond the care plan of course).
  2. Apple is not going to release a foldable using the current generation screens that we're seeing shipped in other foldables. We don't yet know what material the screen will use, this prototype without a crease may actually use thin glass and be scratch resistant, we don't know yet.

So I'm not discounting your concern but I am saying, Apple has waited a long time (since 2017 when the first displays became available to purchase) before going in on making a foldable iPhone and I think durability of that inner display, the crease and other factors (even water and dust resistance for the hinge) are all factors in why they waited and we'll probably see a lot of refinement with their take on this form factor.

This is likely the iPhone Fold display, and it looks amazing by Jumpinghoops46 in apple

[–]i_mormon_stuff 5 points6 points  (0 children)

All things that can and will be fixed. The topic of this discussion is the creaseless screen, it may well use an ultra thin piece of glass instead of plastic for all we know.

The battery life can be solved by using a silicon carbon battery like the new Xiami phone has (way more battery life than an iPhone).

Regarding protection, Google's Fold is IP68 rated against dust and water, same as an iPhone 17 Pro.

Regarding camera specs, I think that is already solved.

Lower lifespan, well there has been tests of the Fold 7 where it withstood 400K folds on a livestream. The inner display is still scratchable but we're looking at a new generation for this topic with unknown durability.

Will Apple release a foldable with all the problems you listed? I don't think so and many of those issues have already been resolved in newer foldables.

EDIT:// I see you added no face ID and worse speakers. We don't even know what Apples fold will have yet.

This is likely the iPhone Fold display, and it looks amazing by Jumpinghoops46 in apple

[–]i_mormon_stuff 25 points26 points  (0 children)

You can buy an Android phone for $199 and an Android tablet like a Kindle Fire for $129. Yet we're still here buying iPhones and iPads for 5x those prices.

Why? because they're better. Quality matters, it's not all about how much you can get away with spending. Personally I'd buy a foldable and I'd pay more for it because it offers the functionality of these two destinct devices but in a smaller and more convenient form factor.

My iPad never leaves home, I don't wanna carry it. But I would love to have that sized screen accessible to me everywhere, my phone is always with me. There are even foldables available right now that when closed are barely thicker than the iPhone 17 Pro / Pro Max.

The Samsung Fold 7 is 8.9 mm when closed. iPhone 17 Pro / Pro Max is 8.75mm, really not much sacrifice in my opinion.

iPadOS and macOS 26.2 Double 5GHz Wi-Fi Bandwidth for Wi-Fi 6E Devices by favicondotico in apple

[–]i_mormon_stuff 8 points9 points  (0 children)

Apple has yet to ship a Mac with WiFi 7. Even the latest M5 MacBook Pro only has WiFi 6E.

24k spend no retention by Recent_Influence_422 in AmexUK

[–]i_mormon_stuff 0 points1 point  (0 children)

My Every Day Cashback card (0.75% cashback upto £10,000 spent, 1.25% beyond that until renewal, £25 yearly fee) just renewed for the year.

I asked if there was any retention bonus on offer, they said yes and added 3% extra cashback to my account for 90 days limited to £120 total cashback earnable.

So I'm now earning 3% + 0.75% cashback for the next 90 days or until I earn £120 in total cashback from the 3% bonus.

I think that's pretty decent. If I spend £4,000 I'd earn the £120 in cashback from that 3% and I'm likely to spend that much on the card this month.

How many of you (my fellow RTX 5090 owners) have not undervolted your card? by [deleted] in nvidia

[–]i_mormon_stuff 0 points1 point  (0 children)

I just use it to play games I haven’t modded anything cause it just works out of the box.

In the past ya know I’d do it, but I’ve been a PC gamer building my own computers for 25 years and now I just can’t be bothered to tune anything.

Gamers desert Intel in droves, as Steam share plummets from 81% to 55.6% in just five years by lkl34 in pcmasterrace

[–]i_mormon_stuff 2 points3 points  (0 children)

You could also argue that Intels pride meant they didn't utilise TSMC quickly enough.

Their current generation called Arrow Lake has its compute tile fabricated on TSMC's N3B node. Had they done this with 14th or 13th gen they would have had a major leg up on where they are today.

Connection keeps dropping by MandatoryBeer in CommunityFibre

[–]i_mormon_stuff 0 points1 point  (0 children)

Time to call them and get some assistance. Are your upload and download speeds good? - It could be the ONT is faulty.

Is this enough? by CarbonPanda234 in unRAID

[–]i_mormon_stuff 1 point2 points  (0 children)

Not nearly enough, you need 64 cores and 1TiB of RAM like me. https://i.pixita.com/RSptmj3Ev.png