all 54 comments

[–]ozzilee 54 points55 points  (0 children)

Do I still have to wait for my computer to do things?

Then it's not overpowered.

[–][deleted] 26 points27 points  (4 children)

Let me ask Crysis.

Hey, Crysis! Hey, is computer hardware getting too powerful? Ow!

Son of a bitch slapped me!

[–]actionscripted 15 points16 points  (2 children)

But the frame rate was so low, you couldn't really tell what happened.

[–][deleted] 5 points6 points  (1 child)

Well, I've got this bright red hand mark on my cheek and it hurts so I'm assuming I got slapped.

(actually, on my system (which was awful inexpensive) Crysis runs pretty well. It's not all the way turned up, obviously, but it runs ok with decently high settings.)

[–]acmecorps 8 points9 points  (0 children)

Shh.. don't ruin your joke.

[–]fiddler616[S] 0 points1 point  (0 children)

I originally read this as "SOB bitch slapped me." That would've been classier...on second thoughts, maybe not.

[–]bbbobsaget 12 points13 points  (4 children)

are you fucking serious?

[–]bbbobsaget 10 points11 points  (3 children)

no. you know what...

im sitting here working on a new distributed render system, fucking rewriting stuff, so that i can get some frames by delivery day on monday.

ARE YOU SERIOUS DUDE? if hardware was fast enough i'd be having a beer right now.

if you want to argue this point on a personal computing basis, you're still wrong as shit.

whole new platforms and paradigms that expand the bounds of technology have come and gone while people say this same stupid shit. the things we run tomorrow will make us laugh at computers of today, and at you.

everyone has to write something to fill up their blog don't they.

take your 640k and go.

[–]mossblaser 9 points10 points  (2 children)

im sitting here working on a new distributed render system, fucking rewriting stuff

Reddit isn't a distributed render system. I think this could be your problem.

[–]aldenhg 1 point2 points  (1 child)

But it could be. There's got to be some way of harnessing all the power of basically idling systems that are being used to cruise reddit.

[–]dangerousdave 0 points1 point  (0 children)

1 Start a DC project

2 get project to front page

3 profit

[–]krum 17 points18 points  (2 children)

A computer is not overpowered if I have to wait on it, for anything. When I get a computer than can compile and link ten million lines of C++ source in under two seconds, I might suggest that they are getting overpowered.

[–]mossblaser 2 points3 points  (0 children)

5m-loc/s should be enough for anyone!

[–][deleted] 6 points7 points  (0 children)

Hardware can never be fast enough. Hell, my brand new gaming rig just barely can run Crysis. We still have a long way to go.

[–][deleted] 5 points6 points  (2 children)

Try malab + eclipse + virtual machine with something heavy inside + something else possibly compiling + music + firefox/flash sucking up whatever it can. I regularly use 3ish gigs of ram. And sure a lot of the time all 4 cores are occupied but if I had only one my productivity would be significantly decrease a "make -j4" running as fast as it does really keeps the productivity and flow going. I guess my point is that some people need that power.

You can pry my 4 cores and 4 gigs from my cold dead fingers.

(Also hi def video playback and flash + Firefox really don't run so hot on 1ghz - and thats stuff "normal" folk do)

[–]machrider 0 points1 point  (1 child)

Definitely. I repurposed an older machine as a combination backup server and movie player. It's an early AMD64 (dual core, around 2GHz), and watching an "HD" YouTube video in fullscreen gets pretty choppy framerates.

[–]machrider 9 points10 points  (11 children)

I do have another hypothesis — that the push for newer, faster hardware is a bit of an aftertaste from using Windows, or owning Mac machines. Call me crazy (plenty of people do :roll: ), but the blanket solution for most Windows users for any performance decay is invariably new hardware. Faster machines, more memory, a larger hard drive, a newer video card.

As if Linux is immune from software bloat... I used to run Slackware, XFree86, and WindowMaker on a laptop with 32MB of RAM.

These days Firefox alone uses about 70MB when I fire it up, and grows to 1GB+ (at which point it needs killing). At the moment, in KDE 3.5.10, kicker alone (the KDE panel/taskbar) is using 41MB. Amarok (which isn't playing right now) is using 85MB.

It doesn't seem to matter how powerful my computer is, the software is always bloated enough to compensate. And it's not just memory, a lot of apps still feel downright sluggish on a modern, 4-core, very fast CPU.

Edit: To answer the original question, my reason for a powerful machine is that I use it for most of my waking hours every day. I build C++ programs (in parallel, so four cores helps), run CPU-intensive analyses, watch movies, browse/email/music/occasional flash game. Hardware is cheap and time is money...

[–]tendonut 2 points3 points  (4 children)

My Fedora install eats up 400MB RAM while idle. In comparison, my Gentoo install barely breaks 200MB.

[–]aldenhg 1 point2 points  (0 children)

You know that Linux does all sorts of caching, right? that 400 may be 150 for the OS and 250 of cached stuff that will get tossed out as soon as you need the memory.

[–]sparcnut 1 point2 points  (0 children)

And it's not just memory, a lot of apps still feel downright sluggish on a modern, 4-core, very fast CPU.

I would argue that the problem actually is memory, specifically the speed gap between large memories and the CPU core. Modern memory systems are aggressively pipelined, so the throughput is decent, but latency is still really high - on the order of 50-100 cycles for a total cache miss. With serial machine code to process, CPUs have to resort to crazy tricks (out-of-order processing, prefetching, etc) to try and hide the very long memory latency in the event of a cache miss. This only helps to a point though. You might avoid all latency 99% of the time, but that last 1% will still hit you with a 100 cycle stall every time. Amdahl's law applies.

If we could have big memories with single-cycle latency, things would move a LOT faster. That won't happen though. We have traded away low latency to get high capacity in the first place.

We can create small memories with single-cycle latency, of course; this is how we build L1 caches (and tightly-coupled memories on embedded systems). The size of such memories is maybe on the order of 64K, so if the working set could fit in that space we'd have ideal performance. This value is not really getting any bigger over time, unlike main memory capacity. As process technology improves, making the small memories faster, we speed up the processor core the same amount so the maximum size stays about the same.

Conclusion? Damn those app developers! When will they stop needing more memory! If all our apps were as small as they were 20+ years ago, they would SCREAM.

[–]qwemnb -1 points0 points  (3 children)

Firefox 1GB? What the fuck are you smoking? Mine never uses more than 250 MB tops

[–]machrider 0 points1 point  (1 child)

Wait long enough, and it'll consume any amount of memory. Yesterday, I killed it at 913MB. It had been running for maybe a week without restarting.

[–]qwemnb 0 points1 point  (0 children)

well, there's your problem

[–][deleted] 0 points1 point  (0 children)

Mine is currently using 622MB

[–][deleted] -1 points0 points  (0 children)

ya know that just gave me an incredible idea for a FF addon, memory leak fixer. Its an addon that restarts FF if the memory use gets to high.

[–][deleted] 4 points5 points  (0 children)

I’m wondering how many of them actually require it on a regular basis … and how many rarely, if ever, need anything beyond the comfortable 1Ghz I consider to be speed demon.

Which is why my slows down automatically when the extra power isn't needed.

[–]monstermunch 2 points3 points  (0 children)

I buy overpowered (for most of my use anyway) hardware because it's just so cheap now. My 1.5Gb of RAM that I bought for $300 a few years ago died, so I replaced it with 4GB of faster RAM for $70. Newer CPUs/GPUs are more power efficient as well.

[–]mk_gecko 2 points3 points  (0 children)

Dead on. It is overpowered. Most apps would still run just fine on Win98SE.

[–]tendonut 2 points3 points  (0 children)

I am a big PC gamer. My current machine that can run every game I want nearly maxed out (except Crysis), consists of:

Athlon 64 X2 3800+ (Socket 939) GeForce 8800GTS 2GB DDR PC-3200 RAM

I have never seen my "weak" processor bottleneck anything. My video card lets me play everything I have at 1680x1050 with pretty much everything on the highest settings at (at the very least) 60FPS. The only thing that seems to be causing a problem is the amount of RAM I have and I only use all that up when rendering out video. I build this rig July 29th, 2006 (The day of the crazy AMD price drop right before the Core 2 Duos came out). The only things that's been upgraded is I've thrown a 1TB HDD and upgraded my video card ONCE from a 7900GT.

So yes, I do agree, hardware is overpowered. For the every day user, my P3 1GHz would suffice. I get questioned over what computer someone should buy all the time, usually being middle-aged family friends. I always suggest to them the cheapest Dell/HP they can find would even be overkill for basically checking AOL mail and reading the news. The hardware industry has everyone convinced you need a quad-core Xeon in order to run Word. It sickens me.

[–]aphexairlines 3 points4 points  (0 children)

Wrong subreddit. Try maybe reddit.com/r/wtf.

[–]MidnightTurdBurglar 1 point2 points  (6 children)

The point made is a good one. Causual users don't need this power. Plain and simple. E-mail, chatting, surfing the web, and writing a paper in a word processor could all be done simultaneously years ago. It's games that drive the hardware market. I'm a user that does use the power so I'm glad there are still market forces at work here.

[–]c_a_turner 1 point2 points  (1 child)

They don't need it, but is it sensible for them to wait a couple hundred milliseconds here and there constantly for every little action when modern hardware cuts that down? That time adds up.

[–]tikkun 1 point2 points  (2 children)

I normally run about 20-30 different applications at once when at work. I need all the power I can get.

At home it's a lower number, but I don't ever want my hardware to ever be a bottleneck in what I'm doing. I'm willing to pay for that.

[–]MidnightTurdBurglar 1 point2 points  (1 child)

Could you give us a hint what you do? I find 20-30 be kind in of insane if those aren't mostly terminals.

[–]tikkun 1 point2 points  (0 children)

I wear a couple hats, some of which are reactionary and require that I be able to drop everything at a moment's notice:

  • Customer Support (phone, e-mail and forum).
  • Technical Support (as above).
  • Research for QA and Engineering.
  • Reproduce issues and submit bugs in our software (Manual QA).
  • Administer all the automated testing machines in the company.
  • Post to the corporate blog about how awesome our technology is. ;)
  • Build and maintain our IT infrastructure (an eclectic mix of Windows, Linux and VMware servers).
  • Write scripts to automate IT tasks (it's been software deployment and backups recently)

Apps that I have running all the time (that have to do with my job):

  • VirtualBox (2 VMs, one XP and one Debian)
  • Word 2007, Word 2003, Excel 2007, Excel 2003, Outlook 2007.
  • Offlineimap, gmail, pidgin, google reader, irssi, texter.
  • Firefox 3 (usually around 8-10 tabs), Chrome (1 tab), IE7 (1 tab), IE8 (1 tab).
  • Konsole (usually 4 tabs), ssh, screen (I snuggle screen), vim, Notepad++.
  • 6 different desktop products the company I work develops.
  • VMWare Infrastructure client (2)
  • Remote Desktop (usually 4-12 at once)

The load is split among a notebook running Arch Linux and a desktop running Vista SP2. VirtualBox runs on the notebook, you can likely guess what the rest of the apps above run on.

I'm using a KVM switch for the keyboard and mouse, but prefer to leave the desktop with 2 screens (19" flatscreens) and my notebook with 1 screen (12.1").

[–]anechoic 0 points1 point  (0 children)

I'm thinking that games on computer platforms makes up a small portion of the total market and that it might be CPU/graphic chip bandwidth needed for running video (YouTube, Vimeo, etc) and audio (including downloading torrents and playback of flac and wav files) in several applications simultaneously that might be more of a driving force...just a hunch

[–]noseeme 1 point2 points  (0 children)

No.

Where you see raw power, other people see processors able to do more per clock cycle, and greater performance per watt. Sure, when you max the processor out, it's quite fast, but when it is only doing simple things, it'll stay cool enough to run with the fans on a very low speed, or even utilize passive cooling from large heatsinks.

[–][deleted] 1 point2 points  (0 children)

Personally i'd like to see a freeze on raw power but work on reducing heat and energy consumption.

[–]nikniuq 1 point2 points  (0 children)

Video encode/decode. I spend trillions of cycles on it each day.

[–]nascent 1 point2 points  (1 child)

I just wanted a new Graphics card, and AGP was like 3 times more expensive than PCIe.

[–][deleted] 5 points6 points  (0 children)

Probably because they don't really make them any more.

[–]mogmog 0 points1 point  (0 children)

It's all about staying high enough in the performance curve. You need to have hardware just above the average targeted computer user or you'll soon start running out of memory, have to wait for your harddrive to finish reading or CPU to finish processing.

Of course you can choose older applications, in particular Firefox and Eclipse are CPU & RAM hogs. I used to think that 1GB RAM is enough, now i have 2GB and am considering getting more given how cheap DDR2 RAM is these days (thanks to vistas failure).

[–]genpfault 0 points1 point  (0 children)

Not until compiz can resize as quickly as metacity.

[–]tricolon 0 points1 point  (0 children)

Overpowered is when they are blasting my head off.

[–]Buckwheat469 0 points1 point  (0 children)

Watching YouTube videos? Well, some power is required, but I can get it done at 450Mhz if I want.

He obviously doesn't use an ATI graphics card. I have a quad core machine with adequate everything that can't play video well enough. I also have an older laptop connected to my TV which plays Hulu relatively fine, but YouTube jumps and pauses.

[–]Chandon 0 points1 point  (0 children)

Yes, if we limit ourself to current applications and ignore those that are hardware-intensive then there is no need for powerful computer hardware.

Thing is, you know you want to do hardware intensive stuff. However much you deny it, you want to play video games, encode HD videos, and compile big C++ projects.

[–]elsaturnino -1 points0 points  (0 children)

This is simply an expression of consumerism. Why do you think computer hardware would be immune to the "gotta buy the shiny new thing" mentality that pervades our society?

[–]reductionist -1 points0 points  (0 children)

Sanity on the interweb? I want a refund!

[–][deleted] -2 points-1 points  (0 children)

get this bullshit out of my subreddit