Here, for reference, is the link which has irritated me and tickled my rant-bone: http://michaelbluejay.com/electricity/computers-questions.html#turnoff
I stumbled there searching for a wattage issue with Seagate drives, and I poked around. Exasperated by the drivel there, I find myself... typing my own!
So, how should one make sense of a line like this?
"Keeping your computer on constantly means it's running three times longer than normal."
Normal for who?
People who turn on their computers once a week to email with the grandkids?
Or Google's server farm?
I live about an hour's drive from anything; quite normal where I am. Some people are within walking distance of fast food restaurants. Very normal, for them. Pretty subjective.
I bet there are places where it's somewhat abnormal to have deer jump out in front of you while you're driving; not here, where it's not quite so normal if you haven't seen a couple playing at being a kamikaze missle as you try to dodge their attempts to embed a hoof or two into your hood.
Of course, the statements on this fellow's page just get more moronic as he warms up into it. My favorite:
"The computer will become obsolete long before you wear it out, no matter how often you cycle it."
He then defines "obsolete" to mean around two years.
I'm typing this on a system I built back in '01.
It's a venerable old ECS K7S5A; I finally maxed out the memory to a single gig a few years ago. (DDR, mind you, though the board itself also can take SDRAM.)
(I've always loved boards that are forward upgradeable; I've still got one capable of using both SDRAM and EDO; I've not yet been able to bring myself to dispose of it.)
Back to this system: I've had to replace the power supply in it, as well as a bad memory module. I also upgraded the Athlon XP 1300+ to one a few revisions higher.
All I use this system for is programming (compiled languages, like C and asm) web browsing (20+ windows open in Chrome right now), and when creating and testing web pages, obviously, Firefox, Opera, and IE are also all open at once.
Then there's Putty, and then there's Paintshop, and let's not forget all those Explorer windows (Yes, sadly, I'm running XP; I had Linux on here back in '02, but it ran so well and fast I felt I was spoiling myself, so I put a Microsoft® OS back on it.) people who actually use their computers tend to attract like langoliers.
Let's see, I also do video capture, and... you get the point. This brier computer, it be used.
I'm typing this in 2009. I've been using this system over 8 years.
The system next to it is as ancient; a P4 from around '00: One of the reference boards for the 845 chipset.
Then there's my venerable Linux router: a celery 300 overclocked to 450mhz, with 192mb memory... on a FIC motherboard of all things. It's been rock solid since '97 or so. It had a short stint running Win2k back in '98, but we don't like to talk about such things in polite company, or, for that matter, here. (My Linux system wishes to insert here that the week it had Windows ME on it was, like, TOTALLY groovy. I didn't know computers could have acid trips, either.)
I do have one newer system; I haven't finished building it yet. When I do, it'll be my primary box for a good many years, no doubt.
I take great issue with anyone asserting computers are good for a mere 2 years, and then... only if you use them 'normally'.
I'm using a system which, at this point, is literally a decade old and still running great, despite hardly ever being turned off.
It was only a few _weeks_ ago when I finally threw away some 386 boards. I know people who are still doing nothing but browsing the web and emailing their grandkids with Windows 95 on PIIs, and not only are they fine with doing so, but wouldn't know why they'd need anything else.
I realize the author of the above linked FAQ did `tech/troubleshooting` support for Apple; thus, his ignorance of computer equipment that lasts longer than a few years is quite understandable, as is his apparent inability to grasp this rather simple concept:
It takes a lot of energy and resources to build more computers.
Seriously. I'm not joking; it's a sad fact of our dimension, but computers don't simply materialize when you wish for one. I speak from first-hand experience, here.
Before Santa and the Easter Bunny bring them to you, someone still has to build them.
Again, building computers take a lot of resources; we have to dig up the raw materials, process them, transport them, manufacture them into something useable (having had to pay people to sit under horrendous flourescent lights to design those useable things), etc. All of which, naturally, takes energy.
Much more energy AND resources than simply using your computer for, say, one more year...
So, dwelling a bit more on the friendly neighborhood ex-Apple Support twit and his "let's focus on REALLY saving energy" drivel, here are my thoughts and responses to his garbage advice:
- I never turn off my systems. I don't recommend anyone who heavily use their systems do, either. I'm a fan of hibernate where appropriate, as well built in power management functions of the newer processors, boards, and even peripherals.
- I know I'm incurring the wrath of Murphy and his cruddy law by saying this, but the myriad 30gb (ancient!) to 100gb hard drives in the systems I mentioned before are still going strong after all this time and heavy use. (I now have a NAS, with 4x 250gb drives, as well. It doesn't get turned off, either.)
- I did REAL tech support for years, FIXING computers, not 'troubleshooting them over the phone'. (And that's COMPUTERS, not `Apples`... apparently, Apples can't be fixed, but become obsolete within two years of purchase.) From the thousands of systems I dealt with, the ones that failed the earlist were ones that were cycled down. Granted, some failures were from heat deaths (from having a couple of household pets, or equivalent, in lint/fur, sucked into the chassis) or other weirdness. Most, however, had failing components where the usage model involved being cycled nightly. The most common dead part was the PSU; expensive PSUs can fail somewhat randomly; the cheap ones seem to die if you look at them funny --- turning them on and off is just asking for trouble. Some computer brands, like Emachine, where the motherboard fails because it's made of junk components, shouldn't ever be turned on in the first place, and certainly never turned off, if you do manage to get it to boot. People who buy a $300 system from Walmart shouldn't expect better, if they're realistic.
- The only 'name brand' systems I own are my laptops. Buying quality parts and building a system means that it'll last many more years than the rebranded crap from places like Dell, HP/Compaq, Acer/Gateway/Emachine, and, apparently, Apple.
- Buying quality parts and putting together a system which won't be obsolete/dead after two years means you've reduced your carbon footprint and made the earth feel better, as it didn't have to be further raped of resources and energy to aid in your ever waging war of come-uppednance with The Joneses. Isn't that nice?
Yes, I know: it's too expensive to do this right now.
Yes, I know, there are still a lot of issues to be resolved.
However, until the Sun supernovas, we can happily use as much solar energy as we want from it, and, until our Earth's rotation is retarded by whatever Hollywood disaster du jour wins out for the job, we can harness wind, as well.
Because, really, the solution isn't to use LESS energy; the solution is to make it so we mass consumers can have more, affordable energy.
And, of course, to perhaps buy more reliable computers.
Lastly, I wanted to address this particular FAQ answer:
"Any energy use has to be weighed against the benefit we expect to get from it. Personally, I don't feel the alleged gains from distributed computing projects are worth the energy it requires and the pollution it creates. Take a look at the projects: One involves searching for extraterrestrial life. Of course it would be kind of ironic if we actually found some aliens and our contact with them was necessarily brief because climate change wiped out life on our planet shortly after we found them."
Alleged gains, huh? As for finding aliens -- they're not so much lost as most likely hiding from idiots who'd want to make them turn off their spaceships to conserve energy.
Disregarding the very real results of password cracking with distributed computing, frankly, just because you don't choose to look for intelligent life somewhere in the universe, or find folding proteins or curing cancer useful, doesn't mean that other people can't waste the energy they're paying for as they see fit, whether there are any 'actual' gains from doing so or not.
I know fighting on the interwebs is quite akin to competing in the special olympics (no offense, if you happen to be a contender...), but I have a real problem with someone pushing their agenda on neophyte computer users who really might not know better: it's okay to leave their system on -- they could seriously shorten their computer's useful life by following the advice of some wannabe ex-Apple support troubleshooter with a god-complex regarding everyone else's power usage.
Well! I do feel rather better, after this rant.
I've done did my part, a'fighting the good fight, against the ignorance and doubleplus ungood untruths from the drooling fruit-damaged troglodytes, with their disposable systems and crazy, red eyed stares glaring at my rolling power meter with hostile reproach.
Remember, kids, don't believe everything you read on the interwebs. If you ever want to talk about computer reliability in the real world, and don't know any hackers (or are scared of their fascination with trains), then simply find yourself an active COBOL programmer supporting a 70's era metal hulk gathering dust in your bank's basement. If you can wake the fellow up, he'll fill you with some Truth™ -- but only if you trounce not upon his punchcards.