It could be as old as 15 years… If someone bought a species out i7 laptop in 2009 they may have upgraded it to 16gb at some point. Seems realistic enough
Please stop. I’m only in my 30s but you’re making me feel like I’m 80. To me, old is a 386 with 4MB of RAM, a 40MB hard drive, Windows 3.1, and a turbo button. Audio was limited to a single channel square wave courtesy of the PC speaker, cause sound cards were expensive.
Or if you want to really talk old in the personal computing realm, then we’ll have to start bring up companies like Commodore, Atari, and Radio Shack. But their computers were before my time.
Well personal computing just moved faster back then. Today, a decent computer from 10 years ago (2014) is perfectly usable for most people (with an SSD especially). But in 2010 if you had a top of the line computer that was from 2000 it was basically garbage. If you had a computer from 1990 in the year 2000 it was practically ancient history.
The PC market just has plateaud for everyday use. We just see incremental performance improvements for enthusiasts/professionals and little more than power draw improvements for everyone else.
We just see incremental performance improvements for enthusiasts/professionals and little more than power draw improvements for everyone else.
For several years we didn’t even see those. When AMD wasn’t competitive, Intel didn’t do shit to improve their performance. Between like Sandy Bridge (2011) and Kaby Lake (2016) you’d get so little performance uplift, there wasn’t any point in upgrading, really. Coffee Lake for desktop (2017) and Whiskey Lake for laptops (2018) is when they actually started doing… anything, really.
Now we at least get said incremental performance improvements again, but they’re not worth upgrading CPUs for any more often than like 5 or more years on desktop IMO. You get way more from a graphics card upgrade and if you’re not pushing 1080p at max fps, the improvements from a new CPU will be pretty hard to feel.
It could be as old as 15 years… If someone bought a species out i7 laptop in 2009 they may have upgraded it to 16gb at some point. Seems realistic enough
Please stop. I’m only in my 30s but you’re making me feel like I’m 80. To me, old is a 386 with 4MB of RAM, a 40MB hard drive, Windows 3.1, and a turbo button. Audio was limited to a single channel square wave courtesy of the PC speaker, cause sound cards were expensive.
Or if you want to really talk old in the personal computing realm, then we’ll have to start bring up companies like Commodore, Atari, and Radio Shack. But their computers were before my time.
Well personal computing just moved faster back then. Today, a decent computer from 10 years ago (2014) is perfectly usable for most people (with an SSD especially). But in 2010 if you had a top of the line computer that was from 2000 it was basically garbage. If you had a computer from 1990 in the year 2000 it was practically ancient history.
The PC market just has plateaud for everyday use. We just see incremental performance improvements for enthusiasts/professionals and little more than power draw improvements for everyone else.
For several years we didn’t even see those. When AMD wasn’t competitive, Intel didn’t do shit to improve their performance. Between like Sandy Bridge (2011) and Kaby Lake (2016) you’d get so little performance uplift, there wasn’t any point in upgrading, really. Coffee Lake for desktop (2017) and Whiskey Lake for laptops (2018) is when they actually started doing… anything, really.
Now we at least get said incremental performance improvements again, but they’re not worth upgrading CPUs for any more often than like 5 or more years on desktop IMO. You get way more from a graphics card upgrade and if you’re not pushing 1080p at max fps, the improvements from a new CPU will be pretty hard to feel.