The stagnation of the computer market

Half-Saint

Active member
AmiBayer
Joined
Sep 2, 2013
Posts
2,982
Country
Slovenia
Region
Ljubljana
Just finished reading an article about computer sales going down. Am I the only one who thinks that the computer market is stagnant? When the first Nehalem processors came out, tech sites predicted that we'd soon be looking at 8 core processors as standard. Today, 7 years later, most people are still running dual-core processors in their computers. I'm not sure, if the stagnation is the result of Intel's unofficial monopoly or something else is to blame but the fact is, we haven't had any major improvements since Nehalem! Sure, they churn out new processor families every two years or so but apart from making the die smaller, the processor and GPU faster by some 10-15% than the previous generation, nothing really changes! I'd still be running an i3-2100, if I didn't have to merge two PCs into one after moving the office home. I think even a Sandy Bridge Pentium with 4GB of RAM would be fine for most average users (non-gamers) who use the computer for surfing/facebook/e-mail/youtube.

The graphics cards are the same! AMD just released their Radeon 300 series cards and they're basically using 5 years old chips! While they're not blatant rebrands, they give you no good reason to upgrade.

On the other hand we have display manufacturers who are pushing 4K and talking about 8K resolutions. I'm just at a loss what kind of hardware would it take to actually run that :p
 

SaviorX

VCCUG
Banned
Joined
Jul 12, 2010
Posts
1,863
Country
Canada
Region
Calgary, Alberta
the desktop market seems to have steadily shrunk as smartphone and tablets do more of the same work desktops used to do....

most 'younger' people i know have tablets and phones and no computer...or...1 computer for the household..but 1 or 2 portable devices each...
 
Last edited:

Matt3o

New member
Joined
Jul 1, 2015
Posts
272
Country
Italy
Region
Florence
also we are reaching a physical limit in die miniaturization, we probably need to switch to a completely new technology to see some real breakthrough
 

stephenfalken

Member
Joined
Jan 12, 2014
Posts
572
Country
UK
Region
Lancashire
As Matt3o said, there is a physical limit to the number of transistors that can be squeezed onto a die, the last one I read about from Intel stated over 4,000,000,000! Base clock speed increases began to slow down around the 3ghz mark because of this.

Not counting pipelining etc, the next step was to physically double the CPUs, then double again etc. Since then clock speeds have increased too up to 4-4.8ghz but I feel they are reaching the physical limits now, if the die size is increased, so does the time it takes to keep cache coherency between each CPU core, eliminating most of any speed gain.

I don't believe it's anything to do with a monopoly, I agree it is current technology limitations. We always want to do more and do it quicker!

Regards
 

RichyV

A4000 lives once more! A1200 bits are going...
AmiBayer
Blogger
Joined
Feb 5, 2013
Posts
1,171
Country
UK
Region
Gloucestershire
I think this is mainly software-lead.

Unless or until there are more multi-core capable software (that we - you & I - routinely use) applications available, there's no point. CPU's are certainly physically capable of being manufactured with 16 cores (or more), but what's the point if the software will only use 1 or 2 (or, very infrequently, 4).

For mass-crunching of data, GPU's are more capable these days, so that's where a lot of the research has been aimed. 'New generation' Graphics cards are still rolled-out on a much more regular basis that CPU's, which seem to just get the 'refresh' treatment of a few tweaks here and there every year to 18 months.

It's simply a question of 'need'. We've pretty much reached the realm of maximum available speed of systems as a whole (no point making a CPU with cores that are redundant and that run at speeds that make no sense when compared to the throughput of the rest of the system).

Servers & even more so, super computing aside, of course...

All IMHO of course... :)
 

Jameson

Active member
AmiBayer
Blogger
Joined
Jan 19, 2015
Posts
834
Country
Australia
Region
Queensland
I think you will find the law of "diminishing returns" comes into play as well. This is especially noticeable with gaming and directly with consoles.

For those not familiar with the term, the law of diminishing returns states simply that every time a production value is increased, the viable output is less than the proportional increase in production value.

This was developed for economics/industry etc, but I have found it fits INCREDIBLY well in computing.

For instance.... lets talk colour palette.

1 bit = 2 colours, usually black and white. Lets use this as the base.
2 bits = 4 colours, usually black and white and then two primary colours. Lets say red and green. To an observer, this is a huge difference. Before you only had monochrome, now you have colour (of a fashion)! It is night and day and all it took was 1 extra bit of palette storage information.
3 bits = 8 colours. Now as well as black and white, you can have all of the primary colours and some secondary colours including one grey allowing simply shading. The difference between 2 bits and 3 bits is not quite as huge as going from black and white to basic colour, but it is still a big leap.
4 bits = 16 colours. Carefully chosen and with clever dithering and sufficient resolution, now you can display photo images which can look quite decent.
5 bits = 32 colours. Allows better shading.... but not a big noticeable difference from 16 colours. So lets start doubling the colour palette bitrate.

8 bits = 256 colours. Big step up from 16, however not as big as you would think. Try dithering a nice image in 16 well chosen palette colours. Then do the same in 256. There is a noticeable difference.... It looks twice as good.... but it does not look SIXTEEN times as good even though it is using 16 times as many colours.
16 bits = 65536 colours. 256 times as many colours as 256 colour mode..... yet only twice or maybe three times as nice (objectively) to look at...
24 bits = 16777216 colours. 256 times as many colours as 65536 colour mode..... yet only very marginally nicer to look at.

This is diminishing returns. It works with most things. Audio. Resolutions etc.

This is why the PS2/XBox/Dreamcast/Gamecube era of consoles looked FAR FAR better (for polygonal games etc) than the PS1/3DO/Saturn era.... the 360/PS3 era looked considerably nicer than the previous era, though definitely not night and day.... and I have yet to see anything on the now two year old PS4 and Xbox One that looks CONSIDERABLY better than the previous generation.

Again, diminishing returns.

Of course, this also applies to computer hardware - cpus, ram, gpus etc. As technology increases, the apparent return will become steadily less.
 
Last edited:

SaviorX

VCCUG
Banned
Joined
Jul 12, 2010
Posts
1,863
Country
Canada
Region
Calgary, Alberta
y'know..that is precisely the reason i havent yet upgraded my 3.0GHz 4-core cpu....

back in 2007 or so i upgraded from an AM2 1.5GHz dual core, to an AM2+ 2.1GHz triple core...the difference was fairly substantial for most things i was doing, particularly video re-compress and par2 generation, but for most of the software coming out as well...particularly games...

then in about 2012 i upgraded again from the 3-core to the AM3 3.0GHz quad....which again was even more superior for many of the things i was doing, and along with memory and vid card upgrades over that period of time, i was able to play many of the new games at a decent level of quality and many of the older games i had at the higher levels of quality...

i have been toying with the idea of going to an AM3+ 3.5GHz 6-core as they are not too expensive these days (and as my board supports it so it's a simple drop-in upgrade)....but i havent bothered with any of the newer games which might see some benefit (GTA, Farcry, soon to be out Fallout 4)....but i can still run those at somewhat lesser quality if i want to anyways.

video re-compress and any par2 creation i choose to do would obviously see the benefits of a 50%+ more powerful cpu...but that is about all...

and it doesn't really feel like its necessary to put out the money on it, when the money could go to my other 'hobbies' :p

the 4core 3ghz DOES feel like its about the absolute minimum i would want to have in a cpu at the moment, however...

but i also don't see the need or even point to the 4.7GHz 8-core that my board WILL support...
 
Last edited:

Powerpie5000

Hardware Junkie™
AmiBayer
Joined
Mar 5, 2009
Posts
2,653
Country
UK
Region
Lancashire
AMD just released their Radeon 300 series cards and they're basically using 5 years old chips! While they're not blatant rebrands, they give you no good reason to upgrade.

I wouldn't go as far as saying they're based on 5 year old tech as even the oldest GCN based cards were released at the end of 2011. The 390 & 390X are tweaked versions of the 290 & 290X which were released near the end of 2013. If you have an older 7950, 7970, R9 280X or similar/lower then an upgrade to a higher end 300 series card is totally worth it ;). The PCs in my home use a GTX 980, GTX 980 Ti and even a little GTX 650... I've not seen any reason to buy another AMD GPU yet and I'm glad I went with a GTX 980 Ti instead of the Fury X with my main gaming PC :cool:.

I'd be very interested in how the new AMD Zen CPUs turn out in 2016... It's about time Intel had some real competition! AMD Zen could even persuade me to upgrade from my i7 4790K if it's any good :).
 
Last edited:

Vyncynt

Member
Joined
Mar 8, 2013
Posts
94
Country
USA
Region
Alabama
And don't forget that with every new generation, the operating system manufacturers also are adding more and more features to bog down the CPUs
 

SaviorX

VCCUG
Banned
Joined
Jul 12, 2010
Posts
1,863
Country
Canada
Region
Calgary, Alberta
the OS requirements to go from DOS/Win3.1 to 95 to 98 to XP were far greater than it was from XP->Vista->7->8->10, you really needed to upgrade at each level...cpu...possibly hard drive...almost certainly ram...

a 'top of the line' windows 95 system might run 98, but was very unlikely to be good enough for XP, whereas a top of the line XP system (when XP was released that is) would probably be good enough to run 10 (although probably only just) (in 2001 an absolute top of the line system would/could have a P4-2GHz)

the system requirements for 7, 8 and 10 are virtually the same, and well beneath the capabilities of any new system sold today...

in many respects the hardware that is coming out now is pretty much overkill for most peoples needs...save perhaps for top-end gaming

and in my case..i am still on an AM3+ cpu...literally...what...2 gens old for AMD? and with 16 gig of RAM (only @ 1333mhz, tho with this board i could go to 1833) a semi-decent vid card...with a small upgrade in CPU power...i would presumably be good for...another 4 years anyways, and probably 6...8 or more if i went to the 4.7ghz 8-core...

running 'apps' on top of an OS are a different matter...
 
Last edited:

Half-Saint

Active member
AmiBayer
Joined
Sep 2, 2013
Posts
2,982
Country
Slovenia
Region
Ljubljana
I'd be very interested in how the new AMD Zen CPUs turn out in 2016... It's about time Intel had some real competition! AMD Zen could even persuade me to upgrade from my i7 4790K if it's any good :).

The way things are going, you'll be able to run that i7 4790K for a long time ;-) As for me, I'm well beyond upgrading for the sake of upgrading. I still like toying with hardware it's just that my priorities have somewhat changed in the past 15 years.
 
Last edited:

edd_jedi

Active member
Joined
Apr 9, 2010
Posts
1,259
Country
United Kingdom
Region
London
IMO it's not the hardware that hasn't changed, it's the software (apart from games.) As mentioned what do most people use computers for? Web browsing and email. Any computer made in the last decade can comfortably do both of those things. I bought a top of the range i7 in 2009, this year I sold it and bought an i5 laptop. Why? Because in the six years I owned the i7, nothing I did ever pushed it, and I decided a small laptop would be more practical as I clearly don't need the power.

So I don't think it's hardware that's plateaued, it's the software. While current generation processors can do all that 99% of users need at a decent speed, why bother? I have a <£500 laptop that boots in 10 seconds and does everything I need already. What will drive future hardware is future software.
 

Half-Saint

Active member
AmiBayer
Joined
Sep 2, 2013
Posts
2,982
Country
Slovenia
Region
Ljubljana
AMD just released their Radeon 300 series cards and they're basically using 5 years old chips! While they're not blatant rebrands, they give you no good reason to upgrade.

I wouldn't go as far as saying they're based on 5 year old tech as even the oldest GCN based cards were released at the end of 2011. The 390 & 390X are tweaked versions of the 290 & 290X which were released near the end of 2013. If you have an older 7950, 7970, R9 280X or similar/lower then an upgrade to a higher end 300 series card is totally worth it ;). The PCs in my home use a GTX 980, GTX 980 Ti and even a little GTX 650... I've not seen any reason to buy another AMD GPU yet and I'm glad I went with a GTX 980 Ti instead of the Fury X with my main gaming PC :cool:..

Ya, you're right it's not 5 years old but I wasn't off by far. As you say an R7 370 is a tweaked R9 270 which in turn is a tweaked 7850. You normally don't just tweak an existing product a little bit, add more memory and call it something else. I still stand at my statement that we're basically still being sold old tech (albeit tweaked) under new model names.
 

SaviorX

VCCUG
Banned
Joined
Jul 12, 2010
Posts
1,863
Country
Canada
Region
Calgary, Alberta
no, but they did enjoy selling us slightly tweaked CDRoms for years...

1x...2x...3x....4x...6x...8x...10x...12x...14x...16x...18x...... :LOL:

and i am also quite glad that a 19" monitor is now a 19" monitor and not a 19" monitor (18" viewable)
 
Last edited:

Bastich

Active member
Joined
Apr 13, 2011
Posts
1,930
Country
United Kingdom
Region
Staffordshire
@Jameson I have worked in games and other video related industries and your 16 / 32bit colour comparison is very flawed. 16bit colour produces huge bands in colour when doing gradients where as 24bit produces totally smooth gradients. 24bit colour equates to basically the entire colour range visible by the human eye, 16bit is way off. Bare in mind 16bit is setup as R5 G6 B5, 5bit red colour is very granular.
 
Last edited:

BLTCON0

Math inside
AmiBayer
Joined
May 7, 2011
Posts
2,221
Country
Hellas (Greece)
Region
Chania, Crete
@Jameson I have worked in games and other video related industries and your 16 / 32bit colour comparison is very flawed. 16bit colour produces huge bands in colour when doing gradients where as 24bit produces totally smooth gradients. 24bit colour equates to basically the entire colour range visible by the human eye, 16bit is way off. Bare in mind 16bit is setup as R5 G6 B5, 5bit red colour is very granular.

It comes down to resolution too, though.

I'll use 15 bit instead of 16 bit to make greys more sensible:

With 15 bits (5-5-5 RGB) there are exactly 2^5 = 32 levels of grey shades, from 0-0-0 (black) to 31-31-31 (white).
With 24 bits (8-8-8 RGB) there are exactly 2^8 = 256 levels of grey shades, from 0-0-0 (black) to 255-255-255 (white).

So suppose in a photo with unlimited theoretical resolution, which spans the whole viewport, there's a section with a gradient from total black to total white.
That section might horizontally span 25% of the screen, now if the viewport's resolution is 1600 x 1200 we're talking about 400 pixels so the full range of 256 shades can be taken advantage of. But if it's 800 x 600 we're talking about 200 pixels and the full gamut can't of course be squeezed in there. Of course it's still better than just 32 shades.

Now in practice I'd say my 25% is pretty generous - I'd expect most full-range areas to cover less in average.

So I'd say as long as the resolution is at 800 x 600 levels, rendering a photo at 16 bit won't indeed yield much difference from rendering it at 24 bit.

But as the resolution goes up, 24 bit becomes mandatory to fully utilise the extra pixels.
 
Top Bottom