I'd argue that this case is somewhat petty.
Keep in mind that the argument here is SPECIFICALLY against 6-Core and 8-Core... AMD quite correctly point out that the Compound Performance difference between 8-Core (Dual Execution Pipeline) Vs. 8-Core (Single Execution Pipeline) is ~80% … BUT and this is important to keep in mind, we are talking about 4 Modules and a COMPOUND Performance loss.
This means that 1 Module (2 Core Bulldozer) Vs. 2 Core Phenom II... Clock-for-Clock the Phenom II is better, when in Dual Core (1 Module) Operation.
Yet here's the thing... keep in mind what Bulldozer was strictly designed to resolve., aside from the obvious support for Features that Intel decided to Point-Release Incompatible variants of last-minute to prevent Phenom II from also supporting.
Primarily it was to resolve the Diminishing Returns in Multi-Core / Thread Performance, which essentially made 8+ Cores; while entirely possible to produce, absolutely pointless as a Product.
Why? Because you have per Core, Diminished Performance., which means:
On the Left is Phenom II / Core and on the Right is Bulldozer
1-Core = 1.00x / 1.00x
2-Core = 1.97x / 1.80x
3-Core = 2.91x / 2.80x
4-Core = 3.84x / 3.80x
5-Core = 4.73x / 4.60x
6-Core = 5.59x / 5.60x
7-Core = 6.44x / 6.40x
8-Core = 7.26x / 7.40x
Now keep in mind this is without Core Balancing., so that is to say you're manually responsible for Cache Management.
This means if you over-saturate the LDS... the above are the "Worst Case"., while typically speaking the Cache will never be fully saturated; and thus you can actually get VERY close to 1.00x Scaling.
I mean we're basically talking about Edge Case Scenarios where this differential will actually cause an issue... but it would be noticeable in Gaming. Where we're talking about Lightly Threaded Workloads., or in Modern Multi-Threaded Games Vs. Ryzen (for example), which has since improved this even further … but also as a note uses a 4 Core per Module instead of a 2 Core Module approach.
Mind they actually have increased the LDS Cache to match the Maximum Possible Instruction Executions; as opposed to Bulldozer where (for obvious reasons) they saved cost by reducing this to a "Common" Max rather than Hardware Max.
They did the same with GCN., hence why it has MUCH Lower Cache than the ISA indicates it should have; with only Vega actually having the Full Cache that the Architecture was designed with; and let me ask you... does it showcase any real difference in real-world performance?
The answer is... not really.
In any case... even if performance isn't the same as what is potentially possible from 6/8-Cores... well that doesn't stop the fact that it is still 6 and 8 PHYSICAL Execution Core Units.
Gimped or not... they are there... they can be physically seen and counted.
I mean it's like the argument "My Motherboard doesn't have 8 Phase VRM" … well does it physically have 8 VRM Phases on it? Doesn't matter if it's in 6+2 or 4+4 Configuration., as it's not being advertised as 8 Phase SoC VRM; it's merely being advertised as 8 Phases (Total).
You might not like that being the case., and sure you could argue there should be Advertising / Specification Regulations that outline that a Company MUST be Verbose in how these are Set-up for Consumer Information.
Still the Company in question is not lying or mis-advertising their product.
I mean the Consoles are Advertised as having 8-Core CPU... except they're not a Natural 8-Core, but instead 2x4-Core.
On top of this the Xbox One only has 5 Cores actually available for Games to use; the other 3 are Reserved for the OS.
Is that False Advertising? It DOES still have 8 Cores Physically Present., just because not all of them are usable doesn't mean they can't boast such.
There is a certain amount of Consumer Caveat Emptor (Buyer Beware) … and AMD had a point that, if Consumers KNEW this was how the CPU was designed but still purchased it; then simply because someone later choses to bring a Class Action., you can't simply jump on-board with that merely because you were disappointed with the performance of the product.
It would be like me, suing Samsung because my Display doesn't support HDR despite supporting 10bit and 12bit Colour... which for the most part isn't supported by basically anything, so arguably these are "Pointless" Features outside of a few exceptions.
And sure, I did purchase the display BECAUSE it supported said Higher Colour Bits,. which were advertised.
Yet at no point did it state it supported HDR; and it's not Samsung' fault that HDR/WGC are only supported via *very* specific formats by the major companies as opposed to via Native (Custom) Output.
This case as a whole is ridiculous, and I hope AMD do win it... because it would be a bad precedent to see set, that basically because a consumer isn't getting an IDEAL they believe they should be., that an IHV is responsible.
Particularly in the case with AMD; where unlike Motherboard OEM … they weren't being deliberately dishonest as to make the product sound better than it was.
They were quite open about how the Architecture Worked.
The ISA has been public for almost as long as the product has been on the market.
Tech Reviews have done Dives into the Architecture (and used a lot of AMDs slides when it was first introduced, to showcase the new Architecture) … while the implication of said changes might not have been obvious, it was still freely available for any Consumer / Developer to investigate further to be more informed about their purchase.
Welcome to the USA, home of 100000000000 lawyers and the right to file frivolous cases. Plus remember, this suit was filed in California, the most liberal, backwards state in the union, anywhere else besides New York it'd have been thrown out and the plaintiffs laughed out of the room.