Why dont gpu manufacturers use detachable gddr?
Sure you would have to redesign the whole board but no new technologies would have to be used, you just have to get the leads up to some corner of the board. You could implement it in one generation and just keep using it from there on. This modular system would allow for multiple things.
1. Customers would have to purchase a gpu and gddr for it, making the cost in the first generation about the same, however, in the following generation they could just reuse the gddr from their previous graphics card, thus making it overall cheaper, since they would be buying cards without gddr pre-installed. This ultimately makes it more affordable for the consumer.
2. If gddr dies, your whole gpu isnt bricked, instead you just replace it.
3. Custom amouts of gddr rather than factory specified. (if you want to run larger textures just buy a stick of ggdr)
The first company to utilise this technology would probably see an increase in sales for that year (since people would see it as "new" and "revolutionary") as well as allow that company to claim best bang for the buck, considering their card would in fact be cheaper than the rivaling company's in the same range (write in small print "but you also have to buy gddr for it")
The following year the technology wouldnt be as new, however since people bought it the previous year and have those sticks from last year, it would still be the most economic route to buy a gpu from this vendor. If you patent this first you can also block out the rivaling company from using it .
So what is the reason this isnt being used? Is it because its more profitable for vendors?