View Single Post
Posts: 1,522 | Thanked: 392 times | Joined on Jul 2010 @ São Paulo, Brazil
#101
I imagine the proper way of doing it for backwards compatibility would be to mess with the video driver to make it so when a program tries to make the screen be 16bits, padding is added as the least significant bits of each color component to produce a 32 (or 24 or whatever) bits image; the programs would think they are making the display be 16bits when actually everything is more bits all the time. Unless, of course, it is actually easy and trouble free to change bit depth each time a program wants a different one.
 

The Following User Says Thank You to TiagoTiago For This Useful Post: