Since when was a gigabyte 1,000,000,000 bytes?
Never. I looked this up for something unrelated like five minutes ago.
EDIT: Its actually a little more nuanced then that. Gigabyte is indeed 1,000,000,000 bytes. A GB, on the other hand, is 1,073,741,824 bytes. Technically, a GB is called a "Gibibyte".
I think that Engineering/Programming will use the powers of two GB Gigabyte, and Marketing will use the powers of ten/"SI prefix" GB Gigabyte, the end user will buy based on marketing and get what programming put in, nobody really ought to care because THE ACTUAL PERCENTAGE OF DISK USED IS STILL THE SAME, as the OP of this issue may not have grasped, and I will side with engineering because COMPUTERS ARE IN BASE 2.
Plus, most of the Internet measures the file sizes in powers of two, so you'd just be overestimating everyone else's files.
Seriously, though, filesizes do go straight down to bit/byte level and it's much easier to count the bytes in powers of two than in powers of ten.
*sigh* This entire debate is because someone decided to try and retcon SI onto a working powers of two system. Even though the prefixes were recycled, it's not, AFAIK, an SI unit so the people who came up with it, for better or for worse, define the way it works and they picked powers of two.
But if you write a gigabyte in binary, it has only ones and zeroes. Maybe we should write something to display THAT instead of the apparently-offensive current rendition. </snark>
Seriously, though, neither of the two groups is wrong, depending on what definition you want to ascribe to gigabyte and/or GB, but only one of them controls the OS.