Standards are something that doesn't seem to exist in the computer world; or something that does exist elsewhere, that the computer industry has decided to destroy.
For instance, SI units (as used by the Metric system) such as kilo and Mega have been trashed by the computer industry. The proper, SI use, of kilo means one-thousand, and Mega means one-million, however you write your numbers (i.e. kilo means one times ten to the third power, or one-thousand, not just a number that has a one at the fourth most significant digit position).
One of the accepted uses of kilobyte does not mean one-thousand bytes, likewise one of the uses of Megabytes doesn't equal one-million bytes. So terms, such as 12 kb and 34 Mb have, essentially, undefined values; unless whoever wrote the value down, also defines which variation of the multiplier they've used (SI or otherwise), which they rarely do.
They're inconsistently used, too. Not all hard drive manufacturers use the same form of Mega, for example. Some manufacturers even swap between the alternatives, throughout their documentation.
There are some computer nitwits who stupidly argue that standards hold them back, limiting their ability to do something. Though, I'm sure that they appreciate standards when it comes to cars running on standard fuels, not needing special roads for each model; or that they don't need to have ten different telephones to ring people using different companies than theirs; or any other number of standard arrangements that people rely on to be able to do things together.