How about feet of IBM punch cards?
A 1 foot tall stack holds 1,647,360 bits of data if all 80 columns are used. If only 72 columns are used for data then it’s 1,482,624 bits of data and the remaining columns can be used to number each card so they can be put back in order after the stack is dropped.
THIS is what I’m talking about!
I like this because the amount of bits in a stack can vary depending on whose foot you use to measure, or the thickness of the card stock.
IBM standard cards are one 48th of a barleycorn thick. I believe IBM measured from the 1932 Iowa Reference Barleycorn, now kept in the vault inside Mt Rushmore.
No, those are not metric, they just borrowed some prefixes, although it’s not like metric designers invented those anyways.
We should be using KB, MB, GB, and TB. Also we should adopt the entire International System of Units and stop with the shit we use. The army uses metric. Why can’t the rest of the population?
This is a joke post. That’s why it’s in the programmer humor community
Oh damn. Well then I rescind my statement. We should obviously use a Base-50 system. One bit for each state.
TiB
One tebibyte equals 2^40 or 1,099,511,627,776 bytes.
What makes that more intuitive than any of the others?
K/M/G/T/P = decimal prefixes. K is 1000. M is 1,000,000. etc.
Ki/Mi/Gi/Ti/Pi = binary prefixes. Ki is 2¹⁰ (1024), Mi is 2²⁰ (1,048,576), etc.
It’s a disambiguation of the previous system where we would use KB to interchangeably mean 1000 or 1024 depending on context.
I thought you wanted it to be more american
Yeah, American stuff makes sense unlike the metric system which is completely unintuitive /s
This whole post is meant to be a joke. The metric prefixes are perfectly understandable even if they’re technically off the decimal benchmarks by a handful of bytes
Metric is intuitive, but also shit. Just because you have 10 fingers doesn’t mean you should formulate a measurement system out of it. In fact if you actually give a shit about intuitiveness you’d go back to the American system which is roughly base 12 and therefore easier for division and manual estimations.
1 sperm is 37.5MB.
If it’s for American context then you mean 1 baby
Every sperm is sacred.
As all your other measurements are based on the subjective measures of random people, I’d suggest using the amount of digits of pi a senior can remember in the time a new school shooting happens as a base, like a Bit. Then just multiply by a random amount for bigger sizes and prefix the name with random presidents names.
Yes another person who doesn’t understand why the metric system sucks. American’s (fuck yea) use only useful and descriptive units, so obviously MiB, KiB, GiB, etc. because who cares what the closest rounded Ten’s digit is? The computer world deals in Bits.
I use Kb, Mb, Gb, in my world (networking). And MiB GiB and TiB when I want to know the actual size something is.
When I interned in a NOC I referred to bandwidth in GiB/s once or twice. The looks on the senior engineers’ faces were priceless.
Why use metric? Because the fact that 1440KiB is 1.41MiB is annoying.
It doesn’t make it better, it’s just really much more convenient when you’re working in a base 10 digit system. There are lots of times when the advantages of an alternative unit system outweighs that convenience.
Its a funny thing that so many people are emotionally attached to unit systems. It’s a tool, use the best one for the job.
The best tool for the job isn’t ever metric ironically enough.
M$ already fucked that up for everybody calling GiB GB.
Cut to a younger me looking at HDDs in Walmart, and wondering why the fuck they were using much higher numbers than what the drive actually had. That’s when I learned the difference, and started grow my hate for advertising bullshit.
We can use bits instead of bytes. That way it can look 8x bigger than it really is and have no real bearing to modern computing.
I would suggest:
- 1KB = storage capacity of 1 kg of 1.44 floppy disks.
- 1MB = storage capacity of 0.0106 mile of CD drives.
- 1GB = storage capacity of 1 good computer in the 2000s.
- 1TB = storage capacity of 1 truck of GB (see above)
1 kg
(͡•_ ͡• )
Don’t you mean one pound, abbreviated lb?
Naw, it’s actually one Kinda Gallon; a Kinda Gallon of course referring to the average of the masses of a gallon of water, a gallon of beer, and a gallon of whiskey.
I know you’re joking, but that first Kb definition makes me grind my teeth!
1.44 floppy disks can store, well, 1.44 MEGAbytes. So how can 1 kg of floppy disks can just store 1 KB?
Thank you for your compliment. I love it. The floppy disk is 1.44 non-freedom MB, not 0.015264 miles of CD drives.
lol
1 bible = 69 porn clips = 420 feet (unrelated to the other measurement)
Size of an uncompressed image of the Washington Crossing the Delaware painting = 1 Yankee
12 Yankees in a Doodle
60 Doodles in an Ounce (entirely unrelated to the volume or weight usage of ounce)
60 Doodles in a Dandy
giggity
That’s too straightforward. It should be 113 Doodles in a Dandy. And 73 Dandies in a Macaroni.
How many Macaronis in a Handy though? I’d say 1776.
… I’ll see myself out.
4 Macaronis in a bit of an ounce.
8 Macaronis in a full ounce.
Maybe its the number of men in the boat number of dandies in a macaroni
Congrats, in my almost year on Lemmy, this is the best comment I’ve seen!
Make sure to make the specific term “Computer Ounce”, or co. oz.
Better yet, just use “cooz” as the “common unit”
Then it’s proportioned following fluid ounce measurements from there. e.g. “coc” (computer cup) is 16 coozes.
Ayyy, I’m in COLORADO so this would be great.
I second this. It makes total sense - computer memory is a volume to be filled with data. They ain’t call parts of a hard drive volumes for nothing.
Sampled at what resolution, though? It’s a physical painting and the true, atomic-scale resolution would make this whole system useless.
May I suggest the entire constitution in ASCII (American Standard Code for Information Interchange) instead? Bonus points if any future amendments change the whole system.
A milebyte is 5280 bytes
Most people would use “word”, “half-word”, “quarter-word” etc, but the Anglophiles insist on “tuppit”, “ternary piece”, “span” and “chunk” (that’s 5 bits, or 12 old bits).
KiB, MiB, GiB etc are more clear. It makes a big difference especially 1TB vs 1TiB.
The American way would probably be still using the units you listed but still meaning 1024, just to be confusing.
The difference really needs to be enforced.
My ram is in GiB but advertised in GB ???
Your RAM is in GiB and GB. You can measure it either way you prefer. If you prefer big numbers, you can say you have 137,438,953,472 bits of RAM
Pretty sure the commenter above meant that the their RAM was advertised as X GiB but they only got X GB, substitute X with 4/8/16/your amount
As far as I know, RAM only comes in GiB sizes. There is some overhead that reduces the amount you see in the OS though. But that complaint is valid for storage devices if you don’t know the units and expect TB/GB on the box to match the numbers in Windows
MigaBytes?
MiB = mebibyte
The American way would probably be still using the units you listed but still meaning 1024, just to be confusing.
American here. This is actually the proper way. KB is 1024 bytes. MB is 1024 KB. The terms were invented and used like that for decades.
Moving to ‘proper metric’ where KB is 1000 bytes was a scam invented by storage manufacturers to pretend to have bigger hard drives.
And then inventing the KiB prefixes was a soft-bellied capitulation by Europeans to those storage manufacturers.
Real hackers still use Kilo/Mega/Giga/Tera prefixes while still thinking in powers of 2. If we accept XiB, we admit that the scummy storage vendors have won.
Note: I’ll also accept that I’m an idiot American and therefore my opinion is stupid and invalid, but I stand by it.
Absolutely, I started computers in 1981, for me 1K is 1024 bytes and will always be. 1000 bytes is a scam
Kilo comes from greek and has meant 1000 for 1000’s of years. If you want 2^10 to be represented using greek prefixes, it better involve “deca” and “di”. Kilo (and di) would be usable for roughly 1.071508607186267 x 10^301 byte. KB was wrong when it was invented, but they were only wrong for decades at least.
Computers have ruled the planet for longer than the Greeks ever did. The history lesson is appreciated, but we’re living in the future, now, and the future is digital.
Calling 1048576 bytes an “American megabyte” might be technically wrong, but it’s still slightly less goofy-looking than the more conventional “MiB” notation. I wish you good luck in making it the new standard.
No the correct way is to use the proper fucking metric standard. Use Mi or Gi if you need it. We have computers that can divide large numbers now. We don’t need bit shifting.
The metric standard is to measure information in bits.
Bytes are a non-metric unit. Not a power-of-ten multiple of the metric base unit for information, the bit.
If you’re writing “1 million bytes” and not “8 million bits” then you’re not using metric.
If you aren’t using metric then the metric prefix definitions don’t apply.
There is plenty of precedent for the prefixes used in metric to refer to something other than an exact power of 1000 when not combined with a metric base unit. A microcomputer is not one one-thousandth of a computer. One thousand microscopes do not add up to one scope. Megastructures are not exactly one million times the size of ordinary structures. Etc.
Finally: This isn’t primarily about bit shifting, it’s about computers being based on binary representation and the fact that memory addresses are stored and communicated using whole numbers of bits, which naturally leads to memory sizes (for entire memory devices or smaller structures) which are powers of two. Though the fact that no one is going to do something as idiotic as introducing an expensive and completely unnecessary division by a power of ten for every memory access just so you can have 1000-byte MMU pages rather than 4096 also plays a part.
Or maybe metric should measure in Hartleys
The metric system is fascist. It was invented by aristocratic elitist control freaks. It is arbitrary and totalitarian.
“The colorfulness and descriptiveness of the imperial system is due to the fact that it is rooted in imagery and analogies that make intuitive sense.”
I’ll save my own rant until after I’ve seen the zombies froth.
If you aren’t using metric then the metric prefix definitions don’t apply.
Yes it does wtf?
This is such a weird take to me. We don’t even colloquially discuss computer storage in terms of 1000.
The Greek terms were used from the beginning of computing and the new terms of kibi and mebi (etc.) were only added in 1998 when Members it the IEC got upset. But despite that, most personal computers still report in the binary way. The decimal is only used on boxes for marketing terms.
most personal computers still report in the binary way.
Which ones?
Windows reports using binary and continues to use the Greek terms. Windows is still the holder of largest market share for PC operating systems.
Hey how is “bit shifting” different then division? (The answer may surprise you).
Bit shifting works if you wanna divide by 2 only.
interesting, so does the computer have a special “base 10” ALU that somehow implements division without bit shifting?
In general integer division is implemented using a form of long division, in binary. There is no base-10 arithmetic involved. It’s a relatively expensive operation which usually requires multiple clock cycles to complete, whereas dividing by a power of two (“bit shifting”) is trivial and can be done in hardware simply by routing the signals appropriately, without any logic gates.
In general integer division is implemented using a form of long division, in binary.
The point of my comment is that division in binary IS bitshifting. There is no other way to do it if you want the real answer. You can estimate, you can round, but the computational method of division is done via bitshifting of binarary expansions of numbers in an ALU.