In formal definitions of arithmetics, division is often defined via multiplication: as a simplified example with real numbers, because a ÷ 2 is the same as a × 0.5, if your axioms support multiplication you’ll get division out of them for free.
Mathematicians also subtract by adding, with the same logic as with division.
Right. The cells are dividing in half, which would be represented in math form by 1/0.5 = 2. Dividing by one half is the same thing as multiplying by 2, and division in general is really just a visually simplified way to multiply by a fraction of 1.
Any time you divide by some fraction of 1, you will necessarily end up with a larger number because you’re doubling that division which reverses it back into multiplication, much in the same way as a negative x negative = positive. If that makes sense.
A mathematician would not be bothered by this. A high schooler taking algebra I might be though, if you phrased it the same way this post did.
if your axioms support multiplication you’ll get division out of them for free
this is true… except when it isn’t.
In mathematics, rings are algebraic structures that generalize fields: multiplication need not be commutative and multiplicative inverses need not exist
if your axioms support multiplication you’ll get division out of them for free*
*certain terms and conditions may apply. Limited availability in some structures, North Korea, and Iran. Known to the state of California to cause cancer or reproductive toxicity
a/b is the unique solution x to a = bx, if a solution exists. This definition is used for integers, rationals, real and complex numbers.
Defining a/b as a * (1/b) makes sense if you’re learning arithmetic, but logically it’s more contrived as you then need to define 1/b as the unique solution x to bx = 1, if one exists, which is essentially the first definition.
Computers don’t subtract, and you can’t just add a negative, a computer can’t interpret a negative number, it can only store a flag that the number is negative. You need to use a couple addition tricks to subtract to numbers to ensure that the computer only has to add. It’s addition all the way down.
And mathematicians divide by multiplying!
In formal definitions of arithmetics, division is often defined via multiplication: as a simplified example with real numbers, because a ÷ 2 is the same as a × 0.5, if your axioms support multiplication you’ll get division out of them for free.
Mathematicians also subtract by adding, with the same logic as with division.
Right. The cells are dividing in half, which would be represented in math form by 1/0.5 = 2. Dividing by one half is the same thing as multiplying by 2, and division in general is really just a visually simplified way to multiply by a fraction of 1.
Any time you divide by some fraction of 1, you will necessarily end up with a larger number because you’re doubling that division which reverses it back into multiplication, much in the same way as a negative x negative = positive. If that makes sense.
A mathematician would not be bothered by this. A high schooler taking algebra I might be though, if you phrased it the same way this post did.
this is true… except when it isn’t.
https://en.wikipedia.org/wiki/Ring_(mathematics)
Yeah I should maybe just have written
Cells: 🫣🫨😢
a/b is the unique solution x to a = bx, if a solution exists. This definition is used for integers, rationals, real and complex numbers.
Defining a/b as a * (1/b) makes sense if you’re learning arithmetic, but logically it’s more contrived as you then need to define 1/b as the unique solution x to bx = 1, if one exists, which is essentially the first definition.
The example was just to illustrate the idea not to define division exactly like that
That’s me, a degree-holding full time computer scientist, just learning arithmetic I guess.
Bonus question: what even is subtraction? I’m 99% sure it doesn’t exist since I’ve never used it, I only ever use addition.
It’s just addition wearing a trench coat, fake beard and glasses
Addition by the additive inverse.
Now you just replaced one incalculable thing with a different incalculable thing.
Eh?
Computers don’t subtract, and you can’t just add a negative, a computer can’t interpret a negative number, it can only store a flag that the number is negative. You need to use a couple addition tricks to subtract to numbers to ensure that the computer only has to add. It’s addition all the way down.
What does this have to do with computers?