Updated: Jul 22, 2021
As I have stated in previous posts there is, in my opinion, a major contradiction or flaw in the multiplicative inverse axiom that we rigorously use to help define division in mathematics. This is why we can't divide by zero (1/0 does not equal 0/1 because 1/0 is undefined or infinity depending on who you ask). Here is how I can best explain the reason that it is undefined.
I read a lot of posts by people innocently asking why we can't divide by zero or what is the result of 1/0. Some people go to great lengths trying to explain it and it is the same explanation I hear time and time again but nobody seems to notice the flaw right under their nose. Some explanations to this question are for example; If I have a loaf of bread and need to divide it amongst so many groups then each group will get an equal amount of bread distributed to them (1 loaf of bread divided amongst 2 groups = each group getting 0.5 or 50% of the loaf of bread (1/2 = 0.5)). This works in almost every mathematical scenario except when the denominator is zero. There is the contradiction in my opinion because if I have a loaf of bread and am to divide it amongst zero groups, wouldn't I still have the whole loaf of bread? The answer can't be 1 because there are zero groups to divide it amongst. It also can't be 1 because of the multiplicative inverse axiom itself (1/0 does not equal 1 because of how we write, say and conduct long hand division which shows the operation). The answer can't be infinity either unless we are saying, again using long hand division, that 0 goes into 1 infinitely many times. This contradicts our we write and say it when we read it as it is. For example we say 1 divided by 2 equals 0.5 but we write it long hand by showing that 2 divides equally into 1 0.5 times.
Another reason why I don't believe the answer is infinity is because there isn't an infinite amount of bread. The bread is finite. I believe this is proof that we are explaining division incorrectly. I am not arguing the fact that we have developed a great way of equating a fraction to a decimal. I totally agree that 1/2 of something equals 50% of something. That is all that appears to be happening in my opinion. I believe the argument over infinity, that has gone on for over a millennia, stems from a complete misunderstanding of division itself. From what I can tell there has never been a better or good enough way of defining division from what we have today until now.
Think about how simple my new theory is. If I have something and I cut it once, I will then have 2 pieces at 1/2 of their original size, and so on and so on (1 cut 1 time = (2/1)*(1/2). This is of course a purely ideal cut, in real life there would be some loss or remainder due to the cut. My other posts go into detail explaining this as well. The beauty of this is the fact that it is defined with a zero denominator (1 cut 0 = (1)*(1). People smarter than me argue with me about the multiplicative inverse axiom but it's right there in front of them. The result of multiplying the results on the right hand side of the equation is the multiplicative inverse. It also equals the numerator on the left hand side of the equation. Look at 2 cut 1. 2 things each cut equally once. Here the result is 2 cut 1 = 4 * 1/2 (see other posts for description). 4 * 1/2 = 2, and 2 is the numerator of the left hand side of the equation. This continues for all numbers and beautiful patterns emerge.
I am not saying that the rules of multiplication need to change either. The rules of multiplication still apply. I almost called this theory multiplivision or divisplication as it seems to me that both operations are taking place at the same time. Currently multiplying is adding groups but to me it also means to grow in number. Dividing something in my mind means to cut something. Both are happening in my theory. The amount of pieces are growing in number while the size of the pieces is being divided into smaller and smaller sizes.
Another rule that I feel that we break when it comes to division is the rule that any number divided by itself is 1. This applies to every number except zero. This is simply due to the fact that we say that division by zero is undefined and that zero could go into zero an infinite amount of times or once. It's another paradox that exists due to the rules of division. We are limiting our ability to think outside of the box. We also say that whatever you do to one side of the equation you must do to the other however when we want to change a decimal to a fraction we only multiply one side by 100 to get the percentage and not the other.