Algebra errors. Why they are wrong, and why they make me sad

Is $\cfrac{1}{4} + \cfrac{2}{3}$ equal to $\cfrac{3}{7}$? No, of course not but why do so many students even at university level end up adding numerators to denominators to form these absurdities? I’m not even that annoyed by it. More intrigued because it doesn’t make any sense at all. Well of course it does in a way. You don’t know what you’re supposed to do (or can do) so you make a ‘reasonable guess’ as to what an expression could look like and you just run with it. After all you’ve never been asked to derive or motivate these things (at least not in years)  but when these things show up when I’m correcting a test at university level it gets me all depressed. I mean the student in question had applied to an engineering school (in chemistry) but he’s so far behind on basic algebra that he’ll either never make it or will go through hell.  Well, well, zero points and onto the pile… Granted I only ever saw this particular error twice while grading tests but other variants of universal linearity such as $\int fg dx = \int f dx \int g dx$ and $(f/g)' = f' / g'$ were frequent.

But if we focus more on this particular issue of adding numerators and denominators (besides it being kind of strange) is it at least something? Let’s call these alternate rules of algebra the broken system by naively defining $\cfrac{a}{b} \oplus \cfrac{c}{d} = \cfrac{a + c}{b + d},\qquad \cfrac{a}{b} \otimes \cfrac{c}{d} = \cfrac{ac}{bd}$

and let’s see what goes wrong. Well even at a brief glance we see that it’s not consistent with standard arithmetic but what is so much worse is that it isn’t even internally consistent! Sometimes absurd errors can be related to real mathemtical concepts. One of my favourites for example is when someone accidentally writes $(a + b)^2 = a^2 + b^2$

because this is actually true if you work in an anticommutative algebra such as a Clifford algebra where the property $ab = -ba$ leads to the annihilation of cross-terms. But as to the broken system adding denominators implies that if we are to have a zero (a number which when added does nothing, and we have to have one!) we’re going to have to use $0 / 0$ and that’s just wrong because once we allow this abomination into our algebra nothing becomes well defined because it clashes with the reducible property of fractions. That is the fact that $2/4 = 1/2$ which is encapsulated in $\cfrac{a}{b} = \cfrac{c}{d} \Leftrightarrow ad = bc$

If zero denominators are allowed then suddenly all numbers end up being equal, $0/0 = 1/3 = 3/2 = 9/7 = a/b = ...$ and they do absolutely nothing when we combine them. That doesn’t seem right… Well maby the reducibility property (the equivalence relation) needs to go (I never really liked it anyways says the student) but then suddenly you’ve ended up with $1/2$ being distinct from $2/4$ and  that just gives us new problems because suddenly you can’t find inverses anymore since $\cfrac{1}{2} \otimes \cfrac{2}{1} = \cfrac{2}{2} \neq \cfrac{1}{1}$

No matter how you twist an turn and redefine you seem to just end up with more problems than you started with. It can’t be made remotely similar to numbers (with the real number axioms). It’s just broken.

I find errors like universal linearity more depressing than annoying because to me it represents the divorce of mathematical symbols from their origin as descriptions of real things. They’re still abstract things, different in nature from whatever we wish to equate them with in the real world but the “axioms” and rules which we use are not arbitrary. An axiom such as $a + b = b + a$ is not ‘just because’ but  instead always has real intuition behind it. To point to the cutting of a cake or the geometric construction of proportions are not proofs of why adding fractions is the way it is but it’s a damn good indicator of why it ought to be the way it is.

Anyone who looks at three pie charts knows that you can’t add numerators to denominators. It is apparent! To move to explain the way addition of fraction should be done is of course trickier because of the issue with imperfect metaphores which I think I want to return to in a future post. But the cake metaphore works just fine as a model of why you ought to have to put things on common denominators. Metaphors are tools for understanding. They’re not the key to understanding. Generalization and abstraction is the key but without metaphors and physical representations of mathematical concepts there is no motivation for what we squiggle on a paper.

3 thoughts on “Algebra errors. Why they are wrong, and why they make me sad”

1. howardat58

One of the troubles with fractions is the “… Understand addition and subtraction of fractions as joining and separating parts referring to the same whole.” and the whole load of stuff about fractions as being identifiable as points on the number line (CCSS stuff).
My view is that fractions should be seen straight away as operators just like whole numbers, and so fractions ARE numbers. As in “Half of that apple” and “Two of those apples”, or more arithmetically “Half of 20”, “A quarter of 24”, and so on.
Then “Half of 20” + “One fifth of 20” is 10 + 4, or 14, and so fourteen 20ths of 20, or seven tenths of 20.
A very common mistake is to introduce and push the abstract symbolism before the ideas are clear. This happens with the + and – signs, with fractions, negative numbers, functions, calculus ………
When I see cat a whole load of stuff appears in my head. I get the abstraction.
When kids see y = mx + c they see whyequalsemxplusc, they don’t get the meaning, or any meaning.

• seriouscephalopod

Now I’m not an american but from what I’ve read at least the ideas contained in CCSS are completely reasonable and the number line has been proven to be conceptually powerful device which though it does not teach you anything about algebra it serves the precise purpose of unifying seemingly disparate ideas which is precisely what math is about. I mean the original purpose of the number line when it was first used in the 19th century was to encode the concept of total order ( ( $<$) directly into the representation and studies (in Sweden) show that people which are introduced to the number line have a better sense of unity and an easier time absorbing the idea that $-1$ is an object and not just and operation (the act of subtraction).

Now you can definitely screw up education by introducing abstraction without proper motivation but so long as you emphasize that math is man made you can free student from the impression that it's arbitrary, Numbers aren't given their properties by a higher authority, we give them their properties because they are reasonable and so they are to be consistent with the ideas we have already decided to be reasonable. In that framework motivated abstraction is a tool and not a hindrance.

• howardat58

I agree with all of this. My problem with the number line is that it is letting in measurement by the back door. The logical progression is: I need to know how long is that piece of wood. I take an agreed or arbitrary length as unit and sooner or later I find that there are pieces of wood whose lengths are not a multiple of the unit length. Hence fractional lengths, or fractions of the unit length. I now construct a ruler, and have a number line, at least for non-negative numbers. This is OK as for most measurements there is a natural zero, the smallest measure. Now I see numbers on the ruler, and it does not take too long to see that all measurements can be arranged in this same way. I have a Number Line. Do it this way with kids and they will understand the abstractions involved. And not overlooking the notation for fractions being meaningful: 1/2 is one of the two equal parts that the unit can be split or divided into, for example, half an inch, so 1/2 is 1 divided by 2, just as we happily write 12/3 is 4.