# Mistakes give rise to Problems- 3

Algebra Level 2

We know that $a\times (b+c) = (a\times b) +(a\times c)$ This property is there for "multiplication distributed over addition"

But if you do it in multiplication, then it's the following FALSE property $a\times (b\times c) = (a\times b) \times (a\times c)$

If you do it, it'll be a mistake !!!

But for how many distinct real values of $$a$$ (independent of $$b$$ and $$c$$, they can be anything except 0) is the above "False" property seen to be "always true" ?

×