As this wiki Propositional Logic explains, propositions are treated as atomic units. Such propositions can be denoted by letters such as , which have a value of TRUE or FALSE. Expressions can be built up using the three primary logical operations AND, OR and NOT, commonly represented by symbols , , and (or ). and such expressions can be related by or even include conditionals and biconditionals represented by symbols and . So, for example, we have the expression
which is a logical identity, "decomposing a conjunction", meaning that if is true, then is true. We can have something more complicated,
which in fact is the "definiton of the biconditional" being the symbol. Propositional Logic explains more in detail, and, in practice, one is expected to make use of such logical identities to prove any expression to be true or not. This can be a cumbersome exercise, for one not familiar working with this.
It seems much like algebra, so is there a way to work these things out algebraically? Yes, sort of, one can. First of all, all propositions and expressions necessarily have a value of either TRUE or FALSE. We can use numeric values and to mean the same thing (although other schemes are possible), and for this wiki, lowercase letters shall denote propositions and expressions that have numeric values. We can immediately see that AND and NOT are simply
where the symbol means similar, or homologous, operations. Now, but the OR operation is a bit more cumbersome
We round this out with the conditional and biconditional
Now, let's try to put this into practice, say, Modus Ponens. Is it true?
Using the algebraic equivalents given above, we can work out the corresponding algebraic expression
Now, but here's we depart from usual algebraic convention. Every proposition and expression always have a value of either TRUE or FALSE, either or . This means that all the exponents in the algebraic expression can be reduced to , and we're left with
which means it's always true, and therefore a logical identity.
While this "trick" could be a handy way of double-checking logical identities or working out complicated combinations of logic gates (see Logic Gates), this approach helps one understand that even a "If ... therefore ...." statement can be represented by a logical gate, as well as biconditionals. This puts all of these things on an equal footing, so that conditionals and biconditionals are not fundamentally any different from logical operations.
Example Question 1
Find the equivalent algebraic expression for the logical expressions on both sides of the bicondtional.
Note that, when worked out, the algebraic form yields the same truth table as the logical expressions on either side of the biconditional, if we say that and .
Also note that had the problem asked for the equivalent algebraic expression for the entire logical expression, including the biconditional, the result would have simply been , as it is a well-known logical identity.
Helpful note: Given an arbitrary truth table, say, of two variables and , the logical expression yielding this truth table can easily be found by performing an operation on all the instances of , each one being an expression based on its location, each of which are , , , . So, for instance, for the biconditional, defined by a truth table where if and are both or , it returns , otherwise it returns , to generate the logical expression we simply write out
This might look different from the expression used above for the same thing, but in fact both are logically equivalent, and have the same simplest algebraic expression, which is .
This method of finding logical expressions for arbitrary truth tables can be generalized for any number of variables.
Here are algebraic equivalents for commonly used logic gates
so that it can be said that a biconditional is logically the same as . Meanwhile, a conditional doesn't correspond to any of the classic logic gates, as it is asymmetrical with respect to the two inputs, having the algebraic equivalent of .