From: Charlie-Boo on

Let A, B and C represent the following:

A |- Provability “The following is provable.”
B ~ Negation “The following is false.”
C (aX) For All “For all values of the variables the following is
true.”

Then each string of ABC is a wff - and a very useful wff at that e.g.:

1. [] P
2. A |-P
3. B ~P
4. C (aX)P(X)
5. AA |- |- P
6. AB |- ~P
7. AC |-(aX)P(X)
8. BA ~|-P
9. BB ~~P
10. BC ~(aX)P(X)
11. CA (aX)|-P(X)
12. CB (aX)~P(X)
13. CC (aX)(aY)P(X,Y)

What are the relationships between pairs of these wffs? With such a
simple structure, writing Rules of Inference should be a snap.

1. 1=9 (so 1=>9 and 9=>1): P = ~~P Double Negation [] = BB From
nothing comes BB and from BB we get nothing.

2. 2=>1: If the system is Sound – this is the definition of
soundness. What is provable is true. A => [].

3. 1=>2: Godel showed this is not always so. There is a true but
unprovable statement.

4. 8=>6: Godel then showed this is also not always so. There is an
unprovable and unrefutable statement.

5. 2=>5: Yes, if P is provable then we can prove that P is provable
(Proof Theory.) A => AA.

Godel’s first theorem based on w-consistency can be formalized quite
nicely using only ABC wffs. For starters, w-consistency is (aX)|-P(X)
=> ~|-~(aX)P(X) i.e. CA => BABC.

What axioms, rules and definitions are needed to show the equivalences
and implications that can be represented?

Occam likes. Better to spend an hour exploring 2 or 3 significant
theorems than to spend an hour scratching your head at a 1,000 line
proof of 1+1=2!

C-B