So if I understand you correctly:
-Complex statments are composed of atomic statements (statements that cannot be broken up into smaller statements)
-These atomic statements are combined by logical operators to form complex statements
-If a statement is composed of a finite number of atomic statements, then the truth of each of the atomic statements can be determined, and therefore the truth of the statement as a whole can be determined
-If an equation is self-referential, then attempting to break it down into a finite number of atomic statements it imposible
-Therefore, if an equation is self-referential, it may be impossible to determine if it is true or not.
That does not mean all self-referential statments are neither true nor false. Let us imagine that I toss a coin. We can define the statements A and B such that:
A="The coin comes up as 'heads'"
B=A OR NOT(A) OR B
The equation B is self-referential, but it appears to be true and not false. Am I right on this?
A friend of the family who professes an interest in mathematics mentioned something odd and I hope someone here can help clear things up. He said that if in roulette the ball lands on red that it is more likely to land on red the next spin adding "why would it change". He also said the same applies to tossing a fair coin. This sounded wrong to me and seems like an example of the gambler's fallacy. I mentioned this and he said he meant mathematical likelihood and in support of his claim said it was a "proper formula" and refused to explain further. I said I'd look it up but have to admit to being stumped. I'd be willing to bet that at least his examples are invalid. Can anyone explain likelihood and how it would apply, if at all, to the mentioned examples? I'm ashamedly inexperienced when it comes to mathematics, something I should fix but am unsure how, so layman's terms would be appreciated.
Let us construct a maximum likelihood estimator.
Let us call the chance that the roulette wheel comes up red as P and the chance that it does not as Q, with P+Q=1. The roulette wheel does not change probability with each spin, so P and Q are constant. If we take N results from the roulette wheel, then the number of reds, which we shall call X, follows a binomial distribution with parameters N and P. Let us similarly call the number of times the wheel comes up as something other than red Y. X+Y=N.
He is referring to likelihood, we will construct a likelihood estimator. When we do this, we are given a set of results and we use these to guess the paramaters of the distribution. In this case, we will be given X and Y and we will use these to estimate P (and hence Q). Since X follows a binomial distribution, for any value of P the likelihood of X is given by:
Prob(X)=
NC
X*P
X*Q
YProb(X)=
NC
X*P
X*(1-P)
YWe are given what X and Y are, so they are constant. We want to find the value of P that gives the largest chance of Prob(X). Note that Prob(X) is a function of P, so we can differentiate it with respect to P. At it the maximum value of Prob(X), its derivative with respect to P will be zero. However, deriving the above with respect to P is possible, but a bit awkward. Consider instead the function L=ln(Prob(X)). ln is an increasing function, so it will be at its maximum when Prob(X) is at its maximum. It is also a lot nicer to differentiate:
L=ln(
NC
X)+Xln(P)+Yln(1-P)
dL/
dP=
X/
P-
Y/
(1-P)When P is at its maximum value,
P, then
dL/
dP will be zero.
X/
P-
Y/
(1-P)=0
X/
P=
Y/
(1-P)X*(1-
P)=Y*
PX=X*
P+Y*
PX=
P*(X+Y)
X=
P*N
P=X/N
In the example you gave, N=1 and X=1. This would give the estimate for
P as 1. However, it should be noted that this is a statistically "best guess" for
P given the information available.