I should have been clearer-- A belief, as in, a thing that is held to be true without providence or basis in something concrete or provable.
To borrow from Reelya--
One can KNOW the earth is round by using trigonometry. This is not a belief.
One can BELIEVE the earth is round, based on any number of unprovable things. ("The fairies said so", etc.)
Basically, if you hold something to be true, based on an assertion alone, you are espousing a belief. Due to its nature, it is not a rational position.
Everyone has beliefs of one kind or another. This is normal and natural. That does not make them rational however. Understanding that there is a difference between knowledge and belief is important, and necessary, if a person is to behave rationally (within the context of a shared objective reality) about any given subject or course of action.
Failure to delineate that there is such a distinction between the two, leads to irreconcilable conflict, and demonstrably irrational rhetoric, behavior, and policy.
It was this last bit that spurned my initial response to Duna. The people in his religious community hold a set of beliefs that are clearly in contradiction with the greater shared objective reality (the conservative party has been caught bald-faced lying, and literally with their pants down doing very unbiblical things repeatedly-- yet they cling to the notion that they are the godly party.), yet they continue to act and behave as if that was not the case. They do this, because they have refused to delineate between knowledge (which deals exclusively with things that are potentially falsifiable but have withstood all means of doing so-- In this circumstance, it could well have been some slander applied to the party-- but multiple independent investigators, working the events in question simultaneously from multiple angles, all found evidence that shows that wrongdoing was on the part of the conservative politicians, and not on some slanderer--), and belief (assertion of somethings' truthfulness without any such rigor.) In such circumstances, revealing the falsity of the belief is like asserting the falsity of a fact for the person on the other end of the discussion. The actions they undertake "make sense", if and only if, the subject of the belief is presumed to be true.
As a bit of a tortured example--
Say a person gets drunk, and loses their keys. Their memory is totally compromised-- They do not actually remember much of anything after a certain point in the night. They have vague notions that they emptied their pockets in the livingroom. Over the course of the day, they search fruitlessly in the livingroom for their keys and never find them. (This is because they actually took them out of their pocket in dining room.) A friend comes by, and notes that the individual has basically ransacked their livingroom looking for their keys. Their friend politely suggests that they try looking in other rooms too-- but gets told no, "I totally set them down in the living room, they have to be in here somewhere."
It can get even more hairy, if the friend was their designated driver, and was the one who drove them home. They could have witnessed the keys being taken out of the pocket in the dining room the night before. The individual searching futilely for the keys still adamantly insists that the keys are in the living room, they remember taking them out in the living room, they must be in the living room.
It could be contended that the person looking for the keys is still perfectly rational (**IF** the precondition of their having removed their keys in the living **WAS** true, then the keys should indeed be there), but is engaged in a completely irrational exploit, based on a belief (They do not actually have real knowledge of where they removed their keys-- There is no way to demonstrate conditional's factualness or falsity.)