The second part of your question, Sbdh Shvpj, was about confirmation bias. Our neurophysiological architecture, to the extent that we now understand it, supports mental states which interpret the structure of the world in different ways that are heavily modulated by culture. The resultant framework of knowledge and beliefs contributes significantly to our individual psychological stability, or homeostasis, so whatever appears to destabilise it may be unconsciously perceived as a threat. One way to attempt to compensate for this tendency towards psychological homeostasis, or confirmation bias, if one wishes to do so, is to recognise various forms it can take, e.g:
 deliberately avoiding (i.e. as a matter of personal 'policy') any source of information contrary to our worldview, which may in some cases verge on the obsessive or the pathological
 inadvertently avoiding such sources of information in order to remain within our 'comfort zone'
 being, alternatively, open to discomforting information, but inadvertently construing it in ways favourable to our worldview
 defending our current worldview so vigourously and constantly, both to ourselves and to others, that conceding any part of it becomes 'unthinkable' and represents a loss of face
There are no simple one-size-fits-all ways to compensate for confirmation bias. I think it's as well to recognise that the hardest kinds of worldviews, or epistemological commitments, to accept changes to are those which, rather than being simply abstract and interesting, are closely intertwined with our actions, behaviours, relationships and status in the world.
As I said, though, this path is for those who want to take it. Psychological stability, after all, is indeterminately related to general health and wellbeing, and to want to identify and compensate for confirmation bias may perhaps be more characteristic of those who are attracted to forums like this one than of the broader community. There is a great deal of literature indicating that a measure of self-deception or illusion helps many people to cope with life.
There is no test to tell a person how rational they are. Every person is at times rational and at other times irrational especially on occasions where strong feelings are involved. There are however things you can do to help alleviate confirmation bias and therefore be more rational. Here are my suggestions:
- Make sure you back up your claims with logic and evidence. Remember that anecdotal evidence and whims/feelings while useful do not qualify as real evidence and no one is obligated to take what you say as true if you back it up with weak evidence and your feelings.
- Make it a habit to ask yourself "is it me?" (Example: Do I have anything to do with the house being so messy or is it really all my significant others fault?)
- Plan ahead for those times where you are bound to be irrational. By plan ahead I mean imagine the scenario and imagine what the best course of action would be that way if the scenario ever happens you are more likely to actually do what you imagined yourself doing and therefore make a more rational choice than you otherwise would have.
- Shut up and listen, I mean really listen without interrupting. A lot of people fall into confirmation bias this way, they don't really listen to what the other party is saying or they interrupt or interject and this often causes a person to interpret what the other party is saying in a way that benefits your side.
I posted a few days ago about the 'confirmation bias' part of Sbdh Shvpj's question, and I've just been reading this week an article by Ryan McKay and Daniel Dennett* which has a great deal to say about the 'positive illusions' which often help us to live our daily lives. The authors observe that 'even if a general confirmation bias mechanism generates illusions as a well-entrenched subclass of outputs, the serendipitous benefits that those outputs provide might “protect” the confirmation bias mechanism ... from counter-selection, helping to “pay for” its persistence'.
The authors are discussing the issue from the point of view of hypothesised adaptive cognitive mechanisms inherited through natural selection. The 'outputs' they are talking about include those personal and interpersonal 'spin-offs' from positive illusions which tend to entrench the confirmation bias that helped to generate the illusions in the first place. In other words, positive illusions ('he says he's changed, things will be better now', 'I've gotta win this time') and confirmation bias can work in a mutually reinforcing cycle, hence the point made by other posters that persistent and deliberate habits of rational and probabilistic thinking are the way to undermine both irrational positive/negative illusions and confirmation bias.
* Ryan T. McKay & Daniel C. Dennett, The Evolution of Misbelief, Behavioral and Brain Sciences, 32, 2009:545 (homepage.mac.com/ryantmckay; ase.tufts.edu/cogstud/incbios/dennettd/dennettd.htm)