This is a general problem in survey research. In an article in Chance magazine in 1997, "The myth of millions of annual self-defense gun uses: a case study of survey overestimates of rare events" [see here for related references], David Hemenway uses the false-positive, false-negative reasoning to explain this bias in terms of probability theory. Misclassifications that induce seemingly minor biases in estimates of certain small probabilities can lead to large errors in estimated frequencies. Hemenway discusses this effect in the context of traditional medical risk problems and then argues that this bias has caused researchers to drastically overestimate the number of times that guns have been used for self defense. Direct extrapolations from surveys suggest 2.5 million self-defense gun uses per year in the United States, but Hemenway shows how response errors could be causing this estimate to be too high by a factor of 10.
Here are a couple more examples from Hemenway's 1997 article:
The National Rifle Association reports 3 million dues-paying members, or about 1.5% of American adults. In national random telephone surveys, however, 4-10% of respondents claim that they are dues-paying NRA members. Similarly, although Sports Illustrated reports that fewer than 3% of American households purchase the magazine, in national surveys 15% of respondents claim that they are current responders.
Gays are estimated to be about 3% of the general population (whether the percentage is higher or lower in the military, I have no idea), so you can see how it can be very difficult to interpret the results of "gaydar" questions.
P.S. This post really is about guns and gaydar, not so much about God, but to maintain consistency with the above title, I'll link to this note on the persistent overreporting of church attendance in national surveys.