One-bit thinking is a common error made particularly by computing people. The error is to treat something which needs to be represented by a real number (a continuum of values) as if it was either true or false, and if it is not completely true, it is completely false. In other words something which needs a real number has been replaced by a single bit: a boolean value.
For example, let's say you're trying to make the computing security of your organisation better. Pretty soon you discover that it's just not really possible to deal with some of the problems: people, for instance, really rely on some application which sends plain text passwords over the wire, and this application can't be replaced. The response of the one-bit thinker to this is to give up: since the problem can't be completely solved there is no point in even trying.
It should be obvious that this is a disaster, because computing security is not well-represented by a single bit: it's perfectly possible to improve security while not fixing all the problems. For instance let's say that there are five applications which use plain-text passwords, of which four can be replaced by ones which don't. Replacing four out of five has certainly made things a lot better, even though the problem has not been completely resolved.
The world is made of continuously-varying quantities: representing them by single bits, or by quantities which can have only a few discrete values, is a terrible mistake.
As far as I know the person who came up with this term, in the form of 'one-bit people', was Erik Naggum, although the problem is obviously much older than that.