crack·pot /krakˌpät/ [colloq.] adjective: 1. eccentric; impractical. Most people who have ever been through the scientific training of graduate school eventually wind up running into the one of the little oddities of science:
that handwritten letter that arrives addressed "To the Head of X Department at Y University,"
the slightly befuddled-seeming but intense-eyed person who appears on your floor of the department burning to tell someone, anyone, about his latest theory of how everything is made up little geared wheels,
the odd person at a scientific convention with a presentation full of graphs about how, yes, the face on Mars is really real, it's really a face, and...
In short, the crackpot. The crackpot is the kind of person you get when an individual doesn't understand, and/or for one reason or another rejects, the commonly held scientific theory of how X works, whether X is germ theory, or human psychology, or the Standard Model of Physics, or anthropogenic climate change. (More on the latter in a moment.) It is to be noted that simply rejecting the current paradigm in itself doesn't make someone a crackpot. After all, science would never evolve from one way of thinking to another if current theory isn't continually challenged and tested. (Einstein's theory of relativity is still being tested after nearly a hundred years, for instance.) The trouble starts when someone not only rejects current thinking, but also:
doesn't actually understand what it is he is rejecting,
seizes on one particular observation that seems inconsistent with current theory and insists that this, irrespective of the whole of a body of work, is reason enough that current theory is wrong and justifies an entire alternative theory (rather than seeking any other explanation for the apparent inconsistency),
has an alternative theory which is at least primarily driven by an article of faith or outside agenda, rather than actual scientific evidence
makes the argument that science is —"just another belief system" (never mind that scientific assertions have to be testable and tested, unlike beliefs), —"just a theory" (confusing the level of testing and rigor needed for a scientific theory with everyday usage of the word "theory"), —"not settled" (because science requires you to keep testing, you never Know-with-a-capital-K the way you do with faith, so you can't base anything, especially even mildly inconvenient policy, on science), or —"just a conspiracy by those in Big Science wanting to keep control" (never mind that such a conspiracy would serve little purpose, would be eventually overturned by actual measurements, and that such an argument is particularly dissonant when claiming that science is simultaneously "too settled" by conspiracy yet "not settled enough" to actually base anything upon)
In particular, this kind of thinking all too often occurs when it comes to "skepticism" about climate change. Obviously detailed climate models are constantly being updated and refined, but that's what science does; science doesn't pronounce an article of faith and never adjust it to new data. And yet often critics of climate-change science criticize it both for being too unsettled (because models keep changing with new data, thus we shouldn't base policy upon it) and at the same time too settled (because consensus is so high, it must be a conspiracy, and thus we shouldn't base policy upon it). Also, oddly enough, these critics are almost never trained/qualified climate scientists and rarely seem interested in actually addressing climate science as a whole (or publishing in a peer-reviewed climate science journal), rather than triumphantly waving a single observation or survey as completely disproving current theory without considering any other possible explanation. It doesn't help if such criticism comes from someone who otherwise has a vociferous conservative agenda or particularly traceable links to Big Oil and the like. Of course, someone's political beliefs or financial ties don't, in themselves, invalidate particular points they make—but together with everything else, particularly a lack of scientific training and willingness to cherry-pick facts, strongly suggest that there is an axe to grind rather than a factual point to be made. This brings me back to my original point: the crackpot. When someone walks into a university department or a scientific conference with an oddball, not-fully-developed and utterly untested hypothesis, it is... eccentric, often a little unfortunate, but ultimately generally harmless. But when folks loudly declaim against climate science consensus and, frankly, attempt to manufacture a "scientific controversy" that frankly, does not exist in the science, it's not harmless. To the extent that the even the mildest warnings of actual climate scientists are right, this crackpottery is dangerous (and gives crackpots a bad name, given that many of them genuinely believe in something, however misguided, rather than merely railing against a whole field of science, wittingly or unwittingly on behalf of the profits of others). This is the difference between genuine skepticism and denialism. The former asks out of a sense of curiosity and hunger to understand, looking to break down bad ideas and refine good ones. The latter seeks simply to negate an idea that is inconvenient for one's existing beliefs. Denialism arises when we simply don't want to admit we even could be wrong: a natural enough human impulse. But this impulse to avoid being wrong is one of the most important reasons of all that scientific debates should happen between trained scientists. It's not just the amount of specific knowledge, though that's important, or being able to look at the big picture of an existing theory, though that's important as well. Perhaps the most important aspect of doing science, and the hardest part, is being able to be wrong. This is the trial-by-fire that graduate school represents: being wrong, over and over and over, on a long journey to some small piece of a testable model of how things actually work. Simply denying, saying "no, I don't believe that, I don't want to believe that," is antithetical to science. And it has no place in a discussion about science.