What the climate sensitivity is, in simplest terms, is the magnitude of change expected from a given forcing to the climate, from a given baseline or period. It's most often cited as what temperature we should expect from a doubling of atmospheric carbon dioxide. The most common is the equilibrium sensitivity, which is a little more complicated than what we should expect from just the addition of greenhouse gases. The Earth is a dynamic system, and so the equilibrium sensitivity is a measure which accounts for the various feedbacks which would result from doubling carbon dioxide, such as albedo changes and increased atmospheric water vapour.
Now, the climate sensitivity doesn't have to use greenhouse gases as the forcing on the climate system. The sensitivity calculation could just as easily use changes in solar irradience. It doesn't really matter, as we have values for what these changes produce in terms of forcing, measured in watts per square meter. Though of note, identical forcing from solar irradience and greenhouse gases will produce the same magnitude change in the mean temperature at the Earth's surface, but they will have different 'fingerprints'. Where solar changes cause the earth's climate to warm, we would see a larger warming trend in summer as opposed to winter. We would also expect to see the upper layers of the atmosphere warming concurrently. What we see today, is larger warming trends in the coolest half of the year, and upper layers of the atmosphere cooling, while the surface warms. These are fingerprints of a greenhouse induced warming.
Using paleo-climatology and the laws of physics, computer models produce estimates of what change can be expected. The modern estimates are largely constrained by a range of 2.6 - 4.1°C per doubling of CO2, with most values clustered around 3°C. The problem I alluded to before, the uncertainty of that measure is that the distribution of results has a long skewed tail.
The long tail is a result of paleo-climate studies. It's well known that it has been very warm on Earth in the past. A good example that still has the paleo-climate community scratching their heads is the Paleocene-Eocene boundary, which is marked by a large increase in global temperatures. The temperature climbed by 6°C in 20,000 years (This may sound like a lot, but it turns out to be a change of only 0.003°C per decade, while our current climate is warming at about 0.17°C per decade. That is almost 60 times faster!)
There are a number of hypotheses as to how this climatic change occured. However, the overwhelming conclusion for each of them is that they cannot be squared with our current estimates of climate sensitivity, which would seem to give evidence for our current estimates underestimating the inherent sensitivity of Earth's climate system to forced changes.
This as well has lead to speculations. The most prevalent is that there must be some mechanism which enhances natural variability. A new paper out this week in Proceedings of the National Academy of Science has posited a mechanism as a result of some modeling experiments. Authors Swanson, Sugihara and Tsonis(.pdf) find that inter-decadal variability in ocean circulation, shows evidence of this enhanced natural variability on top of the imposed climate forcing. A climate that responds to forcing with increased natural variability means that the sensitivity to that forcing must be higher than the previous estimates, none of which account for this, and none of which can square the large natural climatic changes in earth's past.