Journal of Neurophysiology

Transient Analgesia Evoked by Noxious Stimulus Offset

Joshua D. Grill, Robert C. Coghill

Abstract

Pain has long been thought to wax and wane in relative proportion to fluctuations in the intensity of noxious stimuli. Dynamic aspects of nociceptive processing, however, remain poorly characterized. Here we show that small decreases (±1–3°C) in noxious stimulus temperatures (47–50°C) evoked changes in perceived pain intensity that were as much as 271% greater than those of equal magnitude increases. These decreases in perceived pain intensity were sufficiently large to be indistinguishable from those evoked by 15°C decreases to clearly innocuous levels. Furthermore, decreases in pain ratings following noxious stimulus offset were significantly greater than those occurring during adaptation to constant temperature stimuli. Together, these findings indicate that an analgesic mechanism is activated during noxious stimulus offset. This analgesic phenomenon may serve as a temporal contrast enhancement mechanism to amplify awareness of stimulus offset and to reinforce escape behaviors. Disruption of this mechanism may contribute importantly to chronic pain.

INTRODUCTION

Changes in noxious stimulus intensity have long been known to exert significant effects on various aspects of pain sensation (Hardy et al. 1952). More recently, both psychophysical and neural responses during increases in noxious stimulus intensity have been examined in detail (Harrison and Davis 1999; Pertovaara 1999; Yarnitsky et al. 1992; Yeomans and Proudfit 1996). Such responses during decreases in noxious stimulus intensity, however, remain largely uncharacterized. If the same nociceptive mechanisms process information about both increases and decreases in noxious stimulus intensities, then changes in perceived pain during stimulus decreases would be predicted to be indistinguishable from those evoked during equal magnitude stimulus increases.

Two lines of evidence indirectly suggest that different mechanisms are engaged during dynamic increases and decreases in noxious thermal stimuli. First, Robinson and colleagues (1983), in examining the ability to detect incremental increases in noxious stimulus intensity, observed in passing that changes in pain intensity ratings produced by incremental decreases in noxious stimulus intensity were larger than those produced by an increase of an equal magnitude. Second, anecdotal evidence suggests that thermal intensities sufficient to produce tissue damage (54°C) can be reached with little or no perception of pain when temperature is increased in a step-wise fashion via sequential increases of 2°C and decreases of 1°C (i.e., 47, 46, 48, 47, 49°C …) (D. J. Mayer, personal communication). Together these disparities between perceptual changes and stimulus temperatures suggest that an active analgesic mechanism may be engaged during the termination or reduction of a noxious stimulus.

To better characterize this possible analgesic response, volunteers were recruited to evaluate changes in perceived pain intensity during dynamic alterations in noxious thermal stimuli. Psychophysical responses following incremental decreases in noxious stimulus temperatures were directly compared with sensations following incremental increases in noxious stimulus temperatures to determine if an active analgesic mechanism is engaged during stimulus offset.

METHODS

Subjects

Twelve healthy volunteers (7 males and 5 females), ages 22–31, participated in this investigation. All subjects gave informed consent acknowledging that they understood that the experiment involved the presentation of heat-induced pain, that the methods to be used were clearly explained and understood, that no tissue damage would result from stimulation, and that they were free to terminate stimulation or to withdraw from the study at any time. All procedures were approved by the Institutional Review Board of the Wake Forest University School of Medicine.

Thermal stimulation

Thermal stimuli were delivered to the ventral surface of the dominant forearm via a 16 × 16-mm peltier device with rise and fall rates of 6°C/s (Medoc TSAII, Ramat Yishai, Israel). This device was attached to the forearm with a Velcro strap and was maintained at a baseline temperature of 35°C. Three different types of stimulus trials were used.

EXPERIMENTAL TRIALS.

These trials were designed to compare responses to incremental decreases in noxious stimulus temperatures with those evoked by incremental increases. Each experimental trial consisted of three contiguous phases: an initial painful stimulus (T1, 47, 48, or 49°C, 5-s duration), a 1, 2, or 3°C increase to a second temperature (T2, 5-s duration), and a decrease to a test stimulus (T3), equal to T1 but with a duration of 20 s (Fig. 1).

Fig. 1.

Time course of experimental and control trial types. Each trial consisted of three contiguous phases: T1, an initial noxious thermal stimulus; T2, a 2nd noxious thermal stimulus at least 1°C greater than T1; and T3, a 3rd thermal stimulus equal to T1 for experimental trials or equal to 35°C for control trials. Pain intensity ratings (↓) were obtained 4 s after the start of T1, 4 s after the start of T2, and 5, 10, 15, and 20 s after the start of T3.

CONTROL TRIALS.

These trials used the same T1 and T2 stimuli as the experimental trials but used a T3 of 35°C. These trials provided a means to determine if the change in pain intensity evoked by a 1, 2, or 3°C decrease in noxious stimulus temperature was distinguishable from that produced by a step down to a clearly innocuous temperature.

CONSTANT TEMPERATURE TRIALS.

The disproportionately large drops in pain ratings following the T2–T3 temperature decrease in the experimental trials could potentially be attributed to the adaptation of primary afferents known to occur during prolonged stimulation (LaMotte et al. 1983). To rule out this possibility, subjects also rated pain intensity during constant temperature stimulation (35, 47, 48, and 49°C; 35-s duration) to characterize the degree of adaptation.

To prevent tissue damage, a maximum T2 temperature of 50°C was used in all trials. To minimize sensitization or adaptation, all trials were separated by approximately 2 min and were performed on previously unstimulated sites on the skin. Experimental and control trials were presented once per subject in a randomized order, while constant temperature trials were presented twice in a randomized order (once near the beginning and once near the end of each testing session).

Assessment of pain intensity

Subjects rated pain intensity (as defined by Price et al. 1989) using a mechanical visual analog scale (VAS, 15 cm length, 0–10 range, verbal anchors of “no pain sensation” and “most intense pain sensation imaginable”) (Price et al. 1983, 1994). Pain intensity ratings were obtained 4 s after the start of T1, 4 s after the start of T2, and 5, 10, 15, and 20 s after the start of T3. In the case of constant temperature trials, pain intensity ratings were obtained at analogous points in time. Subjects were prompted for ratings by a computer-controlled audio signal.

Statistical analysis

In experimental and control trials, the difference in psychophysical pain intensity ratings (Δ) resulting from T1 to T2 (e.g., 48–49°C) and T2 to T3 (e.g., 49–48°C) temperature changes was first used to describe the effect that an increase or decrease in stimulus temperature had on pain intensity. Within-subjects analyses of variance (ANOVA) of Δ scores determined if the magnitude of changes in perceived pain intensity evoked during stimulus decreases was significantly different from that evoked during stimulus increases.

For comparisons involving constant temperature stimulus trials, absolute VAS scores served as the dependent variable. Within-subjects analyses of variance determined if pain ratings following incremental decreases in stimulus temperatures were statistically different from ratings of constant temperature stimuli.

RESULTS

Effects of incremental increases and decreases in stimulus temperatures

Changes in perceived pain intensity associated with incremental decreases in noxious stimulus temperatures were markedly larger than those evoked by equal magnitude increases in noxious stimulus temperatures (Fig. 2). Within-subjects ANOVA of the Δ values of the increase from T1 to T2 and the decrease from T2 to T3 revealed that temperature decreases produced significantly greater changes in perceived pain intensity across all T1 and T2 combinations (Table 1, all comparisons significant at P < 0.0227). Furthermore, these differences were substantial. For example, a 1°C decrease from 50 to 49°C evoked a change in ratings of pain intensity 271% larger than that evoked by the increase from 49 to 50°C. For steps of 1°C, the ratios of Δdecrease:Δincrease were consistently large across all T1 temperatures (F (2,22) = 0.03,P = 0.9669). Similarly, these ratios were not significantly altered by the size of the T2 step (F (2,22) = 1.72, P = 0.2024).

Fig. 2.

Comparison of changes in pain intensity ratings produced by increases and decreases in stimulus temperatures. In all cases, temperature decreases (T2–T3) produced significantly (*P < 0.05) larger changes in pain intensity ratings than equal magnitude temperature increases (T1–T2). Furthermore, the changes in pain intensity ratings evoked by slight (1–3°C) drops within the noxious range (T2–T3) were indistinguishable from those evoked by considerably larger (13–15°C) drops to clearly innocuous levels (T2 to 35°C). These disproportionately large drops in pain intensity ratings suggest a differential processing of nociceptive information during dynamic decreases in stimulus intensity.

View this table:
Table 1.

Analyses of changes in pain intensity ratings during increases and decreases in stimulus temperatures

Control trials with T3's of 35°C confirmed that 1, 2, or 3°C T2–T3 decreases produced a robust change in perceived pain intensity. The magnitude of the decrease in pain intensity observed in experimental trials was so large that subjects were unable to distinguish a step down to a T3 of 47, 48, or 49°C from a step down to an innocuous 35°C (Fig. 2, Table 1, no comparisons significant). In other words, a 1, 2, or 3°C step down to temperatures within the noxious range felt no different from a 13, 14, or 15°C step down to a 35°C stimulus (Fig. 2).

Static vs. dynamic stimuli

In the constant temperature trials, ratings of pain intensity following 15 s of stimulation (a time point equal to the 1st T3 rating in the experimental trials, Fig. 1) decreased only 28 ± 0.065% from their initial T1 value for 49°C and 44 ± 0.094% for 48°C. In contrast, pain intensity ratings following the 50 to 48°C and 50 to 49°C decreases in the experimental trials were significantly lower than those of the corresponding constant temperature stimuli (F (1,11) = 8.23,P < 0.015; F (1,11) = 10.06, P < 0.0089, respectively, Fig.3, B and C), while pain intensity ratings following the 49 to 48°C decrease exhibited a trend toward being smaller than those of the 48°C constant temperature stimuli (F (1,11) = 3.81,P < 0.077, Fig. 3 A). Therefore the relatively large decreases in pain intensity ratings evoked by slight decreases in noxious stimulus temperatures are distinct from the smaller changes in pain intensity that occur during adaptation to 48 and 49°C constant temperature stimuli and indicate that an analgesic mechanism is engaged during stimulus offset. In contrast, perceptual decreases following T2 offset in experimental trials using a 47°C T1 were not distinguishable from the substantial adaptation (84.4 ± 0.67%) that occurred during 47°C constant temperature trials.

Fig. 3.

Time course of pain intensity ratings during experimental, control, and constant temperature trials. Pain intensity ratings at T3 (5 s) in the experimental trials dropped to levels substantially lower than those occurring during adaptation to corresponding constant temperature stimuli (*P < 0.05). Thus stimulus offset evoked analgesia. After 15 s of continued stimulation, T3 (20 s) pain intensity ratings in the experimental trials rose to levels indistinguishable from those of the constant temperature stimuli. Together, these observations indicate that the analgesia evoked during stimulus offset is distinct from adaptation.

Pain intensity ratings were examined 20 s after the onset of T3 (T3 20 s) to determine if the analgesia at T3 (5 s) diminished during 15 s of continued stimulation. In all cases, analgesia evoked by incremental temperature decreases exhibited a complete reversal by T3 (20 s) in that pain intensity ratings rose to levels indistinguishable from those of the corresponding constant temperature stimuli (50 to 48°C: F (1,11) = 0.71,P = 0.420; 50 to 49°C:F (1,11) = 0.13, P = 0.729; 49 to 48°C: F (1,11) = 0,P = 0.950, Fig. 3). These increases in pain intensity ratings between T3 (5 s) and T3 (20 s) during the experimental trials contrast sharply with the continued, gradual decreases in pain intensity ratings of the constant temperature stimuli, and further distinguish the analgesia at T3 (5 s) from adaptation.

DISCUSSION

The present findings demonstrate that a potent analgesia is evoked by slight incremental decreases in noxious stimulus temperatures. We have named this novel analgesic phenomenon offset analgesia. Offset analgesia is distinct from the adaptation and/or primary afferent fatigue that occurs during prolonged and/or repeated noxious stimulation (LaMotte and Campbell 1978; LaMotte et al. 1983). It is temporally coupled with incremental decreases in stimulus temperature, and it is reversed by 15 s of continued noxious stimulation (Fig. 3). This time course indicates that offset analgesia is an active process and raises the possibility that central inhibitory mechanisms may play a critical role in this phenomenon.

Damage to central inhibitory mechanisms has been demonstrated to occur during chronic pain states. Peripheral nerve injuries during animal models of neuropathic pain have been reported to cause excitotoxic loss of inhibitory interneurons in superficial laminae of the spinal cord (Ibuki et al. 1997; Mayer et al. 1999;Sugimoto et al. 1990). Consistent with the potential loss of inhibition, neuropathic pain patients report that exposure to a brief tactile or thermal stimulus produces painful sensations which long outlast the stimulus (Lindblom 1985;Noordenbos 1959). Thus disruption of central inhibitory mechanisms that potentially mediate offset analgesia may be an integral component of the pathophysiology of chronic pain.

Under normal circumstances, pain is a signal of actual or impending tissue damage. Why then, do minor decreases in stimulus temperature cause such disproportionately large decreases in pain intensity? Is a 49°C temperature any less dangerous to tissue when preceded by a higher temperature than it is when presented alone? Signals indicative of the termination of painful stimuli are potentially as important as signals indicating the continued presence of injurious stimuli. Just as lateral inhibition serves to enhance spatial contrast in the visual system, offset analgesia may serve to enhance temporal contrast during dynamic changes in noxious stimulus intensity (Hartline and Ratliff 1957). Thus such amplification of the perception of small decreases in noxious stimulus intensity produces a readily detectable signal that may facilitate escape responses from injurious stimuli.

Acknowledgments

This work was supported by the Forsyth County United Way and Wake Forest University School of Medicine venture funds.

Footnotes

  • R. C. Coghill (E-mail:rcoghill{at}wfubmc.edu).

REFERENCES

View Abstract