The purpose of this study was to evaluate the effect that modifications in item response anchors have on responses to survey items. Twenty-nine items were administered in 1993 and 1995 as part of more extensive attitude surveys to two random samples of Federal Aviation Administration employees. Changes in the response scales (5-point Likert) between the two survey administrations ranged from no change at all to extensive re-anchoring of the response categories. Item responses were modeled via two-parameter graded response models based on item response theory. Changes in the way the item responses functioned between both years were assessed using the differential item functioning (DIF) method recommended by Muraki (1997).