Traditional techniques for performance analysis provide a means for extracting and analyzing raw performance information from applications. Users then reason about and compare this raw performance data to their performance expectations for important application constructs. This comparison can be tedious, difficult, and error-prone for the scale and complexity of today's architectures and software systems. To address this situation, we present a methodology and prototype that allows users to assert performance expectations explicitly in their source code using performance assertions. As the application executes, each performance assertion in the application collects data implicitly to verify the assertion. By allowing the user to specify a performance expectation with individual code segments, the runtime system can jettison raw data for measurements that pass their expectation, while reacting to failures with a variety of responses. We present several compelling uses of performance assertions with our operational prototype including raising a performance exception, validating a performance model, and adapting an algorithm to an architecture empirically at runtime.