We're trying to capture key performance scenarios.
We've figured out some 'large' use cases, like loading a form that has a dynamic table with 50 rows. We're tracking loading time as well as memory usage.
Our approach basically snapshots memory and starts a stopwatch before performing an operation, then stopping the stopwatch and asserting that the time and memory usage are below specified limits.
I rolled this up into a "PerformanceAsserter" class that is meant to be used by wrapping the operation in a using statement, and asserting that the specifications are met in the 'dispose', which demarcates the performance portion of the test clearly in curly braces.
Here is some sample code:
This isn't an ideal solution, I'd like to see per-test data collected and compared or something like that, but it seems to get the job done. It's best for cases where you have a performance target already, like "the screen should load in under 15 seconds". Then it becomes easy to run on a bunch of devices, and get feedback about which ones pass the criteria and which ones don't, as well as by how much.
A couple people I've spoke too had an interest in this, and I thought others may be interested as well, so I'm posting it here.