Today's processors can decode 720p HD content, but I think only Apple's high end G5 dual processor setup is capable of reliably running 1080p content. Below is a picture of my processor usage (P4 2.8C) while playing the Batman Begins trailer in 720p from Apple's HD gallery
For the majority of the trailer the processor usage is between 40 and 50%, spiking to 58% at its highest. Since 720p is less than half as demanding as 1080p, my processor would most likely be pegged at 100% and dropping frames for much of the trailer.
an AnandTech preview of the next generation of ATi processors, and they state that even the RV515 (low end) will have some sort of H264 decoding (though that could just mean software decoding, so I'd wait for the actual reviews).
I'm interested in the RV530 series; 12 pipelines, low power consumption, and H264 decoding should make it a good all-round gaming/multimedia card that won't cost a bundle.
Edit: I just realized that because the Batman Begins trailer was only a 1280x544 widescreen, it wasn't as accurate an example as a true 1280x720 image. Here are a couple processor usage shots while running The Macaulay Library (1264x780) and the BBC Motion Gallery (1280x720). The yellow line represents 50% processor usage:
BBC Motion Gallery: