It looks like you're new here. If you want to get involved, click one of these buttons!
This is Nvidia's consumer GPU based on their new high end GK110 GPU. It's the same Kepler architecture as the rest of the GeForce GTX whatever cards. Whereas a GK104 has 8 SMXes and 4 memory channels, GK110 has 15 and 6.
The GeForce GTX Titan has to clock itself substantially lower than the GeForce GTX 680, 670, 660, etc., however. It's also a salvage part, with one of the SMXes disabled, most likely because yields aren't that great. Not that you'd expect excellent yields on a 551 mm^2 chip.
While Nvidia isn't allowing benchmarks to be posted yet, from the theoretical specs, you can expect it to come in at 40%-50% faster than a GeForce GTX 680. It's advantage over a Radeon HD 7970 GHz Edition should be a bit less than that, but still substantial. This is the new top end GPU chip, and AMD doesn't have a competing product this generation. If you want maximum performance at any price, this is the card you want.
If you're on any semblance of a budget, however, this is not the card you're looking for. The price tag is $1000. It offers perhaps double the graphical performance of a GeForce GTX 660 or Radeon HD 7870, while costing more than four times as much. That makes it terrible on a price/performance basis, but Nvidia isn't marketing it as a budget-friendly product. If you want top end single-GPU performance, this is the only way to get it, so you pay what it costs or do without.
Nvidia promises a TDP of 250 W. Do I believe that's an honest TDP? In a word: no. The Radeon HD 7970 GHz Edition has a TDP of 250 W. The GeForce GTX 680 has an official TDP of 190 W, but an honest TDP probably a bit higher than that. To stay inside of 250 W, Nvidia can get huge efficiency gains over both of those on the same process node. Do I believe that they can do that even while burning far more power to have far more very high clocked GDDR5 memory, binning out the least efficient chips because the good ones go into Tesla and Quadro cards that cost several times as much, and burning a fair bit of power for GPGPU/enterprise bloat (which makes Tahiti substantially less efficient than AMD's other 28 nm GPUs)? Definitely not. So I'd assume an honest TDP in the ballpark of 300 W.
Nvidia expects card to be available for sale next week.