I've nothing specifically relevant to add to the "who has the bigger epeen" discussion in hardware, however, I daresay that I'm running a quad-core gen 1 Xeon (that's actually about 4 years old now) and Radeon R7 260 2 GB card and have had no issues with performance in practically any game. I also run a Dell 30" (2560x1600) and an Acer 22" (1920x1080) either or at times, depending on what I'd like to see in resolution/details/performance.
When I previously ran Nvidia and AMD processor chipsets, I had a slew of problems with overheating, just general abnormalities with graphics tearing, etc. I quickly learned the major differences in that Intel chipsets usually run a lot cooler by default, and safer in ROI. I've also seen that NVidia tends to run their graphics in an "emulated" state with games- fullscreen windowed that makes it look like it's actually running fullscreen, as opposed to actually running hardware fullscreen (buffered, etc.) which makes them overall slower on average. (many years observing this)
Not sure what the actual mechanical differences are- as I'm not a hardware driver programmer- but observing the actual differences in terms of efficiency has led me personally to favor ATI chipsets for graphics and Intel for processors. Sure this also means running at whatever the hardware is "locked" at, but it also means my hardware will last longer and I'll be replacing less of it over time. 