I understand the appeal of increased resolution, improved textures, overclocking to get rid of slowdown and flicker that occurred on the original machine, etc., it's an interesting new way to look at this stuff, but that should not be considered to be pure "emulation." Just call it what it is: modding. I'm not against this stuff per se, but it should never, never be considered the end goal of the emulation scene.
Also, don't forget that home systems were meant to be viewed on low-resolution CRTs through analog RCA composite input (at best). Sure, the developers would have likely worked on RGB computer monitors, but they'd still be outputting at a lower native resolution with no bilinear filtering or upscaling. On a modern display, they are going to look like crap, and not just because graphics have advanced so much in the interim, but also because they have been completely stripped of the degradation in picture that occurred in the original delivery methods. There's something to be said for simulating loss of picture quality/fidelity, that's the only way to reproduce things like the Genesis/Mega Drive's dithered "transparency" and "extra" colors, or the "larger" color palette of composite mode CGA on old PC games. Sometimes you have to make the picture look "worse" to make it look "better."
At least when you add a layer of CRT simulation or fake vector glow/flicker, it's just a "mask" on top of the base system emulation which is no less accurate than it would be without them.