philsoft wrote:I don't think so. It doesn't make sense to say that taking a digital image, and converting it to analog, can make it better than the source material.
The fact is that every time there is a conversion or an amplification (of analog signals) or virtually any other form of processing there will be a loss. By loss I mean I mean that it is impossible to retain the original wave forms and not introduce noise and/or alter the waveform.
However, if you are starting with a great source like original masters for audio or a top quality film source then you play that back on an analog device capable of rendering it at the resolution and fidelity that it is recorded in you will get picture and sound that is superior to any that you will ever get from any digital copy of that original.
I doubt that anyone in the consumer market has access to or can afford the kind of equipment that is required to play any material at that level.
Once material is digitized either for placing on a disk or local storage or transmitted over the internet it has already incurred some loss so the plan must be to minimize the remainder of the losses. That means it should be converted and processed as little as possible and that means keeping it as digital until it can be processed or converted by the best device in the stream for that purpose.
Where that best device is is exactly where the problem comes in. In most homes the TV is the best device for processing video but in many homes it is not the best for audio and that is why there are so many that have a separate device for handling the audio.
If the signal arrives in the home in a digital format and the TV really is the best and making the signal ready for viewing then the digital video should be kept in that form until the last moment and therefore the HDMI cables will produce the best picture.
But, if there is another device that has better processing (and many of the newest DVD and Buleray players have great processors) then that device should be used to convert the signal and, in that case, the component cables will produce the best picture.
Now to the real crux of the matter: Most people do not have the ability to actually tell the difference between the various delivery methods for either audio or video. Since that is true the HDMI cable should be used as much as possible simply because it reduces the interconnecting complexities.
Lastly, the bottom line is that each person should use whatever connection method they see as best and "to heck with what anyone else says." Viewing and listening should be personal choice and that decision should be based only on what the individual's preference is and what they can afford.
BTW: I can demonstrate (And have done so) that a "record" made from a high quality master and played on a mid-level turntable and mid-level amp produces a cleaner and more accurate sound than any of the digital copies that are generally used played on a comparable system to the analog system. Of course I cannot actually hear any difference (old ears) so I use digital for all my audio.
By a similar method I am sure that analog is really superior in many cases and 1080p is better than 720p but I cannot see the difference so I use digital connections and 720p and I do not buy Buleray as standard DVD resolution is just fine for me.