Hi!
I work for a small independent movie theater (50 seats, non-profit, all voluntary staff). We recently renewed our equipment, but don't have the budget for a processional DCI compliant cinema system, so we are building a kind of a "high-end home theater system".
The projector is a Panasonic PT-MZ770. Video source is a Windows 10 machine with VLC, MPC-HC and NeoDCP. Video output is hdmi from a Geforce 1070.
We screen mainly indie, shorts, etc movies, so we have a bunch of different formats, DCP, h264, x264, prores, etc. with various framerates.
The PT-MZ770 has a native resolution of 1920x1200. Of course all movies are either 1920x1080 or 2k DCPs. We are having issues with jittering/flickering/jerky playback depending on the file that is being played. The projector hdmi input was initially setup (not by me) to 1920x1080i. It seems that if an input resolution is forced, the vertical scan frequency must be set to a fix value as well. I don't know the reason why it was setup like this - maybe cause the previous computer, when the projector was installed, was a Mac mini.
My question is: how should we setup the hdmi input to be able to handle all video sources "automatically"? Should we leave it on the default EDID? Or set 1920x1080i or 1920x1080p? And at which frequency? Is 60Hz fine for all fps?
The manual says:
Select [60Hz], [50Hz], [30Hz], [25Hz], or [24Hz] when [1920x1080p] is selected for [RESOLUTION].
Select [60Hz], [50Hz], or [48Hz] when [1920x1080i] is selected for [RESOLUTION].
Select [60Hz] or [50Hz] when anything other than [1920x1080p] or [1920x1080i] is selected for [RESOLUTION].
Is there anything particular we should do to have the projector output a 1920x1080 resolution directly (is that even possible?), or will it simply always output 1920x1200 with black borders?
I hope my questions are not too silly... thanks for taking the time to help!
I work for a small independent movie theater (50 seats, non-profit, all voluntary staff). We recently renewed our equipment, but don't have the budget for a processional DCI compliant cinema system, so we are building a kind of a "high-end home theater system".
The projector is a Panasonic PT-MZ770. Video source is a Windows 10 machine with VLC, MPC-HC and NeoDCP. Video output is hdmi from a Geforce 1070.
We screen mainly indie, shorts, etc movies, so we have a bunch of different formats, DCP, h264, x264, prores, etc. with various framerates.
The PT-MZ770 has a native resolution of 1920x1200. Of course all movies are either 1920x1080 or 2k DCPs. We are having issues with jittering/flickering/jerky playback depending on the file that is being played. The projector hdmi input was initially setup (not by me) to 1920x1080i. It seems that if an input resolution is forced, the vertical scan frequency must be set to a fix value as well. I don't know the reason why it was setup like this - maybe cause the previous computer, when the projector was installed, was a Mac mini.
My question is: how should we setup the hdmi input to be able to handle all video sources "automatically"? Should we leave it on the default EDID? Or set 1920x1080i or 1920x1080p? And at which frequency? Is 60Hz fine for all fps?
The manual says:
Select [60Hz], [50Hz], [30Hz], [25Hz], or [24Hz] when [1920x1080p] is selected for [RESOLUTION].
Select [60Hz], [50Hz], or [48Hz] when [1920x1080i] is selected for [RESOLUTION].
Select [60Hz] or [50Hz] when anything other than [1920x1080p] or [1920x1080i] is selected for [RESOLUTION].
Is there anything particular we should do to have the projector output a 1920x1080 resolution directly (is that even possible?), or will it simply always output 1920x1200 with black borders?
I hope my questions are not too silly... thanks for taking the time to help!