Hi all,
Power consumption has become a critical discussion on the purchase of new projectors.
I have been involved in attempting to compare 2 different vendor offerings.
My general conclusion is that different vendors have advantages and disadvantages in different models, But in general it would appear to be a wash. Being, no one vendor has any significant advantage over the other. Some models when compared be better, but others worse.
I would like to get other, non-conflict-of-interest based, input from other knowledgeable readers of this forum.
It also does bring up the question of. "How do we objectively compare Apple and Oranges"
For example, each vendor quotes power usage based on different approaches to obtain them.
For example, one turns on a projector and sets it to 100% lamp. Takes a reading.
Another does a generalised approach based on average power over the life of the projector based on maintaining DCI-spec brightness and primaries and having to raise the power over the 10 years to archive it.
There are also differences in comparing total hours as different vendors base it on different ambient temperatures.
Personally, I don't like the subjective and unproven approach. But I would like others to share wisdom on how to make comparisons.
In general, laser is quite new and the expectations vendors portray and real file results are not proven. Personally, the vendor who has been at this the longest, appears to portray a realistic expectation or laser life based on temperature and power settings. But with next-generation system using different technologies, it is hard to know if the same understandings apply.
In this difficult calculation, I tend to put weight behind proven record backing.
Keen to hear other opinions, and if anyone out there has real world experiences to share.
Power consumption has become a critical discussion on the purchase of new projectors.
I have been involved in attempting to compare 2 different vendor offerings.
My general conclusion is that different vendors have advantages and disadvantages in different models, But in general it would appear to be a wash. Being, no one vendor has any significant advantage over the other. Some models when compared be better, but others worse.
I would like to get other, non-conflict-of-interest based, input from other knowledgeable readers of this forum.
It also does bring up the question of. "How do we objectively compare Apple and Oranges"
For example, each vendor quotes power usage based on different approaches to obtain them.
For example, one turns on a projector and sets it to 100% lamp. Takes a reading.
Another does a generalised approach based on average power over the life of the projector based on maintaining DCI-spec brightness and primaries and having to raise the power over the 10 years to archive it.
There are also differences in comparing total hours as different vendors base it on different ambient temperatures.
Personally, I don't like the subjective and unproven approach. But I would like others to share wisdom on how to make comparisons.
In general, laser is quite new and the expectations vendors portray and real file results are not proven. Personally, the vendor who has been at this the longest, appears to portray a realistic expectation or laser life based on temperature and power settings. But with next-generation system using different technologies, it is hard to know if the same understandings apply.
In this difficult calculation, I tend to put weight behind proven record backing.
Keen to hear other opinions, and if anyone out there has real world experiences to share.
Comment