圣者
精华
|
战斗力 鹅
|
回帖 0
注册时间 2007-11-29
|
楼主 |
发表于 2016-10-18 01:47
|
显示全部楼层
本帖最后由 boday 于 2016-10-18 01:51 编辑
前面回帖的时候我特意回去翻了翻 github 上的那个超长的 issue 讨论,不过发现貌似后来离题比较远的部分都已经被删掉了,包括我提问的那部分。不过刚才想起来我的邮箱里应该还留有,于是索性贴出来:
The point here is, again, that the concept of color management and the concept of a monitor calibrated to some standard contradict each other. When you use color management, the tone curve of the monitor does not matter (apart from subtleties that are, well, too subtle for this discussion ;–) ).
What does matter, though, is that the ICC display color profile that describes your monitor fits to whatever calibration settings you used. In other words, you have to regard the calibration of the monitor and the ICC display color profile that you (hopefully) created with Argyll at the same time form a unity. For mpv, it doesn't matter in the slightest what your monitor is calibrated to. Calibration merely makes color management as simple as possible (and allows you to use simple profiles like 3xCurves+Matrix rather than significantly more complicated LUT profiles), and it helps reproduction in other applications (that aren't color calibrated), like video games. Ultimately, for mpv, what you see on the screen is the exact same result no matter what your monitor is calibrated to. ...The gamma value of the display does not matter at all for color management, simply because the corresponding display color profile takes the gamma value into account and as a consequence, the CMM "neutralizes" it, anyway (because the objective of the CMM is to always produce identical colors, based on the data it gets form the color profiles). Originally, the whole gamma thing stems from cathode ray tubes which are not linear (gamma = 1) in their brightness reproduction but had a gamma of roughly 2 - 2.5. At that time, this was nothing anyone intended, it was simply a physical fact. Unfortunately, this whole thing proliferated until today (although LCD display do not have any “natural”/physical gamma curve).
This is a source of endless confusion for people who want to profile their displays. The main culprit for this is the vendors of profiling software, who offer a gamma setting feature in their software, but then don’t explaining what to do with it. (The manual usually says something like "Choose the value you prefer.") Since color management is supposed to make for "objectively correct" colors, it’s completely unclear for users why and how they should “subjectively” choose any specific gamma value.
The truth is that it’s completely meaningless for color management. As @haasn has already pointed out, it does matter if you still use any non color managed applications, because these have to assume some specific display gamma value to calculate the colors they write to the display. Historically, this value was 2.2 for Windows and 1.8 for the Mac (no deeper truth in it; it just was that way). Nowadays, that mostly does not matter (because color management becomes more and more pervasive), but it did matter earlier. For instance, before OS X 10.6 (Snow Leopard) video wasn’t color managed even in QuickTime (computer performance was not up to it at that time). As a result, for video to look correctly, you had to calibrate your monitor to gamma 1.8, because that was what QuickTime assumed. For video player in Windows, it was gamma 2.2. This why cross platform players which did not compensate for that produced wrong colors at least on one of the two platforms.
|
|