10bit HiDPI in 10.13.1 is orange????
I just upgraded my Mac Pro 5,1 to 10.13.1. I have a 980Ti card in it. I have the latest web drive installed along with the latest SwitchRES X. All resolution work fine as long as I am in 8bit color, as soon as I switch to 10bit color with a HiDPI resolution blue and orange are switched, so everything in the GUI that was a shade of blue is a shade of orange and vice versa. Anyone else see this? It is so strange!!!!
I just upgraded my Mac Pro 5,1 to 10.13.1. I have a 980Ti card in it. I have the latest web drive installed along with the latest SwitchRES X. All resolution work fine as long as I am in 8bit color, as soon as I switch to 10bit color with a HiDPI resolution blue and orange are switched, so everything in the GUI that was a shade of blue is a shade of orange and vice versa. Anyone else see this? It is so strange!!!!

#1
Posted 11/12/2017 06:37 PM   
I don't notice any difference. When I use SwitchResX to switch between Millions and Billions, the screen is disabled (turns black) for a couple seconds, then is reenabled (comes back). How can you tell if it's outputting 10 bit color? I tried doing gradients but couldn't see a difference. The following is a bash script to create 12 bit, 10 bit, 8 bit, 6 bit gradients (Grey,R,G,B,Y,C,M,Black) in a tiff file. When I open this in Preview.app, I see no difference between 12, 10, and 8 bit gradients. The tiff file uses 16 bit per component RGB. [code] # create tiff header: big endian, no compression, 4096 x 2048 ( echo "4d4d 002a 0000 001e 0000 02d0 0000 000a 0000 02d0 0000 000a 0010 0010 0010 000e 00fe 0004 0000 0001 0000 0000 0100 0004 0000 0001 0000 1000 0101 0004 0000 0001 0000 0800 0102 0003 0000 0003 0000 0018 0103 0003 0000 0001 0001 0000 0106 0003 0000 0001 0002 0000 0111 0004 0000 0001 0000 00cc 0115 0003 0000 0001 0003 0000 0116 0004 0000 0001 0000 0800 0117 0004 0000 0001 0300 0000 011a 0005 0000 0001 0000 0008 011b 0005 0000 0001 0000 0010 011c 0003 0000 0001 0001 0000 0128 0003 0000 0001 0002 0000 0000 0000" numlines=$(( 2048 / 8 / 4 )) # 2000 lines, 8 colors, 4 bit depths for color in 111 100 010 001 110 011 101 000 ; do red=${color:0:1} green=${color:1:1} blue=${color:2:1} for depth in "12:8:2#1000000000001" "10:4:2#10000000001" " 8:0:2#100000001" " 6:2:2#1000001000001" ; do shiftfinal=${depth:3:1} multiplier=$((${depth:5:32})) depth=${depth:0:2} shiftright=$((12 - depth)) for (( y=0; y < numlines; y++ )); do for (( i=0; i < 4096; i++ )); do value=$(( ((i >> shiftright) * multiplier) >> shiftfinal )) printf "%04x %04x %04x " $((value * red)) $((value * green)) $((value * blue)) done echo done done done ) | xxd -p -r > 4096x2048gradients.tif [/code]
I don't notice any difference. When I use SwitchResX to switch between Millions and Billions, the screen is disabled (turns black) for a couple seconds, then is reenabled (comes back).

How can you tell if it's outputting 10 bit color? I tried doing gradients but couldn't see a difference.

The following is a bash script to create 12 bit, 10 bit, 8 bit, 6 bit gradients (Grey,R,G,B,Y,C,M,Black) in a tiff file. When I open this in Preview.app, I see no difference between 12, 10, and 8 bit gradients. The tiff file uses 16 bit per component RGB.

# create tiff header: big endian, no compression, 4096 x 2048
(
echo "4d4d 002a 0000 001e 0000 02d0 0000 000a 0000 02d0 0000 000a 0010 0010 0010 000e 00fe 0004 0000 0001 0000 0000 0100 0004 0000 0001 0000 1000 0101 0004 0000 0001 0000 0800 0102 0003 0000 0003 0000 0018 0103 0003 0000 0001 0001 0000 0106 0003 0000 0001 0002 0000 0111 0004 0000 0001 0000 00cc 0115 0003 0000 0001 0003 0000 0116 0004 0000 0001 0000 0800 0117 0004 0000 0001 0300 0000 011a 0005 0000 0001 0000 0008 011b 0005 0000 0001 0000 0010 011c 0003 0000 0001 0001 0000 0128 0003 0000 0001 0002 0000 0000 0000"

numlines=$(( 2048 / 8 / 4 )) # 2000 lines, 8 colors, 4 bit depths
for color in 111 100 010 001 110 011 101 000 ; do
red=${color:0:1}
green=${color:1:1}
blue=${color:2:1}
for depth in "12:8:2#1000000000001" "10:4:2#10000000001" " 8:0:2#100000001" " 6:2:2#1000001000001" ; do
shiftfinal=${depth:3:1}
multiplier=$((${depth:5:32}))
depth=${depth:0:2}
shiftright=$((12 - depth))
for (( y=0; y < numlines; y++ )); do
for (( i=0; i < 4096; i++ )); do
value=$(( ((i >> shiftright) * multiplier) >> shiftfinal ))
printf "%04x %04x %04x " $((value * red)) $((value * green)) $((value * blue))
done
echo
done
done
done
) | xxd -p -r > 4096x2048gradients.tif

#2
Posted 11/24/2017 05:13 PM   
Actually, Preview.app does show a difference between the 8 bit and 10 bit gradients but the difference shows up in both Millions and Billions of colors, so it must be doing dithering of the 10 bit and 12 bit gradients while in Millions. I'm not sure if Billions is working though. Most color pickers are 8 bits or less, so they won't show the real color (to test for dithering). I can't tell the difference between 10 bit and 12 bit gradients. Some other apps don't show a difference between 8 bit and 10 bit gradients (they don't do dithering and are probably drawing into a 24 bit buffer).
Actually, Preview.app does show a difference between the 8 bit and 10 bit gradients but the difference shows up in both Millions and Billions of colors, so it must be doing dithering of the 10 bit and 12 bit gradients while in Millions. I'm not sure if Billions is working though. Most color pickers are 8 bits or less, so they won't show the real color (to test for dithering). I can't tell the difference between 10 bit and 12 bit gradients.

Some other apps don't show a difference between 8 bit and 10 bit gradients (they don't do dithering and are probably drawing into a 24 bit buffer).

#3
Posted 11/24/2017 05:27 PM   
Scroll To Top