[self-interest] true color (was: 4.1 stuff)
Jecel Assumpcao Jr
jecel at merlintec.com
Fri Nov 5 17:44:30 UTC 1999
> Do you have a fix for the X true color bug?
> Sorry about not crediting you with the patch.
Actually, I found out that your code is rather different after
all. But the end result is similar.
I only did enough to be able to see a debugger pop up in a
new Self world, but then got in too deep and never finished
Since I was testing a 16 bit per pixel mode, this ugly hack
gets the same results in Self 4.1:
traits paint _AddSlots: = (|
asSmallInteger = ((macRed >> 11) << 11) +
(macGreen >> 10) << 5) +
(macBlue >> 11)
Now I can see the debugger pop up, as I said, the then the code
that tries to generate a pixmap mask gets in deep trouble:
drawMaskFromImage: in traits xPixmapCanvas calls
xImagePutData: t0 MappedBy: t1 in traits xlib xImage calls
XImagePutData: t0 MappedBy: t1 IfFail: fb in traits xlib xImage calls
asVMByteVector (since we have a 'badTypeError' above) and this
fails since it tries to stuff a SmallInteger into a byteVector
A related piece of code, the drawFromImage: in traits xPixmapCanvas,
has this telltale segment:
pixmapImage: disp xCreateImage: disp screen defaultVisual
Depth: 8 Format: pixMap zPixmap
Width: w Height: h BitmapPad: 16.
Is the depth always supposed to be 8? That seems strange, though I
am pretty sure that this actually works. But there certainly are
a lot of places which assume that color values are always 255 or
less. None of this code is probably used on the Mac, however (true
color works fine there, right?) so maybe it isn't worth fixing.
More information about the Self-interest