cvs.gedasymbols.org/archives/browse.cgi   search  
Mail Archives: djgpp/1999/07/14/19:15:25

Message-ID: <378C896D.BA341D0@ameritech.net>
From: S Prasad <aangels AT ameritech DOT net>
X-Mailer: Mozilla 4.04 [en]C-AIT (Win95; I)
MIME-Version: 1.0
Newsgroups: comp.os.msdos.djgpp
Subject: Allegro shading, once again I'm lost
Lines: 25
Date: Wed, 14 Jul 1999 08:58:21 -0400
NNTP-Posting-Host: 199.179.188.234
X-Trace: nntp0.detroit.mi.ameritech.net 931957671 199.179.188.234 (Wed, 14 Jul 1999 09:07:51 EDT)
NNTP-Posting-Date: Wed, 14 Jul 1999 09:07:51 EDT
Organization: Ameritech.Net www.ameritech.net Complaints: abuse AT ameritech DOT net
To: djgpp AT delorie DOT com
DJ-Gateway: from newsgroup comp.os.msdos.djgpp
Reply-To: djgpp AT delorie DOT com

First of all, sorry to be posting the (almost) same question, but I'm
still lost...

Lets say I have a 256-color BMP in 8 bit color mode.  So something like
this:

BMP->line[y][x]

will return a value between 0-255.  So, if I needed, I could use this
value as the subscript for an array, like this:


Color = ShadePalette[  BMP->line[y][x]  ] [ 0 ] 
		       ^^^^^^^^^^^^^^^      ^
                         Color Index    Shading Index

Now for the question... In 16-bit color mode, the BMP->line[y][x]
returns a 16-bit number, which ranges from 0 to 65535.  Now, its pretty
hard to have an array that is like this: ShadePalette[65535][255]   Its
a little too big!  So using it as an array subscript is out of the
question.  So, how do I use a LUT for shading in 16-bit color mode?  Or
is there another way?


Thanx in advance!!

- Raw text -


  webmaster     delorie software   privacy  
  Copyright © 2019   by DJ Delorie     Updated Jul 2019