cvs.gedasymbols.org/archives/browse.cgi   search  
Mail Archives: djgpp/1998/03/27/20:58:35

Message-ID: <351AB7BF.752FC001@primary.net>
Date: Thu, 26 Mar 1998 14:17:04 -0600
From: * benz <benz AT primary DOT net>
MIME-Version: 1.0
Newsgroups: comp.os.msdos.djgpp
Subject: Allegro and 16-bit color
NNTP-Posting-Host: pn7-ppp-75.primary.net
Organization: Primary Network. http://www.primary.net
Lines: 45
To: djgpp AT delorie DOT com
DJ-Gateway: from newsgroup comp.os.msdos.djgpp

I have been trying for some time now to get Allegro to work in 16-bit
color mode.  For some reason it never works...  What I am trying to do
is to load some 256 color pictures (with differenet palettes) and then
display them on a 16-bit screen.  I get no errors or warnings when I
compile, but when I run the exe I either get a solid color filling the
screen or a SIGSEGV error.  When I symify the SIGSEGV error, it says the
problem is in the blit (_blit+1025 to be exact) function.  My exact
source is below and I would really appreciate some help on what I might
be doing wrong -- I just can't figure it out :(


/*                    Source                    */
#include <allegro.h>

main()
{
   BITMAP *bmp1;
   BITMAP *bmp2;
   PALETTE pal1;
   PALETTE pal2;

   allegro_init();
   set_gfx_mode(GFX_AUTODETECT, 640, 480, 0, 0);
   set_color_depth(16);

   set_color_conversion(COLORCONV_NONE);
   bmp1 = load_bitmap("heroPc1.bmp", pal1);
   bmp2 = load_bitmap("test1.bmp", pal2);

   set_palette(pal1);
   blit(bmp1, screen, 0, 0, 0, 0, 99, 99);
   set_palette(pal2);
   blit(bmp2, screen, 0, 0, 0, 125, 399, 187);

   readkey();
   destroy_bitmap(bmp1);
   destroy_bitmap(bmp2);
   set_gfx_mode(GFX_TEXT, 0, 0, 0, 0);
}


Thanks
Andy Benz
benz AT primary DOT net

- Raw text -


  webmaster     delorie software   privacy  
  Copyright © 2019   by DJ Delorie     Updated Jul 2019