cvs.gedasymbols.org/archives/browse.cgi   search  
Mail Archives: djgpp/1997/08/29/08:48:14

From: Mike Darrett <ez073236 AT mailbox DOT ucdavis DOT edu>
Newsgroups: comp.os.msdos.djgpp
Subject: software interrupts
Date: Fri, 29 Aug 1997 01:11:25 -0700
Organization: University of California, Davis
Lines: 76
Message-ID: <Pine.GSO.3.95.970829010635.4265B-100000@dilbert.ucdavis.edu>
NNTP-Posting-Host: dilbert.ucdavis.edu
Mime-Version: 1.0
To: djgpp AT delorie DOT com
DJ-Gateway: from newsgroup comp.os.msdos.djgpp

Hi Guys,

	I'm sure there is a very simple solution to this problem, but I've
been working on this for the last three days and it has me stumped. I had
a PCX decoding routine I was porting from Turbo C++ to DJGPP, and I found
the source of the bug was in the Palette Updating routine. In a nutshell,
it goes like this, but the colors are the WRONG ONES. I know it has
something to do with the way I'm issuing a software interrupt from
the 32-bit code, but I tried everything, from __dpmi_int() to int86() for
the int 10 function (this function takes a pointer in ES:DX pointing to
palette data and updates the video palette). The FAQ's DJ wrote said just
use int86() and forget about ES, ... 

	Any help would be appreciated.

			- Mike


#include <go32.h>
#include <dpmi.h>
#include <conio.h>
#include <dos.h>

typedef unsigned char byte;

typedef struct{
  byte r, g, b;
} point;

point palette[256];


void SetPalette(void)
{
  union REGS r;
  int i;

  r.x.dx = r.x.di = (unsigned long)palette;
  r.x.ax = 0x1012;
  r.x.bx = 0;
  r.x.cx = 256;
  int86( 0x10, &r, &r );
}


void SetMode( byte m )
{
  __dpmi_regs r;
        
  r.h.al = m;
  r.h.ah = 0;
  r.h.bl = 0;
  __dpmi_int (0x10, &r);
}


void main()
{
  int i;
  __dpmi_regs r;
        
  SetMode( 0x13 );

  for( i=0; i<256; i++ ){
    palette[i].r = 0;
    palette[i].g = 0;
    palette[i].b = 60;
  }
  SetPalette();

  getch();
  SetMode( 3 );
}



- Raw text -


  webmaster     delorie software   privacy  
  Copyright © 2019   by DJ Delorie     Updated Jul 2019