cvs.gedasymbols.org/archives/browse.cgi   search  
Mail Archives: djgpp/1999/04/13/06:50:39

From: "Rafael García" <rafael AT geninfor DOT com>
Newsgroups: comp.os.msdos.djgpp
Subject: BOOL as char/int
Date: Tue, 13 Apr 1999 12:03:07 +0200
Organization: CTV/JET
Lines: 23
Message-ID: <7ev4na$49a$1@lola.ctv.es>
NNTP-Posting-Host: info596.jet.es
X-Trace: lola.ctv.es 923997738 4394 194.224.182.86 (13 Apr 1999 10:02:18 GMT)
X-Complaints-To: usenet AT lola DOT ctv DOT es
NNTP-Posting-Date: 13 Apr 1999 10:02:18 GMT
X-Priority: 3
X-MSMail-Priority: Normal
X-Newsreader: Microsoft Outlook Express 5.00.2014.211
X-MimeOLE: Produced By Microsoft MimeOLE V5.00.2014.211
To: djgpp AT delorie DOT com
DJ-Gateway: from newsgroup comp.os.msdos.djgpp
Reply-To: djgpp AT delorie DOT com

Look at this:

#include <stdio.h>

typedef char /*int*/ BOOL;
#define TRUE 1
#define FALSE 0

main() {
   BOOL flag=(BOOL)isupper('E');
   puts(flag?"*TRUE*":"*FALSE*");
   return 0;
   }

It fails with BOOL as char, but works as int
Can someone explain this reasonably?
It works well with Borland
I have been using this typedef for years and it seems standard, robust,
good, pretty, simple, near-machine, fast, compact...
It seems gods of chaos are conquering the world of computing



- Raw text -


  webmaster     delorie software   privacy  
  Copyright © 2019   by DJ Delorie     Updated Jul 2019