cvs.gedasymbols.org/archives/browse.cgi   search  
Mail Archives: djgpp-workers/1998/03/23/09:37:05

Date: Mon, 23 Mar 1998 17:34:13 +0300 (IDT)
From: Eli Zaretskii <eliz AT is DOT elta DOT co DOT il>
To: Vik Heyndrickx <Vik DOT Heyndrickx AT rug DOT ac DOT be>
cc: djgpp-workers AT delorie DOT com
Subject: Re: ^Z in text-mode output to the screen
In-Reply-To: <35165AB3.5061@rug.ac.be>
Message-ID: <Pine.SUN.3.91.980323171843.4963A-100000@is>
MIME-Version: 1.0

On Mon, 23 Mar 1998, Vik Heyndrickx wrote:

> Isn't the REAL problem here that the ^Z should never get returned by a
> read operation from a character device/file.

This can be done (and is done by DJGPP's libc) only if the file is read 
in text mode.  Binary reads cannot ignore and/or filter data, or they 
will betray their users.

The case in point is precisely one of those when the file is read in 
binary mode, but written in text mode, because writing the console in 
binary mode has nasty side-effects (which I described).  In this case, I 
don't see how can the ^Z be filtered during input.

> IMO the way text-data is
> stored should be entirely transparent to the user program (AFAIK POSIX
> requires this), this means that the read functions should do CR/LF to NL
> and ^Z to EOF translations. AFAIK this is enough to ensure that ^Z never
> gets passed to the write functions.

This is all so, but only for text-mode reads.  Binary reads don't change 
the file's data at all.

> IMO, there are only two cases: text files/devices and binary
> files/devices. I don't see any use for making a distinction between
> cooked-mode devices and files (I almost wrote cooked devices :-) )

DOS doesn't have a notion of ``binary'' vs ``text'' devices.  As far as
DOS is concerned, a handle which is used for I/O to/from a device can be
either in raw mode or in cooked mode.  Binary vs text mode is an illusion
created by libc functions.  DOS always reads and writes in binary mode. 
DJGPP switches a device to raw mode when the application requests a binary
open of a device.  This is IMO appropriate, since cooked I/O will
interpret some of the characters (^C, ^S, TAB, etc.) which is contrary to
the definition of the binary I/O and expectations of its users. 

> IMO, a ^Z (at any place in the output data) should turn a file in EOF
> mode, and let write and family ignore any further output to that file
> (until the EOF indicator gets reset).

This has several problems.  First, you need to search every buffer for ^Z 
characters, which is expensive in functions like `_write' which don't 
usually examine every byte (we could have a 64KB transfer buffer).  
Second, what do you return as the number of bytes written to the caller?

> The fact that 0 will be returned is in fact an error condition in this
> case since a ^Z should never have got read.

This can't be done, AFAIK; see above.

> > handle the case where ^Z is the first character in the buffer.  If ^Z
> > is somewhere in the middle, the caller will get a smaller return value
> > than the size of buffer it wanted to write, and will typically try to
> > write the rest of the buffer beginning with the next unwritten
> > character, which is ^Z.
> 
> Why would a user make the assumption that after a partial write, the
> remainder will get written succesful?

In practice, that is what many programs do, presumably due to their Unix 
legacy.  OTOH, if the application doesn't continue to write, no harm is 
done, I think.  You just have the effect you requested: nothing gets 
written after ^Z.

- Raw text -


  webmaster     delorie software   privacy  
  Copyright © 2019   by DJ Delorie     Updated Jul 2019