Mail Archives: djgpp/1998/09/13/12:31:43
Eli Zaretskii wrote:
> > I don't see that as a bad thing. I think it makes the language more
> > flexible.
> It also makes the code less predictable. The pointer dereference
> example is exactly the case in point. Imagine that your program uses
> a lot of third-party classes for which you don't have sources. If
> some of those classes overload the operator `*', in many cases you
> will have no idea what goes on behind the scenes when you code uses
> those classes, and deeply-nested, multiple-inheritance hierarchies of
> these classes together with insufficient documentation could prevent
> you from ever finding out. It is quite possible that you will not
> even be aware that `*' is overloaded, until it is too late.
But, the point of this was to allow use of new datatypes that acted as old
ones. So that code was reusable (in template form).
> Try to write a time-critical application that way, and you will
> understand what I mean.
But, most programs these days are NOT time-critical. For the most part,
companies that hire programmers, what software that is quick to develope and
works. Speed isn't that important (else VB wouldn't be so popular).
--
(\/) Endlisnis (\/)
s257m AT unb DOT ca
Endlisnis AT GeoCities DOT com
Endlis AT nbnet DOT nb DOT ca
- Raw text -