Ben Brosgol wrote: > > Tucker Taft wrote: > > > Ben Brosgol wrote: > >> 2) "Elaboration order" is more precisely defined in Java than in Ada. In > >> Ada one often has to use elaboration pragmas to avoid > >> access-before-elaboration exceptions or uninitialized variables. The > Java > >> analog to Ada's package elaboration is the initialization phase of > dynamic > >> class loading. When a class is initialized is defined by the language > >> rules. > > > >This is a bit misleading. In fact, Java "solves" the elaboration order > >problem by ignoring it. Access before elaboration is permitted in Java. > >If you read the JVM definition carefully, what you will find is that > >it does *not* require that class initialization be *completed* before > >you reference a field or call a method of a class. It simply requires > >that class initialization be *started*. So if two classes depend on > >each other cyclically, and reference each other during their elaboration, > >one is certain to not be fully initialized when referenced by the other. > > I am aware that this is how Java addresses the cyclic elaboration problem, > but I believe that such examples are pathological rather than normal, and as > I understand the rules the effect is still deterministic. Indeed that was > the point of my list, to answer Mark Lundquist's question about areas where > Ada left the effect implementation dependent but where Java specified the > semantics. > ... It will again be interesting to see how this is addressed in natively compiled versions of Java. In our experience, it is essentially impossible to implement Java semantics in a natively compiled and/or statically linked implementation, as might be expected for a real-time system, because it requires one class initialization to be called in the middle of another class initialization, if it is the first point of use. So instead what happens is that a topological sort is performed on the class initialization routines, and if there is a cyclic interdependence, or if there is no interdependence, an arbitrary order is chosen. I think what it points out is that as long as the language is defined with the presumption that the semantics are defined by a virtual machine interpreter, you can be very specific. As soon as you try to convert to natively compiled, statically linked code, you end up being forced to somehow loosen and/or modify the rules. One final comment. For the purposes of formal analysis, determinism is not always as important as a well-defined set of guarantees. If you look at the language Djikstra defined for his own use in "A Discipline of Programming" he made the two fundamental control structures non-deterministic (if and while). The key thing is that the proof rules are relatively straightforward, so it is clear what is relevant in proving that a program is correct, or in proving that a set of tasks will meet their deadlines. This should probably be the focus more than arbitrary determinism. > Ben Brosgol > [log in to unmask] -- -Tucker Taft [log in to unmask] http://www.averstar.com/~stt/ Technical Director, Distributed IT Solutions (www.averstar.com/tools) AverStar (formerly Intermetrics, Inc.) Burlington, MA USA