Subject: Fw: Are we losing out because of grammars?
The following message from the xml-dev group seems very germaine to this group's work. Martin Bryan ----- Original Message ----- From: "Sean McGrath" <firstname.lastname@example.org> Sent: 7 February 2001 13:33 Subject: Re: Are we losing out because of grammars? > Single models (monolithic) fail in the real world not in the theoretical > world. > > In theory, people in a smoke filled room can agree a top-down > model of data interchange. In practice, the cannot. In theory, developers > can manage the state-space explosion inherent in processing > monolithic content models but in practice they cannot. > > I think the mantra that "content + presentation == document" is part > of the problem. > > In reality there are two main sub-divisions of "content": > "semantics + aggregation + presentation == document" > > Agreeing semantic elements (invoice, voltage, footnote) is far more > politically/technically feasible than agreeing aggregation elements (ledger, > TV set, Technical Manual). > > Moreover, I think a multi-dimensional XML modelling technique > in which the *expression* of the aggregation is itself an XML > instance, it a powerful and general modelling approach worthy > of consideration in many contexts. > > There are a number of analogies. All are useful to some degree > but break down if you push them too far... > > Polymorphism - a containership model in which the type of the > things contained is not relevant. > > Merchant Shipping - what has standardized? The design of > the ship or the design of the containers loaded on to the ships > and subsequently transported by truck/rail? > > Bottom Up Analysis - start by modelling the component units > of a system and work upwards towards ways of agregating > them together. > > Tupperware (tm) - A containership model in which content and > containers can be intermixed. The contents of the containers > is never modelled in the outer container(s). > > regards, > >
Powered by eList eXpress LLC