Received: by nummer-3.proteosys id <01C19443.98D8BE1C@nummer-3.proteosys>; Thu, 3 Jan 2002 11:44:19 +0100 In-Reply-To: lingnau%INFORMATIK.UNI-FRANKFURT.DE@vm.gmd.de's message of Thu, 30Jan 92 16:00:39 N <9201301504.AA26170@ufer.ZIB-Berlin.DE> MIME-Version: 1.0 Content-Type: multipart/alternative; boundary="----_=_NextPart_001_01C19443.98D8BE1C" Return-Path: <@vm.gmd.de:LATEX-L@DHDURZ1.BITNET> X-MimeOLE: Produced By Microsoft Exchange V6.5 x-vm-v5-data: ([nil nil nil nil nil nil nil nil nil][nil nil nil nil nil nil nil nil nil nil nil nil nil nil nil nil]) Content-class: urn:content-classes:message Subject: Validating LaTeX output Date: Thu, 30 Jan 1992 16:29:52 +0100 Message-ID: X-MS-Has-Attach: X-MS-TNEF-Correlator: From: "Rainer Schoepf" Sender: "LaTeX-L Mailing list" To: "Rainer M. Schoepf" Reply-To: "LaTeX-L Mailing list" Status: R X-Status: X-Keywords: X-UID: 554 This is a multi-part message in MIME format. ------_=_NextPart_001_01C19443.98D8BE1C Content-Type: text/plain; charset="iso-8859-1" Content-Transfer-Encoding: quoted-printable Anselm Lingnau wrote: > 3. The output of dvitype slightly differs between different > implementations. This makes it almost unusable, except for > specialists. Wouldn't it be possible to write a program that parses the dvitype output and checks whether the deviation from some `standard' is = within acceptable limits? I think it would be possible to write some sort of context sensitive `diff' program. But that solves only the last problem. Anyhow, it'd = be nice to be able to check an implementation of LaTeX by running a = program that said Well, some dimensions were off by as much as 0.0001%. Other than that, everything seems O. K. Very nice. But how can you check that the .dvi file produced from a tex file and a set of .sty files is the correct one? An example I experienced recently while doing the LaTeX update: After some change I got the report that labels in the bibliography environment were left aligned whereas they had been right aligned previously. No problem, I changed it. A while later it was reported that this had now the effect that labels in the alpha bibstyle came out right aligned as well whereas they should have been left aligned. Two points follow: 1. LaTeX is too complex to check every feature against every other, i.e. one has to select certain of them for testing. 2. Given such a set of tests, against what do you compare the output? I.e. what is the standard, the known-to-be-correct-thing? I don't know much about the subject of software testing, maybe someone more knowledgeable can tell us more. Rainer ------_=_NextPart_001_01C19443.98D8BE1C Content-Type: text/html; charset="iso-8859-1" Content-Transfer-Encoding: quoted-printable Validating LaTeX output

Anselm Lingnau = <lingnau%INFORMATIK.UNI-FRANKFURT.DE@vm.gmd.de> wrote:

   > 3. The output of dvitype slightly = differs between different
   >    implementations. = This makes it almost unusable, except for
   >    = specialists.

   Wouldn't it be possible to write a = program that parses the dvitype
   output and checks whether the deviation = from some `standard' is within
   acceptable limits?

I think it would be possible to write some sort of = context sensitive
`diff' program. But that solves only the last = problem.

          &nbs= p;            = ;            =             &= nbsp;           = Anyhow, it'd be
   nice to be able to check an = implementation of LaTeX by running a program
   that said

           Well, some dimensions were off by as much as = 0.0001%.
           Other than that, everything seems O. K.

Very nice. But how can you check that the .dvi file = produced from a
tex file and a set of .sty files is the correct = one?

An example I experienced recently while doing the = LaTeX update: After
some change I got the report that labels in the = bibliography
environment were left aligned whereas they had been = right aligned
previously. No problem, I changed it. A while later = it was reported
that this had now the effect that labels in the alpha = bibstyle came
out right aligned as well whereas they should have = been left aligned.

Two points follow:

1. LaTeX is too complex to check every feature against = every other,
   i.e. one has to select certain of them = for testing.

2. Given such a set of tests, against what do you = compare the output?
   I.e. what is the standard, the = known-to-be-correct-thing?

I don't know much about the subject of software = testing, maybe someone
more knowledgeable can tell us more.

Rainer

------_=_NextPart_001_01C19443.98D8BE1C--