Received: from mail.proteosys.com ([213.139.130.197]) by nummer-3.proteosys with Microsoft SMTPSVC(6.0.3790.3959); Wed, 31 Dec 2008 02:18:40 +0100 Received: by mail.proteosys.com (8.13.8/8.13.8) with ESMTP id mBV1IclM022210 for ; Wed, 31 Dec 2008 02:18:39 +0100 Received: from listserv.uni-heidelberg.de (listserv.uni-heidelberg.de [129.206.100.94]) by relay2.uni-heidelberg.de (8.13.8/8.13.8) with ESMTP id mBV1FBqX029981 (version=TLSv1/SSLv3 cipher=DHE-RSA-AES256-SHA bits=256 verify=NO); Wed, 31 Dec 2008 02:15:12 +0100 Received: from listserv.uni-heidelberg.de (localhost.localdomain [127.0.0.1]) by listserv.uni-heidelberg.de (8.13.1/8.13.1) with ESMTP id mBUN27i8010011; Wed, 31 Dec 2008 02:15:02 +0100 Received: by LISTSERV.UNI-HEIDELBERG.DE (LISTSERV-TCP/IP release 15.5) with spool id 167132 for LATEX-L@LISTSERV.UNI-HEIDELBERG.DE; Wed, 31 Dec 2008 02:15:02 +0100 Received: from relay2.uni-heidelberg.de (relay2.uni-heidelberg.de [129.206.210.211]) by listserv.uni-heidelberg.de (8.13.1/8.13.1) with ESMTP id mBV1F2i1020658 for ; Wed, 31 Dec 2008 02:15:02 +0100 Received: from rv-out-0708.google.com (rv-out-0708.google.com [209.85.198.243]) by relay2.uni-heidelberg.de (8.13.8/8.13.8) with ESMTP id mBV1EwYs029918 for ; Wed, 31 Dec 2008 02:15:02 +0100 Received: by rv-out-0708.google.com with SMTP id c5so6199264rvf.10 for ; Tue, 30 Dec 2008 17:14:57 -0800 (PST) Received: by 10.141.5.3 with SMTP id h3mr7510446rvi.213.1230686097317; Tue, 30 Dec 2008 17:14:57 -0800 (PST) Received: from ?10.0.1.102? (219-90-242-56.ip.adam.com.au [219.90.242.56]) by mx.google.com with ESMTPS id l31sm9422465rvb.2.2008.12.30.17.14.55 (version=TLSv1/SSLv3 cipher=RC4-MD5); Tue, 30 Dec 2008 17:14:56 -0800 (PST) Content-Type: text/plain; charset=US-ASCII; format=flowed; delsp=yes Content-Transfer-Encoding: 7bit Mime-Version: 1.0 (Apple Message framework v929.2) References: <495160FA.8000004@morningstar2.co.uk> <9EECA5EE-12A7-4C01-8311-7658BD3E8E04@gmail.com> <21363E65-E3FB-4495-A94E-6789AC0619A0@gmail.com> <18778.35736.43421.950797@morse.mittelbach-online.de> X-Mailer: Apple Mail (2.929.2) X-Spam-Whitelist: Message-ID: <8A5C4EC7-9242-4A0E-9644-7BCE1BA029CC@gmail.com> Date: Wed, 31 Dec 2008 11:44:52 +1030 Reply-To: Mailing list for the LaTeX3 project Sender: Mailing list for the LaTeX3 project From: Will Robertson Subject: Re: Back to "token list" nomenclature; was Re: \tlist_if_eq:nn To: LATEX-L@LISTSERV.UNI-HEIDELBERG.DE In-Reply-To: <18778.35736.43421.950797@morse.mittelbach-online.de> Precedence: list List-Help: , List-Unsubscribe: List-Subscribe: List-Owner: List-Archive: X-ProteoSys-SPAM-Score: -2.599 () BAYES_00 X-Scanned-By: MIMEDefang 2.64 on 213.139.130.197 Return-Path: owner-latex-l@LISTSERV.UNI-HEIDELBERG.DE X-OriginalArrivalTime: 31 Dec 2008 01:18:40.0664 (UTC) FILETIME=[B6F9A980:01C96AE5] Status: R X-Status: X-Keywords: X-UID: 5555 On 31/12/2008, at 7:29 AM, Frank Mittelbach wrote: > So I personally would go KISS and offer > functions that limit the accepted input to balanced token lists > without #. Another random thought: since users "shouldn't" be defining new macros within the body of their documents, I think LaTeX3 should have something like the equivalent of \AtBeginDocument{\catcode`\#=12} Of course, there could be markup to allow them to write more definitions mid-document if they really want/need. * * * But that's rather separate from what we've been talking about :) It seems to me that the two most plausible options (if we do anything) are to change tlp->tlist or tlist->toks depending on how robust we can make the inline functions. Can we be guaranteed that all (what is currently now) \tlist_ functions can deal gracefully with # tokens? Alternative: use tlp->tlist regardless and say that tlist functions that take inline arguments are generally more robust with # tokens than saving data to a tlist pointer. (Since I kind of like the =tlist= name. Wishy-washy, I know.) * * * Also, is there a way that the naming of the \token_ module can be incorporated into our naming scheme above? Or is that stretching things too far? (My current feeling is that it is.) Will