Look, I understand you like newLISP a lot. I don't think that's a good reason for downplaying this project.
You have a language you like? Good, stick to it. You don't need to paint other peoples' bikeshed, do you?
I don't really know newLISP very much, but I know it uses hygienic macros and my brain is just not wired for them. So forgive me if I prefer Arc.
You seem to know about the "goals set for Arc". What they would be? To "put fun back into programming"? Every new language and its dog claims to do that, nowadays. I'm having a lot of fun using LISP already, thankyouverymuch. (I mean any LISP.)
And I'm interested in everything that has to do with LISP. If you want to bring in technical merits of newLISP or discuss technical points about Arc, you are welcome, I'm all ears. Otherwise, you are better off doing something different with your time than starting religious flame wars. Personally, I really have had enough of that.
You really think that Arc can't do better than newLISP? Good. There's nothing to talk about. Time will tell, and you'll even get to laugh as Arc implodes. You'll have lots of fun. Let us have ours. Thanks.
Common Lisp, Scheme and Arc are similar to each other in many ways, so you won't have many problems if you change your mind. One of the exceptions is the macro system: CL and Arc are similar ("raw" macros) while Scheme takes the hygienic approach. Understanding a macro system and learning its idiosyncrasies takes a while.
I think "On Lisp" makes some very valid points in favor of the Arc's approach, but probably you'll find one system more intuitive than the other and it will boil down to a matter of personal preference.
For these reasons I think the book is more important than the language. Find a book that resonates with you and learn. You'll be in a better position to decide then.
You probably know that, but let me point out that the order of the arguments is wrong. That said, the expression shouldn't hang.
The rem function hangs because it tries to coerce the function to a cons (arc.arc:483) but coerce returns what it was given when it doesn't know what to do (ac.scm:787). So rem calls itself indefinitely. Maybe coerce should throw an error in that case? I'll try that out and let you know.
Another thing that has been very useful to me is a function that creates a table with a given list of keys all initialized to a given initial value. For instance:
table-w/keys is useful for counting. After creating the table in icemaze's exemple you can for instance do
(++ (h 'cars))
Even more useful, however, is to specify a default value for keys not yet in the table, so you can do the above without first initializing for the keys you intend to use. PG mentioned in Arc at 3 Weeks that lookup failure should return the global value * fail* , in which case you would be able to do
which returns a table where red is 1, blue is 2, green is 1, and no other keys exist. The example is kind of stupid, but the concept is quite powerful.
Good point about fill-table. I didn't think of how easy it is to give this function a newly created table. It still kind of feels like a shortcoming that (table) doesn't accept values like (list) does. And I don't want to define a different function to behave like this. If it's a good idea, it should be in the main table constructor. PG seems to have thought of this already because there's a comment in ac.arc with a definition that involves fill-table. I do, however, see that this would slightly bloat the constructor.
(mac >> body
`(let it ,(car body)
,(if (cdr body) `(>> ,@(cdr body))
'it)))
It works a little bit like CL's let* but is anaphoric and it's much easier to read. It's a pipeline: each expression is evaluated and it is bound to the result for the next expression to use. Example:
It's very convenient sometimes. Plus, since most of Arc's functions have their main argument at the end (thanks devteam!) it could be modified so that it appends "it" to every expression in the body. This depends on how it's used in the real world.
(def ablast (l)
(if (no (cdr l))
nil
(cons (car l) (ablast (cdr l)))))
(def replc (x y l)
(if (atom l) (if (is x l) y l)
(no l) nil
(is x (car l)) (cons y (replc x y (cdr l)))
(acons (car l)) (cons (replc x y (car l)) (replc x y (cdr l)))
(cons (car l) (replc x y (cdr l)))))
(mac imp body
(if (no body) nil
(replc 'it `(imp ,@(ablast body)) (last body))))
Not really. Arc has macros and yet all the syntactic sugar ([], :, ~) is implemented in Scheme. Access to the read-table would be a start.
What I would really like to see though is a requirement for the code walker to be exposed and extendable.
Ah, ok! Now I get your point. Yes, it would be really useful to be able to extend syntax like that. However I believe readtable access should be limited. Let me explain.
I don't mean "limited" as in "dumbed down" but as in "predictable". It's nice to extend syntax, but it's even nicer when your editor can indent and colorize your program without a hitch. You are right: macros don't alter syntax significantly. And that's what makes them so nice: parens are still balanced, editors don't barf and, if you use with-gensyms and once-only, referential transparency is broken only where necessary.
A good place to start, IMHO, would be the ability to define operators: infix (like ':'), suffix ('~'), postfix and "wrapping" (like '[...]'). Operators would satisfy all of Arc's syntax-extension needs (unless I'm missing something). For example:
(operator wrapping #\[ #\] body
`(fn (_) ,@body))
(operator prefix #\~ symbol
`(fn args (not (apply ,symbol args))))
...or something like that.
Simonb: do you think this solution would fill the gap? Can you point out some situation where you would like even more control over the read-table?
The problem with giving only a limited support for something is that people will invariably want more and start to abuse the system. For instance with operators you risk ending up with a slew of little languages full of glyphs rich with semantic meaning (think Perl).
Macros allow you to build "up", to add new [abstraction] layers to your language but sometimes it would be nice to be able to go "down" as well and add new semantics that non-trivially interacts with the host language. Re-entrant continuations or proper tail recursion for CL come to mind. Another example of such friction is Qi [http://www.lambdassociates.org/aboutqi.htm] where all sorts of hacks (case sensitivity etc.) are needed instead of it simply being a library.
Theoretically one could design a language powerful enough that this would never be necessary but even then you would want code-walkers for optimisations. CL with it's compiler macros is a step in the right direction but it soon runs aground when you want to do things such CPS optimisations. Say you want to merge nested map-operations in CL:
(map 'vector #'^2 (map 'vector #'+ a b))
=>
(map (lambda (:g1 :g2) (^2 (+ :g1:g2))) a b)
This is something compiler macros are great for, but as soon as the inner map is wrapped in a function we have a problem. What our compiler macro sees is something like:
(map 'vector #^2 (vector+ a b))
and at that point we have no idea what vector+ looks like.
In terms of a compromise, access to the environment at compile-time would go a long way (CL has provisions for this, but the standard doesn't say anything about how the environment should look like) and maybe a facility to get function definitions (forced inlining if you will).
I don't think we want that at this point. Every change in the language would require porting the language itself to its new version and a lot of effort could be potentially wasted this way. Later, maybe.