Ah ah ah.... no! The cute thing is that a 'bool presents the "scanner abstraction". Basically, a scanner is anything that overloads the functions 'car, 'cdr, 'scanner, and 'unscan. Thus, you don't have to check for the type of an object: you just need to pass it through 'scanner. If 'scanner throws, it's not a "list". If 'scanner returns a value, you can be sure that it returns a value that will be legitimately passed to 'car and 'cdr.
From this point of view, a "list" is anything that happens to be a scanner.
So nil is a list. So are 'cons cells. So are anything that overloads 'scanner, 'unscan, 'car, and 'cdr.
Try:
(using <lazy-scanner>v1)
...then play around with (lazy-scanner a d)
Or for a bit of ease of use generate an infinite list of positive integers (after doing the 'using thing):
(generate [+ _ 1] 1)
Edit: to summarize: a list is not a cons cell. A cons cell is a list. ^^
Ah...but...what if you want to define a generic function that operates differently on lists and bools (i.e. not a scanner, but a general generic function). I haven't had a close look at Arc-3F yet, so maybe I need to play around a bit more to uderstand what you're saying :)
Well, a "list" is a "scanner". So your "not a scanner" doesn't make sense, at least from the point of view of Arc-F.
However if you mean "list" as in sequence of cons cells:
(def works-on-cons-cells-and-bools (x)
(err "this works only on cons cells and bools!"))
(defm works-on-cons-cells-and-bools ((t x cons))
(work-on-cons-cells x))
(defm works-on-cons-cells-and-bools ((t x bool))
(work-on-bool x))
Note that you can even define a unifying "type class" function which ensures that the given data is a cons cell or a bool, or is convertible to one (i.e. an analog to 'scanner). For example, you might want a "hooper" type class:
(def hooper (x)
(err "Not convertible to a bool or cons cell" x))
(defm hooper ((t x cons))
x)
(defm hooper ((t x bool))
x)
Then you can convert works-on-cons-cells-and-bools with the type class:
Then, someone can make a type which supports the "hooper" type class by either overloading hooper (and returning a true hooper), or overloading hooper and works-on-cons-cells-and-bools:
choice one:
(defm hooper ((t x my-type))
(convert-my-type-to-cons x))
choice two:
(defm hooper ((t x my-type))
x)
(defm works-on-cons-cells-and-bools ((t x my-type))
(work-on-my-type x))
in reply to the summary: nice.. some day I'll have to look into the bellow of the beast. Arc3F no longer has lists build up from cons?, or lists are build from cons which are a special kind of lists.
Well, I prefer to think of cons-as-lists as one implementation of lists. It's possible to define alternative implementations of lists; all that is necessary is to define the overloads for 'car, 'cdr, 'scanner, and 'unscan. With generic functions in Arc-F, that is enough to iterate, cut, search, filter, join, map, and more on any list, regardless of whether it's made of 'cons cells or 'lazy-scanner objects.
Not that this is necessarily a bad thing - it's good to take your time thinking about hard problems. I just hope the Arc community doesn't get too pissed off with pg's slow pacing that they go off and create yet another could-have-been-great-but-for-x-y-and-z new lisp dialect.
A possible implementation: use the Anarki repository as the website and build the package manager on top of git. I don't know much about gems so I don't know if this would work, but it would save a lot of reinventing the wheel.
To knock down my own suggestion: git isn't suitable for this because it doesn't have any way of partially downloading a repository. At least, none that I can find.
I think you've hit the nail on the head. Hygenic macros and unhygenic macros are very different things (unlike dynamic vs lexical scoping, which are just different ways to create a function). Lisp macros are 'true' macros (Wikipedia: "Macro: a set of instructions that is represented in an abbreviated format"). Hygenic macros are more like a new abstraction that was inspired by Lisp macros.
Well, I'd rather not argue about what 'true' macros are, but I would point out that your definition is basically data compression for programs (which, by the way, I think is an interesting approach to take to programming language design). I'm pretty sure both types of macros and normal functions would all fall under it.
As for the hygienic vs. unhygienic difference, unhygienic macros are certainly easier to define: they rearrange source code into other source code.
The one thing I can think of that hygienic macros can do that unhygienic ones can't is that while they are rearranging source code, hygienic macros can insert references to things that aren't bound to any variable in the expansion scope. The common example I've seen for this is that it lets you protect against people redefining your variables weirdly. For instance, if you insert a reference to 'car', it means whatever 'car' meant where you defined your hygienic macro, even if 'car' has been redefined to be something crazy in the place where your macro is used. The Scheme hygienic macro system also has a way to break hygiene if you want to, so it can do everything other Lisp macros can do.
I guess the question then is, is it useful to be able to do that?
And if you decide you want to be able to do that, are Scheme-style hygienic macros the right way to go about it?
(One option would be to just let you insert objects straight into forms, instead of symbols that you think should reference those objects. This would be fine unless you wanted to be able to set those things later, in which case you'd need some way to get at the actual underlying variable boxes.)
But what is the 'lexically obvious object' for symbols in a macro expansion?
(= z '(list y))
(with (x 1 y 2)
(mac foo () z))
(with (x 3 y 4)
(pr (list x)
(pr (foo)))
which expands to
(with (x 3 y 4)
(pr (list x)
(pr (list y)))
To my mind, the lexically obvious value of y is 4 in that last expression, just as the lexically obvious value of x is 3. I see nothing lexically obvious about giving y the value 2, because there is no lexical reference to y in that scope.
So in my opinion, lexical scoping and unhygenic macros are both doing the lexically obvious thing. Hygenic macros introduce a whole new kind of scoping that is neither lexical nor dynamic, but looks like lexical scoping in specific circumstances.
Unhygenic macros preserve true lexical scoping while hygenic macros try to override it. So from my point of view, the question should be "If lexical scoping is a good idea, why would you want hygiene?"
or something like that. My regexp knowledge is a little rusty. I also agree with stefano's point that they should be part of the standard library. This would be possible if there were easy ways to reprogram the Arc syntax in a library.
> This would be possible if there were easy ways to reprogram the Arc syntax in a library.
Again, like I said, this is probably implementable using readermacros, but fooling around with the reader is always troublesome.
Consider some random programmer who uses /ca/ as a variable name in his or her programs for some inexplicable reason. Whether this is considered a regular expression or a valid symbol will then depend on whether or not it is loaded before or after the regular expression library.
If you want nice regular expression syntax, then it must be standardized as part of the language syntax so that everyone knows they should avoid using such variable names. Alternatively, give some method for specifying a reader for each module file. No, this is an exploratory language, and someone will try using /ca/ as a variable name unless you specifically ban it. I promise you that.
This is only partially implementable using ssyntax, but again this may be considered as "fooling around with the reader".
If strings as regular expressions work for you, then it's okay, since strings are already standardized in the syntax: