Token Filters for Lua

In an attempt to at least get a chance at making those haughty Lisp fanatics shut up for a minute about their macros, people have been playing with a patch for Lua which will make some of the same tricks possible: token filters. The idea is that you register with the Lua interpreter your own Lua function which gets the chance to examine and modify the token stream for any subsequent pieces of Lua code that get to be executed. Your function gets called for each token and outputs one or many tokens, but thanks to the tricks made possible by Lua coroutines, you can pretend that you can simply get() any further tokens you need and put() them to the output stream at your own pace.

Of course, this is much more inconvenient than Lisp macros, because a token stream is an order of magnitude harder to examine and make intelligent decisions upon than the no-syntax S-expressions of Lisp, but still, it’s a nice hack. People have started doing truly interesting/useful things, like this function parameter checking facility.

The main problem I see is not with the awkwardness of the token stream mechanics; if this catches on, I think sensible frameworks which allow work with higher-level Lua constructs will emerge. The problem is with the uber-simplistic Lua compiler, which does virtually no optimizations. Most often than not, the idea of macros (and other metaprogramming techniques, such as template metaprogramming in C++) is to introduce simpler syntax for things which will be expanded/calculated, then most of the inefficiency will be optimized away, and all that will happen at compile time. With Lua, there’s nothing to optimize your code, so often you’ll have nothing to gain by running some calculation at compile time instead runtime. And the lack of optimization is a feature - it comes with the “tiny, simple, embeddable” part of the Lua bundle.

One Response to “Token Filters for Lua”

  1. Carron Baths Says:

    Took me ages to find this post, this time I’ll bookmark it.

Leave a Reply