Categories
Programming

Squawk (simple queues using awk)

If you are easily offended, look away now …

Reliable message queues (ActiveMQ in particular) are pretty handy things. They make it a lot easier to build reliable systems which are able to network problems, hardware trouble and temporary weirdness. However, they always feel pretty heavyweight; suitable for “enterprise systems” but not quick shell scripts.

Well, let’s fix that. My aim is publish and receive messages to an ActiveMQ broker from the unix shell with a minimum of overhead. I want to have a ‘consume’ script which reads messages from a queue and pipes them to a handler. If the handler script succeeds, the message is acknowledged and we win. If the handler script fails, the message is returned back to the queue, and can be re-tried later (possibly by a different host).

STOMP is what makes this easy. It’s a ‘simple text-oriented message protocol’ which is supported directly by ActiveMQ. So we won’t need to mess around with weighty client libraries. A good start.

But we still need to write a ‘consume’ program which will speak STOMP and invoke the message handler script. There are existing STOMP bindings for perl and ruby, but I’m pitching for a pure unix solution.

In STOMP, messages are NUL separated which made me wonder if it’d be possible to use awk, by setting its ‘record separator’ to NUL. The short answer is: yes, awk can do reliable messaging – win!

We’ll need some network glue. Recent versions of awk have builtin network support, but I’m going to use netcat because it’s more common than bleeding-edge awks.

I also want to keep ‘consume’ to be a single file, but I don’t want to pull my hair out trying to escape everything properly. So, I’ll use a bash here document to write the awk script out to a temporary file before invoking awk. (is there a nicer way to do this?)

There’s not much more to say except here’s the scripts: consume and produce.

To try it out, you’ll need to download ActiveMQ and start it up; just do ./bin/activemq and you’ll get a broker which has a stomp listener on port 61613.

To publish to a queue, run: echo ‘my message’ | ./produce localhost 61613 /queue/a

To consume, first write a message handler, such as:

#!/bin/bash
echo Handling a message at $(date).  Message follows:
cat
echo '(message ends)'
exit 0

and then run: ./consume localhost 61613 /queue/a ./myhandler.

To simulate failure, change the handler to “exit 1”. The message will be returned to the queue. By default, the consumer will then immediately try again, so I added in a ‘sleep 1’ to slow things down a bit. ActiveMQ has many tweakable settings to control backoff, redelivery attempts and dead-letter queue behaviour.

I’m done.

If you want to learn more about awk, check out the awk book on my amazon.com bookshelf.

Y’know, come the apocalypse, the cockroaches’s programming language of choice is probably going to be awk.

4 replies on “Squawk (simple queues using awk)”

you can get away with this:
#! /bin/bash

(echo “GET / HTTP/1.0” ; echo “”) | nc http://www.yahoo.com 80 | awk ‘
BEGIN{
i=0
}
{
print i++ “> ” $0;
}’ | head -20

alternatively if you need more processing, use ” instead or you can even process the current file and pipe it to awk if stdin is spare (i’ve used ## comments to give a grep target), using back ticks to get extra nested shells is a neat trick too..

Ken: Yep, that works fine if you are using netcat as a data source and filtering its output with awk. But for squawk I also need the output from awk to be fed back into netcat’s stdin – its a two way conversation. I use netcat -c “my_awk_command” for this. But that means my awk command invocation would need to be escaped otherwise bash will expand things like ‘$0’ which I intend for awk.

Hmm, actually, I think in this case I can surround the awk invocation with single quotes and then consistently use double quotes within the awk script. In an earlier version I was relying on the awk script being bash-expanded to do variable substitution for the queue name. But now I explicitly pass the arguments in with –assign.

Ah, no .. I still need to shell-expand the –assign part of the awk invocation, so it can’t be in single quotes.

This page has some suggestions; I’ll see if I can get it to work: http://www.gnu.org/manual/gawk/html_node/Quoting.html. I didn’t know that if you run something like this:

ls ‘foo'”bar” (no space between the quotes)

.. then ls receives a single argument – I’d always assumed it’d get two arguments.

Comments are closed.