Latest posts for tag devel

Next time we'll iterate on Himblick design and development, Raspberry Pi 4 can now run plain standard Debian, which should make a lot of things easier and cleaner when developing products based on it.

Somewhat related to nspawn-runner, random links somehow related to my feeling that nspawn comes from an ecosystem which gives me a bigger sense of focus on security and solidity than Docker:

I did a lot of work on A38, a Python library to deal with FatturaPA electronic invoicing, and it was a wonderful surprise to see a positive review spontaneously appear! ♥: Fattura elettronica, come visualizzarla con python | TuttoLogico

A beautiful, hands-on explanation of git internals, as a step by step guide to reimplementing your own git: Git Internals - Learn by Building Your Own Git

I recently tried meson and liked it a lot. I then gave unity builds a try, since it supports them out of the box, and found myself with doubts. I found I wasn't alone, and I liked The Evils of Unity Builds as a summary of the situation.

A point of view I liked on technological debt: Technical debt as a lack of understanding

Finally, a classic, and a masterful explanation for a question that keeps popping up: RegEx match open tags except XHTML self-contained tags

While traveling around Germany, one notices that most towns have a Greek or Italian restaurant, and they all kind of have the same names. How bad is that lack of fantasy?

Let's play with Select a bounding box and run this query:


Export the results as gpx and have some fun on the command line:

sed -nre 's/^name=([^<]+).*/\1/p' /tmp/greek.gpx \
   | sed -re 's/ *(Grill|Restaurant|Tavern[ae]) *//g' \
   | sort | uniq -c | sort -nr > /tmp/greek.txt

Likewise, with Italian restaurants, you can use cuisine=italian and something like:

sed -nre 's/^name=([^<]+).*/\1/p' /tmp/italian.gpx \
   | sed -re 's/ *(Restaurant|Ristorante|Pizzeria) *//g' \
   | sort | uniq -c | sort -nr > /tmp/italian.txt

Here are the top 20 that came out for Greek:

    162 Akropolis
     91 Delphi
     86 Poseidon
     78 Olympia
     78 Mykonos
     78 Athen
     76 Hellas
     74 El Greco
     71 Rhodos
     57 Dionysos
     53 Kreta
     50 Syrtaki
     49 Korfu
     43 Santorini
     43 Athos
     40 Mythos
     39 Zorbas
     35 Artemis
     33 Meteora
     29 Der Grieche

Here are the top 20 that came out for Italian, with a sadly ubiquitous franchise as an outlier:

     66 Vapiano
     64 Bella Italia
     59 L'Osteria
     54 Roma
     43 La Piazza
     38 La Dolce Vita
     38 Dolce Vita
     35 Italia
     32 Pinocchio
     31 Toscana
     30 Venezia
     28 Milano
     28 Mamma Mia
     27 Bella Napoli
     25 San Marco
     24 Portofino
     22 La Piazzetta
     22 La Gondola
     21 Da Vinci
     21 Da Pino

One can play a game while traveling: being the first to spot a Greek or Italian restaurant earns more points the more unusual its name is. But beware of being too quick! If you try to claim points for one of the restaurant with the top-5 most common names, you will actually will actually lose points!

Have fun playing with other combinations of areas and cuisine: the Overpass API is pretty cool!


Rather than running xml through sed, one can export geojson, then parse it with the excellent jq:

jq -r '.features[]' italian.json \
   | sed -re 's/ *(Restaurant|Ristorante|Pizzeria) *//g' \
   | sort | uniq -c | sort -nr > /tmp/italian.txt
One-page guide to ES2015+: usage, examples, and more. A quick overview of new JavaScript features in ES2015, ES2016, ES2017, ES2018 and beyond.
Rich offline experiences, periodic background syncs, push notifications&amp;mdash;functionality that would normally require a native application&amp;mdash;are coming to the web. Service workers provide the technical foundation that all these features rely on.
The Service Worker Cookbook is a collection of working, practical examples of using service workers in modern web sites.
One overriding problem that web users have suffered with for years is loss of connectivity. The best web app in the world will provide a terrible user experience if you can’t download it. There have been various attempts to create technologies to solve this problem, as our Offline page shows, and some of the issues have been solved.

One of the software I maintain for work is a GUI data browser that uses Tornado as a backend and a web browser as a front-end.

It is quite convenient to start the command and have the browser open automatically on the right URL. It's quite annoying to start the command and be told that the default port is already in use.

I've needed this trick quite often, also when writing unit tests, and it's time I note it down somewhere, so it's easier to find than going through Tornado's unittest code where I found it the first time.

This is how to start Tornado on a free random port:

from tornado.options import define, options
import tornado.netutil
import tornado.httpserver

define("web_port", type=int, default=None, help="listening port for web interface")

application = Application(self.db_url)

if options.web_port is None:
    sockets = tornado.netutil.bind_sockets(0, '')
    self.web_port = sockets[0].getsockname()[:2][1]
    server = tornado.httpserver.HTTPServer(application)
    server = tornado.httpserver.HTTPServer(application)

I am writing a little application server for microservices written as compiled binaries, and I would like to log execution statistics from getrusage(2).

The application server is written using asyncio, and processes are managed using asyncio subprocesses.

Unfortunately, asyncio uses os.waitpid instead of os.wait4 to reap child processes, and to get rusage information one has to delve into the asyncio innards, and provide a custom ChildWatcher implementation. Here's how I did it:

import asyncio
from asyncio.log import logger
from contextlib import contextmanager
import os

class ExtendedResults:
    def __init__(self):
        self.rusage = None
        self.returncode = None

class SafeChildWatcherWithRusage(asyncio.SafeChildWatcher):
    SafeChildWatcher that uses os.wait4 to also get rusage information.
    rusage_results = {}

    def monitor(cls, proc):
        Return an ExtendedResults that gets filled when the process exits
        assert > 0
        pid =
        extended_results = ExtendedResults()
        cls.rusage_results[pid] = extended_results
            yield extended_results
            cls.rusage_results.pop(pid, None)

    def _do_waitpid(self, expected_pid):
        # The original is in asyncio/; on new python versions, it
        # makes sense to check changes to it and port them here
        assert expected_pid > 0

            pid, status, rusage = os.wait4(expected_pid, os.WNOHANG)
        except ChildProcessError:
            # The child process is already reaped
            # (may happen if waitpid() is called elsewhere).
            pid = expected_pid
            returncode = 255
                "Unknown child process pid %d, will report returncode 255",
            if pid == 0:
                # The child process is still alive.

            returncode = self._compute_returncode(status)
            if self._loop.get_debug():
                logger.debug('process %s exited with returncode %s',
                             expected_pid, returncode)

        extended_results = self.rusage_results.get(pid)
        if extended_results is not None:
            extended_results.rusage = rusage
            extended_results.returncode = returncode

            callback, args = self._callbacks.pop(pid)
        except KeyError:  # pragma: no cover
            # May happen if .remove_child_handler() is called
            # after os.waitpid() returns.
            if self._loop.get_debug():
                logger.warning("Child watcher got an unexpected pid: %r",
                               pid, exc_info=True)
            callback(pid, returncode, *args)

    def install(cls):
        loop = asyncio.get_event_loop()
        child_watcher = cls()

To use it:

from .hacks import SafeChildWatcherWithRusage


    def run(self, *args, **kw):
        kw["stdin"] = asyncio.subprocess.PIPE
        kw["stdout"] = asyncio.subprocess.PIPE
        kw["stderr"] = asyncio.subprocess.PIPE
        self.started = time.time()

        self.proc = yield from asyncio.create_subprocess_exec(*args, **kw)

        from .hacks import SafeChildWatcherWithRusage
        with SafeChildWatcherWithRusage.monitor(self.proc) as results:
            yield from asyncio.tasks.gather(
        self.returncode = yield from self.proc.wait()
        self.rusage = results.rusage
        self.ended = time.time()

Debian conveniently distribute JavaScript libraries, and expects packaged software to use them rather than embedding their own copy.

Here is a convenient custom StaticFileHandler for Tornado that looks for the Debian-distributed versions of JavaScript libraries, and falls back to the vendored versions if they are not found:

from tornado import web
import pathlib

class StaticFileHandler(web.StaticFileHandler):
    StaticFileHandler that allows overriding paths in the static directory with
    system provided versions
    SYSTEM_ASSET_PATH = pathlib.Path("/usr/share/javascript")

    def get_absolute_path(self, root, path):
        path = pathlib.PurePath(path)
        if not
            return super().get_absolute_path(root, path)

        system_dir = self.SYSTEM_ASSET_PATH.joinpath([0])
        if system_dir.is_dir():
            # If that asset directory exists in the system, look for things in
            # there
            return self.SYSTEM_ASSET_PATH.joinpath(path)
            # Else go ahead with the default static dir
            return super().get_absolute_path(root, path)

    def validate_absolute_path(self, root, absolute_path):
        Rewrite of tornado's validate_absolute_path not to raise an error for
        paths in /usr/share/javascript/
        root = pathlib.Path(root)
        absolute_path = pathlib.Path(absolute_path)

        is_system_root =[:len(] ==
        is_static_root =[:len(] ==

        if not is_system_root and not is_static_root:
            raise web.HTTPError(403, "%s is not in root static directory or system assets path",

        if absolute_path.is_dir() and self.default_filename is not None:
            # need to look at the request.path here for when path is empty
            # but there is some prefix to the path that was already
            # trimmed by the routing
            if not self.request.path.endswith("/"):
                self.redirect(self.request.path + "/", permanent=True)
            absolute_path = absolute_path.joinpath(self.default_filename)
        if not absolute_path.exists():
            raise web.HTTPError(404)
        if not absolute_path.is_file():
            raise web.HTTPError(403, "%s is not a file", self.path)
        return str(absolute_path)

This is how to use it:

class DebianApplication(tornado.web.Application):
    def __init__(self, *args, **settings):
        from .static import StaticFileHandler
        settings.setdefault("static_handler_class", StaticFileHandler)
        super().__init__(*args, **settings)

And from HTML it's simply a matter of matching the first path component to what is used by Debian's packages under /usr/share/javascript:

    <link rel="stylesheet" href="{{static_url('bootstrap4/css/bootstrap.min.css')}}">
    <script src="{{static_url('jquery/jquery.min.js')}}"></script>
    <script src="{{static_url('popper.js/umd/popper.min.js')}}"></script>
    <script src="{{static_url('bootstrap4/js/bootstrap.min.js')}}"></script>

I find it quite convenient: this way I can start writing prototype code without worrying about fetching javascript libraries to bundle.

I only need to start worrying about it if I need to deploy outside of Debian, or to old stable versions of Debian that don't contain the required JavaScript dependencies. In that case, I just cp -r from a working /usr/share/javascript into Tornado's static directory, and I'm done.

Python mailbox.mbox is not good at opening compressed mailboxes:

>>> import mailbox
>>> print(len(mailbox.mbox("/tmp/test.mbox")))
>>> print(len(mailbox.mbox("/tmp/test.mbox.gz")))
>>> print(len(mailbox.mbox("/tmp/test1.mbox.xz")))

For a prototype rewrite of the MIA team's Echelon (the engine behind mia-query), I needed to scan compressed mailboxes, and I had to work around this limitation.

Here is the alternative mailbox.mbox implementation:

import lzma
import gzip
import bz2
import mailbox

class StreamMbox(mailbox.mbox):
    mailbox.mbox does not support opening a stream, which is sad.

    This is a subclass that works around it
    def __init__(self, fd: BinaryIO, factory=None, create: bool = True):
        # Do not call parent __init__, just redo everything here to be able to
        # open a stream. This will need to be re-reviewed for every new version
        # of python's stdlib.

        # Mailbox constructor
        self._path = None
        self._factory = factory

        # _singlefileMailbox constructor
        self._file = fd
        self._toc = None
        self._next_key = 0
        self._pending = False       # No changes require rewriting the file.
        self._pending_sync = False  # No need to sync the file
        self._locked = False
        self._file_length = None    # Used to record mailbox size

        # mbox constructor
        self._message_factory = mailbox.mboxMessage

    def flush(self):
        raise NotImplementedError("StreamMbox is a readonly class")

class UsageExample:

    def scan(cls, path: Path) -> Generator[ScannedEmail, None, None]:
        decompress = cls.DECOMPRESS.get(path.suffix)
        if decompress is None:
            with open(path.as_posix(), "rb") as fd:
                yield from cls.scan_fd(path, fd)
            with decompress(path.as_posix(), "rb") as fd:
                yield from cls.scan_fd(path, fd)

    def scan_fd(cls, path: Path, fd: BinaryIO) -> Generator[ScannedEmail, None, None]:
        mbox = StreamMbox(fd)
        for msg in mbox:

This code:


class Test:
    def __init__(self, items=[]):
        self.items = items

    def add(self, item):

a = Test()
b = Test()

"obviously" prints:

['foo', 'bar']
['foo', 'bar']

Because the default value of the items argument is a mutable list constructed just once when the code is compiled when the function definition is evaluated, and then reused.

So, in Python, mutable items in default arguments are a good way to get more fun time with debugging.