Friday, February 26

The Setup / Charlie Loyd:

Too many of my daily tools are overfitted. What could I be using that’s not designed by and for affluent Anglophone citydwellers pretty much like me? I want to be learning more from my gear. One of my favorite designers is a blacksmith, Jim Wester, who makes blades for woodworking. Carvers all over the world use his adzes and knives. As a lark, sometimes he makes cutlery, and I have one of his paring knives. It breaks all sorts of kitchen knife norms: it’s obviously not designed by a chef. You sharpen it at a shallow angle, because it has an idiosyncratic bevel; its handle expects a slightly different motion than you do; it’s tool steel, not stainless, so you have to worry about rust - and so on. It’s weird. And I expect to use it and love it until, many years from now, I wear it out. It cuts like a cartoon laser. Its weirdness comes from the hand and mind of a master, and it’s precious to me. My dream setup has more of that feeling.

Wednesday, February 17

process management with pm2

Edit: There may be misleading stuff in these notes. In the end, I wound up going the simpler-to-explain route of just managing pm2 stuff as a user with a home directory and login shell. I don't exactly like this, but then ¯\_(ツ)_/¯

Here’s the tutorial I was writing all week.

I’m putting (hopefully) the last few pieces together in a tutorial on migrating hosted Parse applications to an instance of Parse Server running on a standalone box.

Parse is a “mobile backend as a service”, which is to say that it’s an API you can wire mobile apps up to, providing storage and authentication and a bunch of other stuff. Except it’s not going to be for much longer, because it’s shutting down after a Facebook acquisition.

Anyhow, the Parse folks (to their credit) have published an MIT-licensed Node.js version of this API, called Parse Server, which you can kinda run wherever, although the instructions are geared towards Heroku and other platformy stuff like that.

On an Ubuntu 14.04 system, I want to make sure that a parse-server process is started at system boot, running under a specific user called parse. There’s a Node-based process manager called PM2, which lets you do something like (assume I’m doing this all as root; sudo may complicate life):

# npm install -g pm2
# pm2 start parse-server

And then—this is pretty cool—you can say:

# pm2 save
# pm2 startup ubuntu

…and it’ll do something like this:

[PM2] Generating system init script in /etc/init.d/
[PM2] Making script booting at startup...
[PM2] -linux- Using the command:
      su -c "chmod +x /etc/init.d/ && update-rc.d defaults"
 System start/stop links for /etc/init.d/ already exist.
[PM2] Done.

Which is to say that it tries to write out a system-appropriate startup script which will resurrect the processes you saved. You can even specify a user and working homedir:

# pm2 startup ubuntu -u parse --hp /opt/parse-wrapper/

…which will set a user to run the startup script, and a home directory.

You can even write a JSON config file that will set environment variables and such, which I definitely need to do:

  "apps" : [{
    "name"        : "parse-wrapper",
    "script"      : "/usr/bin/parse-server",
    "watch"       : true,
    "merge_logs"  : true,
    "cwd"         : "/opt/parse-wrapper",
    "env": {
      "PARSE_SERVER_DATABASE_URI": "mongodb://",
      "PARSE_SERVER_APPLICATION_ID": "some_key",
      "PARSE_SERVER_MASTER_KEY": "some_key",

The problem here, or at least the one I think I’m having, is that PM2 is saving its state in a $HOME/.pm2/. I checked, and it seems like I can just cp that directory into whatever desired location, but since I’m doing this in public, with impressionable nerds watching, it sure seems like there should be a less hacky approach to the problem. Also it seems to persist state I don’t want (like the user that owns the process, which is kind of the whole point).

If I can specify a homedir for generating the startup process, and in config files, shouldn’t I be able to do that when saving the process list and such?

There’s an entire deployment mechanism built into this thing, for some reason, which I think I could probably leverage to solve this problem, but come on.

Wednesday, February 10

listening to a port with netcat

I always forget this one is so easy:

nc -l localhost 1234

Tuesday, February 9

day 9 of stout month

  1. 9/9 so far.
  2. Coffeeshops are for the most part kind of shitty places to get work done.
  3. So are living room couches.
  4. Precarity is usually closer than you think.
  5. I do not know what I am doing.
  6. I just heard a guy say “I don’t do American New Year”. I don’t know that he has the wrong idea.
  7. I wanted this list to be 0-indexed, but I’m writing it in a markup language which doesn’t support that behavior without escaping to HTML. If (for some reason) you want to know how and why forms of representation which give up expressive power for reduced complexity of syntax will cause you pain and eventually drift towards syntax complexity, I guess you could dwell in the space created by this problem and its relatives for a decade or two.
  8. There are probably better courses in life.
  9. From re-watching the first season on DVD, I still really like Babylon 5. The first episode of the new X-Files kind of hooked me a little bit. Nostalgia is frequently a dead letter, but I guess evaluating the genre environment of your teens in the light of your middle thirties doesn’t always have to be a depressing project.

Saturday, February 6

Getting a list of files in a given target directory matching some criteria, in Perl, using File::Find:

use File:Find;

# collect a list of things to render
sub get_source_files {
  my ($root_dir) = @_;

  my @source_files;
    sub {
      if (! /index$/) {
        push @source_files, $File::Find::name;
  return @source_files;

Notes: You can use $_ inside the subroutine, but if you want the full path, you need to use $File::Find::name. For some reason you can’t do the normal thing where you pull arguments off of @_.

Experientially, File::Find is actually kind of a crap replacement for shelling out to find(1). I seem to have used it a bunch of times over the years, but I never remember how the interface works and every time I find out I’m irritated.

Friday, February 5

moving from cgi to static site generation

This is the kind of post that people used to refer to as “administrivia” on the old-timey artisanal internet of yore. It’s about this web site itself, which is usually a boring thing for a web site to talk about.

Still, I’m trying to take this “keep a technical logbook” idea seriously and write down all the miscellaneous stuff that doesn’t go into a tutorial for my full-time gig.

First off, I have been thinking that could be used as a static site generator instead of a piece of dynamic code running under a CGI script. I finally spent time on this, and it turned out to be (mostly) simpler than I had expected.

When I started, the way this worked was that a Perl FastCGI script lived in /var/www/p1k3. FastCGI means that a long-running process loops over input, blocking until the web server receives a new request and hands it a fresh request. In this case, it would just pass the parameters to a method on an instance of Display, which would return some text to be printed. Pretty much like this. In order to use pretty URLs, I wrote a bunch of Apache mod_rewrite rules to pass things like:

Off to:

Which would look for a file in:


…and do markup processing, then push it through a Perl template of sorts to return HTML.

This code hasn’t, to my knowledge, gotten me owned for a while now, and part of me liked continuing to run old-school CGI in some little corner of my life. That said, it turns out that static site generators are popular with the cool kids for some good reasons.

For one thing, all running code is both an attack surface and a source of bugs. For another, a lot of the incidental complexity vanishes from the whole scenario when you’re just serving static files. For example, those RewriteRules, always cheesy and inflexible to maintain, just went away. So did the need to enable CGI at all. And absent those dependencies, I’m now free to serve with Apache, Nginx, Lighttpd, or just about anything else.

Anyway, what I do now is convert the same file to:


There’s some other rigamarole with copying static assets, and the code is still real stupid because I wrote most of it when I was 21ish and an idiot and then later tried to refactor and made it worse.

I should chop this all down into something a lot simpler. Maybe I should move it to some other language.