Monday, December 29

What follows is a grab bag of stuff I wrote some time in the past year or so but never polished enough to feel like I should post it. It may be garbage, but just this one time I’m going to clear out my work-in-progress directory by doing something besides rm *.



Dan Lyke writes:

You know what the real revelation of This is a motherfucking web site is? Just how little we've gained from CSS and HTML3/4/5.

I’m not sure what I think about this. On the one hand, yeah, we’ve certainly walked down a lot of deadend paths and seen a lot of promise revealed as so much smoke & mirrors. On the other hand, so many aspects of the bad old days have been obscured by the last decade’s unbelievable progress in web tech that we can easily forget how much we’ve learned and built.

I started making web pages some time around 1995. What we have now that we didn’t really have then is at least:

  • Rich layout and presentation mechanics expressed at least partially in a separate layer from content structure. It’s not the shining dream of the semantic web, but on the other hand it’s nothing to sneeze at.

  • Pretty ok looking type.

  • High resolution, high color images.

  • Animation, video, and sound that pretty much work.

  • Clients which work most of the time.

  • Well-established means of building richer UI than mainframe-style form-submission.

  • Client-side scripting stable and used to solve real problems instead of to build scrolling stock tickers and crash browsers. (I came later to the “JavaScript is acceptable for real work” train than most, and I still think it’s used way too often to overload standard browser behavior, but at this stage if you refuse to acknowledge the utility of full-featured client-side scripting, I’m forced to assume you’re privileging the pleasure of your contrarian stance over good empiricism.)

Some of this you can lay at the feet of more bandwidth and more powerful computers, but I don’t think we’d be here without better markup than we had then. I’ve had my current gig since 2007, and in just that much time we’ve gone from pecking at snarled masses of nested tables and inline fonts to straightforward HTML in a handful of nested divs. It’s imperfect, and frankly kind of ugly, but there’s no question as to whether it improves on the clunky fragility of the way markup was usually done not so long ago.

‘course, I do have a set of complaints about the modern web and how we got here.

  • Until at least the era of HTML5, the HTML standards process and the general technical direction the official-ish leadership kept pursuing were both plagued by architectural detours, overdesign, and a pretty amazing disconnect from real-world implementations. I know, let’s retcon HTML into an XML dialect! What could possibly go wrong besides XML being a scheme of the Adversary and an abomination? I know! We’ll define a rich, top-down semantics for content implemented by exactly none of the working clients! So on and so forth.

  • CSS and its accompanying intellectual apparatus have utterly failed to provide a set of core abstractions that map cleanly to the most basic problems of visual layout. Its original semantics are obviously geared towards privileging typography and simple text flow mechanics, despite which they manage to elide or needlessy complicate questions like “how do I put this text in the middle”. I think they’re working on this, but there’s no plausible claim that they’ve gotten it right yet.

  • Even for the average bright, generally informed web developer, the idea of “semantic markup” has often become a kind of superstition, a received dogma with little direct connection to original notions of “separating content from presentation”. For an entire generation of nerds, “semantic” has come to reliably stand for believing in a visceral way that it’s better to put tabular data inside of <div>s than inside of <table>s. Think about that for a minute, because it’s fucking nuts. A vocabulary for describing rows of cells was rich enough to be abused as a workaround for the total lack of general-purpose layout semantics. This turned out to be really painful to live with. And the lesson we manage to impart to everyone afterwards? That the explicit table semantics were really distasteful but generic container tags are totally groovy because semantics! It leaves me kind of embarrassed for the species.

  • I used air quotes because it’s far from clear that “separating content from presentation” was ever entirely consonant with reality to begin with.

  • Rigid, brittle, device-dependent layouts and interfaces are all over the place. Relative and fluid techniques and units are almost universally deprecated in favor of static widths and absolute values. Ridiculous, boredom-driven hacks for basic problems proliferate to the extent that they displace existing basic solutions and techniques. People keep re-inventing things like the scroll bar. Badly.

  • Browsing on a mobile device is still a really craptacular experience, and direct attempts to target mobile tech backfire horribly at least as often as they ameliorate the problem.

  • We’re fucking up so badly on the questions of security and distribution of power that the entire project of democracy itself may suffer a major setback. The difficulty of authorship and publication outside of big corporate platforms like Facebook and Google is part of this problem, and some of its roots are in how hard it is for average people to write for the basic web. HTML is partly to blame for this, but I suspect that where we’re really failing is in terms of infrastructure, not the makeup of individual documents.


isolation (from some time in july)

So the wifi sucks and I want to figure out how to do a thing with filesystems.

Do you remember trying to solve a software problem without the Internet? Like really without the Internet?

We used to do this constantly. Like, constantly. A huge portion of my adolescence was composed entirely of confronting things like which brutally arcane permutation of AUTOEXEC.BAT and CONFIG.SYS would allow a game to allocate enough memory to actually start, continue running, and play properly. In a summer like this one 15 or 20 years ago, other people were out learning how to play football, developing social skills, meeting their first girlfriends, having jobs, etc. I was memorizing the AT command set so I could play the shareware versions of first person shooters against the kid down the road.

It’s true that everything was harder then because the tools were limited and the systems agonizingly unstable, but that’s not really the whole story. I mean, some of the software widely used then was delightful in ways that have very rarely been equaled (I think of LIST.COM, the early adventure games, the Mac’s Finder, ResEdit, HyperCard), and the hardware of the day had merits too-little appreciated in the present climate. The problem was that when there was something wrong with the system, the resources you had for solving it amounted to the local folklore of computing, a manual if you were lucky, but mostly just the system itself.

I guess that’s still true. It’s just that the system itself has become the entire web and all the people connected to it. The Computer is the Network. Except when the network sucks and there’s something I’d normally google. Then the Computer is a Terminal Window and a Filesystem Full of Crap I Only Sort of Remember.


a bad idea for a novel

(I have these every once in a while, and they are always good reminders that I’m not cut out for fiction.)

Near future. Colorado or similar Western mountain state.

Protagonist is a journeyman hacker type - has a technical day job making some small segment of the economy move. (WRITE WHAT YOU KNOW, YEAH?) Not operating on any cutting edges but basically aware of the state of things. Kind of knows, very imperfectly, what’s on the radar. Decides to go hiking solo for a while in late summer, that time that’s shading into early fall in the mountains.

It’s pretty much a near-future just a couple notches further along the obvious curves than late 2014. The always-on network is ubiquitous anywhere bigger than a few hundred people. Watches are popular again, having largely supplanted phones once designers noticed that they could pack Real Computers into really beautiful formfactors that were small enough to lose easily. Cheap, extremely hi-resolution, low-power displays are everywhere. Serious direct brain interface tech has been on the market for long enough that stuff with potential is starting to shake out from the novelties & overdesigned dead ends. People walk around wearing the computational equivalent of an early-2000s datacenter, and a vast high-latency-but-high-bandwidth sneakernet has emerged.

Our boy gets into the mountains and turns off his antennae — he knows the bandwidth is going to be terrible, and anyhow he wants to get away from the constant hum of the internet long enough to remember what it felt like before everything was talking to you all at once.

{poetic camping interlude}

Dude heads back to civilization. He doesn’t meet anybody else on the trail coming out of the backcountry, but it’s getting late in the year and he isn’t surprised by this. It’s only when he rolls through the first small town down off the road to the trailhead and nothing is moving at all that he notices he’s been afraid for a while.

Gradually we realize that the singularity or something like it has happened. That the nerd rapture came all at once, in full-on secular apocalypse mode, leaving behind only a remnant population. Everyone caught outside the network during the last handful of days has been marooned in human cognitive space, in human physical form. It’s impossible to say what’s happening in the rest of the world, because the wreckage is terrifying. The dead are everywhere, in their millions, and millions more — most of the remainder — are too scattered, shell-shocked, and unsupplied to survive the winter. And yet: All the obviously dead and the doomed survivors put together would number no more than a tenth of the American population, most of them at the geographical or economic fringes of society. The rest have simply vanished, and it never becomes entirely clear where or how.

There’s vast confusion among the survivors. Most of those who weren’t just off the grid during the last week recognize that the net was a vector for the catastrophe, that some kind of fundamental phase shift seemed to happen all at once in the general mode of communication, that strange idioms and literally incomprehensible conflicts multiplied exponentially in the final hours. Many fall back on conventional religious explanations or political conspiracy.

Everyone is terrified. Everything is horrifying. The human world is broken.

{compelling and somehow believable story as our viewpoint character struggles desperately to find and integrate with a community which stands some chance of survival, juxtaposed with his and others' attempts to make intellectual and moral sense of a fundamentally unassimilable historical moment}

Of course there’s no narrative here. It’s just an ugly daydream I kept having for a while: What if I walked into the mountains for a minute and when I walked back out the bulk of humanity had accidentally bootstrapped itself into minor godhood and fucked off to some techno-spectral plane and left a horror story behind for the survivors?

This isn’t, to be clear, a scenario that I think has a shred of plausibility, or based on anything I think is a very interesting notion of a posthuman future. It’s just this momentary fascination with what a sufficiently literal “singularity” looks like to more-or-less baseline humans, combined with the idea that if some great phase shift actually were going to spring itself on the species, maybe it’s already a lot further along than you’d think. Frogs in boiling pots, snowflakes crystallizing the surface of a near frozen pond, that kind of idea.


scattered notes on trust

Google Reader is an example of something we shouldn’t have trusted and should never have designed for.

GMail is the same only vastly more dangerous. (But the ecosystem of mail is still more robust because widely used, so maybe there’s hope?)

Don’t trust power. Don’t trust those with power, and don’t trust what acquiring power itself might do to you or those you care about. Don’t trust your relationship to power.

Cultivate your memory, but don’t trust it. You’ve been lying to yourself since before you knew words.

Centralization is powerful, necessary, and invariably dangerous.

Certain modes of redundancy are valuable. Leave the house with enough layers to get you through the whole day. Leave town with enough gear to get you through sleeping, eating, staying warm, hopping busses, etc. etc.

An individual should be careful of what they don’t own but rely on anyway.

The public:

  • should be wary of what it needs, doesn’t own, and can’t get from a diverse market.

  • should be wary of owning things badly.

Maintaining e-mail is terrible.

  • This hasn’t changed.

  • Is it terrible enough to justify the risk in letting someone else own it in exchange for the conveniences provided by their centralization?

  • Has the centralization of systems like e-mail made them easier to maintain in the general case, or harder?

Is it better to have to trust more entities?

Trust yourself, but don’t trust yourself: Be confident. Be in the world. Put yourself in situations to do hard things and good work. But don’t put yourself where you’ll fail and break what matters just because you think you ought to be strong enough in the moment. The decisions that matter just as often aren’t the ones in the moment. They’re the ones hours and days and weeks before the moment comes.

If you decided not to put on your seatbelt, if you left camp without water, if you wrote code to ignore the error case: You’re going to have a bad time.