2015

IETF GeoJSON

Having busted up my technical/administrative blogjam, I'm going to try to catch up on the news from my little corner of the world. The new IETF GeoJSON working group that I've been incubating finally hatched on October 1. The chairs are Martin Thomson and Erik Wilde and I'm playing the role of lead editor. All but one of the original format spec authors are participating in the working group along others from the original email list and a bunch of IETF folks. We're going to ship two documents: a GeoJSON format specification (with clarifications and extension guidance) and a document describing a format for a streamable sequence of GeoJSON texts (as I've sketched out in this blog previously).

The IETF GeoJSON WG will operate out of the existing https://github.com/geojson/draft-geojson repo and use an IETF GeoJSON mailing list https://www.ietf.org/mailman/listinfo/geojson for discussion and announcements. See https://github.com/geojson/draft-geojson/blob/master/CONTRIBUTING.md for all the details about contributing. The WG's draft is https://datatracker.ietf.org/doc/draft-ietf-geojson/ and replaces the old https://datatracker.ietf.org/doc/draft-butler-geojson/ draft that we've been iterating on for a year or so. This is to say that the IETF GeoJSON WG has adopted the draft of the independent GeoJSON working group and that the independent GeoJSON working group has become the IETF GeoJSON WG.

To get to the next revision of draft-ietf-geojson, I'm going through the outstanding issues in the tracker, summarizing them to the discussion list, and trying to reach consensus on whether to add text to the spec, remove text from the spec, or leave it alone. Some of these are more contentious than others, such as if and how GeoJSON should be extended. So far we have consensus that there is extensibility, but no consensus that there is a capital E Extension Framework.

If you've ever felt that the GeoJSON specification wasn't clear enough on something or is out of step with recent technological advances, this is the time to jump in and help improve it.

Migration of my blog

A while back I switched my blog from a database-backed dynamic site to static HTML generated by Sphinx and Tinkerer. This weekend I finished migrating eight years of old posts (with their comments) to the new blog. And I configured my new blog and home page to support HTTPS. Hello, 2015.

You may, when visiting old posts, experience browser warnings because of embedded HTTP resources. Images mainly, but also the odd script here and there. I have also not yet implemented the redirect rules that will get you from old style number/slug URLs to the new YYYY/MM/DD/slug URLs, though I have mapped them all out. Please bear with me, I'll address these issues soon.

Doing this lets me retire my last dedicated server. My blog now runs on a very cheap ($5 per month) virtual server provisioned (Nginx, SSL certs and keys, the works) using Ansible. I build my blog HTML on my laptop and rsync it with the server also using Ansible.

I'm pretty sure this blog migration is the final chapter of my Zope story. The following is very likely my final Zope external method, ever.

import json


def dumper(context):
    """Dump Python blog posts to JSON"""
    for result in context.searchResults(
            meta_type='COREBlog Entry', sort_on='created',
            sort_order='descending'):
        post = result.getObject()

        doc = {}
        doc['id'] = post.id
        doc['slug'] = post.entry_slug()
        doc['title'] = post.title_or_id()
        doc['body'] = post.body
        doc['categories'] = post.categories()
        doc['published'] = post.published()
        doc['html'] = (post.format == 2)

        coms = []
        for comment in post.comment_list():
            com = {}
            com['title'] = comment.title
            com['author'] = comment.author
            com['url'] = comment.url
            com['body'] = comment.body
            coms.append(com)
        doc['comments'] = coms

        date = post.published().split('T')[0]
        date = date.replace('-', '')

        outfilename = '/tmp/blog/{0}-{1}-{2}.json'.format(
            date, post.id, post.entry_slug())

        with open(outfilename, 'w') as f:
            json.dump(doc, f, indent=2)

Black Squirrel Half Marathon

Saturday was the 3rd running of the Black Squirrel Half Marathon. It's a roughly triangular route up and over the top of Lory State Park and back along the East Valley Trail.

I finished 97th out of 312 in 2:23:24. This is 7 minutes slower than my trail half on Taylor Mountain outside Issaquah, WA in June. I had a cold and hadn't slept well in a few days so wasn't at my best. Between the last two aid stations I was going backwards, but rallied enough over the last 2.5 miles to move up 3 places. Although I didn't do as well as I'd hoped, I enjoyed the race and finished with nothing left in my tank and no regrets. High points: hanging out with my friend and former ultimate teammate Jeanne (who finished 3rd in the womens masters category), not falling on any rocks, shady pine groves on the West Ridge Trail, and a great view of Longs Peak from the top.

Ruth battled back from a slow start to finish in 2:35:57. She said she passed about 50 runners on the singletrack going up and that the intermittent speed bursts and lack of consistent pace was exhausting. Aaron Anderson, who works in a lab down the hall from Ruth, was the winner in 1:30:02. This was 8 minutes faster than the winning time in Issaquah!

It's humbling to be a mere mortal among actually elite athletes, but it's also completely okay. See also Jacob Kaplan-Moss's great Pycon keynote on being average in running and programming.

Here's a photo of the run rising over the plains to the east before the start. I'm not a morning person at all, but I love the light and stillness of dawn.

https://farm6.staticflickr.com/5726/21198158526_d28c213cff_b_d.jpg

Lory State Park at dawn

Crazy Legs Trail Run

https://farm8.staticflickr.com/7694/17899867135_d1a98948a3_c_d.jpg

Sunday morning Ruth and I got up again at dawn to race, this time at Larimer County's Devil's Backbone Open Space west of Loveland. This 10+ kilometer trail run, organized for the last 8 years by Paul Stoyko, reminded me very much of the ultimate (frisbee) tournaments I played in the olden days: low key, low tech, high enthusiasm. It was an out and back route (map below), taking the left hand side of the Wild, Hunter, and Laughing Horse loops along the way. The final loop (at the top of the hills in the photo above) was pretty tough: 500 feet above the start and lots of ups and downs over fractured slickrock ledges.

I finished 24th out of 96 with a time of 1:05:10. Ruth finished a few minutes after me in 31st place. Here we are holding the popsicle sticks we grabbed at the finish line. Old school!

https://farm8.staticflickr.com/7745/17900375501_e2ea41e7c5_c_d.jpg

I've driven by Devil's Backbone many times but had never been to the trailhead or up the trail before. It's beautiful and wild(ish) and the trail network extends all the way to Horsetooth Mountain Park. Foothill wildflowers are starting to kick off right now and there were blue Penstemon (P. virens) and Britton's Skullcap all along the trail.

Thanks for putting this race together, Paul. We'll be back.

Running

Sunday, May 3rd, I completed my first ever half marathon. The Colorado Marathon starts up the Poudre (pronounced "Poo-der") Canyon and follows the Cache La Poudre River into the Old Town of Fort Collins. State Highway 14 was closed upstream of Ted's Place (intersection with US 287) during the run and we rode chartered buses from the City bus terminal to the starting line before dawn.

My time: 1:52:41. 13 minutes faster than my final 13 mile training run! I think I may have become a runner. I'm hardly sore at all today and am looking forward to splashing down the trail again tomorrow.

Tracking my training with a mobile app has been surprisingly fun. For no reason other than that it's a Mapbox customer, I've been using Runkeeper. I like it. The charts make sense, the data export works, and it is perfectly adequate for tracking mileage. I can't say how useful it is for real sports physiology. Since February 15, when I started training for the race, I've run 215 miles.

Next race: a 10k on the Devil's Backbone Trail outside Loveland, May 17. I'm looking forward to more rocks and dirt and less pavement.

Fiona, Rasterio, Shapely binary wheels for OS X

Numpy and SciPy binaries for OS X have been up on PyPI for a few months and I've recently figured out how to do the same for Fiona, Rasterio, and Shapely. As the SciPy developers do, I've used delocate-wheel to (see its README):

  • find dynamic libraries imported from python extensions
  • copy needed dynamic libraries to directory within package
  • update OSX install_names and rpath to cause code to load from copies of libraries

The new Fiona and Rasterio binaries are beefy (14MB) because they include the non-standard libraries that enable format translation, cartographic projection, and computational geometry operations:

$ delocate-listdeps ~/code/frs-wheel-builds/dist/rasterio-0.17.1-cp27-none-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl
@loader_path/.dylibs/libgdal.1.dylib
@loader_path/libgeos-3.4.2.dylib
@loader_path/libgeos_c.1.dylib
@loader_path/libjasper.1.0.0.dylib
@loader_path/libjson-c.2.dylib
@loader_path/libproj.0.dylib

For the small price of a larger download, Mac users now get batteries-included binaries that work immediately. No XCode required. Just pip install rasterio and start using it.

The new binaries are built on 10.9 using Python 2.7.9 and 3.4.2 downloaded from python.org. These Pythons were compiled using the 10.6 SDK for both i386 and x86_64 architectures and I've similarly set MACOSX_DEPLOYMENT_TARGET=10.6 and -arch i386 -arch x86_64 in my own builds. In practice they are intended for 10.9 and 10.10, but will probably work on 10.7 and 10.8. They should work for just about any OS X Python, whether from the system, Homebrew, MacPorts, or python.org.

If you'd rather continue to compile, e.g, Rasterio's modules using your own GDAL installation, you've got an out in pip's --no-use-wheel option:

$ GDAL_CONFIG=/path/to/gdal-config pip install --no-use-wheel rasterio

To contribute to development of these binaries or report installation bugs, please head over to https://github.com/sgillies/frs-wheel-builds. Most importantly, help me spread the word that installation of Fiona, Rasterio, and Shapely on OS X is easier than ever.