2009

Adieu, 2009

I've never had a year like 2009. My family and I moved to Montpellier, France, to profit (as they say here) from my wife's first sabbatical from teaching at Colorado State. We're at the halfway point of our séjour, glad we came, and looking forward to even more hilarious thrills and spills in 2010. Not only am I fortunate to have such an adventurous family, I am extremely fortunate in these times to be able to take my job with me, and to work for and with such enlightened people. I've travelled in France before, but have never before made my home outside the Rocky Mountains (Utah and Colorado, specifically). I've lived nowhere else in the United States, let alone Europe, so the experience is mind expanding. In my personal life, I can't think of anyone who has impressed me more in the past year than my oldest daughter, who we tossed into the deep end of French école maternelle (preschool) and is making friends and doing fine, and my wife, who has already lapped me in French fluency and wills us all through various opaque bureaucracies. Every time I went to the Prefecture or the OFFI office, I thought about the people who run the immigration gauntlet without the security of a home to return to, little savings, often as a single parent. Humbling.

The past decade? So much good for me, personally, and I've witnessed a lot of inspiring work by others, but for our civilization and the world ... well, it largely stunk, didn't it? A punch in the eye from terrorists in the beginning, an insane stampede into a senseless war, and a kick in the teeth from bankers at the end. All to the soundtrack of the giant sucking sound of our liberties going down the "war on terror" drain. And that's just in the US. Good riddance.

WMS and URI addressability

Recently I commented that the saving grace of the the OGC's Web Map Service protocol is that one can use URL rewriting or other tricks to hide parameters like service, request, and version – junk, as in "junk DNA", to those of us who will never in practice rely on protocol version negotiation – and expose to users not a service endpoint but an infinite space of map images addressed by URLs. For example, this request:

GET /geoserver/wms?service=WMS&request=GetMap&version=1.1.1
    &format=image/png&width=800&height=600&srs=EPSG:4326
    &layers=states&styles=population&bbox=-180,0,0,90
Host: http://example.com

Can be replaced by:

GET /maps?format=image/png&width=800&height=600&srs=EPSG:4326
    &layers=states&styles=population&bbox=-180,0,0,90
Host: http://example.com

Or even, using a specialized host, by:

GET /EPSG/4326/states_population.png?bbox=-180,0,0,90&size=800,400
Host: http://maps.example.com

It is the constraint of HTTP called URI addressability that makes this possible. URI addressability means that "a URI alone is sufficient for an agent to carry out a particular type of interaction" [TAG Finding 21 March 2004]. Rewritability isn't the only nice property of resources that are addressable by URIs; it's also then possible to bookmark resources, share links to them, cache them effectively, and migrate users to new locations.

More precisely, the saving grace of WMS is that its protocol parameterization doesn't obstruct URI addressability.

Comments

Re: WMS and URI Addressability

Author: Matt Priour

This is precisely what I would favor. Build good 'geo' server technologies and then use HTTP protocols & URI handlers to provide interaction by REST & OGC Standards techniques.

One could go even further with your example and have the root URI of the resource respond with metadata and provide the required parameters for gridded (server cacheable) tiles, defaulting to standard tile size, grid bbox & resolutions.

In this way you could have any reference system, area, tile size, etc referenced using an X Y Z uri construction that is popular in the commercial web mapping apis and so easily used in OpenLayers and other mapping clients as well.

ArcGIS server is the only server that I know of that is currently implementing something like this. It would be relatively simple to implement a service wrapper for any of the other OGC compliant servers that would allow that kind of interaction.

While I like Atom RSS type interactions for feature reporting, I think URI addressing is a much better method for any dynamic, arbitrarily styled and generated image resource.

La volaille

Adam Gopnik, at the end of the second essay ("The Strike") in his collection Paris to the Moon, writes:

The turkey, not quite incidentally, was so much better than any other turkey I have ever eaten that it might have been an entirely different kind of bird.

We (by which I mean my family and I) are having similar experiences with la volaille. On the morning of Christmas Eve, I went down to Les Arceaux to pick up the dinde de Noël my wife had ordered the week before from a local producer. A flock of birds, 30 or so, was waiting behind the plexiglass counter of trailer, each marked with a weight, price, and name (one mine). I didn't see a bird over 5 kilos. Mine, at 3 kg, would be considered a smallish, but not tiny bird. The producer asked me if I wanted him to finish the turkey; I could have brought it home with feet, head, and lungs attached if I'd desired. I did not. He whacked off the extremities, removed the organs, replaced the ones normally considered edible, I paid him and brought home a fowl that if not overcooked (it was, slightly) could have been the best we'd ever eaten.

We've been getting chicken from the same producer: whole birds and wings, plump meaty wings from well-exercised birds, driven up into the trees several times each day, perhaps, by a well-trained French farm dog. My wife is permanently astounded at the smell of the chicken, or rather the lack of smell. Sushi-grade poulet, this is. Even supermarket chicken seems years fresher than its American counterpart. We don't completely understand the factors. Is there a shorter supply chain? We can buy directly from a producer, but I get the sense that mass market birds also spend less time in carcass state around here. A very good thing. Unconsolidated industry? Geography? We'll look into it.

One thing for certain is that the French still, in the 21st century, expect their ingredients to be fresh and their meals to be well made. Yesterday I rode in the Font-Romeu télécabine with a pair of 20-something French dudes, and evesdropped on their conversation about food. One of them wearing one of those ridiculous rainbow poly fleece dreadlock ski hats, not something I'd expect to see on the head of a gourmet. "What's for dinner?" "Cuisse de canard (duck leg, roasted)." "With what?" "Potatoes." "Boiled? I'd prefer a gratin." "That's a little heavy to go with duck, in my opinion ..." They then veered into a reminscence of favorite meals past. This wasn't the first time I'd experienced this; it's a bit like living in the Food Channel here.

Comments

Re: La volaille

Author: Jerome

Hey, talking about food while eating sure is french.

Chickens you can buy right next farm tastes a lot better than the one you can buy in the supermarket (even though there are many quality grades : poulet portion, poulet jaune, poulet de bresse ... yours may be "poulet de basse cour").

The breed, the food they eat and their growth speed are the main parameters. On one side you have the industrialized-über-selected breed, fed with low cost efficient food to get the standard chicken "portion" as fast as possible. On the other hand are the old kinds, fed with grains and killed "when they look fat enough".

As for almost everything, "the slower the better" :/

Least power

Volker Mische on the future of GeoCouch:

You will lose functionality like reprojection. The spatial index won't know anything about projections. So GeoCouch won't be projection aware anymore, but your application still can be. For example if you want to return your data in a different projection than it was stored, you do the transformation after you've queried GeoCouch.

You would also loose fancy things for geometries, like boolean operations on them. But this is something I'd call complex analytics, and not simple querying.

GeoCouch would only support three simple queries: bounding search, polygon search and radius/distance search. If the search would be within a union of polygons, let's say all countries of the European Union, you would simply make the union operation before you query GeoCouch.

To me, this design acknowledges something like a Rule of Least Power for GIS interfaces:

When designing computer systems, one is often faced with a choice between using a more or less powerful language for publishing information, for expressing constraints, or for solving some problem. This finding explores tradeoffs relating the choice of language to reusability of information. The "Rule of Least Power" suggests choosing the least powerful language suitable for a given purpose.

Comments

Re: Least power

Author: Kirk Kuykendall

It sure seems like domain specific languages are supposed to provide a least power approach too. I wonder why spatial DSL's haven't been developed.

Re: Least power

Author: Michael Weisman

Perhaps I'm missing something (I am far from an expert on CouchDB) but given how easy it is to query very large datasets in a distributed environment with map reduce functions, and how there really is nothing else that I am aware of for doing distributed GIS (the best tool I have found was hadoop streaming with python scripts), I think there is a good case for PostGIS-caliber geometry functions for couchdb.

Re: Least power

Author: Sean

JEQL and CQL come to mind. MapServer has a spatial filter language too.

Last market before Christmas

Never before have I had the chance to buy a fresh truffle at my local farmer's market, and I seized that opportunity this morning.

http://farm5.static.flickr.com/4044/4196825461_dfa9cafa27_d.jpg

I'm told that the season for Tuber melanosporum is just getting started. This one was sold to me by folks that specialize in beautiful dry-cured hams and is from the nearby oak woods. I actually smelled them before I saw them, and the scent is everywhere around us: the car, the kitchen, the pantry. We wasted no time in shaving off a good quantity to use in omelettes made of farm-fresh organic eggs.

http://farm3.static.flickr.com/2591/4196825841_71268acf12_d.jpg

The rest of that nugget is destined to be tossed with roasted roots and bulbs: sweet potato, céleri-rave (celeriac), rutabaga, and fenouil (fennel) for a Christmas dinner side dish.

It was frosty this morning at the market, but just as busy as ever. Freezing together in open air shopping solidarity and the shared mission of acquiring fresh ingredients for holiday dinners seemed to draw people together. I had more interaction with other market-goers than on any other trip.

Comments

Re: Last market before Christmas

Author: Pierre GIRAUD

May I suggest you to put the rest in an hermetic jar with some fresh eggs. Omelettes with those eggs will taste the truffe as they were cooked with it.

Re: Last market before Christmas

Author: Sean

Yes, I've read about that practice, and am curious to see how well it works.

Re: Last market before Christmas

Author: atanas entchev

Buying a fresh truffle at a farmer's market is on my bucket list. You are fortunate to have done it.

Re: Last market before Christmas

Author: Jason E. Rist

You have no idea how jealous I am.

Sweeping your front door

Good advice:

Finding #8: Starting by sweeping your front door.

Before you agonize about how RESTful your back-end management protocol is, how about you make sure that your management application (the user front-end) is a decent Web application? One with cool URIs, where the back button works, where bookmarks work, where the data is not hidden in some over-encompassing Flash/Silverlight thingy. Just saying.

William Vambenepe's article is as good as advertised (via Tim Bray).

Simple and reusable spatial queries

Quoting James Fee:

First we’ve got new functionality in the Google Maps Data API. First off you can now perform geospatial and attribution queries on data stored on Google’s MyMap. Now of course this isn’t paleo-type spatial queries, just simple stuff that solve 80% of all queries you’d need to complete. Simple web apps need not fancy complicated APIs and clearly Google is the master of this. So upload your data into Google’s My Maps and then query it to display on a Google Maps application. Simple and sweet.

This is also sweet from an engineering perspective. Yes, simple bounding box or radius search are "only" "80-20 rule" features, but at the same time are also features that power users are going to use almost 100 percent of the time for a lot of tasks. In other words: highly reusable features. Let's say you want to select features of a collection that are contained within another feature. You could build up a result set by iterating over each feature in the collection no matter where it is and evaluate containment with the filtering feature for every single one of them. Better yet would be to first eliminate candidate features that aren't even in the same ballpark as the filtering feature by using an indexed bounding box search. This is the default mode of operation for newer functions in PostGIS, for example, like ST_Contains:

This function call will automatically include a bounding box comparison that will make use of any indexes that are available on the geometries. To avoid index use, use the function _ST_Contains.

(By the way, nice new docs there, PostGIS people!) Simple indexed search is the initial boost to more interesting queries. Indexed operations are best provided by the map data collections themselves. Higher order processing, some of which can be done in parallel, can be left to other resources or services. This is, of course (except for the parallel considerations), the premise of the OGC's Web Processing Service, but taken to an extreme where a WFS provides only bounding box and attribute filtering and the WPS does everything else. A familiar pattern, but I'd be surprised if a future Google spatial data processing platform wasn't as different from OGC WPS as the maps data API is from WFS.

OpenLayers.Format.Atom

Right on the heels of this announcement by a company dipping its toe into queryable geographic feature services is word that my patch for an Atom format has been accepted into OpenLayers. OL features can be serialized to Atom + GeoRSS for posting to an AtomPub collection and fetched feeds with GeoRSS annotated entries can be deserialized into OL feature collections with a large degree of conformance to RFC 4287. Thanks, Tim.

It'll be interesting to see if this takes off and in which direction: position as extension or payload?

DDOS on climate science?

Ed Parsons has been beating the drum for open climate data. I like open data, but it's not not without its own problems. A potential problem for science, and scientific consensus, in a brave new world where we are all now climate scientists, is the ramping up of the social denial of service attacks identified by Steve Easterbrook:

But in reality, the denialists don’t care about the science at all; their aim is a PR campaign to sow doubt in the minds of the general public. In the process, they effect a denial-of-service attack on the scientists – the scientists can’t get on with doing their science because their time is taken up responding to frivolous queries (and criticisms) about specific features of the data. And their failure to respond to each and every such query will be trumpeted as an admission that an alleged error is indeed an error. In such an environment, is it perfectly rational not to release data and code – it’s better to pull up the drawbridge and get on with the drudgery of real science in private. That way the only attacks are complaints about lack of openness. Such complaints are bothersome, but much better than the alternative.

In this case, because the science is vitally important for all of us, it’s actually in the public interest that climate scientists be allowed to withhold their data. Which is really a tragic state of affairs. The forces of anti-science have a lot to answer for.

Joe Gregorio has this social denial of service thing nailed:

Let's go back to electronic denial-of-service attacks. They worked because of an inherent asymmetry between the attacker and the attacked. [i.e. from earlier in Gregorio's post: The attacker performs very little computation to send the packets, but the server has to accept them and perform some computation to determine if they are valid or bogus. In this way an attacker with the same or less computational power can overwhelm a bigger host.] The same is true of the social denial-of-service attack where arguments, responses, rebuttals and more importantly time has to be spent responding to the bad faith objections, which are easily written up and tossed onto the mailing list.

Denial of service on climate science was bad enough before the leaked emails, now scientists have to read the emails, parse them, and explain how they don't falsify the science in every public forum and every media outlet. Next, add to the mix climate data and models. What happens when some blog or cable TV gasbag complains that not only do the model results of scientists not match his interpretation of the data, but that he couldn't even get the model to run on his computer, no matter how hard he tried, and that the code itself might be fraudulent. That's not going to be a victory for transparency.

Perhaps we need to match open climate data and models with a change in the rules of our climate debate. Gregorio explains the rules used by the IETF:

Remember that one way to fight a denial of service attack is to raise the amount of computation required by the attacker. In the case of a Working Group the way to do that is by requiring disruptions to take more time and energy. This is where the call for "camera ready copy in the form of a Pace" comes from in the AtomPub WG. Camera ready copy is much more difficult to write than a one or two line objection tossed into a mailing list. Only if you are willing to put in the work to write up a Pace with reasonable text will it start to take up the time of the WG. Your willingness to put in the time and effort to create camera ready copy will distinguish your proposals and objections from those of an attacker.

Similarly, in the climate debate, we could demand that denialists publish their arguments and supporting evidence in peer-reviewed journals. (Note that I'm distinguishing denialists from the skeptics who already do publish in peer-reviewed journals.) Does it risk giving them unwarranted credibility? Maybe, but I think that it's balanced by increased cost. Even low-cost electronic journals completely stacked with friendly reviewers will help level the asymmetry that makes a DOS attack possible. Forcing the denialists to read and personally sign off on the work of others, or even just keeping them occupied correcting each other's grammar and spelling, would be a good start.

We could demand this, and by "we" I mostly mean our media, but that would require our media to transform itself into something that infotains us a little less and edifies us a little more, and that's probably too much to ask, yeah? I don't have an answer, but it's interesting to look at some aspects of the climate debate as a denial of service attack, and I didn't see that perspective come up in any of the many comments on Ed's blog. I also recommend Bryan Lawrence's post on this topic. He doesn't use the word "attack", but certainly expresses some frustration at the extra load put on climate scientists in these times.

Comments

Re: DDOS on climate science?

Author: Paul Bissett

Sean,

I agree w/ many of your arguments. However, it is not as easy as you suggest.

1) there is a difference between those who deny warming occurring since the onset of the industrial revolution and those who question the relative impact of anthropogenic activities on the background natural climate variability. These two groups often get lumped together, but they are very distinct. Propagandists tend to be in the first category (warming deniers), credible scientists tend to be in the other (anthropogenic questioners).

2) having worked in the field of predictive oceanic modeling for nearly 20 years, and published in peer-reviewed journals, and guested edited peer-reviewed journals, I can tell you the science of modeling depends heavily on the assumptions of the model, the mathematical equations used to approximate the physical environment, the tuning parameters of those equations, data input to the models, the validation data used to verify the models, and the computational horsepower to run those models. The only people qualified to run those models are those in climate research centers. These models require huge computers, large staffs, and millions of dollars of infrastructure support.

The only way to create a critical review of the models predicting anthropogenic impacts is to fund a separate effort to develop and tune the models differently to see if alternative theories could explain the observations. In practice this is almost never done, because peer-review research is subject to peer-reviewed funding. The bigger the project, the more group support you need to get your project funded. This tends to create positive feedback in the scientific community; a noted flaw, but like democracy is the best system compared to everything else.

Good observations eventually rule. Part of the current debate lost in the noise is that the last decade has been marked by "unexplained" cooling since 1998. This is unexplained only in terms of how the models were previously being forced, which just goes to say the models were not quite right, and they don't quite know why.

Modelers step in when observations are too sparse or limited to definitely make a case. Climate researcher do not have the equivalent of Large Hadron Collider (a true shame). If they did, the scientific debate would be a lot easier.

Re: DDOS on climate science?

Author: Kirk Kuykendall

So maybe Al Gore just didn't want to distract Dr. Maslowski from his work ...

http://www.timesonline.co.uk/tol/news/environment/copenhagen/article6956783.ece

Re: DDOS on climate science?

Author: Paul Bissett

Kirk,

that's the problem on both sides, rhetoric rather than facts. NASA measurements show a 7.8% increase in seasonal ice cover since the low stand in 2007.

http://www.nasa.gov/topics/earth/features/seaicemin09.html

Let the scientists do their jobs, fund the research adequately, and quit politicizing the science. The facts will surface, but they will take time and good measurements.

Re: DDOS on climate science?

Author: Kirk Kuykendall

Paul,

Yes, it will take time, but I don't think the Navy is waiting. I recall hearing about a classified doc produced by the Navy several years ago discussing where to move ports in response to rising sea levels. At the same time other parts of the gov't were saying there just isn't enough evidence to take action. Maybe the Navy's models are classified. (Malowski, incidentally, works for Naval Post grad school)

Re: DDOS on climate science?

Author: Jeff Thurston

Interesting points.

Having managed research in a University for a long time I offer the following.

1) Most research (esp. this type)is conducted by teams of people. The days of lone scientists completing work like this are few and far between. The denials of service would have to cover huge areas and large numbers of people.

2) Most universities expect scientists to do three things. a) teach, b)research and c) community work. The last item is the one that gets the shortest shift, yet it is the last one needing the most attention - for the situations you describe.

There needs to be more people explaining good science to everyday people in terms they understand, and to be doing it continually. Informed people can make better judgements.

What I find interesting is that few instititions actually sit with media to develop these forms of relationships.

Re: DDOS on climate science?

Author: Sean

Thanks for the comments. I'm very sympathetic to scientists who feel that global warming is over hyped and studies of it funded beyond reasonable levels. As an undergrad, I worked in a molecular biology lab under a professor who argued that the Human Genome Project was going to take more than its fair share of the pie and sideline other important work. I studied under some of the prominent skeptics as a atmospheric science grad student. I admire people who'll take an unpopular stand when necessary. I don't admire those who'll whip up the anti-intellectual segment of our societies into a DDOS on consensus.

Paul, I'm with you on observations, but there are some things I'm not willing to risk losing forever while we wait for absolute certainty. Personally, I'm a bit more concerned about direct damage to ecosystems and landscapes (over-fishing, deforestation, mountaintop removal) and the scientists studying these human impacts are just as vulnerable to consensus-jamming.

Re: DDOS on climate science?

Author: Kirk Kuykendall

I forgot to point out how the Navy is highly experienced in countering jamming efforts. (For some interesting history, read this story about the birth of spread spectrum.) In addition to rising sea levels, the Navy is also preparing for an ice free arctic. I expect renewed interest in Alfred Mahan.

Re: DDOS on climate science?

Author: Paul Bissett

Sean,

I think we're aligned in our concerns. Part of the reason for starting WeoGeo was an attempt to open up critical geospatial information that was locked in the silos of individual organizations. Kinda "Think globally, act locally" w/ respect to geo-content.

It has taken me away from direct science endeavors (which I miss), but I am hoping that by enabling easy access to quality measurements, our contributions will have an equally lasting impact.

Keep swinging at the blowhards. I got your back...

Re: DDOS on climate science?

Author: Bill

Sean, the points you raise have merit to be sure. But science is all about the examination of data. To deny the data to other scientists is to deny that science is being done. Denying access to the data is unforgivable in my view.

Second, you downplay the legitimate concerns of the people trying to get access. There is no question that there have been many serious errors in, for example, the hockey stick studies (data sets mislocated, data sets misrepresented, data sets used in the opposite sense of the original authors). We all make mistakes. But scientists go back and look at what they did, and at least do not repeat the same mistakes in the next paper. This is demonstrably not the case with Mann and his studies. That no one questioned follow on papers when these errors were publicly known is strong evidence of the corruption of the peer review process. The continued use of stripbark proxies when nearly everyone on the planet except these few agree that they are misleading is another example.

Third, while I agree that the world has warmed in the last ~130 years, the question of the rate of warming is a legitimate question at this point. As people begin to look at what parts of the record are available they are discovering that the adjustments are questionable. This has occurred in Australia, New Zealand, Siberia, Alaska, and Norway to mention just a few from the past couple of weeks. It may be the adjustments are appropriate, but in some cases it would be hard to see how this could be so (Darwin for example). And ultimately, it is the rate of warming that is most important. It is the rate(s) that will tell if we have serious climate issues or not. So it is vital we get it right.

Re: DDOS on climate science?

Author: Sean

Bill, you've got me wrong: I'm in favor of open data. As to the discoveries you say have been made in the past couple of weeks: let's get them written up and submitted to a peer-reviewed journal.

Montpellier street art

There is some inspired street art in Montpellier, the likes of which I've never seen in the uptight places I've previously called home. The works (by Mear One and Asylm, I learned today) on the exterior of the Gymnase Alain Achille caught my eye on my first tram ride to the centre-ville and today I finally took the time to hop off at the Stade Philippidès tram stop and take some photos. The first 4 are of the east-facing wall (click for higher-resolution images):

http://farm5.static.flickr.com/4044/4178208793_026a922b1a_d.jpg http://farm3.static.flickr.com/2773/4178210913_083a81c13b_d.jpg http://farm3.static.flickr.com/2789/4178971168_20e653c0e0_d.jpg http://farm3.static.flickr.com/2753/4178207797_8792b159f1_d.jpg

The north-facing wall has some interesting art too, not signed or having lost its signature:

http://farm3.static.flickr.com/2593/4178200955_bbe8d3b83c_d.jpg http://farm3.static.flickr.com/2738/4178203041_29fa140e0b_d.jpg http://farm3.static.flickr.com/2682/4178203949_41073d6459_d.jpg

A little research turned up documentation by Sophie. of this amazing work on a gymnase in Boutonnet. Wow! Must find this one. Mear One, again, I'm guessing, and also guessing that the humping deer weren't part of the original. The gym on Avenue Rimbaud direction Celleneuve has another good one (street view) that I'm hoping to be able to post photos of soon.

http://farm1.static.flickr.com/83/231824309_1c40630e8f_d.jpg

Sophie.'s Montpellier set also has some good shots of "Jonny Style" on the streets of Montpellier, like this on Rue du Père Soulas:

http://farm1.static.flickr.com/72/229510818_75d1124019_d.jpg