2007 (old posts, page 17)

Resource-Oriented WFS: Filters

Update: I made the example a little more clear by pulling the filter out of the wfs:Query.

In a previous post, I wrote that a more RESTful WFS should never accept POST as the HTTP method for a query. Well then, what to do about huge filters? IE limits your GET URI to 2K characters, and Apache will handle only up to 8K. A WFS filter containing literal geometries (like 1:250K boundaries of Norway -- potentially megabytes of fjords) could be many times larger than these limits. So, POST them, although it would be a shame to have to send the filters again for another request?

No. Posting a query destroys the uniform interface, and should only be done if there is no other option. In this case, there is another option, and a fine one: implement filter resources subordinate to feature type resources. The filtered type could then be accessed in the same manner as the feature type itself. The URI:


is a feature type, and:


would be a subordinate filtered type. You'll recognize that this approach is not very different in concept from a Technorati watch list or a Google custom search engine.


Re: Resource-Oriented WFS: Filters

Author: Bill Thorp

Its nice to see you talking about this. How would you reference this filter? Unfortunately, referencing a URI from a query string has a logical fault (fitting 2k+ in 2k). Your post on GML+XLINK rings a strange bell; but its not REST.

Re: Resource-Oriented WFS: Filters

Author: Chris Tweedie

Neat idea Sean. I'm a bit confused why a RESTful WFS services should never accept POST queries, but then your solution still involves POSTing the query template anyway? Would you envisage users being able to create filter templates with dynamic content? eg. Using your example, define a generic school query and define the literal as a var, $type$ then referece the created filter, http://gis.example.com/places/filters/21f10d70b724f5215cb183d96f27053e?type=High School% Please forgive my poor REST principles if i am going against the grain :) I could see something like this as being very powerful indeed.

Re: Resource-Oriented WFS: Filters

Author: Keyur

Sean, This approach indirectly helps circumvent the browser cross-domain problem for mashup development as well: POST large datasets (such as large geometries / filters) to the origin server. This creates a new resource and returns a URI to the posted data. Now you can issue a query (a GET) to a server on another domain (with dynamic script tags) by GETing the data from the resource you just created. One caveat is that certain web servers' security policy might prohibit them from accessing external URLs. Not sure what approach would help in this scenario - proxy servers may be? Cheers.

Re: Resource-Oriented WFS: Filters

Author: Sean

Bill, I don't imagine that these filters have any use other than in the context of their "parent" feature type. If you want to use a particular feature or geometry in your filter (referenced by URI) instead of a literal value, by all means do so.

Re: Resource-Oriented WFS: Filters

Author: Sean

Chris, I tried to explain in my previous article why POST is not to be used for queries, but maybe I wasn't clear enough. The HTTP spec states that POST is to be used to submit data. Period. In the context of a feature query, the filter is not data, it is part of the request scoping. However, in the context of creating a new filter resource using the filter factory, the XML-encoded filter *is* data. In HTTP REST you use GET to read only, and POST to write only. Why? Interoperability. The templates are an interesting idea. I've been thinking more about desktop WFS clients with powerful forms and wizards for creating arbitrary filters, and they wouldn't need or use such templates. For other clients, I think you want a simpler API. Instead of
expose a resource to be accessed like
Keyur: I hadn't considered the proxying implications at all. Interesting.

Are GML Documents Hypermedia?

XLink is part of GML, but I've never seen a WFS return GML that links to other resources. Does anybody use GML like this, and what client would they use?


Re: Are GML Documents Hypermedia?

Author: Paul Ramsey

"what client would they use" is the ur-question that everyone ignores despite it's hyper-significance. Why do neogeographers build using JSON or RSS? Because there's a client that can consume them, the web browser. Why is KML relevant? Because there's a client that can consume it! Format relevance is 100% tied to the installed base of products capable of consuming the format -- shape files are still more relevant than GML, because of the huge installed base. If OGC wants GML to be relevant, they should be building and giving away (BSD license) good GML read/write libraries because doing so would drastically up the odds of products integrating GML support.

Re: Are GML Documents Hypermedia?

Author: Sean

Paul, parsing GML to graphically render features is one thing, and that's being done in the browser already. What I'm asking is what clients exist for traversing a web of xlinked GML? In my opinion, without hypermedia links, the "GeoWeb" is just hot air.

Re: Are GML Documents Hypermedia?

Author: Paul Ramsey

In my opinion, without a client, the "GeoWeb" is just hot air... which is saying much the same as you are. Note that KML includes the notion of hypermedia links and there is a KML client that can traverse them. So perhaps the GeoWeb is not hot air, just the GML-GeoWeb.

Re: Are GML Documents Hypermedia?

Author: Bill Thorp

Definitely an interesting question. The Xlink spec says "The hyperlink's effect on windows, frames, go-back lists, style sheets in use, and so on is determined by user agents, not by the hyperlink itself." GML's vision of Xlink seems to be only for defining properties in a non-inline way. EG: [gml:location xlink:href="http://www.ga.gov.au/bin/gazm01?placename=leederville&placetype=R&state=WA+"/]. In my brief look, I found no other use cases. I don't think they imagined user-interactive "hyper" links at all, simply machine readable distributed data fetching. Building a client that treats Xlinks as a user-controlled "hyper" links would involve delayed xlink parsing, which is probably contrary to most Xlink-aware XML APIs out there.

Re: Are GML Documents Hypermedia?

Author: Allan

I don't think the hyperlinking envisioned by GML was ever meant to be browsed with a browser without some intervening process like an XSLT engine to style things. But I do think it was originally meant to go beyond simply defining non-local properties. The way I understood that part of the vision was that eventually GML data could be stitched together using xlinks.

INSPIRE Tech Choice is Discouraging

REST and GeoRSS may be alpha-geek stuff and not quite yet ready for the masses, but, Ed, you're not helping when you frame them as lite technologies (emphasis mine):

It's not difficult to appreciate the problem, REST based interfaces, KML, GeoJSON, GeoRSS etc might actually be the best technologies to use today and would be the tools of choice of many, however like many other Government IT projects INSPIRE needs to follow the low risk route of SOAP, WSDL, WMS, WFS etc.

So we may find that organisations will use OGC style interfaces to communicate to other public sector organisations and the commission, while using lighter weight technologies to publish information to their citizens. This is no bad thing !!

HTTP REST is not about light weight (RFC 2616 is just as heavy as a WxS spec), it's about working with the grain of the Web.

The final paragraphs of Ed's (excellent, despite my one quibble) post hint at the biggest risk from using SOAP and WxS:

I am however disappointed by the continued focus on metadata driven catalogue services as the primary mechanism to find geospatial data, I don't believe this will work as nobody likes creating metadata, and catalogue services are unproved.

INSPIRE needs GeoSearch !!

Agreed. Service and data discovery is seriously hindered by the fact that SOAP and WxS services aren't of the WWW.

Rendering Shapely Geometries in Matplotlib

Update: I just received a reminder that numpy.asarray doesn't copy data.

Here's an example of using Shapely to connect OGR data sources to Matplotlib:

import ogr
import pylab
from numpy import asarray
from shapely.wkb import loads

source = ogr.Open("/tmp/world_borders.shp")
borders = source.GetLayerByName("world_borders")

fig = pylab.figure(1, figsize=(4,2), dpi=300)

while 1:
    feature = borders.GetNextFeature()
    if not feature:

    # Make a Shapely geometry from exported WKB
    geom = loads(feature.GetGeometryRef().ExportToWkb())

    # Wrap the geometry in a Numpy array, slice out lat/long, and plot
    a = asarray(geom)
    pylab.plot(a[:,0], a[:,1])


The result:


I hope you'll agree that this is considerably simpler than the code I used at the 2005 Open Source Geospatial workshop. An even more direct solution for ogr.py fans would be to provide the Numpy array interface directly from OGR geometries.


svn location

Author: brentp

this looks very useful. took me a while to find it. in case anyone has the same problem: svn co http://svn.gispython.org/svn/gispy/Shapely/trunk/ shapely

Re: Rendering Shapely Geometries in Matplotlib

Author: Sean

I've updated the wiki.

Buh-bye Blogroll

I dropped the blogroll from my blog's home page since it wasn't accurately reflecting what I read. Go ahead and unlink me if you're keeping score.

Good Things



non-OGC presentations

Author: Jeremy Cothran

Although the title for the two FOSSG4 presentations that I'd like to present on ObsKML (http://www.foss4g2007.org/presentations/view.php?abstract_id=86) and XeniaPackage(http://www.foss4g2007.org/presentations/view.php?abstract_id=87) do not have the word 'REST' in the title, I very much advocate a simpler data/document centric view in handling geospatial data which reduces or removes the inefficiences of an OGC/web services oriented only approach. ObsKML in particular is designed to facilitate a periodic report/publish approach to data sharing.

The Future of Geo-Blogs and Advertisement

So, everybody has linked to Bruce Sterling's recent piece for Wired, but it seems like nobody agrees with me that the best part is right at the top where he sends up geobloggers and the way they (in the future) shill for shiny toys:

My new Sensicast-Tranzeo 3000 is everything palmtops and cell phones have been struggling to become. I can already feel this device completely changing my life. And a wireless consortium pays me to promote it! You should buy one right now. See that handy link there? Did I mention the free shipping? This mobile is so location-aware, it can ship itself!

I wonder if Sterling isn't reading Planet SpaceNavigator/N95/iPhone.


Re: The Future of Geo-Blogs and Advertisement

Author: Fantom Planet

He's the offspring of Glenn and Frank!!!

Re: The Future of Geo-(Blogs and )Advertisement

Author: Luistxo Fernandez

Global geo-ads are here, btw.


My wife has deep roots in Northern California, and that's where we're headed on vacation: some vino-tourism in the Anderson Valley and lazy days on the Mendocino and Sonoma coasts. Our neighbors (thanks, Dan!) have promised to keep the yard and garden alive during this dry stretch. If anyone can keep the Overton windows moving on WxS and REST, and make sure that GeoDRM doesn't creep out of the basement, I'd be much obliged.

Designing Simple GIS Services for Zope

Interest in WFS for Zope/Plone is rising again. The happy future I envision for geographic computing in the digital humanities is more RESTful than OGC, but I still need to provide a way for people to get Pleiades places into the desktop GIS that they bought and installed within the last few years. For now that's WFS, though I expect we might also make shapefile snapshots for people on legacy systems.

I've identified a veritable continent of common ground in Zope for services that I consider to be otherwise orthogonal: WFS and the Atom Publishing Protocol (APP). We Zope users routinely use containers (think of a container as a dict or hash) to model our information systems. In this design, a container full of geo-referenced content, like "places", becomes a WFS feature type or an APP collection. Some such containers are aggregated in yet another, like "root", which becomes a WFS or APP service.


For APP, we register an atomserv view on the outer, service container, and an atom view on the inner, collection containers. These views return APP service and feed documents. Add HTML (and perhaps JSON) views of the collection items, and you've got a geo web service that also happens to act a lot like a web site. Every resource is directly addressable.

WFS is RPC, and has no surface, but rather an endpoint. That endpoint is entirely implemented by the wfs view registered on the outer, service container.

Below the views, there is a big potential for sharing and reuse of interfaces and adapters. I envision that all these views will be acting upon the same IGeoService, IGeoCollection, IGeoItem, and IGeometry.