How Louisville, KY is driving the trend of data-centric traffic optimization: laying out a network of city-owned gigabit fiber and place to store data they hook up more and more signals and sensors to their own network

Louisville started with Waze data, the city/DOT process that data into an internal database and mine that for information about speeds and jams and events and they are adding sensors and optimizing data, then the ultimate goal is to automate things like signal timings based on real-time traffic flow and optimize the flow of cars but also buses, emergency vehicles [and] pedestrians using the data that’s coming into the system.

SCHNUERLE: I’m the data officer so my specialty isn’t transit in particular, but having worked with the Waze data that any city can sign up for, looking at that data and mining it for information, citizen reports, travel speeds, all sorts of things, is really valuable and shouldn’t be overlooked. I know many cities haven’t even started the process of looking at that data, they just are collecting it without looking at it because it is a bit cumbersome to process. Part of this project actually, when we get the Waze data ingested into AWS, we want to publish our code or our cloud formation template online so that other cities can replicate this processing in the cloud without having to do all the legwork up front.

I say just look at all your different data sources, even some that you think are not directly related to transit like 311 or partner sources like Waze, and don’t over look those because I think what can happen some time in the traffic transportation world is people can get a little bit of tunnel vision with data that they’re looking at every day and they don’t consider the other sources.  We have an extensible bucket of tools that Amazon provides in the cloud that are flexible and expandable and that we can use the right combination of those tools to funnel the data we need in easily. If we were doing that internally, like we did with the Waze data, we’re limited by staff time and maintenance of those servers and there’s a cost there as well for maintenance and software licenses. And we’re mostly a Microsoft shop internally. So that does lend to some extra cost as opposed to using more open-source tools. So it just basically speeds our development process along and reduces the overall cost of deploying the system versus doing it internally.