I am curious about how people deal with the need to have a server to provide realtime updates to apps. When I first started to look at developing realtime support for my applications, I was hoping that I would be able to do queries to the feed targeted at particular stops. As this is not the case, I set about developing the apps to use my server.
I set the server up to download from the feed once per minute from 05:00 to 23:59. I was somewhat horrified to find that this amounted to about 5GB per day, which used up half of my monthly broadband allocation in five days! As this is obviously not tenable, I was wondering what other people do.
Will let others respond to you re the realtime updates. FYI we are intending to offer some parameters via an API but for now it is the full feed. (No ETA for this either that we can share.)
Some of the app developers who are on this forum should be able to share how they do things.
I didnât know anything about that. So this (presumably with a ââheaderâ on the curl command) will cause the feed to be compressed? That should certainly help!
If youâre using curl, use --compressed. Just setting the header isnât much use if you canât understand the response (marked with Content-Encoding: gzip).
I donât think thereâs much you can do in terms of the high level of data transfer. @jayenâs suggestion of ensuring that youâre allowing the server to compress the content is a good one but, beyond that, this is a series of non-incremental GTFS RT feeds. Theyâre updated every minute or so, so you canât build server infrastructure without pulling down all the feeds every time theyâre updated!
One other option might be to use a third party Stop Monitoring / Data API so you can do the âper stopâ requests that youâre after â drop me a private message or email, as weâre launching something similar in the very near future and looking for beta testers.