Thanks Matthias!
Yep, that script would definitely do the job. 1 problem is that it uses the Google search API, and that bugger has a bit of nasty EULA, which only allows you to do 100 or so automated queries per day. That's not really a big problem in this case, but since we'll also need to find the information so we can contact the specific local government in question there's already a bit of manual labor involved. Adding a manual search using "site:" doesn't make the manual part that much bigger.
I do however see that this might be perceived as a "brain-dead" job. Doesn't really worry me much. I have worked as a code-monkey for 2 years at a firm, banging out Typo3 and Joomla sites round the clock, so "brain-dead" doesn't really bother me much :-) But anyone who wants to chip in with half a brain can use this script. I tried it and it works pretty well. It gives an error when it doesn't find anything, but I guess the programmer was in a bit of an pessimistic mood when he made it ;)
It needs the simplejson module. To install this on Debian Squeeze you can use:
sudo aptitude install python-simplejson
You can run it using (I know Almere's website has adobe reader advertisments on it): python find-acrobat-commercial.py almere.nl
Cheerio, Jelle
On 07/15/2011 02:20 PM, Matthias Kirschner wrote:
- Jelle Hermsenjelle@fsfe.org [2011-07-13 13:28:26 +0200]:
Scratch my idea to create a MAS, or webcrawler to automate the search. Searching manually is faster.
I don't know if the attached script might help with that. Would be interested in your view.
Regards, Matthias