blog.ch has currently more than 580 blogs in its database and the usual way with getting each feed after each other doesn't work anymore reliably, 'cause some feeds just time out or respond very slowly and take the whole process down.
There's a post by Wez about how to fetch more than one “feed” at once with non-blocking sockets. This works perfectly (I didn't test it with 500 feeds, which may be too many, but we can still split the feeds into little packages), but has one little drawback. We would have to implement a somehow decent HTTP header parser (302, 304 status codes, etc.) for all the possibilities out there… HTTP_Request from PEAR does a quite good job on that, but doesn't work with stream_socket_client.
Does anyone have an idea, if someone did already do that? Or knows of any other way for helping out blog.ch? Matthias would be grateful :)