A couple more notes on this subject since the Webinar from a couple of weeks ago:
Steve Arnold of Beyond Search asks in a blog post:
…the notion of integrating log files is a good one but I wondered how long it takes to suck big log files, determine deltas, and then update the indexes.
We’ve offered some of the information from the Webinar in a case study we’ve posted about our work with Boomi:
The logging-and-searching service is characterized by frequent commits to make the data available for search; every 5 seconds or 10,000 transaction messages. … There are between two to ten million log transaction generated daily and each may trigger two or more Solr entries. Boomi maintains a rolling 30-day record of log entries.
Not to be outdone, there’s some interesting new input on using Solr for this kind of application from Symplicity, an integrator who does government and university applications, whose Solr credentials include fbo.gov, a site that searches business opportunities within the Federal government through the General Services Administration:
For a while we used a commercial solution to centralize and search our logs, but they wanted to charge us tens of thousands of dollars for just one gigabyte/day more of indexed data. So I said forget it, I’ll write my own solution!