I have been busy working on a Big Data paper, so my blogging is not as often. Getting into Big Data technical details has been easy, and then it hit me that Big Data in data centers and IT has a lot in common with monitoring and management systems. Collecting gigabytes or even terabytes of data a day to monitor operations is a big data center problem.
Researching the Big Data topic it was interesting to see the intersection of Dell, Hadoop, and Splunk in Big Data.
Barton George has a post on Splunk.
Hadoop World: Talking to Splunkâs Co-founder 2 Votes Last but not least in the 10 interviews I conducted while at Hadoop World is my talk with Splunkâs CTO and co-founder Erik Swan. If youâre not familiar with Splunk think of it as a search engine for machine data, allowing you to monitor and analyze what goes on in your systems. To learn more, listen to what Erik has to say:
Barton references a GigaOm post on Splunk and Hadoop.
Splunk connects with Hadoop to master machine data
Splunk has integrated its flagship product with Apache Hadoop to enable large-scale batch analytics on top of Splunk’s existing sweet spot around real-time search, analysis and visualization of server logsand other machine-generated data. Splunk has long had to answer questionsabout why anyone should use its product over Hadoop, and the new integration not only addresses those concerns but actually opens the door for hybrid environments.
Dell's Barton George is interview himself as well at Hadoop World.
Hadoop World: What Dell is up to with Big Data, Open Source and Developers