It’s deja vu all over again.
Back in 2001 industry was hot about network monitoring and finding data. I
remember working on a project to aggregate data across different systems.
Here we are in 2012 it doesn’t seem like things have moved that much
After speaking to attendees at Interop 2012, one of the main themes
was that we all have too much data generated by different logs, events and
systems. Finding a needle in a haystack seemed to be the problem. There is
no one denying the possibility of finding some value with a large
quantity of data, but is anyone questioning the reliability of this data?
Even my niece in high school science class would know the basics of solid
scientific research would require a consistent approach and repeatable
process. The integrity of the data is only as reliable as the integrity of
the process. Within identity and access management, there is ample
opportunity to create event and system data. What some vendors are
pitching today as the business intelligence of identity management is
nothing more than data aggregation based on fallible business process.
Having so much data analysis would require an employee with analytic
knowledge and background in the business process. How do we justify the
resource cost added on already overburdened IT resource to analyze data
that may not yield a valid result?