Big Data analytics is a journey, if the first mile
challenge is to craft a good strategy and build the road-map to navigate through,
then the "last mile" problem in analytics is deployment. Can you
improve how you get results from analytics?
There is the difficulty of linking business outcomes to
analytics. In addition to capturing the
range of outcomes, the challenge is to determine the effect of decisions amidst
the background noise of business variability. This has to be done by gathering
data across locations and over time. It is non-trivial and required to enable
the feedback loop and organizational learning
There is a tendency to misstate and misrepresent problems - focusing on small and isolated situations rather than the
broader dynamics. For instance, many
organizations believe in streamlining and downsizing. Size and complexity are
often posed as problems. But efficiency is no substitute for growth; from
strategy perspective, the importance of growth is over efficiency.
There are pitfalls in benchmarking: To deal with the Big Data deployment problem is to have some basis or benchmark to measure that problem. Presumably given how challenging it is often to follow-up on implementation results, benchmarks are rather crude or non-existent, or there is tendency to have the outcomes disassociated from the benchmarks. Therefore it is not certain that the benchmarks are appropriate in the first place; weather they actually described real problems, or it is about the effectiveness of the remedial strategy. Under ideal circumstances, determining efficacy is matter of continuing the compilation of metrics and confirming positive outcomes. So there is nothing too complicated about the general approach.
There are pitfalls in benchmarking: To deal with the Big Data deployment problem is to have some basis or benchmark to measure that problem. Presumably given how challenging it is often to follow-up on implementation results, benchmarks are rather crude or non-existent, or there is tendency to have the outcomes disassociated from the benchmarks. Therefore it is not certain that the benchmarks are appropriate in the first place; weather they actually described real problems, or it is about the effectiveness of the remedial strategy. Under ideal circumstances, determining efficacy is matter of continuing the compilation of metrics and confirming positive outcomes. So there is nothing too complicated about the general approach.
There
are tradeoffs between function vs. whole; short term vs. long run.
Businesses improve analytical result by real-time
operations simulation analysis, on daily performance results, modify, update
the models parameter or even tracking toe error in big data. However, the metrics of achieving
these specific goals might indicate the functional success while the organization as a whole
would be no better off. The role of the data system focuses on internal
proximal achievements while insulating the organization from external market
conditions. Moreover, the organization might find itself optimized for the
achievement of the short-term but inconsequential outcomes. This is true not
just in terms of how people are compensated but how success is attributed,
inflating the apparent leadership skills of individuals that are mostly skilled
in promoting conformance behaviors.
It is full of adventure in Big data
journey, sometimes it isn't about knowing what exactly to solve and then
solving it. It's about not knowing what to solve and coming to grips with it. An organization might not know what it
should have known until after the fact. Still, there are techniques in relation
to big data deployment and to make the last mile running smoothly, and businesses
have to walk the talk, learn more about the possibilities in the future. An
organization would be quite a market force if it can run the first mile smoothly- know what data to
collect and finish the last mile solidly- how to make effective use of it from the standpoint of strategic
adaptation.
0 comments:
Post a Comment