Access and Feeds

Big Data: The Number of Failed Projects is Huge, But Technology is Seldom the Culprit

By Dick Weisinger

In 2014, Gartner analyst Roxane Edjlali predicted that 60 percent of Big Data projects will fail before making it into production because they can’t demonstrate value.

Now, more than three years later, Gartner says the problem is much worse than thought.  Gartner says the actual rate of failure is likely closer to 85 percent of Big Data projects.

Nick Heudecker, Gatner analyst, told TechRepublic that there are various reasons for failure that include difficulty in integrating with existing business processes and applications, resistance from management, internal politics, lack of skills and security and governance challenges.

A report by New Vantage Partners on Big Data said that the problem “lies in the apparent difficulty of organizational and cultural change around Big Data… Big Data technology itself is not the problem; management understanding, organizational alignment, and general organizational resistance are the culprits. If only people were as malleable as data.”

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Shout it
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)
0 comments on “Big Data: The Number of Failed Projects is Huge, But Technology is Seldom the Culprit
1 Pings/Trackbacks for "Big Data: The Number of Failed Projects is Huge, But Technology is Seldom the Culprit"
  1. […] Big Data: The Number of Failed Projects is Huge, But Technology is Seldom the Culprit  Formtek Blog (blog) […]

Leave a Reply

Your email address will not be published. Required fields are marked *

*