DARKReading: How Data Breaches Happen (June 23rd)
IT - Big Data

So your big data project isn't panning out the way you wanted? You're not alone.

The poor success rate of big data projects has been a persistent theme over the past 10 years, and the same types of struggles are showing up in AI projects too. While a 100% success rate isn't a feasible goal, there are some tweaks you can make to get more out of your data investments.

As the world generates more data, it comes to rely more on data too, and companies that don't embrace data-driven decision making risk falling further behind. Luckily, the sophistication of data collection, storage, management, and analysis has increased hugely over the past 10 years, and studies show that companies with the most advanced data capabilities generate higher revenues than their peers.


With organizations increasingly turning to data science to derive business value, the tools that support the work are proliferating. Here are the key tools successful data scientists rely on.

The boom in data science continues unabated. The work of gathering and analyzing data was once just for a few scientists back in the lab. Now every enterprise wants to use the power of data science to streamline their organizations and make customers happy.

The world of data science tools is growing to support this demand. Just a few years ago, data scientists worked with the command line and a few good open source packages. Now companies are creating solid, professional tools that handle many of the common chores of data science, such as cleaning up the data.


Analysts such as Gartner are claiming that data fabric is the future of data management. But, in fact, the future is already here.

We see many signs of market maturity, ranging from total-addressable market projections to vendors pushing ROI. Data fabric's unique ability to integrate enterprise data and reduce the need for repetitive tasks in data discovery, analysis, and implementation are the reasons why many believe this will be the breakout year for the modern data integration approach.

Gartner defines data fabric as a design concept that serves as an integrated layer, or fabric, of data and connecting processes. A data fabric enables data that is dispersed across various locations and used by different applications to be accessed and analyzed in real-time within a unifying data layer, under the same management and security. And they do it by leveraging both human and machine capabilities.

See all Archived IT - Big Data articles See all articles from this issue