Chris J Powell

Mastering "Medium Data" Before Trying to Tackle Big Data

On Wednesday a very interesting article was posted on the Harvard Business Review titled
Nonprofits: Master “Medium Data” Before Tackling Big Data” the basic gist of the article is really about learning how to walk before trying to run.  Nonprofits are not the only types of organizations that can take a lesson from this concept though, Data tends to be a fickle mistress.

Everything about Big Data needs to have a new Alpha and Omega, many experts recommend rebuilding the entire Architecture when moving to Big Data…I agree for the most part but it also needs the guidance of a skilled mind…knowing what the objective is and a plan to get there.  The leap can be simple and elegant, heck I did it multiple times on my own, with little or no support but when looking at the Data that you have today and foresee having in the near future…is the real benefit of having an understanding of your Medium Data Problems before jumping into the potential complexities of creating a new mess with a Big Data Problem!

Data Warehousing (and this just not have to be on a huge scale) can be a huge benefit to any organization.  When you look at all the sources of Data (and this is by far the biggest challenge of Big Data in my opinion), it takes an Architect to build out the structure and the connections that enable the predictive nature of Big Data Analytics to function.

A simple definition of a Data Warehouse like illustrated above from Wikipedia focuses on having the plan and breaking the sources to the integration, to the warehouse to the different Outputs (Data Marts and Strategic Marts).  When I get talking data with my customers, there is always an aha moment that they realize that my talk about not being a typical Sales Guy is not just a Sales Pitch in itself…I truly enjoy talking bout how things are structured and the vast differences from company to company and industry to industry.

The challenge with the typical Data Warehouse is that it is well suited for “Medium Data” but when the move to Big Data arrives…there are a few other layers of complexity that are thrown in…but understanding the concept behind the warehouse model does allow for a speedier transition to a model that often looks like far more confusing than it is rational…but that is the power of a non-relational data block.

The “typical” Big Data Architecture model above comes from one of my favorite sources of inspiration the Venu Anuganti Blog.

Don’t be overly intimidated by Big Data…it is not an onerous beast that has come to steal your first born…it is an amazing resource that is sitting waiting to be realized for its great potential.  I firmly believe that there is no type of organization (regardless of size) that can not benefit from looking to Big Data as a benefit…and it does not necessarily need to cost as much as Watson did (estimates show the hardware alone cost IBM more than $3 million) but I did locate an interesting source that has a pretty solid walk through on how to build a Watson Jr.  all your own!

Take the first step to data dominance, and instead of mastering he world of Big Data, just take a look at your Medium Data (even if the scale and size would make you think about Big Data) and move from there.


Chris J Powell


Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.