hello
CLOSE ×

TeleTech Blog

Starting Your Big Data Journey Begins With a Lab


Big Data is everywhere. Although the jury may still be out on the ROI of Big Data initiatives, one thing is clear: Big Data has the potential to add a lot of value in almost all areas across the enterprise.

Big Data is also ever evolving which means the plethora of technology options out there is confusing and ever changing. Making sense of Big Data and ensuring it adds value also requires a special set of scarce skills that lives at the intersection of business, technology, and advanced analytics.

Given all of the above, you may be wondering what the best way is to start the Big Data journey. My recommendation is to start out slowly until you have proven its effectiveness. Instead of firms investing right away in an enterprise-scale Big Data platform, they can first set-up a Big Data Lab to prove its value on multiple use cases and prioritize the applications of Big Data across the enterprise.

There are too many examples out there of large enterprises investing in a Hadoop platform just because they thought it was necessary, and then realizing many millions of dollars later that the ROI is worse than their existing data and analytics infrastructure.  The best approach is to first conduct several pilots in a Big Data Lab before productionizing. In order to be effective, the goals of the lab should be to:

big data labs
  • Deliver “Quick Wins” to demonstrate the value of Big Data technologies, analytics, and strategy from both an IT and business perspective. 
  • Create a “Proof-of-Concept” that shows how Big Data thinking can be integrated into the existing enterprise, both from an IT architecture and a business process perspective.
  • Develop a roadmap for the business to act on Big Data at scale.
  • Establish a “Big Data Innovation Center” within the company for Big Data and analytics skill-building, and for testing new technologies and Big Data applications.
  • Provide Big Data thought leadership throughout the enterprise. 

Components of a Big Data Lab

There are two critical components of a valuable Big Data Lab – people and technology – which can be better understood below.

People: An effective Big Data Lab requires three types of people:
 
  1. Software architects and developers well versed with the vast and ever changing open source landscape. In today’s world, familiarity with Spark and Hadoop are key.
  2. Advanced analytics professionals who are well versed with writing machine learning and statistical algorithms at scale.
  3. Business liaisons who are adept at focusing on key strategic objectives and have C-suite access and influence. 

Technology: A Big Data Lab should focus on leveraging open source software and platforms as much as possible to stay nimble, foster efficiency and drive innovation. It has to provide access to technologies in the following areas:
 
  • Data collection -- batch and real-time
  • Data storage
  • Data curation and management -- batch and real-time
  • Data processing -- batch and real-time
  • Data exploration and visualization
  • Unstructured data processing 
  • Data analytics and machine learning
  • Event scoring
  • Channel adapters to integrate with enterprise systems (e.g., CRM, ERP, etc.)


There is a lot of opportunity for companies to leverage Big Data across their organizations and those that do will reap great rewards.

While making the commitment to begin a Big Data journey is a smart one, it’s important to do so in a way that will protect your company from losing millions of dollars in unnecessary technology costs. A Big Data Lab can help. 


Like this? Subscribe to our blog here.

Also, check out the most recent issue of our eNewsletter.