DWH Optimization with Hadoop 1-day workshop

Enterprises are looking for fresher data – from daily, to hourly, to real-time – as well as access to data from more sources and for longer periods of time. And they need it faster and cheaper. Meanwhile, traditional approaches for processing data in the data warehouse (ELT) can’t keep pace; and data warehouse costs are exploding along with data volumes.

One emerging strategy is data warehouse optimization using Hadoop as an enterprise data hub to augment an existing warehouse infrastructure. By deploying the Hadoop framework to stage and process raw or rarely used data, you can reserve the warehouse for high-value information frequently accessed by business users.

Big Industries can guide your organization and help you with offloading your DWH to Hadoop. We engage with you to design an architecture blueprint for augmenting your legacy DWH to increase capacity, maximize productivity and lower cost.

Workshop Scope

Workshop is over two half days
First half day:

  • Overview Hadoop
  • Overview typical Hadoop EDWH offload strategies
  • Identify key stakeholder needs and critical success factors for EDWH
  • Gap analysis of as-is environment (key areas for improvement)

Second half day:

  • Presentation of findings
  • Structure and prioritise proposed architectural description, goals and transition plan

Workshop Deliverables

Who should attend:

  • Business users
  • Technology procurement
  • BICC team
  • Enterprise/Data architects

The engagements are delivered by Rob Gibbon. Rob is an experienced Solution Architect with extensive applied knowledge of Hadoop and Hadoop ecosystem technologies.

Big Industries can run this workshop in our training facilities or at your premises.