Designing Clusters

Apache Hadoop 1.0    |    Intermediate
  • 6 videos | 32m 23s
  • Includes Assessment
  • Earns a Badge
Rating 4.2 of 6 users Rating 4.2 of 6 users (6)
Hadoop is a framework providing fast and reliable analysis of large data sets. Introduce yourself to supercomputing, and explore the design principles of using Hadoop as a supercomputing platform.

WHAT YOU WILL LEARN

  • Describe the principles of supercomputing
    Recall the roles and skills needed for the hadoop engineering team
    Recall the advantages and shortcomings of using hadoop as a supercomputing platform
  • Describe the three axioms of supercomputing
    Describe the dumb hardware and smart software, and the share nothing design principles
    Describe the design principles for move processing not data, embrace failure, and build applications not infrastructure

IN THIS COURSE

  • 5m 50s
  • 5m 43s
  • Locked
    3.  Exploring Big Data Solutions
    6m 41s
  • Locked
    4.  Examining Axioms of Supercomputing
    6m 36s
  • Locked
    5.  Exploring Design Principles for Hadoop
    2m 57s
  • Locked
    6.  Examining Additional Design Principles
    4m 37s

EARN A DIGITAL BADGE WHEN YOU COMPLETE THIS COURSE

Skillsoft is providing you the opportunity to earn a digital badge upon successful completion on some of our courses, which can be shared on any social network or business platform.

Digital badges are yours to keep, forever.

YOU MIGHT ALSO LIKE

Rating 5.0 of 1 users Rating 5.0 of 1 users (1)
Rating 3.9 of 20 users Rating 3.9 of 20 users (20)
Rating 5.0 of 1 users Rating 5.0 of 1 users (1)