Designing Clusters
Apache Hadoop 1.0
| Intermediate
- 6 videos | 32m 23s
- Includes Assessment
- Earns a Badge
Hadoop is a framework providing fast and reliable analysis of large data sets. Introduce yourself to supercomputing, and explore the design principles of using Hadoop as a supercomputing platform.
WHAT YOU WILL LEARN
-
Describe the principles of supercomputingRecall the roles and skills needed for the hadoop engineering teamRecall the advantages and shortcomings of using hadoop as a supercomputing platform
-
Describe the three axioms of supercomputingDescribe the dumb hardware and smart software, and the share nothing design principlesDescribe the design principles for move processing not data, embrace failure, and build applications not infrastructure
IN THIS COURSE
-
5m 50s
-
5m 43s
-
6m 41s
-
6m 36s
-
2m 57s
-
4m 37s
EARN A DIGITAL BADGE WHEN YOU COMPLETE THIS COURSE
Skillsoft is providing you the opportunity to earn a digital badge upon successful completion on some of our courses, which can be shared on any social network or business platform.
Digital badges are yours to keep, forever.