Clusters
Apache Hadoop
| Intermediate
- 8 videos | 48m 2s
- Includes Assessment
- Earns a Badge
Clusters are used to store and analyze large volumes of data in a distributed computer environment. Explore the best practices to follow when implementing clusters in Hadoop.
WHAT YOU WILL LEARN
-
Configure an ubuntu server for ssh and java for hadoopSet up hadoop on a single nodeSet up hadoop on four nodesDescribe the different cluster configurations, including single-rack deployments, three-rack deployments, and large-scale deployments
-
Add a new node to an existing hadoop clusterFormat hdfs and configure common optionsRun an example mapreduce job to perform a word countStart a hadoop cluster and run a mapreduce job
IN THIS COURSE
-
6m 9sIn this video, find out how to configure an Ubuntu server for ssh and Java for Hadoop. FREE ACCESS
-
7m 40sIn this video, you will learn how to set up Hadoop on a single node. FREE ACCESS
-
8m 58sFind out how to set up Hadoop on four nodes. FREE ACCESS
-
3m 52sUpon completion of this video, you will be able to describe the different cluster configurations, including single-rack deployments, three-rack deployments, and large-scale deployments. FREE ACCESS
-
4m 50sIn this video, you will add a new node to an existing Hadoop cluster. FREE ACCESS
-
5m 16sFind out how to format HDFS and configure common options. FREE ACCESS
-
6m 9sLearn how to run a mapreduce job to perform a word count. FREE ACCESS
-
5m 9sIn this video, you will learn how to start a Hadoop cluster and run a mapreduce job. FREE ACCESS
EARN A DIGITAL BADGE WHEN YOU COMPLETE THIS COURSE
Skillsoft is providing you the opportunity to earn a digital badge upon successful completion on some of our courses, which can be shared on any social network or business platform.
Digital badges are yours to keep, forever.