Loading & Querying Data with Hive
Apache Hive 2.3.2
| Beginner
- 13 videos | 1h 19m 23s
- Includes Assessment
- Earns a Badge
Among the market's most popular data warehouses used for data science, Apache Hive simplifies working with large data sets in files by representing them as tables. In this 12-video Skillsoft Aspire course, learners explore how to create, load, and query Hive tables. For this hands-on course, learners should have a conceptual understanding of Hive and its basic components, and prior experience with querying data from tables using SQL (structured query language) and with using the command line. Key concepts covered include cluster, joining tables, and modifying tables. Demonstrations covered include using the Beeline client for Hive for simple operations; creating tables, loading them with data, and then running queries against them. Only tables with primitive data types are used here, with data loaded into these tables from HDFS (Hadoop Distributed File System) file system and local machines. Learners will work with Hive metastore and temporary tables, and how they can be used. You will become familiar with basics of using the Hive query language and quite comfortable working with HDFS.
WHAT YOU WILL LEARN
-
Use the google cloud platform's dataproc service to provision a hadoop clusterDefine and create a simple table in hive using the beeline clientLoad a few rows of data into a table and query it with simple select statementsRun hive queries from the shell of a host where a hive client is installedDefine and run a join query involving two related tablesDescribe the structure of the hive metastore on the hadoop distributed file system (hdfs)
-
Create, load data into, and query an external table in hive and contrast it with a hive-managed tableUse the alter table statement to change the definition of a hive tableWork with temporary tables that are only valid for a single hive session and recognize how they differ from regular tablesPopulate hive tables with data in files on both hdfs and the file system of the hive clientLoad data into multiple tables from the contents of another tableUse the hadoop shell to execute hive query scripts and work with hive tables
IN THIS COURSE
-
2m 14s
-
6m 22sIn this video, you will learn how to use the Google Cloud Platform's Dataproc service to create a Hadoop cluster. FREE ACCESS
-
6m 23sFind out how to define and create a simple table in Hive using the Beeline client. FREE ACCESS
-
7m 15sLearn how to load a few rows of data into a table and query it with simple select statements. FREE ACCESS
-
3m 56sIn this video, find out how to run Hive queries from the shell of a host where a Hive client is installed. FREE ACCESS
-
4m 22sIn this video, learn how to define and run a join query involving two tables that are related to each other. FREE ACCESS
-
9m 23sAfter completing this video, you will be able to describe the structure of the Hive Metastore on the Hadoop Distributed File System. FREE ACCESS
-
8m 52sIn this video, you will learn how to create, load data into, and query an external table in Hive, and how it differs from a Hive-managed table. FREE ACCESS
-
5m 37sIn this video, you will use the "ALTER TABLE" statement to change the definition of a Hive table. FREE ACCESS
-
5m 42sIn this video, you will work with temporary tables that are only valid for a single Hive session and recognize how they differ from regular tables. FREE ACCESS
-
9m 24sIn this video, you will populate Hive tables with data in files on both HDFS and the file system of the Hive server. FREE ACCESS
-
4m 15sIn this video, find out how to load data into multiple tables from the contents of another table. FREE ACCESS
-
5m 38sIn this video, you will learn how to use the Hadoop shell to execute Hive query scripts and work with Hive tables. FREE ACCESS
EARN A DIGITAL BADGE WHEN YOU COMPLETE THIS COURSE
Skillsoft is providing you the opportunity to earn a digital badge upon successful completion on some of our courses, which can be shared on any social network or business platform.
Digital badges are yours to keep, forever.