Practical Apache Spark: Using the Scala API

  • 1h 53m
  • Dharanitharan Ganesan, Subhashini Chellappan
  • Apress
  • 2018

Work with Apache Spark using Scala to deploy and set up single-node, multi-node, and high-availability clusters. This book discusses various components of Spark such as Spark Core, DataFrames, Datasets and SQL, Spark Streaming, Spark MLib, and R on Spark with the help of practical code snippets for each topic. Practical Apache Spark also covers the integration of Apache Spark with Kafka with examples. You’ll follow a learn-to-do-by-yourself approach to learning – learn the concepts, practice the code snippets in Scala, and complete the assignments given to get an overall exposure.

On completion, you’ll have knowledge of the functional programming aspects of Scala, and hands-on expertise in various Spark components. You’ll also become familiar with machine learning algorithms with real-time usage.

What You Will Learn

  • Discover the functional programming features of Scala
  • Understand the complete architecture of Spark and its components
  • Integrate Apache Spark with Hive and Kafka
  • Use Spark SQL, DataFrames, and Datasets to process data using traditional SQL queries
  • Work with different machine learning concepts and libraries using Spark's MLlib packages

Who This Book Is For

Developers and professionals who deal with batch and stream data processing.

In this Book

  • Scala: Functional Programming Aspects
  • Single and Multinode Cluster Setup
  • Introduction to Apache Spark and Spark Core
  • Spark SQL, DataFrames, and Datasets
  • Introduction to Spark Streaming
  • Spark Structured Streaming
  • Spark Streaming with Kafka
  • Spark Machine Learning Library
  • Working with SparkR
  • Spark Real-Time Use Case

YOU MIGHT ALSO LIKE

Rating 4.7 of 57 users Rating 4.7 of 57 users (57)
Rating 4.5 of 62 users Rating 4.5 of 62 users (62)