2.92 out of 5
2.92
6 reviews on Udemy

Apache Spark 2 for Beginners

Get to grips with data processing using Spark, Python and Scala. A complete beginners guide!
Instructor:
Packt Publishing
56 students enrolled
English [Auto-generated]
Understand the fundamentals of Apache Spark
Process and display data with Python and Scala
Stream processing, machine learning and graph processing
Develop a complete Spark application

No matter where you are in your coding journey this course will get you up and running with Apache Spark, from installation and configuration to power user with 5.5 hours of top quality video tutorials.

The first chapters are a step by step guide through the fundamentals of Spark programming, covering data frames, aggregations and data sets. 

Next you’ll dive into what you can do with all the data you collect using Spark, filter results with R and expose your data to Python for deeper processing and presentation using charts and graphs. After that, you go further into the capabilities of Spark’s stream processing, machine learning, and graph processing libraries. 

The last chapter combines all the skills you learned from the preceding chapters to develop a real-world Spark application.By the end of this video, you will be able to consolidate data processing, stream processing, machine learning, and graph processing into one unified and highly interoperable framework with a uniform API using Scala or Python.

About The Author

Rajanarayanan Thottuvaikkatumana, Raj, is a seasoned technologist with more than 23 years of software development experience at various multinational companies. He has lived and worked in India, Singapore, and the USA, and is presently based out of the UK. His experience includes architecting, designing, and developing software applications. He has worked on various technologies including major databases, application development platforms, web technologies, and big data technologies. Since 2000, he has been working mainly in Java related technologies, and does heavy-duty server-side programming in Java and Scala. He has worked on very highly concurrent, highly distributed, and high transaction volume systems. Currently he is building a next generation Hadoop YARN-based data processing platform and an application suite built with Spark using Scala.

Raj holds one master’s degree in Mathematics, one master’s degree in Computer Information Systems and has many certifications in ITIL and cloud computing to his credit. Raj is the author of Cassandra Design Patterns – Second Edition, published by Packt.

When not working on the assignments his day job demands, Raj is an avid listener to classical music and watches a lot of tennis.

Spark Fundamentals

1
The Course Overview

This video gives an overview of the entire course.

2
An Overview of Apache Hadoop

This video will take you through the overview of Apache Hadoop. You will also explore the Apache Hadoop Framework and the MapReduce process. 

3
Understanding Apache Spark

By the end of this video, you will learn in depth about Spark and its advantages. You will also go through the Spark libraries and then dive into Spark Programming Paradigm. 

4
Installing Spark on Your Machines

In this video, you will learn Python installation and also how to install R. Finally, you will be able to set up the Spark environment for your machine. 

Spark Programming Model

1
Functional Programming with Spark and Understanding Spark RDD

Ability to get consistent results from a program or function because of the side effect that the program logic has, which makes many applications very complex 

2
Data Transformations and Actions with RDDs

Learn to process data using RDDs from the relevant data source, such as text files and NoSQL data stores 

3
Monitoring with Spark

Learn to handle the tools for monitoring the jobs running in a given Spark ecosystem 

4
The Basics of Programming with Spark

Ability to explain the core concepts from which the elementary data items have been picked up. 

5
Creating RDDs from Files and Understanding the Spark Library Stack

Ability to handle the appropriate Spark connector program to be used and the appropriate API to be used for reading data. 

Spark SQL

1
Understanding the Structure of Data and the Need of Spark SQL

What if you could not make use of the RDD-based Spark programming model as it requires some amount of functional programming? The solution to this is Spark SQL, which you will learn in this video. 

2
Anatomy of Spark SQL

This video will take you through the structure and internal workings of Spark SQL. 

3
DataFrame Programming

This video will demonstrate to you two types of DataFrame programming models, one using the SQL queries and the other usingthe DataFrameAPIs for Spark. 

4
Understanding Aggregations and Multi-Datasource Joining with SparkSQL

Spark SQL allows the aggregation of data. Instead of running SQL statements on a single data source located in a single machine, you can use SparkSQL to do the same on distributed data sources. 

5
Introducing Datasets and Understanding Data Catalogs

This video will show you the methods used to create a Dataset, along with its usage, conversion of RDD to DataFrame, and conversion of DataFrame to dataset. You will also learn the usage of Catalog API in Scala and Python. 

Spark Programming with R

1
The Need for Spark and the Basics of the R Language

This video will make you understand the necessity of SparkR and the basic data types in the R language. 

2
DataFrames in R and Spark

You may encounter several situations where you need to convert an R DataFrame to a Spark DataFrame or vice versa. Let’s see how to do it 

3
Spark DataFrame Programming with R

This video will show you how to write programs with SQL and R DataFrame APIs. 

4
Understanding Aggregations and Multi- Datasource Joins in SparkR

In SQL, the aggregation of data is very flexible. The same thing is true in Spark SQL too. Let’s see its use and the implementation of multi-datasource joins 

Spark Data Analysis with Python

1
Charting and Plotting Libraries and Setting Up a Dataset

This video will walk you through the Charting and Plotting Libraries and give a brief description of the application stack. You will also learn how to set up a dataset with Spark in conjunction with Python, NumPy, SciPy, and matplotlib. 

2
Charts, Plots, and Histograms

There are several instances where you need to create various charts and plots to visually represent the various aspects of the dataset and then perform data processing, charting, and plotting. This video will enable you to do this with Spark. 

3
Bar Chart and Pie Chart

This video will let you explore more on the different types of charts and bars, namely Stacked Bar Chart, Donut Chart, Box Plot, and Vertical Bar Chart. So, let’s do it! 

4
Scatter Plot and Line Graph

Through this video, you will learn in detail about scatter plot and line graph using Spark. You will also see how to enhance scatter plot in depth. 

Spark Stream Processing

1
Data Stream Processing and Micro Batch Data Processing

Data sources generate data like a stream, and many real-world use cases require them to be processed in real time. This video will give you a deep understanding of Stream processing in Spark. 

2
A Log Event Processor

These days, it is very common to have a central repository of application log events in many enterprises. Also, the log events are streamed live to data processing applications in order to monitor the performance of the running applications on a real-time basis. This video demonstrates the real-time processing of log events using a Spark Streaming data processing application. 

3
Windowed Data Processing and More Processing Options

This video will let you know the different processing options that you can pick up in Spark to work in a smart way with any data. 

4
Kafka Stream Processing

Kafka is a publish-subscribe messaging system used by many IoT applications to process a huge number of messages. Let’s see how to use it! 

5
Spark Streaming Jobs in Production

When a Spark Streaming application is processing the incoming data, it is very important to have an uninterrupted data processing capability so that all the data that is getting ingested is processed. This video will take you through those tasks that enable you to achieve this goal. 

Spark Machine Learning

1
Understanding Machine Learning and the Need of Spark for it

This video will let you know the basics of machine learning and understand the ability of Spark to achieve the goals of machine learning in an efficient manner. 

2
Wine Quality Prediction and Model Persistence

By the end of this video, you will be able to perform predictions on huge data such as the Wine quality, which is a widely used data set in data analysis. 

3
Wine Classification

Let’s use Spark to perform Wine classification by using various algorithms.

4
Spam Filtering

Spam filtering is a very common use case that is used in many applications. It is ubiquitous in e-mail applications. It is one of the most widely used classification problems. This video will enable you to deal with this problem and show you the best approach to resolve it in Spark.

5
Feature Algorithms and Finding Synonyms

It is not very easy to get raw data in the appropriate form of features and labels in order to train the model. Through this video, you will be able to play with the raw data and use it efficiently for processing. 

Spark Graph Processing

1
Understanding Graphs with Their Usage

Graphs are widely used in data analysis. Let’s explore some commonly used graphs and their usage. 

2
The Spark GraphX Library

Many libraries are available in the open source world. Giraph, Pregel, GraphLab, and Spark GraphX are some of them. Spark GraphX is one of the recent entrantsinto this space. Let’s dive into it! 

3
Graph Processing and Graph Structure Processing

Just like any other data structure, a graph also undergoes lots of changes because of the change in the underlying data. Let’s learn to process these changes. 

4
Tennis Tournament Analysis

Since the basic graph processing fundamentals are in place, now it is time to take up a real-world use case that uses graphs. Let’s take the tennis tournament's results for it. 

5
Applying PageRank Algorithm

When searching the web using Google, pages that are ranked highly by its algorithm are displayed. In the context of graphs, instead of web pages, if vertices are ranked based on the same algorithm, lots of new inferences can be made. Let’s jump right in and see how to do this. 

6
Connected Component Algorithm

In a graph, finding a subgraph consisting of connected vertices is a very common requirement with tremendous applications. This video will enable you to find the connected vertices, making it easy for you to work on the given data. 

7
Understanding GraphFrames and Its Queries

GraphFrames is a new graph processing library available as an external Spark package developed by Databricks. Though this video, you will learn the concepts and queries used in GraphFrames. 

Designing Spark Applications

1
Lambda Architecture

Application architecture is very important for any kind of software development. Lambda Architecture is a recent and popular architecture that's ideal for developing data processing applications. Let’s dive into it! 

2
Micro Blogging with Lambda Architecture

In the recent years, the concept of microblogging included the general public in the culture of blogging. Let’s see how we could work it and have fun! 

3
Implementing Lambda Architecture and Working with Spark Applications

Since the Lambda Architecture is a technology-agnostic architecture framework, when designing applications with it, it is imperative to capture the technology choices used in the specific implementations. This video does exactly that. 

4
Coding Style, Setting Up the Source Code, and Understanding Data Ingestion

You may require using different coding styles and performing data ingestion. This video will enhance your knowledge and enable you to implement these tasks with ease. 

5
Generating Purposed Views and Queries

This video will show you how to create the purposed views and queries discussed in the previous videos of this section. 

6
Understanding Custom Data Processes

Let’s explore custom data processes with this video!

You can view and review the lecture materials indefinitely, like an on-demand channel.
Definitely! If you have an internet connection, courses on Udemy are available on any device at any time. If you don't have an internet connection, some instructors also let their students download course lectures. That's up to the instructor though, so make sure you get on their good side!
2.9
2.9 out of 5
6 Ratings

Detailed Rating

Stars 5
0
Stars 4
3
Stars 3
1
Stars 2
0
Stars 1
2
c80174fd646982e80b78e0a8ca67cf5b
30-Day Money-Back Guarantee

Includes

6 hours on-demand video
Full lifetime access
Access on mobile and TV
Certificate of Completion