Companies store a lot of data, but in most cases, it is not available in a format that makes it easily accessible for analysis and reporting tools. Ralph Kimball realized this a long time ago, so he paved the way for the star schema.
Learning Pentaho Data Integration walks you through the creation of an ETL process to create a data mart based on a fictional company. This course will show you how to source the raw data and prepare it for the star schema step-by-step. The practical approach of this course will get you up and running quickly, and will explain the key concepts in an easy to understand manner.
Learning Pentaho Data Integration teaches you how to source raw data with Pentaho Kettle and transform it so that the output can be a Kimball-style star schema. After sourcing the raw data with our ETL process, you will quality check the data using an agile approach. Next, you will learn how to load slowly changing dimensions and the fact table. The star schema will reside in the column-oriented database, so you will learn about bulk-loading the data whenever possible. You will also learn how to create an OLAP schema and analyze the output of your ETL process easily.
By covering all the essential topics in a hands-down approach, you will be in the position of creating your own ETL processes within a short span of time.
Getting Started
Get an insight into the raw data, which we will be working with in this video tutorial.
Create a Star Schema derived from the raw data.
We will create the required databases for our project, add JDBC drivers to PDI and create JDNI connections.
Agile BI – Creating ETLs to Prepare Joined Data Set
Create an ETL transformation that imports your raw data so that you can apply further manipulation further down the stream and output the data to the Datamart.
We will learn how to easily make sure that the data types of the ETL output step are in sync with the database table column types.
Loading huge amounts of data in the traditional way takes too long ,speed it up by using the bulk loader.
Agile BI – Building OLAP Schema, Analyzing Data, and Implementing Required ETL I
In this first step to Agile ETL development, you will learn how to create a Pentaho Analysis Model so that you can analyze the data later on in Pentaho Analyzer.
A very important point is to understand the quality of the data: Are there any duplicates, misspellings and so on. We will find such problems and use this new knowledge to feed it back to the ETL design.
Learn how to implement ETL improvements to iron out the data problems found.
Slowly Changing Dimensions
Learn how to populate a simple dimension.
Learn how to populate a simple dimension and make it future proof.
Learn how to keep historic versions in your dimension table
Populating Data Dimension
In order to make our date dimension transformation more dynamic, we will allow users to define a start and end date in order to specify the period.
Based on the provided parameters, the amount of days between the start and end date will be calculated. This figure will be used to generate a data set with the same number of rows.
In this part, you will learn how to derive various date attributes such as year, week, day and so on. from the input date.
Creating the Fact Transformation
Learn how to efficiently create an input query for your fact transformation.
Learn how to configure the step to look up the SCD type 1 keys.
Learn how to configure the step to look up the SCD type 2 keys.
Orchestration
In our setup, dimensions can be loaded in parallel; therefore, we can create an ETL job
We will create the main job, which runs all the required child jobs and transformations
ID-based Change Data Capture
In this section, you will learn how the new data can be automatically loaded into Datamart using the Change Data Capture (CDC) approach.
We will define the order of execution for all the transformations involved.
Final Touches: Logging and Scheduling
We will create a dedicated environment for logging.
Pentaho Kettle features built-in logging. You will learn how to configure them.
Learn how to schedule a daily run of your ETL process.