4.29 out of 5
4.29
499 reviews on Udemy

SAP Data Services(BODS)Extraction,Transformation and Loading

Learn SAP BO Data Services hands on to help you with ETL projects, moving and transforming complex data
Instructor:
Junaid Ahmed
2,242 students enrolled
Understand Data Services big picture
Work with hetrogeneous Source and Target Connections
Build ETL Jobs using data services transforms and functions
Trace, Validate, and Debug Data Services Jobs
Work with Global/Local variables, scripts, Custom Functions
Implement Change Data Capture in Data Services
Implement error handling

SAP Data services is leader in the data integration and the transformation space. It is a critical component of the SAP HANA solution. With SAP Data Services, you can quickly discover, cleanse, and integrate data – and make it available for real-time analysis. Discover all formats of data in your organization including unstructured data to understand the business better, discover profitable patterns and make better decisions.

Key Learnings:

  • Understand Data Services big picture
  • Work with heterogeneous Source and Target Connections
  • Build ETL Jobs using data services transforms and functions
  • Trace, Validate, and Debug Data Services Jobs
  • Work with Global/Local variables, scripts, Custom Functions
  • Implement Change Data Capture in Data Services
  • Implement error handling

Engaging Teaching:

  1. Expressive and immersive teaching with demonstrations and learning tools
  2. Small, purposeful videos, packed with information, designed specifically for a virtual self paced audience.
  3. Exercises for hands on practice

Introduction to the course,use case, tool and basics

1
Welcome to the course! lets take a quick peek what coming ahead

Hello !!! Welcome .

In this video I will walk you through the course structure, the layout. What you except from each topic and some tips for best learning outcomes.

2
What is ETL ?,and need for it

In this video we will understand the ETL process and its aspects from a 50,000 feet level and set the precedence for the coming lectures.

3
How Data Services can help us and its core functionalities

We will understand the positioning of the tool, its core functionalities, its connectivity capabilities and technical and business benefits.

Architecture and GUI

1
BODS Architecture

Here we look into the  several unique components Data Services relies on to accomplish the data integration and data quality activities required to manage corporate data.

2
The Designer - your bread and butter

Data Services Designer is a Windows client application used to create, test, and manually execute
jobs that transform data and populate a data warehouse. Using the Designer, create data management applications that consist of data mappings, transformations, and control logic.

Connect to DB's, Flat Files, XML data sources

1
Concept of Datastores in Data Services

You are responsible for extracting data into the company's Business Warehouse system and want to convert it using Data Services as the new data transfer process.

A datastore provides a connection or multiple connections to data sources such as a database.
Using the datastore connection. Data Services can import the metadata that describes the data
from the data source

2
Create a data store to connect to a SAP HANA database

You are working as an ETL developer using SAP Data Services Designer. You will create data stores for the source, target, and staging databases.

3
How to connect to SQL Server

You are working as an ETL developer using SAP Data Services Designer. You will create data stores for the source, target, and staging databases.

4
How to discover meta data and profile data of the datastores

Data Services determines and stores a specific set of metadata information for tables. Import
metadata by naming, searching, and browsing. After importing metadata, edit column names.
descriptions, and data types.

5
Exercise 1
6
Create a data store to connect to a SAP ERP System

Lets connect to an SAP ERP system from Data Services.For most Enterprise Resource Planning (ERP) applications. Data Services generates SQL that is optimized for the specific target database (for example. Oracle. DB2, SQL Server. and Inform ix).

7
Template Tables

we will learn to Use template tables in early application development when you are designing and testing a
project

8
How to work with flat files

We well see how to work with flat files, Create file formats, work with different delimiters.

9
Exercise 2
10
Working with multiple flat files

Here we learn how to read multiple flat files with identical formats with a single file format.
By substituting a wild card character or list of file names.

11
How to import Excel files

We will learn how to import create excel workbooks, create formats , filter data and more.

12
How to work with XML files

Data services can interpret XML's as well. lets see how to create xml file formats and import data

Concept, Working and Scheduling of a ETL job

1
Create an ETL Job in DS , Structure and Sequencing

Lets understand the Data services job structure, the different components and how to sequence them in parallel and series.

2
Lets move data using a data flow and a JOB!

Lets see how to build Data flows they contain the source, transform, and target objects that represent the key activities in data integration and data quality processes.

3
Exercise 3
4
Where are the errors, logs and statistics - Top customer priority

Trouble shooting is a key skill for the developers , it starts with Understanding where to find the error logs, trace's and job statistics.

5
Scheduling a Job - top customer Priority

Top customer concern is to be able to schedule a job for automatic execution. lets see how to do that.

A job is the only executable object in Data Services. When developing data flows, you can
manually execute and test jobs directly in Data Services.

Platform Transforms - They make developers life easy!

1
Using the QUERY Transform (filter, Join and more)

Here we will see that the Query transform is the most commonly used transform, and is included in most data flows. enables you to select data from a source and filter it or reformat it as it moves to the target.

2
Exercise 4
3
Using the Case Transform

Here we learn to use the Case transform to simplify branch logic in data flows by consolidating case or decision making logic into one transform.
 

4
Exercise 5
5
Using the Merge Transform

Here we learn to use the Merge transform to combine incoming data sets with the same schema structure.
the merge produces a single output data set with the same schema as the input data sets.

6
Exercise 6
7
Using the Validation Transform


The Validation transform enables you to create validation rules and move data into target objects
based on whether they pass or fail validations

8
Exercise 7
9
Using the SQL Transform

Use the SQL transform to submit SQL commands that generate data to be moved into target objects

10
Exercise 8

Using Variables. Parameters, and Scripts

1
Global , Local , Substitution Variables

Local variables are restricted to the job or work flow in which they are created. Use parameters to
pass local variables to the work flows and data flows in the object. A local variable is included as
part of the definition of the work flow or data flow, and so it is portable between jobs.
Global variables are also restricted to the job in which they are created. However, they do not
require parameters to be passed to work flows and data flows in that job. You can reference the
global variable directly in expressions for any object of the job.

2
Creating a Script and demo for Global, Local and Substitution variables

A script is a single-use object that is used to call functions and assign values in a work flow.
Execute a script before data flows for initialization steps and use a script with conditionals to
determine execution paths. You may also use a script after work flows or data flows to record
execution information such as time, or to record a change in the number of rows in a data set.
Use a script to calculate values that are passed on to other parts of the work flow or to assign
values to variables and execute functions.

3
Creating and Using Custom Functions

If the built-in functions that are provided by Data Services do not meet your requirements. you
can create your own custom functions using the Data Services scripting language.
Create your own functions by writ ing script functions in the Data Services scripting language
using the Smart Editor. Saved custom functions appear under the Custom Functions category in
the Function Wizard and the Smart Editor. Custom functions are also displayed on t he Custom

4
Exercise 9

Using Built-In Functions

1
Use the lookup_ext() Function

Retrieve a value in a table or file based on the values in a different source table or file:
• Return multiple columns from a single lookup.
• Choose from additional operators to specify a lookup condition.
• Specify a return policy for your lookup.
• Perform multiple look ups.

2
Exercise 10
3
Use the search_replace Function

 It performs a simple search and replace based on a string value, word value, or an entire field.

4
Exercise 11
5
Functions to get object info(job, work flow, data flow, system)

# Print functions

print('Host Name: [host_name()]');

print('Repository Name: [repository_name()]');

print('Starting execution of Job: [job_name()] as user: [system_user_name()]');

print('Work Flow Name: [workflow_name()]');

# SYSTEM Info

print('System Date: [sysdate()]');
print('System Time: [systime()]');

print('System_user_name: [system_user_name()]');

6
Function for file processing like check existence of a file

$FileName_Pre  = 'C:Usershana1DesktopCSV filesCSV.txt';
$FileName= $FileName_Pre;
print ($FileName );
$NewFileName = 'C:Usershana1DesktopCSV files2CDVNEW.txt';
If ( file_exists ( $FileName ) = 1 )
begin
print('File exists');
file_move($FileName,$NewFileName,1);
print('File has been copied');
#file_delete($FileName);
#print('Old File has been deleted');
end
else
begin
print( '**********************************');
print( '*****File does not exist**********');
print( '*****Waiting For The File*********');
wait_for_file($FileName, 120000, 60000);
end

7
String functions

# To Char functions

print(to_char(sysdate(), 'MM'));
print(to_char(sysdate(), 'yyyy.mm.dd'));
print(to_char(sysdate(), 'dd.mm.yyyy'));
#to_char(sysdate(),’MONTH’)
#to_char(sysdate(),’DD’)
#to_char(sysdate(),’YY’)
#to_char(sysdate(),’YYYY’)
#to_char(sysdate(),’HH’)
#to_char(sysdate(),’MI’)
#to_char(sysdate(),’SS’)
#to_char(sysdate(),’FF’)

print(To_date('jun 18,2012', 'MON DD,YYYY'));

print(CAST('13.54','INT'));
print(CAST('12.20','DECIMAL(3,1)'));

print( ltrim('Marilyn', 'Ma'));
print( ltrim('ABCABCD', 'ABC')); 

print(ceil(12.12345));
print(ceil(-12.223)); 

Managing Slowly Changing Dimensions in BODS

1
Types of slowly moving dimensions
2
Source CDC vs Target CDC Methods

Setting up a full CDC solution within Data Services may not be requ ired. Many databases now
have CDC support built into them. such as Oracle. SQL Server, DB2 and SAP Sybase.
Alternatively, you can combine surrogate keys with t he Map Operation transform to change all
UPDATE row types to INSERT row types to capture changes.

3
Using Source-Based CDC Concept - DEMO

Source-based Changed Data Capture (CDC) is t he preferred method of updating data because it
improves performance by extracting the fewest rows. Source-based CDC, also referred to as
incremental extraction. extracts only the changed rows from the source as shown in the figure

4
Exercise 12
5
Using Target-Based (CDC) Concept - PART 1 - PREP

Target-based Change Data Capture (CDC) compares the source to the target to determine
which records have changed.

6
Using Target-Based (CDC) Concept - PART 2 -DEMO

Target-based Change Data Capture (CDC) compares the source to the target to determine
which records have changed.

7
Using Target based (CDC) - History Preserving Demo

Converts rows flagged as UPDATE to UPDATE plus INSERT. so
that the original values are preserved in the target. Specify the
column in which to look for updated data.

8
Exercise 13
9
Using the Pivot Transform

Use the Pivot transform to convert columns into rows and to
convert rows back into columns.

10
Exercise 14

Batch Job Troubleshooting

1
Annotations and Tracing Jobs

 Anotations is  a sticky-note with folded-down corner

Tracing job - Use trace properties to select the information that Data Services monitors and writes to the
 trace log file during a job. Data Services writes trace messages to the trace log associated with the
current job server and writes error messages to the error log associated with the current job server

2
Exercise 15
3
Debugging Data Flows - using Debugger

Use the Interactive Debugger to examine what happens to data after each transform or object in
a flow and to troubleshoot any issues that arise when executing your jobs.

4
Exercise 16
5
Auditing Data Flows

Learn to set up audit rules to ensure the correct data is loaded to the target when executing jobs

6
Exercise 17

Error Handling

1
Setting Up Error Handling Implementation Concept

Resolve issues if a Data Services job execution is not successful . for example, if a server failure
prevents the completion of the job. Use recoverable work f lows and try/catch blocks to recover
data for sophisticated error handling.

2
Setting Up Error Handling Implementation Demo

Resolve issues if a Data Services job execution is not successful . for example, if a server failure
prevents the completion of the job. Use recoverable work f lows and try/catch blocks to recover
data for sophisticated error handling.

3
Exercise 18
You can view and review the lecture materials indefinitely, like an on-demand channel.
Definitely! If you have an internet connection, courses on Udemy are available on any device at any time. If you don't have an internet connection, some instructors also let their students download course lectures. That's up to the instructor though, so make sure you get on their good side!
4.3
4.3 out of 5
499 Ratings

Detailed Rating

Stars 5
179
Stars 4
202
Stars 3
93
Stars 2
17
Stars 1
11
83c63cb36b7d3643c1c01797401d610d
30-Day Money-Back Guarantee

Includes

4 hours on-demand video
19 articles
Full lifetime access
Access on mobile and TV
Certificate of Completion