Informatica Tutorials

Big Data Analytics

Loading and Transformation Scenarios

The following sections offer examples of typical loading and transformation tasks:

■ Key Lookup Scenario
■ Business Rule Violation Scenario
■ Data Error Scenarios
■ Business Rule Violation Scenario
■ Pivoting Scenarios


Key Lookup Scenario
A typical transformation is the key lookup. For example, suppose that sales
transaction data has been loaded into a retail data warehouse. Although the data
warehouse's sales table contains a product_id column, the sales transaction data
extracted from the source system contains Uniform Price Codes (UPC) instead of
product IDs. Therefore, it is necessary to transform the UPC codes into product IDs
before the new sales transaction data can be inserted into the sales table.
In order to execute this transformation, a lookup table must relate the product_id
values to the UPC codes. This table might be the product dimension table, or perhaps
another table in the data warehouse that has been created specifically to support this transformation.

For this example, we assume that there is a table named product,
which has a product_id and an upc_code column.

This data substitution transformation can be implemented using the following CTAS
statement:

CREATE TABLE temp_sales_step2 NOLOGGING PARALLEL AS SELECT sales_transaction_id,
product.product_id sales_product_id, sales_customer_id, sales_time_id,
sales_channel_id, sales_quantity_sold, sales_dollar_amount
FROM temp_sales_step1, product
WHERE temp_sales_step1.upc_code = product.upc_code;

This CTAS statement will convert each valid UPC code to a valid product_id value.
If the ETL process has guaranteed that each UPC code is valid, then this statement
alone may be sufficient to implement the entire transformation.


Business Rule Violation Scenario

In the preceding example, if you must also handle new sales data that does not have
valid UPC codes (a logical data error), you can use an additional CTAS statement to
identify the invalid rows:

CREATE TABLE temp_sales_step1_invalid NOLOGGING PARALLEL AS
SELECT * FROM temp_sales_step1 s
WHERE NOT EXISTS (SELECT 1 FROM product p WHERE p.upc_code=s.upc_code);

This invalid data is now stored in a separate table, temp_sales_step1_invalid,
and can be handled separately by the ETL process.
Another way to handle invalid data is to modify the original CTAS to use an outer
join, as in the following statement:

CREATE TABLE temp_sales_step2 NOLOGGING PARALLEL AS
SELECT sales_transaction_id, product.product_id sales_product_id,
sales_customer_id, sales_time_id, sales_channel_id, sales_quantity_sold,
sales_dollar_amount
FROM temp_sales_step1, product
WHERE temp_sales_step1.upc_code = product.upc_code (+);

Using this outer join, the sales transactions that originally contained invalidated UPC codes will be assigned a product_id of NULL. These transactions can be handled
later. Alternatively, you could use a multi-table insert, separating the values with a product_id of NULL into a separate table; this might be a beneficial approach when
the expected error count is relatively small compared to the total data volume. You do not have to touch the large target table but only a small one for a subsequent
processing.

INSERT /*+ APPEND PARALLEL */ FIRST
WHEN sales_product_id IS NOT NULL THEN
INTO temp_sales_step2
VALUES (sales_transaction_id, sales_product_id,
sales_customer_id, sales_time_id, sales_channel_id,
sales_quantity_sold, sales_dollar_amount)
ELSE
INTO temp_sales_step1_invalid
VALUES (sales_transaction_id, sales_product_id,
sales_customer_id, sales_time_id, sales_channel_id,
sales_quantity_sold, sales_dollar_amount)
SELECT sales_transaction_id, product.product_id sales_product_id,
sales_customer_id, sales_time_id, sales_channel_id,
sales_quantity_sold, sales_dollar_amount
FROM temp_sales_step1, product
WHERE temp_sales_step1.upc_code = product.upc_code (+);


Note that for this solution, the empty tables temp_sales_step2 and temp_sales_
step1_invalid must already exist.


Additional approaches to handling invalid UPC codes exist. Some data warehouses
may choose to insert null-valued product_id values into their sales table, while
others may not allow any new data from the entire batch to be inserted into the sales
table until all invalid UPC codes have been addressed. The correct approach is
determined by the business requirements of the data warehouse. Irrespective of the
specific requirements, exception handling can be addressed by the same basic SQL
techniques as transformations.

Data Error Scenarios

If the quality of the data is unknown, the example discussed in the preceding section
could be enhanced to handle unexpected data errors, for example, data type
conversion errors, as shown in the following:

INSERT /*+ APPEND PARALLEL */ FIRST
WHEN sales_product_id IS NOT NULL THEN
INTO temp_sales_step2
VALUES (sales_transaction_id, sales_product_id,
sales_customer_id, sales_time_id, sales_channel_id,
sales_quantity_sold, sales_dollar_amount)
LOG ERRORS INTO sales_step2_errors('load_20040804')
REJECT LIMIT UNLIMITED
ELSE
INTO temp_sales_step1_invalid
VALUES (sales_transaction_id, sales_product_id,
sales_customer_id, sales_time_id, sales_channel_id,
sales_quantity_sold, sales_dollar_amount)
LOG ERRORS INTO sales_step2_errors('load_20040804')
REJECT LIMIT UNLIMITED
SELECT sales_transaction_id, product.product_id sales_product_id,
sales_customer_id, sales_time_id, sales_channel_id,
sales_quantity_sold, sales_dollar_amount
FROM temp_sales_step1, product
WHERE temp_sales_step1.upc_code = product.upc_code (+);

This statement will track the logical data error of not having a valid product UPC code in table temp_sales_step1_invalid and all other possible errors in a DML error
logging table called sales_step2_errors. Note that an error logging table can be
used for several DML operations.

An alternative to this approach would be to enforce the business rule of having a valid UPC code on the database level with a NOT NULL constraint. Using an outer join, all orders not having a valid UPC code would be mapped to a NULL value and then
treated as data errors. This DML error logging capability is used to track these errors in the following statement:

INSERT /*+ APPEND PARALLEL */
INTO temp_sales_step2
VALUES (sales_transaction_id, sales_product_id,
sales_customer_id, sales_time_id, sales_channel_id,
sales_quantity_sold, sales_dollar_amount)
SELECT sales_transaction_id, product.product_id sales_product_id,
sales_customer_id, sales_time_id, sales_channel_id,
sales_quantity_sold, sales_dollar_amount
FROM temp_sales_step1, product
WHERE temp_sales_step1.upc_code = product.upc_code (+)
LOG ERRORS INTO sales_step2_errors('load_20040804')
REJECT LIMIT UNLIMITED;


The error logging table contains all records that would have caused the DML
operation to fail. You can use its content to analyze and correct any error. The content in the error logging table is preserved for any DML operation, irrespective of the success of the DML operation itself. Let us assume the following SQL statement failed because the reject limit was reached:

SQL> INSERT /*+ APPEND NOLOGGING PARALLEL */ INTO sales_overall
2 SELECT * FROM sales_activity_direct
3 LOG ERRORS INTO err$_sales_overall ('load_test2')
4 REJECT LIMIT 10;
SELECT * FROM sales_activity_direct
*
ERROR at line 2:
ORA-01722: invalid number


The error message raised by Oracle occurs where the first after the error limit is
reached. The next error (number 11) is the one that raised an error. The error message that is displayed is based on the error that exceeded the limit, so, for example, the ninth error could be different from the eleventh error.
The target table sales_overall will not show any records being entered (assumed
that the table was empty before), but the error logging table will contain 11 rows
(REJECT LIMIT + 1)

SQL> SELECT COUNT(*) FROM sales_overall;
COUNT(*)
----------
0
SQL> SELECT COUNT(*) FROM err$_sales_overall;
COUNT(*)
----------
11

A DML error logging table consists of several fixed control columns that are
mandatory for every error logging table. Besides the Oracle error number, Oracle
enforces storing the error message as well. In many cases, the error message provides
additional information to analyze and resolve the root cause for the data error. The
following SQL output of a DML error logging table shows this difference. Note that
the second output contains the additional information for rows that were rejected due
to NOT NULL violations.

SQL> SELECT DISTINCT ora_err_number$ FROM err$_sales_overall;
ORA_ERR_NUMBER$
---------------
1400
1722
1830
1847

SQL> SELECT DISTINCT ora_err_number$, ora_err_mesg$ FROM err$_sales_overall;
ORA_ERR_NUMBER$ ORA_ERR_MESG$
1400 ORA-01400: cannot insert NULL into
("SH"."SALES_OVERALL"."CUST_ID")
1400 ORA-01400: cannot insert NULL into
("SH"."SALES_OVERALL"."PROD_ID")
1722 ORA-01722: invalid number
1830 ORA-01830: date format picture ends before
converting entire input string
1847 ORA-01847: day of month must be between 1 and last
day of month



Pivoting Scenarios
A data warehouse can receive data from many different sources. Some of these source
systems may not be relational databases and may store data in very different formats
from the data warehouse. For example, suppose that you receive a set of sales records
from a nonrelational database having the form:

product_id, customer_id, weekly_start_date, sales_sun, sales_mon, sales_tue,
sales_wed, sales_thu, sales_fri, sales_sat

The input table looks like the following:

SELECT * FROM sales_input_table;
PRODUCT_ID CUSTOMER_ID WEEKLY_ST SALES_SUN SALES_MON SALES_TUE SALES_WED SALES_THU SALES_FRI SALES_SAT
---------- ----------- --------- ---------- ---------- ---------- -------------------- ---------- ----------
111 222 01-OCT-00 100 200 300 400 500 600 700
222 333 08-OCT-00 200 300 400 500 600 700 800
333 444 15-OCT-00 300 400 500 600 700 800 900

In your data warehouse, you would want to store the records in a more typical
relational form in a fact table sales of the sh sample schema:
prod_id, cust_id, time_id, amount_sold


Thus, you need to build a transformation such that each record in the input stream
must be converted into seven records for the data warehouse's sales table. This
operation is commonly referred to as pivoting, and Oracle Database offers several
ways to do this.
The result of the previous example will resemble the following:
SELECT prod_id, cust_id, time_id, amount_sold FROM sales;
PROD_ID CUST_ID TIME_ID AMOUNT_SOLD
---------- ---------- --------- -----------
111 222 01-OCT-00 100
111 222 02-OCT-00 200
111 222 03-OCT-00 300
111 222 04-OCT-00 400
111 222 05-OCT-00 500
111 222 06-OCT-00 600
111 222 07-OCT-00 700
222 333 08-OCT-00 200
222 333 09-OCT-00 300
222 333 10-OCT-00 400
222 333 11-OCT-00 500
222 333 12-OCT-00 600
222 333 13-OCT-00 700
222 333 14-OCT-00 800
333 444 15-OCT-00 300
333 444 16-OCT-00 400
333 444 17-OCT-00 500
333 444 18-OCT-00 600
333 444 19-OCT-00 700
333 444 20-OCT-00 800
333 444 21-OCT-00 900


Example Pivoting
The following example uses the multitable insert syntax to insert into the demo table
sh.sales some data from an input table with a different structure. The multitable
INSERT statement looks like the following:

INSERT ALL INTO sales (prod_id, cust_id, time_id, amount_sold)
VALUES (product_id, customer_id, weekly_start_date, sales_sun)
INTO sales (prod_id, cust_id, time_id, amount_sold)
VALUES (product_id, customer_id, weekly_start_date+1, sales_mon)
INTO sales (prod_id, cust_id, time_id, amount_sold)
VALUES (product_id, customer_id, weekly_start_date+2, sales_tue)
INTO sales (prod_id, cust_id, time_id, amount_sold)
VALUES (product_id, customer_id, weekly_start_date+3, sales_wed)
INTO sales (prod_id, cust_id, time_id, amount_sold)
VALUES (product_id, customer_id, weekly_start_date+4, sales_thu)
INTO sales (prod_id, cust_id, time_id, amount_sold)
VALUES (product_id, customer_id, weekly_start_date+5, sales_fri)
INTO sales (prod_id, cust_id, time_id, amount_sold)
VALUES (product_id, customer_id, weekly_start_date+6, sales_sat)
SELECT product_id, customer_id, weekly_start_date, sales_sun,
sales_mon, sales_tue, sales_wed, sales_thu, sales_fri, sales_sat
FROM sales_input_table;

This statement only scans the source table once and then inserts the appropriate data
for each day.

Related Posts Plugin for WordPress, Blogger...

Please Share

Twitter Delicious Facebook Digg Stumbleupon Favorites More

 
Follow TutorialBlogs
Share on Facebook
Tweet this Blog
Add Blog to Technorati
Home