***********************************************************************************************************************
Pushdown Optimization in Informatica PowerCenter
Informatica Pushdown Optimization Option
increase performance by providing the flexibility to push transformation
processing to the most appropriate processing resource. Using Pushdown
Optimization, data transformation logic can be pushed to source database or
target database or through the PowerCenter server. This gives the option
for the ETL architect to choose the best of the available resources for data
processing.
Pushdown Optimization Option enables data transformation
processing, to be pushed down into any relational database to make the best use
of database processing power. It converts the transformation logic into SQL
statements, which can directly execute on database. This minimizes the need of
moving data between servers and utilizes the power of database engine
How Pushdown Optimization actually Works:
When you run a session configured for pushdown optimization, the
Integration Service analyzes the mapping and transformations to determine
the transformation logic it can push to the database. If the mapping contains a
mapplet, the Integration Service expands the mapplet and treats the
transformations in the mapplet as part of the parent mapping. The Integration
Service converts the transformation logic into SQL statements and sends to the
source or the target database to perform the data transformation. The amount of
transformation logic one can push to the database depends on the database,
transformation logic, and mapping and session configuration.
Types of Pushdown Optimization:
You can
configure pushdown optimization in the following ways.
1. Source-side pushdown optimization
2. Target-side pushdown optimization
3. Full pushdown optimization
Source-side pushdown optimization:
When you run a session configured for
source-side pushdown optimization, the Integration Service analyzes the mapping
from the source to the target or until it reaches a downstream transformation
it cannot push to the database.
The Integration Service generates a SELECT
statement based on the transformation logic for each transformation it can push
to the database. When you run the session, the Integration Service pushes all
transformation logic that is valid to push to the database by executing the
generated SQL statement. Then, it reads the results of this SQL statement and
continues to run the session. If you run a session that contains an SQL
override or lookup override, the Integration Service generates a view based on
the override. It then generates a SELECT statement and runs the SELECT statement
against this view. When the session completes, the Integration Service drops
the view from the database.
Target-side pushdown optimization
When you run a session configured for
target-side pushdown optimization, the Integration Service analyzes the mapping
from the target to the source or until it reaches an upstream transformation it
cannot push to the database.
The Integration Service generates an INSERT,
DELETE, or UPDATE statement based on the transformation logic for each
transformation it can push to the database, starting with the first
transformation in the pipeline it can push to the database. The Integration
Service processes the transformation logic up to the point that it can push the
transformation logic to the target database. Then, it executes the generated
SQL.
Full pushdown optimization
The Integration Service pushes as much
transformation logic as possible to both source and target databases. If you
configure a session for full pushdown optimization, and the Integration Service
cannot push all the transformation logic to the database, it performs partial
pushdown optimization instead.
To use full pushdown optimization, the source
and target must be on the same database. When you run a session configured for
full pushdown optimization, the Integration Service analyzes the mapping
starting with the source and analyzes each transformation in the pipeline until
it analyzes the target. It generates SQL statements that are executed against
the source and target database based on the transformation logic it can push to
the database. If the session contains an SQL override or lookup override, the
Integration Service generates a view and runs a SELECT statement against this
view.
Configuring Session for Pushdown Optimization
A session can be configured to use pushdown optimization from
informatica workflow manager.
Go to workflow manager >> Open/edit session >> Session
properties>> Select Pushdown optimization >> choose the Source,
Target or Full pushdown optimization as shown in below image.
You can additionally choose few options to control push down
optimization. Below screen shot shows the available options.
- Allow
Temporary View for Pushdown: Allows
the Integration Service to create temporary view objects in the database
when it pushes the session to the database.
- Allow
Temporary Sequence for Pushdown: Allows
the Integration Service to create temporary sequence objects in the
database.
- Allow Pushdown for User Incompatible Connections: Indicates that the database user of the active database has read permission on the idle databases.
Pushdown Optimization Viewer:
1. Use the Pushdown Optimization Viewer to examine
the transformations that can be pushed to the database.
2. Select a pushdown option or pushdown group in
the Pushdown Optimization Viewer to view the corresponding SQL statement that
is generated for the specified selections. You can invoke the viewer from
highlighted 'Pushdown Optimization' as shown in below image.
3. Pushdown optimizer viewer pops up in a new
window and it shows how integration service converts the data transformation
logic into SQL statement for a particular mapping.
4. When you select a pushdown option or pushdown
group in the viewer, you do not change the pushdown configuration. To change
the configuration, we must update the pushdown option in the session
properties.
Consider below points before Using Pushdown
Optimization
Consider the following database performance issues when you
generate a long transaction.
- A
long transaction uses more database resources.
- A
long transaction locks the database for longer periods of time, and
thereby reduces the database concurrency and
increases the likelihood of deadlock.
- A
long transaction can increase the likelihood that an unexpected event may
occur.
ReplyDeleteThanks for sharing this valuable information to our vision. You have posted a trust worthy blog keep sharing.
Regards,
Informatica institutes in Chennai
Wealth of information. Ton of ideas...nice blog,visit link
ReplyDeletehttp://www.tekclasses.com/
Wealth of information. Ton of ideas...nice blog,visit link
ReplyDeletehttp://www.tekclasses.com/
best ARTICLE to learn,visit link
ReplyDeletehttp://www.tekclasses.com/
Thank you for your guide to with upgrade information
ReplyDeleteInformatica Online Training