Source Qualifier Transformation in Informatica

Last updated on Dec 16 2021
Santosh Singh

Table of Contents

Source Qualifier Transformation in Informatica

The source qualifier transformation is active and connected. it’s wont to represent the rows that the integrations service reads when it runs a session. we’d like to hitch the source qualifier transformation with the relational or file definition during a mapping.

The source qualifier transformation converts the source data types within the Informatica native data types. That’s why there’s no got to alter the data sorts of the ports.

The source qualifier transformation does the subsequent tasks, such as:

  • Joins: we will join two or more tables from an equivalent source database. By default, the sources are merged supported the first key and foreign key relationships. this will be changed by specifying the join condition within the “user-defined join” property.
  • Filter rows: we will filter the rows from the source database. within the default query, the mixing service adds a WHERE clause.
  • Sorting input: we will sort the source data by specifying the amount for sorted ports. within the default SQL query, the mixing Service adds an ORDER BY clause.
  • Distinct rows: we will get separate rows from the source by choosing the “Select Distinct” property. within the default SQL query, the mixing Service adds a get DISTINCT statement.
  • Custom SQL Query: we will write your SQL query to try to calculations.

Source Qualifier Transformation Properties

The source qualifier transformation has the subsequent properties, such as:

Option Description
SQL Query It defines a custom query that replaces the default query of the mixing Service, which is employed to read data from sources. A custom query overrides entries for a custom join or a source filter.
User-Defined Join It specifies the condition which is employed to hitch data from multiple sources represented within the same Source Qualifier transformation.
Source Filter It specifies the filter condition of the mixing Service that applies while querying the rows.
Number of Sorted Ports It indicates the number of columns, and it’s used during the sorting rows queried from relational sources. If we accompany this feature, the mixing Service adds an ORDER BY to the default query when it reads source rows. The ORDER BY includes the number of ports specified, ranging from the highest of the transformation. When selected, the database sort order must match the session sort order.
Tracing Level It sets the quantity of detail included within the session log once we run a session containing this transformation
Select Distinct Specifies if you would like to pick unique rows. the mixing Service includes a get DISTINCT statement if you select this feature .
Pre-SQL Pre-session commands are wont to run against the source database before the mixing Service reads the source.
Post-SQL Post-session SQL commands are wont to run against the source database after the mixing Service writes to the target.
Output is Deterministic Relational source or transformation output that doesn’t change between session runs when the input file is consistent between runs. once you configure this property, the mixing Service doesn’t stage source data for recovery if transformations within the pipeline always produce repeatable data.
Output is Repeatable Relational source or transformation output that’s within the same order between session runs when the request of the input file is consistent. When the output is deterministic, and output is repeatable, the mixing Service doesn’t stage source data for recovery.

 

Examples

In this example, we would like to switch the source qualifier of the mapping “m_emp_emp_target”, so rather than returning all the columns, it’ll return only selected columns.

Step 1: Open mapping “m_stud_stud_target” in mapping designer.

Step 2: Double click on the Source Qualifier transformation “SQ_STUD”. it’ll open the edit transformation property window for it. Then

  1. Click on the properties tab.
  2. Click on the SQLQuery Modify option, and this may open an SQL editor window.

Step 3: within the SQL editor window

  1. Enter the subsequent query

SELECT STUDNO, SNAME, CLASS, SEC FROM STUD

Note: we are selecting the columns STUDNO, SNAME, CLASS & SECTION from the source, so we’ve kept only those within the select query.

  1. Click on the OK button.

Step 4: within the “edit transformations” window,

  1. Select the Ports tab from the menu.
  2. Under the ports tab, you’ll see all the ports. Keep only the ports STUDNO, SNAME, CLASS, SEC and delete other ports

Step 5: After the deletion of ports, click OK Button.

Now, again click on the properties tab within the Edit Transformations window, and that we will see only selected data. After clicking the “OK” button it’ll open SQL Editor Window, and

  1. it’ll confirm the data you’ve got chosen are correct and prepared for loading into the target table.
  2. Click on the OK button.

Save the mapping (using ctrl+s) and execute the workflow. After execution, only the chosen columns are going to be loaded into the target.

In this way, we will override in source qualifier what columns got to be fetched from the source & this is often the sole thanks to replace what specific columns are going to be brought inside the mapping.

So, this brings us to the end of blog. This Tecklearn ‘Source Qualifier Transformation in Informatica’ blog helps you with commonly asked questions if you are looking out for a job in Informatica. If you wish to learn Informatica and build a career in Datawarehouse and ETL domain, then check out our interactive, Informatica Training, that comes with 24*7 support to guide you throughout your learning period. Please find the link for course details:

https://www.tecklearn.com/course/informatica-training-and-certification/

Informatica Training

About the Course

Tecklearn’s Informatica Training will help you master Data Integration concepts such as ETL and Data Mining using Informatica PowerCenter. It will also make you proficient in Advanced Transformations, Informatica Architecture, Data Migration, Performance Tuning, Installation & Configuration of Informatica PowerCenter. You will get trained in Workflow Informatica, data warehousing, Repository Management and other processes.

Why Should you take Informatica Training?

  • Informatica professionals earn up to $130,000 per year – Indeed.com
  • GE, eBay, PayPal, FedEx, EMC, Siemens, BNY Mellon & other top Fortune 500 companies use Informatica.
  • Key advantages of Informatica PowerCenter: Excellent GUI interfaces for Administration, ETL Design, Job Scheduling, Session monitoring, Debugging, etc.

What you will Learn in this Course?

Informatica PowerCenter 10 – An Overview

  • Informatica & Informatica Product Suite
  • Informatica PowerCenter as ETL Tool
  • Informatica PowerCenter Architecture
  • Component-based development techniques

Data Integration and Data Warehousing Fundamentals

  • Data Integration Concepts
  • Data Profile and Data Quality Management
  • ETL and ETL architecture
  • Brief on Data Warehousing

Informatica Installation and Configuration

  • Configuring the Informatica tool
  • How to install the Informatica operational administration activities and integration services

Informatica PowerCenter Transformations

  • Visualize PowerCenter Client Tools
  • Data Flow
  • Create and Execute Mapping
  • Transformations and their usage
  • Hands On

Informatica PowerCenter Tasks & Workflows

  • Informatica PowerCenter Workflow Manager
  • Reusability and Scheduling in Workflow Manager
  • Workflow Task and job handling
  • Flow within a Workflow
  • Components of Workflow Monitor

Advanced Transformations

  • Look Up Transformation
  • Active and Passive Transformation
  • Joiner Transformation
  • Types of Caches
  • Hands On

More Advanced Transformations – SQL (Pre-SQL and Post-SQL)

  • Load Types – Bulk, Normal
  • Reusable and Non-Reusable Sessions
  • Categories for Transformation
  • Various Types of Transformation – Filter, Expression, Update Strategy, Sorter, Router, XML, HTTP, Transaction Control

Various Types of Transformation – Rank, Union, Stored Procedure

  • Error Handling and Recovery in Informatica
  • High Availability and Failover in Informatica
  • Best Practices in Informatica
  • Debugger
  • Performance Tuning

Performance Tuning, Design Principles & Caches

  • Performance Tuning Methodology
  • Mapping design tips & tricks
  • Caching & Memory Optimization
  • Partition & Pushdown Optimization
  • Design Principles & Best Practices

Informatica PowerCenter Repository Management

  • Repository Manager tool (functionalities, create and delete, migrate components)
  • PowerCenter Repository Maintenance

Informatica Administration & Security

  • Features of PowerCenter 10
  • Overview of the PowerCenter Administration Console
  • Integration and repository service properties
  • Services in the Administration Console (services, handle locks)
  • Users and groups

Command Line Utilities

  • Infacmd, infasetup, pmcmd, pmrep
  • Automate tasks via command-line programs

More Advanced Transformations – XML

  • Java Transformation
  • HTTP Transformation

Got a question for us? Please mention it in the comments section and we will get back to you.

0 responses on "Source Qualifier Transformation in Informatica"

Leave a Message

Your email address will not be published. Required fields are marked *