Explore topic-wise InterviewSolutions in .

This section includes InterviewSolutions, each offering curated multiple-choice questions to sharpen your knowledge and support exam preparation. Choose a topic below to get started.

1.

What is Domain in Informatica?

Answer»

PowerCenter SERVICES are ADMINISTERED and MANAGED by the Informatica Domain. It consists of nodes and services. Each relationship and node are CATEGORIZED according to the administration REQUIREMENTS into special folders and sub-folders.

2.

What is the role of a repository manager?

Answer»

A REPOSITORY manager is an administrative tool USED to administer and manage repository folders, objects, GROUPS, etc. A repository manager provides a way to NAVIGATE through multiple folders and repositories, as well as manage groups and USER permissions.

3.

What is target load order?

Answer»

Target LOAD order also referred to as Target load plan, is generally used to specify the order in which target tables are LOADED by an INTEGRATION service. Based on the source qualifier transformations in a mapping, you can specify a target load order. In Informatica, you can specify the order in which data is loaded into targets when there are MULTIPLE source qualifier transformations connected to multiple targets.

4.

Explain data driven sessions.

Answer»

In Informatica Server, DATA-driven properties DETERMINE how the data should be TREATED when an Update Strategy Transformation is used for mapping. When using an Update Strategy Transformation, it MUST be specified whether you want DD_UPDATE (constant for updating record) or DD_INSERT (constant for inserting record) or DD_DELETE (constant for deleting record). It is possible for mapping to contain more than one Update Strategy Transformation. Thus, a Data-Driven property must be specified in a session property for that specific mapping in order for the session to execute successfully.  

Example: 

  • DD_UPDATE (Any time a record is marked as an update in the mapping, it will be updated in the target.)
  • DD_INSERT (Any time a record is marked as an insert in the mapping, it will be INSERTED in the target.)
  • DD_DELETE (Any time a record is marked as delete in the mapping, it will be deleted in the target.)
5.

Can we store previous session logs in Informatica? If yes, how?

Answer»

Yes, that is possible. The automatically session logout will not overwrite the current session log if any session is running or active in timestamp mode. 

Click on Session Properties –> Config Object –> Log Options. 

The properties should be chosen as follows: 

  • Save session log by –> SessionRuns
  • Save session log for these RUNS –> Set the NUMBER of log files you wish to keep (default is 0)
  • When you want to save all log files generated by each run, you should choose the option Save session log for these runs – > Session TimeStamp.

The properties LISTED above can be found in the session/workflow properties. 

6.

Name the output files that are created by the Informatica server at runtime.

Answer»

 During runtime, the Informatica server creates the following output files:

  • Informatica server LOG: Generally, this type of file is stored in Informatica's home directory and is used to create a log for all status and error MESSAGES (default name: pm.server.log). In addition, an error log can also be generated for all error messages.
  • Session log file: For each session, session log files are created that store information about sessions, such as the initialization PROCESS, the creation of SQL commands for readers and writers, errors encountered, and the load summary. Based on the tracing level that you set, the number of details in the session log file will differ.
  • Session detail file: Each target in mapping has its own load statistics file, which contains information such as the target name and the number of written or rejected rows. This file can be viewed by double-clicking the session in the monitor window.
  • Performance detail file: This file is created by selecting the performance detail option on the session properties sheet and it includes session performance details that can be used to optimize the performance.
  • Reject file: This file contains rows of data that aren't written to targets by the writer.
  • Control file: This file is created by the Informatica server if you execute a session that uses the external loader and it contains information about the target flat file like loading instructions for the external loader and data format.
  • Post-session email: Using a post-session email, you can automatically INFORM RECIPIENTS about the session run. In this case, you can create two different messages; one for the session which was successful and one for the session that failed.
  • Indicator file: The Informatica server can create an indicator file when the flat file is used as a target. This file contains a number indicating whether the target row has been marked for insert, update, delete or reject.
  • Output file: Based on the file properties entered in the session property sheet, the Informatica server creates the target file if a session writes to it.
  • Cache files: Informatica server also creates cache files when it creates the memory cache.
7.

Explain the difference between active and passive transformation.

Answer»

Transformation can be classified into two types: 

  • Active transformation: In this, the number of rows that PASS from the source to the target is REDUCED as it eliminates the rows that do not meet the transformation condition. ADDITIONALLY, it can change the transaction history or ROW type.
  • Passive transformation: Unlike active transformations, passive transformations do not eliminate the number of rows, so all rows pass from source to target without being modified. Additionally, it can MAINTAIN the transaction boundary and row type.
8.

An unconnected lookup can have how many input parameters?

Answer»

An unconnected lookup can INCLUDE numerous parameters. No MATTER how many parameters are ENTERED, the return value will always be one. You can, for instance, put parameters in an unconnected lookup as column 1, column 2, column 3, and column 4, but there is only one return value. 

9.

Write the difference between connected lookup and unconnected lookup.

Answer»

Lookup TRANSFORMATIONS can both be used in connected and unconnected modes. Following is a comparison of the connected and unconnected lookup transformations: 

Connected LookupUnconnected Lookup
Data is directly received as input values from the transformation and also contributes to the data flow.  It does not directly take the values; it only receives them from the RESULT or function of the LKP expression.  
For synchronization, it is connected to the database. No synchronization technique is in place. 
Expressions and other transformations can't be done with it.  Though unconnected lookup does not take input directly from other transformations, it is still USEFUL in any transformation. 
This method cannot be CALLED more than once in a mapping. This method can be called multiple times in a mapping. 
User-defined DEFAULT values are supported. User-defined default values are not supported. 
It supports both dynamic and static cache.  It supports only static cache. 
More than one column value can be returned, i.e., output port. Only one column value can be returned. 
10.

Name different types of transformation that are important?

Answer»

Transformations in Informatica are the repository OBJECTS that transform the source data according to the needs of the target system and ensure that the quality of loaded data is maintained. The following transformations are PROVIDED by Informatica to accomplish specific functionalities: 

  • Aggregator TRANSFORMATION: An active transformation used to compute averages and sums (especially across multiple rows or groups).
  • Expression transformation: A passive transformation suitable for calculating values in one ROW. In addition, conditional statements can be tested before they are written to target tables or other transformations.
  • Filter transformation: An active transformation used for filtering rows in mappings that don't meet the given condition.
  • Joiner transformation: An active transformation joins data from different sources or from the same location.
  • Lookup transformation: In order to get relevant data, a lookup transformation looks up a source, source QUALIFIER, or target. Results of the lookup are returned to another transformation or the target object. Active Lookup transformation returns more than one row, whereas passive transformation returns only a single row.
  • Normalizer transformation: An active transformation used for normalizing records sourced from Cobol sources whose data is usually in de-normalized format. A single row of data can be transformed into multiple rows using it.
  • Rank transformation: An active transformation used to select top or bottom rankings.
  • Router transformation: An active transformation provides multiple conditions for testing the source data.
  • Sorter transformation: An active transformation sorts data according to a field in ascending or descending order. Additionally, to set case-sensitive sorting.
  • Sequence Generator transformation: A passive transformation used to generate numeric values. Each record in the table is uniquely identified by creating unique primary keys or surrogate keys.
  • Source Qualifier transformation: An active transformation reads rows from a flat-file or relational source while running a session and adds them to mapping. Using this tool, Source Data Types are transformed into Informatica Native Data Types
  • Stored Procedure transformation: A passive transformation used to automate time-consuming processes or labor-intensive tasks. Additionally, used to handle errors, determine the database space, drop and recreate indexes, and perform specialized calculations.
  • Update strategy transformation: An active transformation used to update data in a target table, either to maintain its history or incorporate recent updates.
11.

What is Informatica PowerCenter? Write its components

Answer»

ETL tools such as Informatica PowerCenter enable data integration (combining data from different sources into a single dataset). In order to build enterprise data warehouses, it PROVIDES the ability to extract, transform, and load data from HETEROGENEOUS OLTP (Online Transaction Processing) sources systems. US Air Force, Allianz, Fannie MAE, ING, and SAMSUNG are among the top clients using Informatica PowerCenter. There is no doubt that it has a wide range of applications. Informatica PowerCenter 9.6.0 is the latest version AVAILABLE. Informatica PowerCenter is available in the following editions: 

  • Standard edition 
  • Advanced edition 
  • Premium edition 

PowerCenter consists of seven important components: 

  • PowerCenter Service 
  • PowerCenter Clients 
  • PowerCenter Repository 
  • PowerCenter Domain 
  • Repository Service 
  • Integration Service 
  • PowerCenter Administration Console 
  • Web Service Hub 
12.

What is ETL (Extract, transform, Load) and write some ETL tools.

Answer»

ESSENTIALLY, ETL means to extract, transform, and load. The ETL process INVOLVES extracting, transforming, and loading data from different databases into the target database or file. It forms the basis of a data warehouse. Here are a few ETL tools: 

  • IBM Datastage
  • Informatica PowerCenter
  • Abinitio
  • Talend Studio, etc.

It performs the following functions:   

  • Obtains data from sources
  • Analyze, transform, and cleans up data
  • Indexes and SUMMARIZES data
  • Obtains and LOADS data into the warehouse
  • Monitors changes to source data needed for the warehouse
  • Restructures keys
  • Keeps track of metadata
  • Updates data in the warehouse
13.

What do you mean by Enterprise data warehouse?

Answer»

Data warehouses (DW) or Enterprise Data Warehousing (EDW), a form of the corporate REPOSITORY, generally STORE and manage enterprise data and information collected from multiple sources. Enterprise data is collected and made available for analysis, business intelligence, to derive VALUABLE business insights, and to improve data-driven decision-making. Data contained here can be accessed and utilized by users (with privileges) across the ORGANIZATION. With EDW, data is accessed through a single point and delivered to the SERVER via a single source.