Explore topic-wise InterviewSolutions in .

This section includes InterviewSolutions, each offering curated multiple-choice questions to sharpen your knowledge and support exam preparation. Choose a topic below to get started.

1.

Explain the dimension.

Answer»

DIMENSION tables are tables in a STAR schema of a data warehouse that contains keys, values, and attributes of DIMENSIONS. A dimension table generally contains the DESCRIPTION or textual information about the facts contained within a FACT table. 

2.

What do you mean by the star schema?

Answer»

The star schema comprises ONE or more dimensions and one fact table and is considered the simplest data warehouse schema. It is so-called because of its star-like shape, with RADIAL points radiating from a center. A fact table is at the core of the star and DIMENSION TABLES are at its points. The APPROACH is most commonly used to build data warehouses and dimensional data marts.

3.

What is the importance of partitioning a session?

Answer»

PARALLEL data processing improves the PERFORMANCE of PowerCenter with the Informatica PowerCenter Partitioning Option. With the partitioning option, a LARGE data set can be DIVIDED into smaller parts that can be processed in parallel, which improves overall performance. In addition to optimizing SESSIONS, it helps improve server performance and efficiency. 

4.

Explain complex mapping and write its features.

Answer»

When a mapping contains a LOT of requirements, which are BASED on too many dependencies, it is considered complex. Even with just a few transformations, a mapping can be complex; it doesn't NEED HUNDREDS of transformations. When the requirement has a lot of business requirements and constraints, mapping becomes complex. Complex mapping also encompasses slowly CHANGING dimensions. Complex mapping consists of the following three features.  

  • Large and complex requirements
  • Complex business logic
  • Several transformations
5.

What do you mean by incremental loading in Informatica?

Answer»

Unlike full data loading, where all data is processed every time the data is loaded, incremental data loading involves loading only selective data (either updated or created new) from the source to the TARGET system. This method provides the following benefits: 

  • ETL (Extract, transform, and load) process overhead can be reduced by selectively loading data, reducing runtime overall.
  • Several factors can lead to an ETL load process failing or CAUSING errors. The likelihood of risk involved is reduced by the selective processing of the data.
  • Data accuracy is PRESERVED in the historical record. Therefore, it BECOMES easy to determine the amount of data processed over time.
6.

What are different types of tasks in informatica?

Answer»

Workflow Manager ALLOWS you to CREATE the following types of tasks so that you can design a workflow: 

  • Assignment task: A value is assigned to a workflow variable via this task type.
  • Command task: This task executes a shell command during workflow execution.
  • Control task: It halts or aborts workflow execution.
  • Decision task: It describes a condition to be evaluated.
  • Email task: This is used during workflow execution to send emails.
  • Event-Raise task: This task NOTIFIES Event-Wait about the occurrence of an event.
  • Event-Wait task: It waits for an event to COMPLETE before executing the next task.
  • Session tasks: These tasks are used to run MAPPINGS created in Designer.
  • Timer task: This task waits for an already timed event to occur.
7.

Describe workflow and write the components of a workflow manager.

Answer»

Workflow in Informatica is typically seen as a set of interconnected tasks that all need to execute in a specific order or a proper sequence. In every workflow, a start task, as well as other tasks linked to it, are triggered when it's executed. A workflow represents a business's internal routine practices, GENERATES output data, and performs routine management tasks. The workflow monitor is a component of Informatica that can be used to see how well the workflow is performing. You can create WORKFLOWS both manually and automatically in the Workflow Manager. 

To HELP you develop a workflow, the Workflow Manager OFFERS the following tools:  

  • Task Developer: This tool allows you to create workflow tasks.
  • Worklet Designer: Worklet designer is an option in Workflow Manager which combines (groups) MULTIPLE tasks together to form a worklet. The term worklet refers to an object that groups multiple tasks together. Unlike workflows, worklets don't include scheduling information. It is possible to nest worklets inside workflows.
  • Workflow Designer: This tool creates workflows by connecting tasks to links in the Workflow Designer. While developing a workflow, you can also create tasks in the Workflow Designer.
8.

Explain what is DTM (Data transformation manager) Process.

Answer»

PowerCenter Integration Service (PCIS) started an operating system PROCESS, known as DTM (Data Transformation MANAGER) process or pmdtm process to run sessions. Its primary ROLE is CREATING and managing THREADS responsible for carrying out session tasks. Among the tasks performed by DTM are:   

  • Read the session information
  • Form dynamic partitions
  • Create partition groups
  • Validate code pages
  • Run the processing threads
  • Run post-session operations
  • Send post-session email
9.

Write difference between stop and abort options in workflow monitor.

Answer»
STOP optionABORT option
It executes the session TASKS and allows another task to run simultaneously.  This fully terminates the currently running task.  
It will stop the integration services from READING the data from the source file. It waits for the services to be completed before taking any action. 
Processes data either to the source or to the target.  It has a 60-second timeout. 
The data can be WRITTEN and committed to the targets with this option. There are no indications of such commitment.  
In other WORDS, it doesn't kill any processes, but it does stop processes from sharing resources.  It ends the DTM (Data Transformation Manager) process and terminates the active session. 
10.

What is the difference between SQL Override and Lookup Override?

Answer»
Lookup OverrideSQL Override
By using Lookup Override, you can avoid scanning the WHOLE table by limiting the number of lookup rows, thus saving time and cache.  By using SQL Override, you can limit how many rows come into the mapping pipeline. 
By default, it APPLIES the "ORDER By" clause.  When we need it, we need to manually add it to the query. 
It only supports one kind of JOIN i.e., non-Equi join. By writing the query, it can perform any kind of 'join'.
Despite finding multiple records for a single CONDITION, it only provides one.This is not possible with SQL Override. 
11.

Explain tracing level.

Answer»

In Informatica, TRACING levels determine how much data you want to write to the session LOG as you execute a workflow. Informatica's tracing level is a very IMPORTANT component as it aids in error analysis, locates bugs in the process, and can be set for EVERY transformation. Each transformation property WINDOW contains an option for tracing level. As shown below, there are different types of tracing levels:  

12.

What is the difference between Router and Filter?

Answer»

Router and FILTER are types of transformations offered by INFORMATICA. There are a few differences between them as given below:

Router transformationFilter transformation
Using router transformation, rows of data that don't meet the conditions are captured to a default output group.In this, data is tested for one condition, and rows that don't meet it are removed from the filter. 
It allows RECORDS to be divided into multiple groups based on the conditions specified. It doesn’t take CARE of the division of records. 
This transformation has a single input and multiple output group transformations. This transformation has a single input and a single output group transformation. 
There can be more than one condition specified in a router transformation. A single filter condition can be specified in filter transformation.
Input rows and failed records are not blocked by the router transformation. There is a possibility that records GET blocked in a filter transformation. 
13.

What do you mean by mapplet in Informatica?

Answer»

A mapplet is a reusable object that contains a SET of transformations and is usually created using mapplet DESIGNER. Using it, you can reuse transformation logic ACROSS multiple MAPPINGS. Below are two TYPES of mapplets:  

  • Active mapplet: This mapplet is created using an active transformation.
  • Passive mapplet: This mapplet is created using a passive transformation.
14.

What is pmcmd command? How to use it?

Answer»

<P>The Informatica features are accessed via four built-in command-line programs as given below: 

  • pmcmd: This command allows you to complete the following tasks:
    • Start workflows.
    • Start workflow from a specific task.
    • Stop, Abort workflows and Sessions.
    • Schedule the workflows.
  • infacmd: This command will let you access Informatica application services.
  • infasetup: Using this command, you can complete INSTALLATION tasks such as defining a node or a domain.
  • pmrep: By using this command, you can list repository objects, create, edit and delete GROUPS, or restore and delete REPOSITORIES. Overall, you can complete repository administration tasks.

In Informatica, a PMCMD command is used as FOLLOWS: 

  • Start workflows 
    • pmcmd startworkflow -service informatica-integration-Service -d domain-name -u user-name -p password -f folder-name -w workflow-name 
  • Start workflow from a specific task
    • pmcmd startask -service informatica-integration-Service -d domain-name -u user-name -p password -f folder-name -w workflow-name -startfrom task-name 
  • Stop workflow and task
    • pmcmd stopworkflow -service informatica-integration-Service -d domain-name -u user-name -p password -f folder-name -w workflow-name 
    • pmcmd stoptask -service informatica-integration-Service -d domain-name -u user-name -p password -f folder-name -w workflow-name task-name 
  • Schedule the workflows 
    • pmcmd scheduleworkflow -service informatica-integration-Service -d domain-name -u user-name -p password -f folder-name -w workflow-name 
  • Aborting workflow and task
    • pmcmd abortworkflow -service informatica-integration-Service -d domain-name -u user-name -p password -f folder-name -w workflow-name 
    • pmcmd aborttask -service informatica-integration-Service -d domain-name -u user-name -p password -f folder-name -w workflow-name task-name 
15.

What is the difference between static and dynamic cache?

Answer»
Static CACHEDynamic Cache
Caches of this type are GENERATED once and re-used throughout a session. During the session, data is continuously inserted/updated into the dynamic cache.
As our static cache cannot be inserted or updated during the session, it remains unchanged. The dynamic cache changes as we can add or update data into the LOOKUP, then pass it on to the target.  
Multiple matches can be handled in a static cache.  Multiple matches can't be handled in the dynamic cache.
You can use both flat-file lookup types as well as relational lookup types.  It can be used with relational lookups.
It is possible to use relational operators such as =&=. The dynamic cache supports only the = OPERATOR
In both unconnected and CONNECTED lookup transformations, a static cache can be used.You can use the dynamic cache only for connected lookups. 
16.

What are different lookup caches?

Answer»

There are different TYPES of Informatica lookup CACHES, such as STATIC and DYNAMIC. The following is a list of the caches: 

  • Static Cache
  • Dynamic Cache
  • Persistent Cache
  • Shared Cache
  • Reached