Informatica Interview Questions

Informatica Interview Questions

Informatica: Introduction

Informatica is actually a power center. It is widely used as an information, extraction, transformation and loading tool. It is further used to build an enterprise data warehouse. The components, which are located within Informatica help in extracting data from its sources and further using it for business requirements. But in order to use the data for business requirements, you need to convert in firstly. And finally, the data is loaded onto a target data warehouse.

Informatica: Career Prospects

Informatica has got the widespread ability in terms of efficient data processing, data partitioning, and bulk extraction. This altogether contributes to the maximum and quality work at any large-scale organization. Almost maximum of business domains utilize Informatica tools and thus career prospects are endless in the field of Informatica. But in order to land up with a good job in Informatica, you need to successfully crack Informatica interview questions.

Cracking Informatica Interview Questions

Cracking Informatica interview questions is not exactly a rocket science. No doubt since the web world is rapidly evolving, there are more and more challenges about the Informatica field. And you need to understand that you will not be ever able to know it all. What you need to do is brush up your foundation. Try to know and understand your basics. Clear up the basic know-how of Informatica and this will help you to a get an upper hand at the Informatica interview. Also, be open and honest about what you know and what you don’t know. This will give a clear picture in the mind of interviewers. But you must display a learning attitude. Make sure that you always tell the interviewer that you are willing to learn new things and incorporate new skills which will further help you in your personal and professional growth.

Following are the list of some Informatica Interview Questions and their answers.

Download Informatica Interview Questions PDF

Informatica Interview Questions

We can generate sequence numbers by using sequence generator or expression transformation.
We can change a non reusable transformation to reusable transformation by selecting the reusable transformation in the navigation bar and drag it to the mapping and hold the control key just before releasing the transformation then release into mapping.
In the static cache, the cache memory will not refresh even though the record is inserted or updated in the lookup table it will refresh when in next session run.
In dynamic caches, cache memory will refresh as soon as the record is inserted or updated in the look-up table.
Not directly we could not generate reports, but we can generate metadata report by using Informatica Metadata driven reporting Tool.
Yes, there are difficulties I found while working with flat file some of them are:
  1. We can’t use SQL override in flat files instead of that we have to use transformations.
  2. testing the flat files is very uninteresting job
  3. We need to specify the correct path in the session and mention either that file is ‘direct’ or ‘indirect. Keep that file in the exact path which we have specified in the session and we have to keep the file in the exact path as mentioned in the session.
  4. If we miss the link to any column of the target, then all the data will be placed in wrong fields. That missed column won’t exist in the target data file.
ETL(Extract-Transform-Load) itself tell that is extract, transform and load the data to the source to destination for better decision making.if we don’t use ETL tools then we have to do all these things by manually by creating SQL codes that are not possible for an end user that can do only expert programmers.this process was very tedious and cumbersome in many cases because it involved many resources and complex coding.these difficulties are eliminated by ETL Tool because they are very comfortable to use and many other advantages in all stages like visual flow, structured system design, operational resilience, impact analysis, data profiling and cleansing and excellent performance.
Index cache contains all of the port values which satisfies the given condition those are stored in index cache data cache stores port values which are not satisfies the given condition is stored in a data cache.
When the integration service waits for a row from a different input group. Some multiple input group transformations require the integration service to block data at an input group.  A  blocking transformation is a multiple input group transformation that blocks incoming data. Custom transformation and Joiner transformation comes into this category.
Decode is a special function which searches a port for the specified value. It is a better option to use DECODE function when the number of conditions is large because it is less costly compared to IIF function. and it can use in a select statement whereas we can not use IIF Function In select Statement.
A group of sessions executed either Serial Or-Parallel Execution By The Informatica Server is called a batch.
There are two types of batches are theirs.
batch run session one after another is called Sequential batch and batch which run a session at the same is concurrent.
Rank transformation is an active and Connected transformation. it is used to select bottom or top rank of data this means to select the smallest or largest numeric value in a group or port. rank index port used by Informatica server to store the rank position for each record A Designer can create a rank index port automatically for each rank transformation.For Example, if you create a Rank Transformation that ranks the top 10 products for each quarter, then the rank index numbers products from 1-10.
Polling means displaying the updated data about the session while it is running in a window. The monitor window displays the status of each session when you poll the Informatica server.
Transaction Control is connected and an active transformation and it is used to control the rollback and commit of transactions .we can specify the transaction depending on varying number of input rows.It is used in two levels mapping and session levels.
Fact table is the centralized table in a star schema that’s why it is also called central table. Fact table mainly contains the facts nothing measurement (values)that is related to the data in the dimension table. it typically has two types of columns one contains facts(values) and another contains foreign keys to dimension tables. a composite key that is made by all of its foreign keys is acts as a primary key of a fact table.
A Data Mart Is a simple form of a data warehouse that is focused on a single subject or functional areas such as sales or finance or marketing.
A Data warehouse is a collection of multiple functional is the central unit which is made by combining all the data marts