top button
Flag Notify
    Connect to us
      Site Registration

Site Registration

Informatica Powercenter - concurrent target instances

0 votes
320 views

We have a situation where we have a different execution order of instances of the same target being loaded from a single source qualifier. We have a problem when we promote a mapping from DEV to TEST when we execute in TEST after promoting there are problems.

For instance we have a router with 3 groups for Insert, Update and Delete followed by the appropriate update strategies to set the row type accordingly followed by three target instances.

RTR ----> UPD_Insert -----> TGT_Insert
      \
       \__>   UPD_Update -------> TGT_Update
        \
         \__>  UPD_Delete --------->  TGT_Delete

When we test this out using data to do an insert followed by an update followed by a delete all based on the same primary key we get a different execution order in TEST compared to the same data in our DEV environment.

Does anyone have any thoughts?

posted Jun 23, 2014 by Sachin

Share this question
Facebook Share Button Twitter Share Button LinkedIn Share Button

1 Answer

0 votes

You can not control the load order as long as you have a single source. I you could separate the loads to use separate sources the target load order setting in the mapping could be used, or you could even create separate mappings for them. As it is now you should use a single target and utilize the update strategy transformation to determine the wanted operation for each record passing through. It is then possible to use a sort to define in what order the different operations is made to the physical table.

answer Jun 26, 2014 by Shweta Singh
Similar Questions
+2 votes

I know Informatica Cloud has a JSON target connector and I know you can build a JSON source in B2B DT Studio for PowerCenter, but don't see in the version 9.0 documentation that PowerCenter can write directly to a JSON target. Is it a feature in later or future releases of PowerCenter?

+1 vote

In Informatica's PowerCenter, is it possible to set a parameter's value based on the output of a stored procedure? The parameter I want to set is a parameter I defined in a flat-file data object.

0 votes

I have some files that I would like to consolidate into a single database table. The files have similar but different formats. The files look something like this:

FileOne:
•ColA : string
•ColB : string
•ColC : string

FileTwo:
•ColAA : string
•ColBB : string
•ColCC : string

FileThree:
•Col01 : string
•Col02 : string
•Col03 : string

The destination table looks like this:

TableDestination:
•ColFirst : string
•ColSecond : string
•ColThird : string

I want to develop a mapping that ETLs these three files into this one database, but because the column names are different, it looks like I'll have to develop three different mappings, or three different sources, or three different somethings. The problem is that my example is contrived: I actually have many different files that all have different formats and column names, but the data is all very similar.

I would like to develop a single mapping or workflow that can handle all of this by only adding a table that holds the column mappings. Such a table would look like this based on the sample files and sample table above:

TableMappings:
enter image description here

In this way, to edit a column mapping I only have to make an edit this this TableMappings table. I wouldn't have to make any changes at all to the mapping or workflow. Nor would I have to redeploy an application.

What would a mapping or workflow look like that could take advantage of something like this? I assume there'd be a flat file source that takes files from a folder. There would be something in the middle that uses this TableMappings table to map column names. Finally there would be a relational data object that represents my destination database table "TableDestination". I don't know how to put this together though.

0 votes

What are the services which form the Informatica PowerCenter processing engine?

...