Tuesday, July 01, 2014

dynamic lookup cache enabled with 2 matched new conditions but newLookupRow is 0

The issue has been resolved following the tips below:-
Dynamic lookup error

Here are common mistakes when configuring a dynamic lookup for insert/update functionality:
  • Associated ports are not linked to the correct input port
  • Datatype mismatches between the dyn lookup in PowerCenter and the database table
  • "Insert else Update" is not selected on the dyn lookup properties tab
  • The lookup/output ports in the dynamic lookup are not sent out of the lookup (usually to a router).  Some people send the input ports out, or have the router get the data from the preceeding transformation.  Unless the dynamic lookup sends the lookup/output port out, it will not detect a change.
  • Input of a column that will always result in an "update" scenario.  For instance, if your target table has a column named "CREATE_DATE" which gets populated with the SYSDATE at the time that the record was loaded, do NOT send the CREATE_DATE column into the dynamic lookup.  The CREATE_DATE, by definition, will be different at the time that the mapping is running from the data that is already loaded into the database.  The CREATE_DATE should be carried from the transformation directly preceeding the dyanmic lookup into the router, skipping over the dynamic lookup.


9 times out of 10, most developers have tripped up and configured the dynamic lookup incorrectly per the above.



In the scenario where the dyanmic lookup is always resulting in an update flag, look at the following:
  • data type mismatches between the lookup and the database
  • associated ports are mismatched
  • input of a column that will always result in an "update."


In the unlikely scenario that all of the options above are configured correctly, but you are still getting all "update" flags from the newLookupRow, I would bet that you are doing some type of mathematic calculation on a numeric column, frequently this is some sort of dollar calculation. In some cases of mathematical calculation, a rounding error becomes an issue. For example, the database is storing $25.58 for the REVENUE column and you believe that the mapping is inputting $25.58 into the dynamic lookup, but in reality due to the way that the number is stored in the PowerCenter engine, the dynamic lookup is actually seeing $25.58000000000000001. Therefore, you are getting newLookupRow = 2. There are a couple of different ways to fix this issue and they can be found my searching my.informatica.com. Many times, enabling high precision arithmetic will do the trick. Sometimes the issue is caused by an IIF statement that looks like this:

IIF(COUNTRY='USA', REVENUE, 0)

when in reality, it should be:

IIF(COUNTRY='USA', REVENUE, 0.00)



There are also some tricks with using the scientific notation for zero (0e0) that you may want to look into.



I have been working with PowerCenter for many, many years and have always found that alleged 'bugs' with the dyanmic lookup to be attributed to mis-configuration or rounding errors.

Labels:

Monday, June 10, 2013

HOW TO FIND WHETHER A STRING IS NUMERIC OR NOT IN ORACLE


REGEXP_LIKE(my_col1, '[[:alnum:]]')
To test a string for numeric characters, you could use a combination of the LENGTH, TRIM, AND TRANSLATE functions built into Oracle.

You can use the following command:

LENGTH(TRIM(TRANSLATE(string1, ' +-.0123456789', ' ')))

string1 is the string value that you are testing

This function will return a null value if string1 is numeric.
It will return a value "greater than 0" if string1 contains any non-numeric characters.

For example,

LENGTH(TRIM(TRANSLATE('123b', ' +-.0123456789',' '))); would return 1
LENGTH(TRIM(TRANSLATE('a123b', ' +-.0123456789',' '))); would return 2
LENGTH(TRIM(TRANSLATE('1256.54', ' +-.0123456789',' '))); would return null
LENGTH(TRIM(TRANSLATE ('-56', ' +-.0123456789',' ')));

Labels:

HOW TO CHECK FOR A TABLE IN A PARTICULAR TABLESPACE BEFORE DROPPING IN ORACLE


select
tab.table_name,tab.tablespace_name,tab.avg_row_len ROWCHN,
obj.created CREATEDON,obj.last_ddl_time DDLCHNGON,tab.last_analyzed ANALYSEDON,obj.timestamp TIMESTAMP
from all_tables tab,all_objects obj
where
tab.tablespace_name = 'REGN_BIG1_DAT_LM'
and
obj.object_type = 'TABLE'
and object_name = tab.table_name

Labels:

HOW TO ESTIMATE THE SIZE OF A TABLE IN BYTES IN ORACLE

To find the size of bytes allocated to a table:

sql > analyze table emp compute statistics;

sql > select num_rows * avg_row_len "Bytes Used" from dba_tables where table_name = 'TEST';

Bytes Used
-----------
560

sql > select bytes "Bytes Allocated" from dba_segments where segment_name = 'TEST';


Bytes Allocated
----------
524288

The result of the query shows that the emp table is using 560 bytes of the 524,288 bytes allocated to it.

Labels:

HOW TO FIND ALL GRANTS GIVEN TO A SPECIFIC USER (GRANTEE_USER) IN ORACLE

select * from dba_tab_privs where grantor = 'GRANTOR_USER'
and grantee = 'GRANTEE_USER'


Labels:

How to DROP multiple columns in Oracle

ALTER TABLE MY_TABLE
DROP (
MY_COL1,
MY_COL2
);

Labels:

How to search in a CLOB field in Oracle

SELECT * FROM MYTABLE WHERE dbms_lob.INSTR(CLOBDATA, 'C00001880386') >0

Labels:

How to use CASE in WHERE clause Oracle

SELECT * FROM parts p
WHERE
case when slogic = 'begins' and partnum like sPartStrp||'%'
     then 1
     when slogic = 'contains' and partnum like '%'||spartStrp||'%'
     then 1
     when slogic = 'equals' and partnum = sPartStrp
     then 1
     when partnum like sPartStrp || '%'
     then 1
     else 0
 end = 1
and p.stock_type = 1

Labels:

Wednesday, April 03, 2013

How to check whether PowerExchange Listener is running or not from PowerExchange Navigator?


It can be checked using DTLREXE utility.

c:\Informatica\PowerExchange9.1.0>dtlrexe loc=MainFrameDEV prog=ping
PWX-00750 DTLREXE Input LOC=MainFrameDEV, PROG=PING, PARMS=, UID=<>.
PWX-33304 10.*.*.*:* : taskid: 0, operation timeout: (none) secs, heartbeat interval: 30 secs
PWX-00755 DTLREXE Command OK!

Pick up the loc from dbmover.cfg.

Labels:

How to get the filename associated with various data maps in mainframe using Informatica PowerExchange?



Create dtlurdmo.ini as follows:-
OUTPUT D:\MY_PWX_SCRIPTS\TGT;
USER mypwxuser;
PWD mypwxpassword;
TARGETUSER mypwxusertgt;
TARGETPWD mypwxpasswordtgt;
SOURCE MVSDEV;
TARGET local;
DETAIL;
VALIDATE;
DM_COPY;

Run it as:-
c:\Informatica\PowerExchange9.1.0>dtlurdmo D:\MY_PWX_SCRIPTS\dtlurdmo.ini

Check Detail.logs in PWX installation directory for the filenames.

Labels:

Monday, January 21, 2013

HowToOracle:-How to insert a xml in a CLOB field in a Oracle table

Step 1:- Create a sample table as follows:-
create table my_clob_table ( id VARCHAR2(32), data CLOB);

Step 2:-
Create a control file as follows:-
test.ctl
-----------------

LOAD DATA
INFILE *
append
   INTO TABLE my_clob_table
   FIELDS TERMINATED BY ','
   (
id           CHAR(32),
clob_filename FILLER char,
  data         LOBFILE(clob_filename) TERMINATED BY EOF
)
begindata
eeeeee8a525e4e0e4e1,C:\TESTFOLDER\1810358.xml

Run sqlldr as:-
sqlldr userid=dbuser/dbpassword@orcl control=C:\TESTFOLDER\test.ctl

** Here C:\TESTFOLDER\1810358.xml is your xml which you want to insert in my_clob_table.data field.

 Click for more information :-

Labels:

Monday, August 20, 2012

Source System Analysis

Step – 1 - Go through the BRS or business requirement specification and then:-
1. Determine whatever attributes business is asking to be in datawarehouse is already existing in datawarehouse or not. Sometimes it happens that you might be extracting that attribute in the datawarehouse.
2. Determine what can be your most relevant source. Most of the times, you are supposed to get the required attributes from a single source or some from source A and some from source B. But sometimes you may find that some attributes can be extracted from source A as well as source B.At that time you need to consider which source you are going to use. You may consider that some source is already existing in your dawarehouse to use that.

Step – 2 – Meet the source system to know the following:-
1. What are the source tables needed. Primary key of the table. If more than one tables needed to fetch data of a specfic flow what is the relation between the table ( PK-FK).
2. The total data volume the source get in the source tables. Is there any update of not. What is determining column to know whether a record in that table is an update/insert. What is the data field to know when that insertion/updation happened.
3. What will be volume of data in datawarehouse in case of incremental load or in case of initial load.
4. What is the retention period of the table data for source system?
5. Know the business meanings of the entities. Collect the data dictionary of they have. Find our the list of values if any.
6. Do they get batch.

Step – 3 – Check back with your requirement which are :-
1. Did you get all the attributes from source system as per the business requirement specification? If not, complete the step -1 once again and when you are satisfied with the meeting with the source system that there is nothing more information to extract from source system you can send your gap analysis to the business team. Gap analysis tells about some attributes which are not present in the source system so that business personals are aware what they are expecting to get.

Step – 4 – Hands on the table and attributes.
1. Get the access of the source tables from where you are to extract from source system DBAs.
2. Do some random check on data like duplicates in date columns.
3. How much nulls or junks in some attributes.Are the attributes from the LOVs or not etc.
4. Relationship among tables. It is basically verifying whatever information you have got from the source system is correct or not. In case of any doubt anywhere perform step 2 and step 3 again.

Step -5 – Start High Level design of Staging Tables:-
1. At first, determine whether you require new staging table or you can accomodate the new attributes to some existing tables.
2. If you are going the ammend the new attributes in some existing table
3. In case of new staging table:-
a. Are you about to truncate and re-load?
b. Or the table will be append only?
c. Which datatype you prefer for the fields?
d. Do you want any constraint to there?
e. What about pk/fk or indexes?
f. Are you going to pull all columns as is from source or only fields you need as per BRS.
g. Did you think about surrogate key?
Recommended: Keep you staging table as similar as the source flat data/table.

Step -6 – Start High Level design of DW Tables:-
1. Put all the columns which will be more accessed at the beginning.
2. Try to put the date columns together in date format.
3. Try always to assign some default values instead of leaving a column value as NULL.
4. Determine your partitioning and index strategy as per the query to be performed on this table.
5. Determine the retention time for the table.
6. To make DW table with richer information and easier reporting/BI, replace any coded values to LOV.

Labels: