Wednesday, September 25, 2013

How to Unit Test OBIA Informatica Changes

I've started getting into doing a bit of development with Informatica on our OBIA install (for reference, we are using OBIA 7.9.6.3 + Extension pack). Part of the "best practices" setup by our Oracle Consultant was to create a separate sandbox folder anytime we were making a change to an Infomratica mapping. The difficulty I had was how to test the workflow after I made my changes, as I can't run my change via DAC, without creating a separate execution plan, and I can't test directly in Infomratica, as it references a parameter file that exists on the server.

The solution was to extract the command that DAC would run (pmcmd), create a temporary parameter file, and then run my new mapping from my sandbox folder. And as always, I've created a helpful script to automate this!

Continue on to read for an explanation of what my script does, and a copy of said script.


Note, this is probably fairly specific to our OBIA setup, but should be able to be tweaked to work under your setup.

First, the script, infaUnitTest.sh

Click me to view infaUnitTest.sh

Now for the explanation.

The script takes the following parameters:

-u     Infa user name that owns the sandbox folder.
-p     Password for the infa user that owns the sandbox folder.
-s     Name of the Informatica sandbox folder where your files are stored.
-w    Name of the Informatica Workflow to run.
-t     Truncate the target table(s) specified in the workflow.

Parameter File Creation


The script will then go into the DacParameters directory, in my case this is:

/apps/applinfa/infa_910/server/infa_shared/DacParameters

It will then do a search file any parameter files that exist with your workflow name mentioned within the file.
If it finds one, it copies it into a new file (prefixed with your sandbox folder name). It then updates the custom parameter file to replace the existing ORAR1213 references to our sandbox folder.

Truncate Tables

If you had supplied the truncate table option, the script will do some additional steps.

First it will determine the target tables by querying the Informatica repository for your workflow. It does this with the following SQL:

select distinct opb_targ.target_name
from opb_subject join opb_task on opb_subject.subj_id = opb_task.subject_id
     join (
               select workflow_id, instance_id, task_id, task_type, instance_name, max (version_number)
               from opb_task_inst sess
               where sess.task_type = 68
               group by workflow_id, instance_id, task_id, task_type, instance_name
          ) tasks on opb_task.task_id = tasks.workflow_id
     join (
               select session_id, mapping_id, max (version_number)
               from opb_session
               group by session_id, mapping_id
          ) sessions on tasks.task_id = sessions.session_id
     join opb_mapping on sessions.mapping_id = opb_mapping.mapping_id
     join opb_widget_inst on opb_mapping.mapping_id = opb_widget_inst.mapping_id
     join opb_targ on opb_widget_inst.widget_id = opb_targ.target_id
where opb_task.is_visible = 1
  and opb_task.task_type = 71
  and opb_mapping.is_visible = 1
  and opb_targ.is_visible = 1
  and opb_widget_inst.version_number = (select max(owi.version_number) from opb_widget_inst owi where owi.mapping_id = opb_mapping.mapping_id)
  and opb_subject.subj_name = '$SANDBOX'
  and opb_task.task_name = '$WORKFLOW_NAME';

The script then uses this target table list to:

  • Truncate the list of tables.
  • Determine if there are any indexes on the list of tables, and if there are:
    • Extract the DDL for said indxes into a script
    • Drop the said indexes
Next, the script executes pmcmd to run the workflow. It will then pause and wait until the workflow completes.

Finally, if you had previously truncated the table, it will run the DDL to reapply indexes.

You should be able to monitor the run using Workflow Monitor, and then query the tables when it completes to do data validation.

No comments:

Post a Comment