Você está na página 1de 23

5/29/12

Share Report Abuse

Sandy's DataStage Notes


Next Blog Create Blog Sign In

Sandy's DataStage Notes


This blog is dedicated to all person who shared the information that help us a lot. Some of the information (and mostly :D) is collected from notes, documents, forums or blogs where I can't tell it one by one, because the main purpose is used for my personal notes for every documents that I'd found when learning this great stuff. BIG thanks for all the knowledge that had been shared, success to all of you.

Email address... FRIDAY, FEBRUARY 3, 2012 LABELS


configuration (8) dan_l (1) datastage (37) datastage 8.5 (1) ds_admin (5) error (4) etl_concept (2) faq (6) functions (2) general (2) locking (2) logging (7) partition (1) readonly (1) repository (7) restart (1) rob vonk (2) tips (9) unix (8) vincent_mcburney (5)

Submit

Ten Reasons Why You Need DataStage 8.5


Source: it.toolbox.com - Vincent I have taken a look through the new functions and capabilities of DataStage 8.5 and come up with a top ten list of why you should upgrade to it. Information Server 8.5 came out a couple weeks ago and is currently available on IBM Passport Advantage for existing customers and from IBM PartnerWorld for IM partners. The XM L pack described below is available as a separate download from the IBM Fix Central website. This is a list of the ten best things in DataStage 8.5. M ost of these are improvements in DataStage Parallel Jobs only while a couple of them will help Server Job customers as well.

1. ITS FASTER
Faster, faster, faster. A lot of tasks in DataStage 8.5 are at least 40% faster than 8.1 such as starting DataStage, opening a job, running a Parallel job and runtime performance have all improved.

2. IT' IS NOW AN XML ETL TOOL


Previous versions of DataStage were mediocre at processing XM L. DataStage 8.5 is a great XM L processing tool. It can open, understand and store XM L schema files. I did a longer post about just this pack in New Hierarchical Transformer makes DataStage great a X ML Tool and if you have XM L files without schemas you can follow a tip at the DataStage Real Time blog: The new XM LPack in 8.5.generating xsds. The new XM L read and transform stages are much better at reading large and complex XM L files and processing them in parallel:

MY BLOG LIST
DSX c hange
Seeking a best solution in DataStage 5 hours ago

IBM InfoSphere DataStage


New featur es and changes for IBM InfoSpher e Infor mation Ser ver , V er sion 8.7 4 months ago

Rob V onk's Site Data Protec tion in Cloud and Hosted environments - Blog: Dan E. Linstedt Bey eNETW ORK Datastage TutorialsDatastage ETL Tool

SEARCH THIS BLOG Search 3. TRANSFORMER LOOPING


The best Transformer yet. The DataStage 8.5 parallel transformer is the best version yet thanks to new functions for looping inside a transformer and performing transformations across a grouping of records. With looping inside a Transformer you can output multiple rows for each input row. In this example a record has a company name and four revenue sales figures for four regions the loop will go through each column and output a row for each value if it is populated:

FOLLOWERS

datastage-notes.blogspot.in

1/23

5/29/12

Sandy's DataStage Notes


Join this site
w ith Google Friend Connect

BLOG ARCHIVE
2012 (1) February (1)

Transformer Remembering DataStage 8.5 Transformer has Remembering and key change detection which is something that ETL experts have been manually coding into DataStage for years using some well known workarounds. A key change in a DataStage job involves a group of records with a shared key where you want to process that group as a type of array inside the overall recordset. I am going to make a longer post about that later but there are two new cache objects inside a Transformer SaveInputRecord() and GetSavedInputRecord(0 where you can save a record and retrieve it later on to compare two or more records inside a Transformer. There are new system variables for looping and key change detection - @ITERATION, LastRow() indicates the last row in a job, LastTwoInGroup(InputColumn) indicates a particular column value will change in the next record. Here is an aggregation example where rows are looped through and an aggregate row is written out when the key

Ten Reasons Why You Need DataStage 8.5 2011 (43)

ABOUT ME
sandy .ph I'm enthusiastic

DataWarehouse Consultant and Business Intelligence Developer and really love to discuss anything about DWH and BI world. View my complete profile

changes:

4. EASY TO INSTALL
Easier to install and more robust. DataStage 8.5 has the best installer of any version of DataStage ever. M ind you I jumped aboard the DataStage train in version 3.6 so I cannot vouch for earlier installers but 8.5 has the best wizard, the best pre-requisite checking and the best recovery. It also has the IBM Support Assistant packs for Information Server that make debugging and reporting of PM Rs to IBM much easier. There is also a Guide to M igrating to InfoSphere Information Serve 8.5 that explains how to migrate from most earlier versions. See my earlier blog post Why Information Server 8.5 is Easier to Install than Information Server 8.1. Patch M erge thats right, patch merge. The new installer has the ability to merge patches and fixes into the install for easier management of patches and fixes.

5. CHECK IN AND CHECK OUT JOBS


Check in and Check out version control. DataStage 8.5 M anager comes with direct access to the source control functions of CVS and Rational ClearCase in an Eclipse workspace. You can send artefacts to the source control system and replace a DataStage component from out of the source control system.

DataStage 8.5 comes with out of the box menu integration with CVS and Rational ClearCase but for other source control systems you need to use the Eclipse source control plugins.

6. HIGH AVAILABILITY EASIER THAN EVER


High Availability the version 8.5 installation guide has over thirty pages on Information Server topologies including a bunch of high availability scenarios across all tiers of the product. On top of that there are new chapters for the high availability

datastage-notes.blogspot.in

2/23

5/29/12
Horizontal and vertical scaling and load balancing. Cluster support for WebSphere Application Server.

Sandy's DataStage Notes


of the metadata repository, the services layer and the DataStage engine.

Cluster support for XM ETA repository: DB2 HADR/Cluster or Oracle RAC. Improved failover support on the engine.

7. NEW INFORMATION ARCHITECTURE DIAGRAMMING TOOL


InfoSphere Blueprint Direct DataStage 8.5 comes with a free new product for creating diagrams of an information architecture and linking elements in the diagram directly into DataStage jobs and M etadata Workbench metadata. Solution Architects can draw a diagram of a data integration solution including sources, Warehouses and repositories.

8. VERTICAL PIVOT
There are people out there who have been campaigning for vertical pivot for a long time you know who you are! It is now available and it can pivot multiple input rows with a common key into output rows with multiple columns. Key based groups, columnar pivot and aggregate functions. You can also do this type of vertical pivoting in the new Transformer using the column change detection and row cache but the Vertical pivot stage makes it easier as a specialised stage.

9. Z/OS FILE STAGE


M akes it easier to process complex flat files by providing native support for mainframe files. Use it for VSAM files KSDS, ESDS, RRDS. Sequential QSAM , BDAM , BSAM . Fixed and variable length records. Single or multiple record type files.

10. BALANCED OPTIMIZER COMES HOME


In DataStage 8.5 the Balanced Optimizer has been merged into the Designer and it has a number of usability improvements that turns DataStage into a better ETLT or ELT option. Balanced Optimizer looks at a normal DataStage job and comes up with a version that pushes some of the steps down onto a source or target database engine. IE it balances the load across the ETL engine and the database engines. Version 8.5 has improved logging, improved impact analysis support and easier management of optimised versions of jobs in terms of creating, deleting, renaming, moving, compiling and deploying them.

datastage-notes.blogspot.in

3/23

5/29/12

Sandy's DataStage Notes

Posted by sandy.ph at 10:08 AM Labels: datastage 8.5

0 comments

Recommend this on Google

WEDNESDAY, SEPTEMBER 21, 2011

UNIX Script to execute DataStage job


From Kenneth Bland:
#/i/s !bnkh #### ### ######################################## ######################################## #### ### ####FL:MseCnrlkh ### IE atroto.s #### ### ####DSRPIN Sat aDtSaeMseCnrltp jbpsig ### ECITO: trs aatg atroto ye o asn #### ### alrnieprmtrvle l utm aaee aus #### ### #### ### ####Dt ### ae Vrin eso Dvlpr eeoe Dsrpin ecito ####----- ------------ -----------------### ----- ---------- -----------------####20-51 10 ### 020-5 . KnBad e ln Iiilrlae nta ees ####20-62 22 ### 020-7 . KnBad e ln FlStaePoesaemdfctos ieeDt/rcsDt oiiain ####20-71 22 ### 020-0 . SeeBye tv oc AddPoesaea 1t prmtr de rcsDt s 4h aaee ####20-81 23 ### 020-6 . SeeBye tv oc Nwclssoe poeue o al trd rcdr #### ### GTNWBTHNRi dtsaeshm E_E_AC_B n aatg cea #### ### isedo drvn i adisrig nta f eiig t n netn #### ### hr. ee #### ### Ue cmQPu.s adcmLQ.s ss oSLlskh n oPSLkh #### ### isedo SLlstbkhkh nta f QPuSu.s.s #### ### PO=bsnm $0` RG`aeae {} EI_TTS0 XTSAU= NW`ae O=dt` eh "{O}$PO}Iiilzto.. co $NW {RG ntaiain." eh co

datastage-notes.blogspot.in

4/23

5/29/12

Sandy's DataStage Notes


#### ### ####CNIUAIN############################# ### OFGRTO ############################# #### ### i [$# -e1 ] te f {} n 4 ; hn eh "{O}$PO}:Ivldprmtrls. co $NW {RG nai aaee it" eh "{O}$PO}:Tesrp nes1 prmtr: co $NW {RG h cit ed 4 aaees" eh "{O}$PO}: co $NW {RG Jbae oNm" eh "{O}$PO}: co $NW {RG Prmtrie aaeeFl" eh "{O}$PO}: co $NW {RG FlStae(YYM-D" ieeDt YY-MD) eh "{O}$PO}: co $NW {RG Bthubr acNme" eh "{O}$PO}: co $NW {RG JbirrhFl" oHeacyie eh "{O}$PO}: co $NW {RG SucSseLs" oreytmit eh "{O}$PO}: co $NW {RG Sbetrait ujcAeLs" eh "{O}$PO}: co $NW {RG Cerokra laWrAe" eh "{O}$PO}: co $NW {RG SatnMlsoe trigietn" eh "{O}$PO}: co $NW {RG EdnMlsoe nigietn" eh "{O}$PO}: co $NW {RG Dbgoe euMd" eh "{O}$PO}: co $NW {RG JbikttsiCekFl" oLnSaitchcsie eh "{O}$PO}: co $NW {RG RsretoFl" eurcLgie eh "{O}$PO}: co $NW {RG Poesae(ULYY-MD H4M:S" rcsDt NL|YYM-D 2:IS) ei 9 xt 9 f i Jbae"{} oNm=$1" Prmtrie"{} aaeeFl=$2" FlStae"{} ieeDt=$3" Bthubr"{} acNme=$4" JbirrhFl=$5" oHeacyie"{} SucSseLs=$6" oreytmit"{} Sbetrait"{} ujcAeLs=$7" Cerokra"{} laWrAe=$8" SatnMlsoe"{} trigietn=$9" EdnMlsoe"{0" nigietn=$1} Dbgoe"{1" euMd=$1} JbikttsiCekFl=$1} oLnSaitchcsie"{2" RsretoFl=$1} eurcLgie"{3" Poesae"{4" rcsDt=$1} eh "{O}$PO}Jbae$Jbae" co $NW {RG oNm {oNm} eh "{O}$PO}Prmtrie$Prmtrie" co $NW {RG aaeeFl {aaeeFl} eh "{O}$PO}FlStae$FlStae" co $NW {RG ieeDt {ieeDt} eh "{O}$PO}Bthubr$Bthubr" co $NW {RG acNme {acNme} eh "{O}$PO}JbirrhFl $JbirrhFl} co $NW {RG oHeacyie {oHeacyie" eh "{O}$PO}SucSseLs $SucSseLs} co $NW {RG oreytmit {oreytmit" eh "{O}$PO}Sbetrait$Sbetrait" co $NW {RG ujcAeLs {ujcAeLs} eh "{O}$PO}Cerokra$Cerokra" co $NW {RG laWrAe {laWrAe} eh "{O}$PO}SatnMlsoe$SatnMlsoe" co $NW {RG trigietn {trigietn} eh "{O}$PO}EdnMlsoe$EdnMlsoe" co $NW {RG nigietn {nigietn} eh "{O}$PO}Dbgoe$Dbgoe" co $NW {RG euMd {euMd} eh "{O}$PO}JbikttsiCekFl $JbikttsiCekFl} co $NW {RG oLnSaitchcsie {oLnSaitchcsie" eh "{O}$PO}RsretoFl $RsretoFl} co $NW {RG eurcLgie {eurcLgie" eh "{O}$PO}Poesae$Poesae" co $NW {RG rcsDt {rcsDt} eh co #Blwwl lo i teprmtr.n fl t dtrietedrcoypt ec. eo il ok n h aaeesii ie o eemn h ietr ah ah UeI=wom` srD`hai Bnieietr=ct/dhm`bn iFlDrcoy`a .soe/i Lgieietr=ge - Lgieietr $Prmtrie|u - ""-2 oFlDrcoy`rp w oFlDrcoy {aaeeFl}ct d = f` TmFlDrcoy`rp- TmFlDrcoy$Prmtrie|u - ""-2 epieietr=ge w epieietr {aaeeFl}ct d = f` CmoSrpFlDrcoy`rp- CmoSrpFlDrcoy$Prmtrie|u - ""-2 omncitieietr=ge w omncitieietr {aaeeFl}ct d = f` CmoLgieietr=ge - CmoLgieietr $Prmtrie|u - ""-2 omnoFlDrcoy`rp w omnoFlDrcoy {aaeeFl}ct d = f` Lgieae$CmoLgieietr}$PO}$Jbae.o oFlNm={omnoFlDrcoy/{RG_{oNm}lg TMBTHBLG$TmFlDrcoy/{RG_{oNm}satlg EPACNRO={epieietr}$PO}$Jbae_tr.o DTSAERJC=ge - DTSAERJC $Prmtrie|u - ""-2 AATGPOET`rp w AATGPOET {aaeeFl}ct d = f` DSRE=ge - DSRE $Prmtrie|u - ""-2 SEVR`rp w SEVR {aaeeFl}ct d = f` DUEI=ge - DUEI $Prmtrie|u - ""-2 SSRD`rp w SSRD {aaeeFl}ct d = f` DPSWR=ge - DPSWR $Prmtrie|u - ""-2 SASOD`rp w SASOD {aaeeFl}ct d = f` NW`ae O=dt` eh "{O}$PO}UeI $UeI} co $NW {RG srD {srD" eh "{O}$PO}Bnieietr $Bnieietr} co $NW {RG iFlDrcoy {iFlDrcoy" eh "{O}$PO}Lgieietr $Lgieietr} co $NW {RG oFlDrcoy {oFlDrcoy" eh "{O}$PO}TmFlDrcoy$TmFlDrcoy" co $NW {RG epieietr {epieietr} eh "{O}$PO}CmoSrpFlDrcoy$CmoSrpFlDrcoy" co $NW {RG omncitieietr {omncitieietr} eh "{O}$PO}CmoLgieietr $CmoLgieietr} co $NW {RG omnoFlDrcoy {omnoFlDrcoy" eh "{O}$PO}Lgieae$Lgieae" co $NW {RG oFlNm {oFlNm} eh "{O}$PO}TMBTHBLG$TMBTHBLG" co $NW {RG EPACNRO {EPACNRO} eh "{O}$PO}DTSAERJC $DTSAERJC} co $NW {RG AATGPOET {AATGPOET"

datastage-notes.blogspot.in

5/23

5/29/12
eh "{O}$PO}DSRE $DSRE} co $NW {RG SEVR {SEVR" eh "{O}$PO}DUEI $DUEI} co $NW {RG SSRD {SSRD" eh "{O}$PO}DPSWR *rtce* co $NW {RG SASOD Poetd" eh co

Sandy's DataStage Notes

#### ### ####PRMTRBIDWtotbthnme ################# ### AAEE UL ihu ac ubr ################# #### ### i ["{rcsDt} ="UL ] te f $Poesae" NL" ; hn Satietm=dt '%-m% %:M%' trTmsap`ae +Y%-d H%:S` es le Satietm=$Poesae" trTmsap"{rcsDt} f i Prmit"-aa Prmtrie$Prmtrie" aaLs= prm aaeeFl={aaeeFl} Prmit"{aaLs}-aa Poesae\$Satietm}" aaLs=$Prmit prm rcsDt="{trTmsap\" Prmit"{aaLs}-aa FlStae$FlStae" aaLs=$Prmit prm ieeDt={ieeDt} Prmit"{aaLs}-aa JbirrhFl={oHeacyie" aaLs=$Prmit prm oHeacyie$JbirrhFl} Prmit"{aaLs}-aa SucSseLs={oreytmit" aaLs=$Prmit prm oreytmit$SucSseLs} Prmit"{aaLs}-aa Sbetrait$Sbetrait" aaLs=$Prmit prm ujcAeLs={ujcAeLs} Prmit"{aaLs}-aa Cerokra$Cerokra" aaLs=$Prmit prm laWrAe={laWrAe} Prmit"{aaLs}-aa SatnMlsoe$SatnMlsoe" aaLs=$Prmit prm trigietn={trigietn} Prmit"{aaLs}-aa EdnMlsoe$EdnMlsoe" aaLs=$Prmit prm nigietn={nigietn} Prmit"{aaLs}-aa Dbgoe$Dbgoe" aaLs=$Prmit prm euMd={euMd} Prmit"{aaLs}-aa JbikttsiCekFl={oLnSaitchcsie" aaLs=$Prmit prm oLnSaitchcsie$JbikttsiCekFl} Prmit"{aaLs}-aa RsretoFl={eurcLgie" aaLs=$Prmit prm eurcLgie$RsretoFl} #### ### ####GtBthNme adcet ELBTHADTrcr ########### ### e ac ubr n rae T_AC_UI eod ########## #### ### eh "{O}$PO}Aott gtnwBTHNRadisr i it ELBTHADT." co $NW {RG bu o e e AC_B n net t no T_AC_UI.. $CmoSrpFlDrcoy/oPSLkh$Prmtrie "RS""RsrD \ {omncitieietr}cmLQ.s {aaeeFl} IDN IUeI" GTNWBTHNR\ E_E_AC_B "{oNm} \ $Jbae" "{trTmsap"\ $Satietm} "{srD"\ $UeI} "{oreytmit"\ $SucSseLs} "{ujcAeLs} \ $Sbetrait" "{aaLs} \ $Prmit" "{ieeDt} >$TMBTHBLG $FlStae" {EPACNRO} SLEI_TTS$ Q_XTSAU=? ct$TMBTHBLG a {EPACNRO} i ["{Q_XTSAU} ! 0] te f $SLEI_TTS" = ; hn NW`ae O=dt` eh "{O}$PO}Fiuet cnetisr it ELBthAdttbe" co $NW {RG alr o onc/net no T_ac_ui al! ei $SLEI_TTS xt {Q_XTSAU} f i #### ### ####GtBTHNRfo bthnme lgfl ############### ### e AC_B rm ac ubr o ie ############### #### ### Bthubr`rp- BTHNR$TMBTHBLG|u - ""-2 acNme=ge w AC_B {EPACNRO}ct d = f` i [- "{acNme} ] te f z $Bthubr" ; hn NW`ae O=dt` eh "{O}$PO}Fiuet rtiv BTHNRfo $TMBTHBLG" co $NW {RG alr o eree AC_B rm {EPACNRO} ei $SLEI_TTS xt {Q_XTSAU} f i #### ### ####Adbthnme t ls o prmtr ################# ### d ac ubr o it f aaees ################ #### ### Prmit"{aaLs}-aa Bthubr$Bthubr" aaLs=$Prmit prm acNme={acNme} NW`ae O=dt` eh co eh $NW $PO}Prmtrls:$Prmit co {O} {RG aaee it {aaLs} eh co #### ### ####DtSaeEEUIN########################## ### aatg XCTO ########################## #### ### NW`ae O=dt` eh "{O}$PO}EeuigDtSaedjbporm." co $NW {RG xctn aatg so rga.. eh $Bnieietr}djb-evr$DSRE}-sr$DUEI}-asod$DPSWR}-u co {iFlDrcoy/so sre {SEVR ue {SSRD pswr {SASOD rn -at$Prmit $DTSAERJC}$Jbae 2& >$Lgieae wi {aaLs} {AATGPOET {oNm} >1 {oFlNm} eh co

datastage-notes.blogspot.in

6/23

5/29/12

Sandy's DataStage Notes


eh "{iFlDrcoy/so -evr$DSRE}-sr$DUEI}-asod$DPSWR}-u co $Bnieietr}djb sre {SEVR ue {SSRD pswr {SASOD rn -at$Prmit $DTSAERJC}$Jbae 2& >$Lgieae" wi {aaLs} {AATGPOET {oNm} >1 {oFlNm} eh co ea $Bnieietr}djb-evr$DSRE}-sr$DUEI}-asod$DPSWR}-u vl {iFlDrcoy/so sre {SEVR ue {SSRD pswr {SASOD rn -at$Prmit $DTSAERJC}$Jbae 2& > $Lgieae wi {aaLs} {AATGPOET {oNm} >1 > {oFlNm} jbatn=ge "atn frjb."$Lgieae` owiig`rp Wiig o o.. {oFlNm} i ["{owiig"! "atn frjb."] te f $jbatn} = Wiig o o.. ; hn NW`ae O=dt` eh $NW $PO}"aatg fie t sattejb co {O} {RG DtSae ald o tr h o" fiesat1 aldtr= es le NW`ae O=dt` eh $NW $PO}"aatg scesul satdtejb co {O} {RG DtSae ucsfly tre h o" fiesat0 aldtr= f i NW`ae O=dt` eh $NW $PO}"ereigjbifrain co {O} {RG Rtivn o nomto" $Bnieietr}djb-evr$DSRE}-sr$DUEI}-asod$DPSWR}-oif {iFlDrcoy/so sre {SEVR ue {SSRD pswr {SASOD jbno $DTSAERJC}$Jbae > $Lgieae {AATGPOET {oNm} > {oFlNm} #### ### ####CEKSAU ############################## ### HC TTS ############################# #### ### ERR`rp"o Sau"$Lgieae` RO=ge Jb tts {oFlNm} ERR$ERR#\} RO={RO#*( ERR$ERR%)} RO={RO%\* i ["{aldtr} ! 0] te f $fiesat" = ; hn NW`ae O=dt` eh $NW $PO}"h jbfie t sat co {O} {RG Te o ald o tr" Adttts"ALR" uiSau=FIUE Cmet=MseCnrlaotd omns"atroto bre" EI_TTS1 XTSAU= es le i ["{RO} =1- "{RO} =2] te f $ERR" o $ERR" ; hn NW`ae O=dt` eh $NW $PO}"h jbcmltdscesul" co {O} {RG Te o opee ucsfly Adttts"UCS" uiSau=SCES Cmet=" omns" EI_TTS0 XTSAU= es le NW`ae O=dt` eh $NW $PO}"h jbaotd co {O} {RG Te o bre" Adttts"ALR" uiSau=FIUE Cmet=MseCnrlaotd omns"atroto bre" EI_TTS1 XTSAU= f i f i FieJbon=ge - FIE $Lgieietr}$Jbae.o|c-|u -19 aldoCut`rp i ALD {oFlDrcoy/{oNm}lgw lct b-` FieJbon=ep $FieJbon}+0 aldoCut`xr {aldoCut ` eh $NW $PO}Tenme o fie jb i [{aldoCut] co {O} {RG h ubr f ald os s $FieJbon} i ["{aldoCut"! 0] te f $FieJbon} = ; hn NW`ae O=dt` eh $NW $PO}"h jbhdfie poess co {O} {RG Te o a ald rcse" Adttts"ALR" uiSau=FIUE Cmet=MseCnrlhd$FieJbon}fie poess omns"atroto a {aldoCut ald rcse" EI_TTS1 XTSAU= f i SopdoSraCut`rp"O SRA SOPD $Lgieietr}$Jbae.i|c-|u -1 tpeJbtemon=ge JB TEM TPE" {oFlDrcoy/{oNm}hsw lct b9 ` SopdoSraCut`xr$SopdoSraCut +0 tpeJbtemon=ep {tpeJbtemon} ` i ["{tpeJbtemon} ! 0] te f $SopdoSraCut" = ; hn NW`ae O=dt` eh $NW $PO}"h jbsra wsSOpdo KLe" co {O} {RG Te o tem a TPe r ILd Adttts"ALR" uiSau=FIUE Cmet=MseCnrljbsra wsSOpdo KLe" omns"atroto o tem a TPe r ILd EI_TTS1 XTSAU= f i #### ### ####ADT################################# ### UI ################################# #### ###

datastage-notes.blogspot.in

7/23

5/29/12

Sandy's DataStage Notes


eh co eh "{O}$PO}Aott udt ELBTHADTwt sau ifrain." co $NW {RG bu o pae T_AC_UI ih tts nomto.. Edietm=dt '%-m% %:M%' nTmsap`ae +Y%-d H%:S` SLtig"PAEELBTHADTA\ Qsrn=UDT T_AC_UI STAEDTMSAP=T_AE'{nTmsap''YYM-DH2:IS',\ E .N_IETM ODT($Edietm},YY-MD H4M:S) ASAU ='{uiSau},\ .TTS $Adttts' ACMET ='{omns' \ .OMNS $Cmet}, ARNIEETNS='{aaLs} \ .UTMSTIG $Prmit' WEE(.AC_B =$Bthubr)" HR ABTHNR {acNme}; NW`ae O=dt` eh $NW $PO}AdtSL$SLtig co {O} {RG ui Q {Qsrn} SLcitieae$TmFlDrcoy/{RG_{oNm}edsl QSrpFlNm={epieietr}$PO}$Jbae_n.q eh $SLtig >$SLcitieae co {Qsrn} {QSrpFlNm} $CmoSrpFlDrcoy/oSLlskh$Prmtrie IDNIUeI $SLcitieae {omncitieietr}cmQPu.s {aaeeFl} RS RsrD {QSrpFlNm} SLEI_TTS$ Q_XTSAU=? i ["{Q_XTSAU} ! 0] te f $SLEI_TTS" = ; hn NW`ae O=dt` eh $NW $PO}Fiuet cnetudt it ELBthAdttbe co {O} {RG alr o onc/pae no T_ac_ui al! ei $SLEI_TTS xt {Q_XTSAU} f i #### ### ####EI ################################## ### XT ################################# #### ### NW`ae O=dt` eh $NW $PO}Cmlt,eiigwt sau [{XTSAU} co {O} {RG opee xtn ih tts $EI_TTS] ei $EI_TTS xt {XTSAU} Posted by sandy.ph at 6:29 PM Labels: datastage, unix 0 comments Recommend this on Google

Running DataStage from outside of DataStage


Another good article from Vincent M cBurney :

This is a followup from comments on my parameter week post on 101 uses of job parameters. This post is about calling DataStage jobs and the range of job control options. The go to command for interacting with DataStage from the command line or from scripts or from other products is the dsjob command. The documentation for dsjob is buried in the Server Job Developers Guide, it is cunningly placed there to keep Enterprise users, who would never think to read the Server Editon guide, in a state of perpetual bewilderment. I was born in a state of bewilderment so I am in my zone. I am not going to go into the job control API or mobile device job control, refer to your documentation for those options! I will cover the more commonly used methods. Sequence Jobs and the DataStage Director The easiest out of the box job control comes from the DataStage Director product and the Sequence Job. The Sequence job puts jobs in the right order and passes them all a consistent set of job parameters. The DataStage Directory runs the Sequence job according to the defined schedule and lets the user set the job parameters at run time. A lot of additional stages within the Sequence Job provide dynamic parameter setting, after job notification, conditional triggers to control job flow, looping, waiting for files and access to the DataStage BASIC programming language. Third Party Scheduling and Scripting DataStage comes with a scheduling tool, the Director. It provides a front end for viewing jobs, running jobs and looking at job log results. Under the covers it adds scheduled jobs to the operating system scheduler. The main advantage of it over a third party scheduling tool is the job run options screen that lets you enter job parameter values when you schedule the job. In third party scheduling tools you need to set job parameters as you run the job in some type of scripting language. Jobs are executed by scheduling tools using the dsjob command. This command can require a lot of arguments so it is often run via a script or batch file. The mother of all DataStage run scripts can be found in this dsxchange thread. Written by Ken Bland and Steve Boyce it

datastage-notes.blogspot.in

8/23

5/29/12

Sandy's DataStage Notes


will start jobs, set run time parameters from a parameter ini file, check the status of finished jobs, service your car, solve the Da Vinci code and run an audit process after the job has finished. This script is run from a scheduling tool to make the setup of the scheduling easier. The mother of all job run scripts sets parameters that are saved in an ini file. Parameters can also be saved in a database table, with a job extracting the settings to an ini file before a batch run. They can also be stored as environment parameters in a users .profile file. These environment parameters can be passed into the job via a script or they can be accessed directly in the job by adding environment job parameters and setting the value to the magic word $ENV. They can also be stored as project specific environment parameters as we saw during the exhilirating job parameter week, where we brought job parameters to life and struggled to come up with a good theme motto. These job parameters are much like environment parameters but use the magic word $PROJDEF. Job Control Code and Old School DataStage Old school DataStage programmers, those who know who Ardent are and remember the days when you only needed one Developer Guide, will be accomplished at writing job control code. This uses a BASIC programming language based on the Universe database code to prepare, execute and audit jobs. The DataStage BASIC language has better access to jobs the operating system scripts. While an external script has to do everything through the dsjob and dsadmin commands the BASIC language has access to a much larger number of DataStage commands. Like dsjob these commands are cunningly hidden in the Server Job Developers Guide. Before the days of sequence jobs, (DataStage 5?), and before sequence jobs became quite useful in version 7.5 this job control code was far more prevelent and easier to code then job control in external scripts. It was extremely useful at putting jobs in the right sequence, retrieving job parameters from files, checking the results of jobs and shelling out to execute operating system commands. Job control code is still widely used even when external scripts or sequence jobs are in use. They fill in gaps of functionality by providing job auditing, setting dynamic calculated parameter values, checking for files etc etc etc. It is a very powerful language. Also from the dsxchange forum we can find examples of job control code. This time from Arnd: GetJobParameter(ParameterName) EQUATE ProgramName TO 'GetJobParameter' OPENSEQ 'ParameterFile' TO InFilePtr ELSE CALL DSLogFatal('Oh No, cannot open file',ProgramName) Finished = 0 Ans = '' READNEXT InRecord FROM InFilePtr ELSE Finished = 1 LOOP UNTIL Finished FileParameterName = TRIM(FIELD(InRecord,'=',1)) FileParameterValue = TRIM(FIELD(InRecord,'=',2,99)) IF (FileParameterName=ParameterName) THEN Finished = 1 Ans = FileParameterValue END READNEXT InRecord FROM InFilePtr ELSE Finished = 1 REPEAT IF NOT(Ans) THEN CALL DSLogFatal('Could not find value for "':ParameterName:'".',ProgramName) CLOSESEQ InFilePtr What are you comfortable with? People from a Unix background are most comfortable with Unix scheduling tools, .profile environment parameters and running and auditing of jobs from within Unix scripts using the dsjob command. People from database backgrounds like have parameters in database tables and may even put an entire job schedule into a table with dependencies and sequencing. They need a bridge between the database and DataStage so they still need a layer of either Unix scripts or job control code to run the jobs. People from programming backgrounds will be very comfortable with the DataStage BASIC programming language and find it can do just about anything regarding the starting, stopping and auditing of jobs. They can retrieve settings and parameters from files or databases.

datastage-notes.blogspot.in

9/23

5/29/12

Sandy's DataStage Notes


The method I currently prefer is Sequence Jobs for all job dependencies, project specific environment variables for most slowly changing job parameters, some job control routines for job auditing and dynamic parameters and external operating system commands and a dsjob script for starting Sequence Jobs from a third party scheduling tool or from the command line. What I like about project specific environment parameters is that the job can be called up from anywhere without requiring any parameter settings. It can be called up from within the Designer by developers, from ad hoc testing scripts by testers and from third party scheduling tools in production.
Disclaimer: The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.
Posted by sandy.ph at 6:26 PM 0 comments Recommend this on Google

Labels: datastage, tips, vincent_mcburney

THURSDAY, SEPTEMBER 15, 2011

IBM WebSphere DataStage known problems and workarounds


Source : IBM Public Library The currently known limitations, problems, and workarounds for WebSphere DataStage . At time of publication, the following information describes the currently known limitations, problems, and workarounds for WebSphere DataStage. Any limitations and restrictions might or might not apply to other releases of the product. S tarting WebS phere DataS tage clients fails when the Java Virtual Machine is not loaded (eCase 131388) WebSphere DataStage clients do not start when the Java Virtual M achine is not loaded. An error message that begins with the following text is displayed: DtSaeRpstr Err Ual t cet CMABsrie aatg eoioy ro: nbe o rae O S evcs A common cause for this error is that the Java Virtual M achine is not able to allocate the memory that it is initially configured to use. Workaround Change the memory allocation of the JVM by editing the Information_Server_install_dir\ASBNode\conf\proxy.xml file on the client system. Reduce the values of the InitialHeapSize and M aximumHeapSize settings, save the file, and restart IBM Information Server console. Connectivity requires binding the executable to DB2 to use DB2Z in WebS phere DataS tage (eCase 126562) After the IBM Information Server installation is complete, perform these post installation commands to use DB2Z in DataStage. On the server layer, type these commands to bind the executable to DB2: d2cnett dnm ue due uigdpswr b onc o bae sr bsr sn basod d2bn d2eq.n dttm iobokn al b id bzslbd aeie s lcig l d2triae b emnt Use the DataStage Administrator on the WebSphere DataStage client layer to add sqllib/lib to the LIBPATH. XML report is not generated with the enablegeneratexml option (eCase 126768) Operating systems: All The XM L report is not generated with the following command: damn-nbeeeaem TU poetnm sdi ealgnrtxl RE rjc_ae Workaround Contact IBM customer support and request client side patch 126768. Datasets from previous versions are read-only after upgrading WebS phere DataS tage (eCase 118650) Operating systems: All If you upgrade WebSphere DataStage to any other version, you cannot modify data sets that were produced before the upgrade. Parallel datasets created in earlier versions of WebSphere DataStage can be read and deleted. No other operations, such as appending or overwriting, are supported. Workaround Full compatibility is available only between releases of the same version. OCI8TO9.B utility is not supported (eCase 125155) Operating systems: All The WebSphere DataStage Connectivity Guide for Oracle Databases incorrectly states that the OCI8TO9.B utility can be run on WebSphere DataStage release 6.0 and later. The OCI8TO9.B utility is supported for IBM Information Server versions up to version 7.5.3. Workaround If you need to convert an Oracle 8 Project to Oracle 9 or 10, use the OCI8TO9.B utility prior to installing IBM Information Server 8.0.1. Documentation incorrectly lists a Windows directory for the UNIX, Linux user privileges for running parallel jobs (eCase 125997) Operating systems: UNIX, Linux The IBM Information Server Installation, Configuration, and Planning Guide incorrectly includes the /ishome/Server/Datasets

datastage-notes.blogspot.in

10/23

5/29/12

Sandy's DataStage Notes


/ishome/Server/DSEngine/uvtemp in the list of directories that require read, write, and modify access to run parallel jobs. Workaround The correct list of UNIX and Linux directories that require read, write, and modify user access privileges is: /ishome/Server/Scratch /ishome/Server/Datasets /tmp Existing WebS phere DataS tage Japanese projects are not visible in the Project tab after IBM Information S erver 8.0.1 is installed (eCase 111937) WebSphere DataStage projects with Japanese messages are not supported. When using the Teradata connector, the PrimaryKey/PrimaryIndex attribute is not imported (eCase 126375) When using the Teradata connector, the PrimaryKey/PrimaryIndex attribute is not imported when the Share metadata when importing from Connector is disabled. Workaround Use another table import option, such as Plug-in M etadata Definitions or Orchestrate S chema Definitions. Teradata connector jobs with Load Type = S tream option end with error (eCase 125768) Teradata connector jobs with Load Type = Stream option end with the following error: rcie sga SGU eevd inl IBS Workaround Use one of the other load methods, or use the Teradata M ultiLoad plug-in with the TPump option. When you specify the sort option in the Aggregator stage and the data to be sorted is not in order, the job ends without displaying an error. (eCase 120088) WebSphere DataStage WAVES jobs do not run on SUSE Linux for PowerPC . (eCase 120082) In the DataStage parallel canvas, Java Pack jobs fail with the null pointer error. (eCase 120773) M odels that are created by the resource estimation tool are not automatically updated when you modify a job, even if you recompile the job. If you request a projection based on an existing model after a job is modified, the system might generate a runtime error. Click the Continue button to navigate to other models. (eCase 101786) WebSphere DataStage parallel jobs give mapping errors for these characters: 301C, 2014, 2016, 2212, or 00A6. If you require mapping for any of these characters, select the 1999 mapping. (eCase 119793) The dssearch command line utility is not supported. In a Flat File stage job, you must specify a complete path name for the surrogate key state file. If you only specify the name of the surrogate key state file, the job fails with a permission error. (eCase 120830) The dsjob -import option is not supported. (eCase 94401) The WebSphere DataStage version control component is not supported. Released jobs are not supported. Limitation on support for the Turkish locale. For the WebSphere DataStage server canvas, a memory management problem occurs during casing operations when the CType locale category name is set to TR-TURKISH. Do not use this setting. For the WebSphere DataStage parallel canvas, locale support is available only for collation operations. Locale-based case operations are not supported for any locale. For casing operations in the Turkish locale, you cannot set the CType locale category name to tr_TR for parallel jobs. The special casing rules for Turkish letters dotted-I and dotless-I cannot be enforced. The automatic stage validation functionality in the Designer client (enabled by the Show stage validation errors button on the Toolbar) has been dropped. ClickPack is no longer supported. Existing jobs are not supported at runtime. XM L Pack 1.0 is no longer supported. Users of XM L Pack 1.0 must migrate to XM L Pack 2.0. Information about compatibility with earlier versions is provided below: XM L Pack 2.0 is not compatible with XM L Pack 1.0. Consequently, table definitions that are created by the previous version of the XM L Pack cannot be reused with the new version. Create new table definitions by using the new XM L M eta Data Importer (installed by default). Automatic migration between XM L Pack 1.0 and XM L Pack 2.0 is not provided because each pack supports a different type of XM L metadata. The 1.0 Pack supports XM L DTDs, an XM L legacy definition, while the 2.0 Pack supports XSDs. Use a third party tool to generate an XSD compliant definition from your existing DTD. GB18030 restrictions in version 8.0.1 (eCases 105680, 105675, 107838, and 107609) GB18030 is a standard for encoding Chinese character data. The DataStage server runtime environment processes a subset of the full phase 1 GB18030 characters. The DataStage parallel runtime environment supports the full range of required phase 1 GB18030 characters. The DataStage and QualityStage clients (Designer, Director, and Administrator) include a large number of UI components that cannot handle GB18030 character data. The unsupported character data is incorrectly mapped within these controls. These UI controls can only handle characters that are supported in the M icrosoft 936 (Simplified Chinese) code page and all other characters are mapped to ? (question mark) characters. As a result, the original character encoding is lost. This limitation affects a large part of the design environment (Designer), including stage property editors, the design canvas (naming of stages and links), and other windows and dialog boxes. Because of this limitation, unsupported characters cannot be entered or used in any programming logic that relies on values entered via the UI. For example, you cannot use the unsupported characters in a literal string in a transformer expression. Therefore, you cannot compare a data field in a row being processed at runtime, with a user-entered literal in the unsupported range. DataStage Director and Administrator client applications are also similarly affected. Possible failure connecting DataStage clients to a remote WebSphere M etadata Server or remote DataStage server The DataStage components can be installed across different systems in different subnets or networks. For example, a site might install the WebSphere M etadata Server on one dedicated server and the DataStage runtime server on a different dedicated server. The DataStage clients can be installed on desktop workstations in a different subnet from the servers. For a DataStage client to connect to the WebSphere M etadata Server or the DataStage runtime server, the client system must be able to resolve the host names of these servers. The DataStage runtime server must be able to resolve the host name of WebSphere M etadata Server. DataStage uses

datastage-notes.blogspot.in

11/23

5/29/12

Sandy's DataStage Notes


unqualified host names (no domain). If the DataStage and QualityStage client is unable to address the servers by the unqualified host name (host short name), the domain name of each server must be added as a DNS suffix to the client's TCP/IP properties settings in the network configuration of the DataStage and QualityStage client. In some network configurations, the DNS server might not be able to resolve the host name to the IP address of the server computer (because each computer is in a different subnet or network, or they are just unknown to the DNS server) causing the DataStage connection to fail. In these configurations, you must modify the hosts file on the DataStage runtime server computer and every DataStage and QualityStage client computer to complete name resolution successfully. 1. M odifying the DNS suffix settings: a. Open the Network Connection properties. Go to Control Panel > Network Connections. Right click the LAN connection you use and click Properties. b. Select Internet Protocol (TCP/IP) and click Properties. c. In the Internet Protocol (TCP/IP) Properties window, click Advanced under the General tab. d. In the Advanced TCP/IP Settings window, click the DNS tab. e. Select Append these DNS suffixes (in order) option and click Add. f. Type the domain suffix and click Add. Add additional suffixes, if different, for the remaining servers. If any of the suffixes are subsets of other suffixes in the list, use the arrow buttons to order these so that the longest ones are above the shorter ones. g. Click OK in each open window and close the Network Connections properties window. 2. M odifying the hosts file: a. You must add entries for both the WebSphere M etadata Server computer and the DataStage runtime server computer to the hosts file. The reason is that DataStage and QualityStage client computers must be able to resolve host names for both the WebSphere M etadata Server computer and the DataStage runtime server computer. The below example shows the entries that must be added to the \Windows\system32\drivers\etc\hosts file of each client computer: <PAdes <aeo MtDtSre> I drs> Nm f eaaaevr Freape o xml: 1320201MtDtSre 9.0.0. eaaaevr <PAdes <aeo DRnieevr I drs> Nm f SutmSre> Freape o xml: 1320202DRnieevr 9.0.0. SutmSre b. The DataStage runtime server computer must be able to resolve the host name of the M etadata server computer. The following entry must be added to the \Windows\system32\drivers\etc\hosts file on the DataStage runtime server computer: <PAdes <aeo MtDtSre> I drs> Nm f eaaaevr Freape o xml: 1320201MtDtSre 9.0.0. eaaaevr c. In some network configurations, a computer might be known by different names, such as local name and the name listed in the DNS server for that computer. In this case, you must include both host names in the hosts file. Below is an example of such entries: <PAdes <aeo MtDtSreDS I drs> Nm f eaaaevrN> Freape o xml: 1320201MtDtSreDS 9.0.0. eaaaevrN <PAdes <aeo MtDtSreLcl I drs> Nm f eaaaevroa> Freape o xml: 1320201MtDtSreLcl 9.0.0. eaaaevroa <PAdes <aeo DRnieevrN> I drs> Nm f SutmSreDS Freape o xml: 1320202DRnieevrN 9.0.0. SutmSreDS <PAdes <aeo DRnieevroa> I drs> Nm f SutmSreLcl Freape o xml: 1320202DRnieevroa 9.0.0. SutmSreLcl Oracle direct path load compatibility It is not possible to perform an Oracle direct path load using an Oracle 10.2 client to an Oracle 9 server. Starting with Oracle 9i, the client version must be the same as or earlier than the server version. If you upgrade from an earlier version of WebSphere DataStage and have jobs that use the Oracle Enterprise stage or the Oracle OCI Load plug-in stage, these jobs might not work correctly for a direct path load unless the Oracle client and server version requirements are met. An alternative is not to use the direct path load feature. For the Oracle Enterprise stage, configure the APT_ORACLE_LOAD_OPTIONS environment variable, for example: ATOAL_ODOTOS 'PIN(IETFLEPRLE=RE' P_RCELA_PIN= OTOSDRC=AS,AALLTU) M ulti-Client M anager only supported on DataStage and QualityStage client installations The M ulti-Client M anager can switch between versions 7 and 8 of the DataStage and QualityStage client on a single system. As such, the M ulti-Client M anager does not expect a DataStage server to be installed on the same computer as the clients. The server must be on a separate computer. If you use the M ulti-Client M anager on a computer that has both client and server software installed, the M ulti-Client M anager will not switch between clients correctly. Data type restrictions in SQL Builder The following functions are not supported:

datastage-notes.blogspot.in

12/23

5/29/12
Oracle SUBSTR2 SUBSTR4 NCHAR LENGTH2 LENGTH4 INSTR2 INSTR4 CAST NEW_TIM E RPAD M ONTHS_BETWEEN Functions having an OVER clause Teradata Enterprise stage EXTRACT OCTET_LENGTH CURRENT_DATE CURRENT_TIM E CURRENT_TIM ESTAM P S ybase Enterprise stage CAST CORR SUBSTRING CURRENT_DATE CURRENT_TIM E CURRENT_TIM ESTAM P ODBC Enterprise stage CURRENT_DATE CURRENT_TIM E CURRENT_TIM ESTAM P

Sandy's DataStage Notes

CONVERT and CORR functions in SQL Builder for SQL Server Enterprise stage are not supported. If you try to use these functions, the "The row is invalid" error is generated. (eCase 116121) The WebSphere TX M ap stage fails when used with the Java class adapter. (eCases 111030 and 111160)

WebS phere DataS tage general operation When you install the WebSphere M Q plug-in in the IBM Information Server console mode, the navigation controls are not available. You must use numbers to use the corresponding options. Press 1 for Next, 2 for Previous, 3 for Cancel, and 5 to Redisplay. (eCase 120401) When using WebSphere M Q CC 6.0, you might encounter an error when creating a queue manager. If you encounter this error, apply Fix Pack 6.0.2 for WebSphere M Q. (eCase 119818) The default Chinese locale of zh_CN.EUC is not supported. You must change the locale setting to use the zh_CN.GBK locale setting. (eCase 117549) Problem displaying parts of the Advanced Find window or the expanded repository view in the Designer client on some systems Symptoms include missing column headings in the Advanced Find window and missing object names in the expanded repository view in the Designer client. Workaround Upgrade the graphics drivers to the newest versions. Reduce the graphics hardware acceleration midway that will disable all DirectDraw and Direct3D accelerations, as well as all cursor and advanced drawing accelerations. Select Display Properties > Advanced > Troubleshoot tab on the client system (eCase 83211) Problems can occur if you import or create table definitions from a shared table in which the database, schema, table, or column names include characters that do not conform to DataStage naming constraints (eCase 93585): First character must be an underscore ( _ ) or alpha (A_Z). Subsequent characters can be underscore, alphanumeric, $ (dollar sign), or period. Do not use the following characters: | (vertical bar), # (number sign), / (forward slash) and quotation marks. There is a problem importing DataStage export files in XM L format that contain parameter sets. Workaround Export these objects using the .dsx format. (eCase 115457) If running in a single-node configuration, a job might hang or fail to run with the error "A cycle was found in the combined graph. Failed describing Operator Combinations." Workaround

datastage-notes.blogspot.in

13/23

5/29/12

Sandy's DataStage Notes


Set the APT_NO_ONE_NODE_COM BINING_OPTIM IZATION environment variable. (eCase 116847) ITAG is supported in DataStage version 8.0.1. However, when you log into DataStage and QualityStage Administrator for the first time, you must manually specify the DataStage server port number for the tagged instance in conjunction with the server name in this form: "HostName:PortNumber". (eCase 115789) When the DataStage ODBC stage is used as a lookup, the parameters are bound in the order that the keys are defined in the generated SQL. If you want to reorder the key positions, for example to have better performance, you need to use user-defined SQL. (eCase 100695) Workaround Specify the user-defined environment variable ODBCBindingOrder = 1. If ODBCBindingOrder = 1 is not defined, the parameters are bound according to the generated SQL. For example, a job using an ODBC stage lookup processes four columns, two of which are keys with order DB2ADM IN.A1.COL1, DB2ADM IN.A1.UCN. The generated SQL is as follows: SLC D2DI.1CL,D2DI.1CL,D2DI.1UN EET BAMNA.O1 BAMNA.O2 BAMNA.C, D2DI.2RUNFO A,A BAMNA._C RM 1 2 WEE(BAMNA.O1=?ADD2DI.1UN=?; HR D2DI.1CL N BAMNA.C ) The user-defined SQL (with the order of the keys switched by the user) is as follows: SLC D2DI.1CL,D2DI.1CL,D2DI.1UN EET BAMNA.O1 BAMNA.O2 BAMNA.C, D2DI.2RUNFO A,A BAMNA._C RM 1 2 WEE (BAMNA.C =?ADD2DI.1CL =?) HR D2DI.1UN N BAMNA.O1 ; To run the user-defined SQL, add the ODBCBindingOrder environment variable, and set the value to 1. On the reference link of the ODBC stage, a LEFT OUTER JOIN SQL statement with parameter markers is not supported. (e85505). For example, the following SQL does not work: SLC tsd.b.al2f,tsd.b.al2f, EET etbdoTbe.1 etbdoTbe.2 tsd.b.al2f FO tsd.b.al2 etbdoTbe.4 RM etbdoTbe LF OTRJI tsd.b.al1o ET UE ON etbdoTbe n (etbdoTbe.1=? tsd.b.al2f ADtsd.b.al2f =?; N etbdoTbe.2 ) Workaround A LEFT OUTER JOIN SQL statement without parameter marker can be used. For example: SLC tsd.b.al2f,tsd.b.al2f, EET etbdoTbe.1 etbdoTbe.2 tsd.b.al2f etbdoTbe.4 FO tsd.b.al2LF OTRJI tsd.b.al1 RM etbdoTbe ET UE ON etbdoTbe o (etbdoTbe.1= n tsd.b.al2f tsd.b.al1f ADtsd.b.al2f = etbdoTbe.1 N etbdoTbe.2 tsd.b.al1f) etbdoTbe.2 Problems using the scheduler on systems with languages other than English If you run DataStage on a system with a language other than English, you might encounter problems when scheduling jobs to run on specific days of the week. Workaround Localize the days of the week for each project. (The AT command, which performs the Windows scheduling, accepts day names only in the local language.) To localize the day names: 1. Go to the project directory for your first project. This directory is on the DataStage server, by default in folder \IBM \InformationServer\Server\Projects. 2. Edit the DSParams file in a text editor. 3. Add the localized days of the week to the end of the file. The following is an example of what you might add for a French system: [CEUE] SHDLR MNA= ODYL TEDYM USA= WDEDYM ENSA=E TUSA= HRDYJ FIA= RDYV STRA= AUDYS SNA= UDYD You might need to experiment with which day names the local AT command accepts. If in doubt, enter the full name (for example, LUNDI, M ARDI, and so on). 4. Repeat the process for each of your projects. You might receive an error message stating that there are no entries in the list when you use the scheduler on a system with languages other than English. This message is output by the AT command and passed on by the Director client. To prevent this message from being displayed: 1. Identify a unique part of the message that the AT command outputs (for example, e t v d in s ie French). 2. For each project, add the following line to its DSParams file: N ETISetvd O NRE=s ie

datastage-notes.blogspot.in

14/23

5/29/12

Sandy's DataStage Notes


The AT command typically accepts keywords other than the days of the week in English. If your system does not, you can add localized versions of the additional keywords NEXT, EVERY, and DELETE to your projects as follows: 1. Edit the DSParams file for each project. 2. Add a line of the form: KYODlclzdkyod EWR=oaie_ewr For example: NEXT=Proxima Incorrect number of rows displayed for parallel Complex Flat File stage When you draw multiple output links for the Complex Flat File stage on the parallel canvas, the number of rows that are shown as processed is not depicted correctly. Only the first output link that is drawn shows the correct number of rows. Other output links incorrectly show zero rows. In addition, if the output link is configured for de-normalization of arrays or a constraint, the number of rows shown on the link is the number of imported rows before de-normalization or filtering for a constraint. (eCase 111912) With Internet Explorer version 7, running a difference comparison of two jobs, and attempting to follow a link from the report to a stage, causes the DataStage and QualityStage Designer to hang. Workaround Apply a fix from M icrosoft. To apply the fix, see the M icrosoft knowledge base article number 930828. (eCase 110705) Quick find or advanced find does not correctly find matching objects when the search text contains the German Sharp-S character. Specifically, objects with the German Sharp-S character in the name are not found. (eCase 109056) Quick find and advanced find do not consider the characters ue and to be equivalent. For example, searching for Duerst does not match the object called Derst and vice-versa. (eCase 109667) If you use parameter sets with the same name but different case, conflicts can occur (for example, each parameter set shares the value files of the other). (eCase 96682) Parameter sets do not work correctly when used in a data connection. (eCase 90504) The create table definition function from a shared table skips any table that are deleted from the shared metadata repository by another user after the list of tables for the create table operation was created. (eCase 107844) When the DataStage engine and the project are on different drives, errors can occur when you view or compile parallel jobs. Workaround Create a TM P directory on each separate drive or use the TM PDIR environment variable to point to an existing directory. (eCase 106782) The performance data file for collecting performance analysis data is not generated at run time. Workaround Request job performance data at the time the job is compiled. (eCase 101790) Surrogate key jobs end abnormally for a DB2 database type if the source name contains a Japanese character. (eCase 106968) A stage property that uses a backslash (\) before a job parameter does not get resolved correctly, for example, a file name in a Sequential File stage that has a backslash before a job parameter. (eCase 106636) Do not use the UTF16 character map in jobs on the parallel canvas. Workaround Use the UTF16LE or UTF16BE maps. (eCase 109459) Sometimes the job monitor for the parallel engine does not start when Windows starts up. Jobs run without row counts. Workaround 1. M ake sure that all DataStage client programs are disconnected. You can confirm that the job monitor is running by opening a DOS command window and issuing the command: p -f|ge Jbo s e rp oMn The system displays the process ID of the JobM on process. 2. M anually start the job monitor by opening the DataStage control panel. 3. Stop all the services. 4. Restart the services. (eCase 110757)

datastage-notes.blogspot.in

15/23

5/29/12

Sandy's DataStage Notes


Communication between the DataStage and QualityStage client and the DataStage server for WebSphere TX M ap stage When the DataStage and QualityStage client is installed on a different computer than the DataStage server, the WebSphere TX M ap stage editor can copy files to and from the computer on which DataStage server is installed. If the DataStage server is on UNIX, the files are copied by using FTP. If the DataStage server is on Windows, files are copied by using the operating system of the computer with the DataStage client installation. Use UNC file paths to refer to the files and directories on the computer with the DataStage server installation. For this to work, certain conditions must be fulfilled: For a DataStage server on Windows: On the DataStage and QualityStage client computer, log on to Windows by using a domain or workgroup logon that is also valid for the DataStage server computer. For a DataStage server on UNIX: Log on to the DataStage and QualityStage client by using a user name and password that are also valid for logging into the FTP server on DataStage server. (eCase 115349) Running a Where used... (deep) analysis on a table definition does not display the paths in the output link constraint of a job when viewed in the dependency viewer. Workaround Perform a Where used...? (deep) and then examine the paths. (eCase 115039) You can set a default value for date, time, timestamp on the Job Properties - Default tab of a WebSphere DataStage job. The default value that you set might be different from the default value set by your system. If you set this default value, you must not enclose the pattern string within single quotation marks ( ' ' ) because the system does not assume that this is in error and generates a warning message. If you want to quote the string, you must use the double quotation mark ( " " ). (eCase 115596) Job sequences from releases before DataStage 7.5 that use jobs which rely on job parameters do not run following import into a post-7.5.1 system. Workaround Edit the job activity stages in the job sequence. Enter the parameter names again where specified. The sequence runs after re-compilation. (eCase 114789) When running DataStage and QualityStage Designer and performing a find or impact analysis operation, you might encounter the below message: QueryInterface failed for _Collection or VBA._Collection failed This error is caused by a problem with the registration of a Visual Basic runtime. Workaround Register the .dll file again from a command window by using the below command: rgv3 C\IDW\ytm2mvv6.l. esr2 :WNOSsse3\sbm0dl (eCases 117025 and 89472) Several character maps are provided by the parallel canvas (PX) to describe record or field string data encoded in the UTF-16 character set, depending on whether the data is stored in big endian or little endian format, and whether a Byte Order M ark (BOM ) appears at the beginning of the record or field (the endian format of the data can be determined from the BOM ). When UTF-16BE is specified as the character set for a record or field, data is assumed to be stored in big endian UTF-16 format. When UTF-16LE is specified as the character set for a record or field, data is assumed to be stored in little endian UTF-16 format. No BOM appears at the beginning of the record or field data on input, and no BOM is written on the output. When UTF-16 is specified as the character set, a BOM might optionally appear at the beginning of the record or field on the input to indicate the endian format of the data. On the output, a BOM is always written at the beginning of the data stream for the specified record or field. Since field data typically does not contain UTF byte order marks, use either the UTF-16BE or UTF-16LE map instead of UTF-16 in most situations. The analyst must determine whether the UTF-16 data is stored in big endian or little endian format. The behavior of the UTF-16, UTF-16BE, and UTF-16LE character maps is inherited from ICU (International Components for Unicode). (eCase 109459) In an M PP environment, the DataStage parallel engine must be able to run the remote shell command (rsh) without a password on all processing nodes. The parallel engine searches the following paths on a processing node, in the below order, to find the remote shell command: $APT_ORCHHOM E/etc/remsh (if it exists) /usr/ucb/rsh /us/bin/remsh /bin/remsh /usr/bin/rsh where $APT_ORCHHOM E is the directory in which the DataStage parallel engine is installed. On Solaris 2.10, the remsh provided by the system might not run successfully within the DataStage parallel engine. This situation can lead to errors such as "rsh issued, no response received". Workaround To specify the location of the rsh command, copy or rename the file $APT_ORCHHOM E/etc/remsh.example supplied by the DataStage parallel engine to $APT_ORCHHOM E/etc/remsh. The file contains the following shell script:

datastage-notes.blogspot.in

16/23

5/29/12
#/i/h !bns #Eapeatecrmh xml p/t/es ee /s/i/s "@ xc urbnrh $"

Sandy's DataStage Notes

As written, this shell script invokes /usr/bin/rsh. Edit the last line of this script exec /usr/bin/rsh "$@" to invoke your specific remote shell command. All users should be able to run the script. To ensure this, use chmod: #ho 75$P_RHOEecrmh cmd 5 ATOCHM/t/es (eCase 117475)

DataS tage connectivity The following are known problems with DataStage connectivity: Informix , Teradata, Classic Federation, and Netezza enterprise stages are not available on SUSE Linux for PowerPC. Netezza enterprise stage is not available on HP-UX 11i v2 on Intel Itanium . BCPLoad stage (eCase 119124) BCPLoad jobs do not run on the Sybase ASE client version 12.5.4. However, BCP Load jobs work with Sybase ASE client version 12.5.2 and Sybase IQ version 12.6. Netezza Enterprise When you specify an incorrect close command, the job completes and displays the status as OK. The job should end with status=failed. (eCase 79755) For the nzload utility to work on a SUSE Linux system, you must replace the libstdc++-3-libc6.2-2-2.10.0.so file in /usr/lib directory with the most current libstdc++-3-libc6.2-2-2.10.0.so file. Otherwise, nzload fails with the following error: udfndsmo neie ybl _yai_at2 dnmccs_

To obtain the most current libstdc++-3-libc6.2-2-2.10.0.so file, contact Netezza customer support. (eCase 85585) Note: This known issue is specific to SUSE Linux for System x. Informix Enterprise (eCase 117177) To prevent termination of Informix enterprise stage HPL jobs, set LDR_CNTRL=M AXDATA=0 in the environment parameters of the job. DB2 Enterprise (eCase 116556) If a job contains a DB2 stage that uses user-defined SQL and a job parameter that also contains a single quote, then these single quotes are stripped out and the SQL statement becomes non-valid. S ybase Enterprise Lookup fails for the microseconds timestamp data type for Sybase IQ. (eCase 110464) The Sybase enterprise stage fails to append for a primary key constraint for Sybase IQ. (eCase 100132) BigInt, unsigned Int, unsigned bigInt, and unsigned smallInt data types are not supported for Sybase ASE version 15.0. A parallel job with a Sybase enterprise stage that selects a table with a long name or a table with long column names, greater than 30 characters, finishes but is not successful. Workaround If you run the same job with table and column names that are shorter than 30 characters, the job completes successfully. (eCase 104960) A parallel job using the Sybase ASE stage ends abnormally when using Sybase ASE database version 15.0. The naming convention of the libraries in $SYBASE/$SYBASE_OCS/dll changed. Now the library names begin with "libsyb" instead of "lib" as in the previous versions. A script in the $SYBASE/$SYBASE_OCS/scripts directory creates links to libraries in $SYBASE/$SYBASE_OCS/dll of ASE Sybase 15.0. For Windows, the script is copylibs.bat and for UNIX it is lnsyblibs. Run the script before connecting to Sybase ASE 15.0 Server from Sybase ASE 15.0 client. Instructions on how to run the script are provided in the "New Features Open Server 15.0 and SDK 15.0 for Windows, Linux and UNIX documentation on the Sybase Web site. (eCase 109024)

iWay Enterprise Sparse Lookup fails to create values for the following DB2 data type boundary values: Varchar minimum value, bigInt minimum and maximum values, decimal minimum and maximum values, and float minimum and maximum values. (eCase 105044) A lookup operation fails when

datastage-notes.blogspot.in

17/23

5/29/12
You select Lookup Failure = Reject, You select Lookup type = Sparse and

Sandy's DataStage Notes

You drag into the output stage any column from the reference link without dragging that column from the primary link. pFTP Enterprise (eCase 110196) The pFTP operator (in sFTP mode) fails when a wildcard "?" (question mark) is used in the URI field. (eCase 102854) On Windows Server 2003, Service Pack 1 the error "Rename Temp File calloc:File exists" might be encountered, when trying to get a file and saving it in a different drive. Workaround Apply M icrosoft fix KB899679. (eCase 91302) The restart feature in the FTP enterprise stage does not support the restarting of a job that experienced network failure. (eCase 81730) The pFTP enterprise stage does not support the Internet Protocol version 6 FTP client (IPv6) The default Internet protocol for the FTP client installed on SUSE version 10 is IPv6. This FTP client is not compatible with the pFTP enterprise stage. If the FTP client is configured with IPv6 as the default Internet protocol, you receive the following error message: "T_nepie10 err fprtre 50 FPEtrrs_,: ro: t eund 0 fpmg 50'PV:cmadntudrto. t s: 0 ES' omn o nesod" Workaround Activate the -disable IPv6 option when you install the FTP client. (eCase 120483)

Teradata Enterprise FLOAT, REAL, and DOUBLE PRECISION data types do not work with Teradata write jobs. (eCase 106763). If a table contains a single column of CHAR data type, a write job works correctly. If a table contains a column of CHAR data type with columns of other data types, a write job gives unrecoverable errors, and an empty table is created. This error occurs only with data other than English. (eCase 106860) ODBC Enterprise To enable NcharSupport in an Oracle database, modify the odbc.ini file of the Oracle data sources and set EnableNcharSupport=1. (eCase 101140) ODBC Connector Reading data from tables containing LOB columns is supported (for SQL Server) only if the LOB columns are the last columns in the SELECT statement. (eCase 105289) On Linux systems, jobs that use both the DB2 enterprise stage together with access to the DB2 system with the ODBC connector and the DB2 wire driver do not run successfully. Workaround Configure jobs such that they use the DB2 enterprise stage, or the ODBC connector or DB2 wire driver, but not both. (eCase 103387) Rejecting records due to check constraint violations might not function correctly with the IBM DB2 Wire Driver. The driver reports a message "Unknown error: SQLCODE -545". (eCase 101228) In the connector stage editor, if the properties tree is highlighted, pressing F1 does not display help information. Use the Help button to display help information. (eCase 97244) The .odbc.ini file on AIX contains an incorrect file name for the Teradata driver. The file lists the file name as ivtera22.so. The correct name is VM Tera22.so. Workaround When you set up a data source entry in .odbc.ini to Teradata, modify the "Driver" entry as follows: Trdt] eaaa Die=epr/i5q0sn0IMIfrainevr rvr/xotax2a1ad/B/nomtoSre/ Sre/rne_dclbVTr2.o evrbaddob/i/Mea2s (eCase 117034) Running jobs on a Linux server using the ODBC connector on a SQL Server 2005 database using the SQL Server wire driver can result in unexpected errors from the driver. (eCase 115699) You cannot import table definitions from a UTF-8 DB2 database using the ODBC connector and the DB2 wire

datastage-notes.blogspot.in

18/23

5/29/12

Sandy's DataStage Notes


driver when the server is installed on a Japanese AIX system. (eCase 116769) An issue exists with accessing the Teradata driver through the ODBC connector from the connector stage editor or connector import wizard. The ASBAgent can stop, requiring it to be restarted manually. (eCase 115699) There is a known problem when multiple DataStage Designer or WebSphere Information Analyzer clients attempt to access the ODBC connector or WebSphere M Q connector simultaneously. The problem can occur when accessing the connectors for any purpose, such as to browse database tables or message queues, test connections, or import table definitions. A message similar to following might be displayed: cmacnilabcssae.oncoSriexeto: o.seta.s.a.hrdCnetrevcEcpin A ecpinocre wietyn t rcietersos fo tehnlr n xeto curd hl rig o eev h epne rm h ade: Errumrhln rtr hae:jv.oEFxeto ro nasaig eun edr aai.OEcpin The workaround to permit simultaneous access to connectors from these clients: 1. Go to ASBNode/bin directory in your Information Server installation. 2. Export the following environment variable: e p r C _ S _ E E = xot CMGLVL6 3. Restart the agent daemons by running the command ./NodeAgents.sh restart ODBC driver issues Teradata driver on Windows and Linux (eCase 109492) When you use the ODBC connector with Teradata 6.1, a COM M IT WORK unrecoverable error is generated. This error causes the job to end when using the Teradata driver. Workaround Use the Teradata ODBC driver.

DB2 ODBC driver When you import tables with the connector metadata import wizard, using the native DB2 ODBC driver for DB2 9.1, you cannot select a table and use its Related Tables hyperlink to select associated tables (eCase 111730). Workaround M odify the client configuration file (db2cli.ini in the root directory of the DB2 installation). For the relevant data source, add R t r A i s s 0 eunlae=. Microsoft text driver (eCase 80352) The M icrosoft text driver is not supported.

Oracle Enterprise In order to use the Oracle OCI or Oracle enterprise stages, users in the primary dstage group must have read and execute permissions on libraries in $ORACLE_HOM E/lib and $ORACLE_HOM E/bin and similarly read permissions on all files in $ORACLE_HOM E. Otherwise, users might experience problems using Oracle OCI and Oracle enterprise stages to connect to Oracle 10.1.0.5. For Oracle 10g operators that use local Oracle 10g servers on AIX and Solaris, you must disable signal handling in Oracle Net 8. To disable signal handling in Oracle Net 8, set bequeath_detach=yes in the environment variable $ORACLE_HOM E/network/admin/sqlnet.ora. (eCase 119712) S QL S erver Enterprise (eCase 116355) The combination of the close statement and any one of the upsert statements that has a single query (such as insert only, delete only, and update only) does not work. Third party directories are not found by DataS tage Engine and AS B Agent The WebSphere M Q library directory is not in the library path environment variable. If the WebSphere M Q connector cannot connect to WebSphere M Q in client or server mode, you receive the following error in the log: ISCN-SQ000 I-ONWM-005 Windows A ecpinocre wietyn t rcietersos fo tehnlr n xeto curd hl rig o eev h epne rm h ade: A ecpinwsrcie fo tehnlr n xeto a eevd rm h ade: Sse cl LaLbayx)fie wt O err16 ytm al odirrE( ald ih S ro 2 (h seiidmdl cudntb fud. Te pcfe oue ol o e on) UNIX A ecpinocre wietyn t rcietersos fo tehnlr n xeto curd hl rig o eev h epne rm h ade: A ecpinwsrcie fo tehnlr n xeto a eevd rm h ade: Sse cl doe( fie wt O err2 ytm al lpn) ald ih S ro (osc fl o drcoy N uh ie r ietr) Plug-in jobs terminate with the following error:

datastage-notes.blogspot.in

19/23

5/29/12

Sandy's DataStage Notes


mi_rga:FtlErr Ftl Sae lbay(q.o fie t la:ero=() anporm aa ro: aa: hrd irr mss) ald o od rn 2, sse msae=(nbet fn lbay'iiq2a_.o. ytm esg Ual o id irr lbmi3hrs') Workaround M ake the WebSphere M Q libraries accessible to the connector. By default, the libraries are in the following directory. Contact your WebSphere M Q Administrator if you are not able to find the libraries in these paths. Windows: C:\Program Files\IBM \WebSphereM Q\bin UNIX (32-bit): /opt/mqm/lib UNIX (64-bit): /opt/mqm/lib64 After you identify the directory location of the WebSphere M Q libraries, modify the library path environment variable to include the appropriate directory for the user that runs the connector or plug-in process. One way to achieve this is to update the dsenv script as described below. The library path is in one of the environment variables in the following table: Table 1. Environment variables by operating system Operating system Windows AIX HP-UX on PA-RISC (32-bit) Environment variable Path LIBPATH SHLIB_PATH

HP-UX on Intel Itanium (64-bit) LD_LIBRARY_PATH Solaris/Linux LD_LIBRARY_PATH

Add the WebSphere M Q library directory to the library path environment variable to use WebSphere M Q with the WebSphere M Q connector that runs in a DataStage parallel job. 1. Add the WebSphere M Q library directory to the corresponding library path environment variable in the dsenv script in /opt/IBM /InformationServer/Server/DSEngine. 2. Log in as root. 3. Type the following commands to source the dsenv script: ./p/B/nomtoSre/evrDEgn/sn otIMIfrainevrSre/Sniedev 4. Type the following commands to restart DataStage Engine and Agent services. c /p/B/nomtoSre/evrDEgn/i d otIMIfrainevrSre/Sniebn .u -di -tp /v amn so .u -di -tr /v amn sat c /p/B/nomtoSre/SNd/i d otIMIfrainevrABoebn ..NdAet_n_Ss /oegnsevD.h .NdAet.hsoAet /oegnss tpgn .NdAet.hsat /oegnss tr Workaround for MQ S eries Plug-in and MQ Connector on HP-UX 11i v2 on Intel Itanium and S US E Linux Enterprise S erver 10 for IBM zS eries Set the following variables in $DSHOM E/dsenv or in the WebSphere DataStage administrator environment variables: L_IRR_AH$DLBAYPT:otmmlb4 epr L_IRR_AH DLBAYPT=L_IRR_AH/p/q/i6; xot DLBAYPT PT=PT:otmmbn epr PT AH$AH/p/q/i; xot AH (eCases 120292, 124837, and 126854) Miscellaneous stages The Command stage fails to copy a file containing special characters such as & (ampersand). (eCase 108820) The XM L input stage does not support namespace variable usage when the XPath expression is provided with the column description. (eCase 108470) The UniData stage is displayed on the server canvas on Linux even though it is not supported on this platform. (eCase 111345) Unable to run XML Input S tage on AIX (eCase 122076) Jobs terminate because they fail to load the xmli.so library file. Server and parallel jobs are terminated. Workaround Contact IBM Support. When using the Teradata connector, the Metadata Importer (Connector Wizard) method of importing metadata might not work for Asian language character sets (eCase 123832) Workaround 1. Click DataStage Designer > Import > Table Definitions. 2. Click Plug-in M etadata Definitions or click Orchestrate Schema Definitions to import table definitions from a variety of data sources. When using the Teradata connector, the WebS phere DataS tage 4-byte float is not the same as Teradata 8-byte float and cause write jobs to insert incorrect float values (eCase 123831) If you write a float value and then read that value, the result might not be identical to the original value that was written. This is because WebSphere DataStage uses a 4-byte float, whereas Teradata uses an 8-byte float. Workaround M odify the job to use a double float. WebSphere DataStage uses an 8-byte double float, so this matches the Teradata float.

datastage-notes.blogspot.in

20/23

5/29/12

Sandy's DataStage Notes


View Data exception when Generate S QL = YES (eCase 123370) View data functionality is not working from within the connector stage editor when the property Generate SQL is set to Yes. Clicking the View data hyperlink results in an error. Workaround Set Generate SQL to No and enter the statement manually or use the SQL Builder. Alternatively, View data functionality is accessible from the Connector Import Wizard. ODBC Connector problem on AIX when writing to CLOB data (eCase 117097) The job can hang indefinitely during the write operation. Workaround Use an alternate stage. DB2 API extract and date handling of type 0001-01-01 problem (eCase 118621) When DB2 date values are represented by the default value of 0001-01-01, these values are incorrectly read as an Integer equivalent of -718432 (instead of 0). PX Jobs reading these values result in joblog warnings similar to ATCmieOeaoCnrle,:eorebnl crepnigt msaekyDTG-O P_obndprtrotolr0Rsuc ude orsodn o esg e SAETD C013ntfud Cekta DHM o ATRSAHi st adD2UBAI00 ? (; -00 o on! hc ht SOE r P_EPT s e. n B_D_P_,: ? 1 0 1244 ; 712) This causes zero records to be processed. Workaround Set the Date column to CHAR. Error loading orchoracle when using Oracle Enterprise stage (eCase 115580) Workaround Type the following command to load the Oracle library correctly: P_BONCHM=srthIMIfrainevrSre/Sopnnsepr P_BONCH XDCNETOE/cac/B/nomtoSre/evrDCmoet;xot XDCNET OE -$.isallbrhrce M /ntl.iocoal Parallel job with Java Client stage does not work (eCase 118191) Workaround Set the DSHOME, DATASTAGE_JRE and DATASTAGE_JVM environment variables as follows (adjust appropriately per install): For Windows operating systems: 1. Set DATASTAGE_JRE DTSAEJE$SNd/psje AATG_R=ABoeap/r 2. Set DSHOM E DHM=:IMIfrainevrSre\Snie SOEC\B\nomtoSre\evrDEgn 3. M odify LIBPATH inside dsenv LBAH$SHM/r/i/ov bfr $SNd/psjebn IPT=ABOEjebnsvm eoe ABoeap/r/i 4. Do not set DATASTAGE_JVM . For Linux, S olaris, and UNIX operating systems: 1. Set DATASTAGE_JRE DTSAEJE$SNd/ps AATG_R=ABoeap 2. Set DSHOM E DHM=otIMIfrainevrSre/Snie SOE/p/B/nomtoSre/evrDEgn 3. Set DATASTAGE_JVM DTSAEJMjebnjv AATG_V=r/i/9m Chinese locale zh_CN.EUC on S olaris is not supported (eCase 117549) Workaround On Solaris, the locale should be set to the zh_CN.GBK locale, which is a superset of the older zh_CN.EUC locale. Chinese installation not supported for Linux environment (eCase 114895) UTF-8 locale on Chinese Linux systems is not supported. Only zh_CN.GB2312 is supported. S hared Metadata Management allows multiple databases to be created under the same host using the same database name and instance (eCase 115223) Shared M etadata M anagement allows multiple databases to be created under the same host using the same database name and instance In S QL Builder, selected columns are not dragged to the update column selection grid for Upsert Mode/Order as Update then Insert (eCase 115234) In SQL Builder, selected columns are not dragged to the update column selection grid for Upsert M ode/Order as Update then Insert In S QL Builder, the update values are lost when Validation is ON (eCase 116587) In update column grid, the update values are lost when values are added to the respective select columns. This problem is seen when the Validation button on the Toolbar is turned ON before making the query. Workaround Turn the Validation button OFF before building the query.

datastage-notes.blogspot.in

21/23

5/29/12
Note: The default state of the Validation is OFF.

Sandy's DataStage Notes

In S QL Builder, certain Oracle 9i functions are displaying an error (eCase 115989) These functions are not supported. Oracle Enterprise: Making a delete statement after making an upsert statement results in S QL Builder opening in upsert mode (eCase 116968) Building a DELETE query after making an upsert causes the wrong SQL Builder window to open with the Insert, Update and SQL tabs. Workaround To build a DELETE query after making an upsert, cancel and come back to the Stage Editor, and then invoke SQL Builder for DELETE. Creating a project using the Administrator is very slow when using the default kernel parameters (eCase 116480) Workaround Use the recommended settings to configure the kernel parameters on Solaris. 1. M ake a backup copy of /etc/system before you modify it. 2. Update /etc/system with the suggested kernel parameters and values. 3. Restart the computer. Suggested kernel values for installing WebSphere Application Server on Solaris [1]: stsmy:hif_hmx=49979 e hsssmnosma 24625 stsmy:hif_hsg=12 e hsssmnosme 04 stsmy:hif_hmi=12 e hsssmnosmn 04 stsmy:eif_eam=134 e esssmnosme 68 stsmy:eif_emi=12 e esssmnosmn 04 stsmy:eif_emp=12 e esssmnosma 06 stsmy:eif_ems=134 e esssmnosmn 68 stsmy:eif_eml=10 e esssmnosms 0 stsmy:eif_eom=10 e esssmnosmp 0 stsmy:eif_emu=24 e esssmnosmn 08 stsmy:eif_eue=26 e esssmnosmm 5 stmgy:sif_smp=12 e sssmgnomga 06 stmgy:sif_smx=655 e sssmgnomga 53 Suggested kernel values for db2osconf: stmgy:sif_smi=26 e sssmgnomgn 50 stsmy:eif_emi=37 e esssmnosmn 02 stsmy:hif_hmx=54782 e hsssmnosma 77654 stsmy:hif_hmi=37 e hsssmnosmn 02 Parallel job not supported when running jobs from WebS phere Information S ervices Directory (eCase 117012) It is not possible to run a Parallel job that contains either a Server Shared Container or a Basic Transformer stage when those stages are connected to a WebSphere Information Services Director Input stage. The Parallel engine produces a warning similar to the following: Operator nnn is not wave aware; the operator is reset and rerun on each wave if multiple waves present. Unable to import metadata from IBM WebS phere Metadata S erver database (eCase 117747) Import metadata from IBM WebSphere M etadata Server database does not work. IBM XL C/C++ Enterprise Edition for AIX, V9.0 compiler is not supported (eCase 122662) This function is not supported. Invalid S earch Expression alert is displayed sequentially when you click the Refresh button (eCase 115991) Error message is displayed when an invalid search expression is entered when using the SQL builder utility. Workaround Click the Refresh icon on toolbar. Errors result when switching between client and server (eCase 124080) When using the M Q Connector, switching between client and server modes and then selecting a queue or test connect might result in an error. Workaround Restart the agent. Connection problem occurs between ITAG installation on WebS phere DataS tage when using Admin Client, after installing second ITAG version (eCase 116898) Informix XPS Load stage fails on the server and parallel canvas on HP-UX 11i v2 on PA-RIS C (eCase 127034) Workaround Obtain and install the patch available from IBM support. MQ plug-ins terminate for jobs run on the server canvas on HP-UX 11i v2 on PA-RIS C (eCase 127036) The queue manager fails on WebSphere M Q on HP-UX 11i v2 on PA-RISC when the Java heap size is larger than 1024. Workaround Install the WebSphere M Q V6.0 Fix Pack 6.0.2.1 http://www-1.ibm.com/support/docview.wss?rs=171&uid=swg1IY88573. S ybase server connection fails with Login fails with invalid packet size error message (eCase 127022) When connecting from the SybaseOC Plug-in to the Sybase server the login fails with invalid packet size. Plug-in stage property SOURCE_TABLE is not defined for input links. Plug-in jobs terminate on HP-UX 11i v2 on Intel Itanium (eCase 126733)

datastage-notes.blogspot.in

22/23

5/29/12

Sandy's DataStage Notes


Informix plug-in jobs end after creating table successfully. The jobs end with the following error: Anra triaino saeifri. boml emnto f tg nomx Ifri_L_.DN1dtce. nomxCI0IET eetd Bulk load jobs end when used from DRS plug-in. The jobs end with the following library loading errors: issue. Cudntla dsbb.o ol o od rd2ls. XPS plug-in jobs end when run in manual mode Workaround Obtain the patch from IBM customer support to use the XPS plug-in. Configure WebS phere DataS tage and WebS phere QualityS tage for use with PAM (eCase 127126) Operating system: Linux Enterprise Server 10 for IBM zSeries Add the following entry to the /etc/pam.d/dsepam file to enable WebSphere DataStage to use PAM (dsepam) services on Linux Enterprise Server 10 for IBM zSeries: #PM10 %A-. at uh rqie pmui2s nlo #e_erc eurd a_nx.o ulk stscp pswr rqie pmui2s nlo #e_erc asod eurd a_nx.o ulk stscp acut rqie pmui2s nlo #e_erc con eurd a_nx.o ulk stscp
Posted by sandy.ph at 11:25 AM Labels: datastage, faq, repository 0 comments Recommend this on Google

Home
Subscribe to: Posts (Atom)

Older Posts

TOTAL PAGEVIEWS

8541
sandy prasetya @ 2011. Picture Window template. Template images by enot-poloskun. Powered by Blogger.

datastage-notes.blogspot.in

23/23

Você também pode gostar