Repository DAC
I am a repository DAC instance and under loads of it. My interviews from DAC client to the server of DAC UNix.The managers will work perfectly. No problem, I'm happy.If I created a new repository DAC and wanted to use this new repository, what changes should I make DAC server-side to point to this new repo? What configuration steps are needed?
A DAC server is married to a repository! All you have to do in the new repository of the DAC should change the DAC host on Setup-> system properties and then run serverSetupPrompt.sh on the server machine and reposition the server to the new repository that you created and reboot the server. That's all there is to it.
Tags: Business Intelligence
Similar Questions
-
Unable to connect to the DAC server by new repository DAC
Hello
I created the new repository of CAD on the same machine to perform analysis on the customization of CAD objects. I could connect the former repository with DAC server but with opening new repository DAC, when I click on tools-> DAC Server Management-> server DAC log, I get the error message: unable to connect to the DAC server.
I can see the red icon on client CAD, which suggests that the DAC server connection has not been implemented.
Please help solve the problem.Hello
This issue because of the number of port DAC. For the first deposit you gave port no.: 3141 with that number already server started. When you create the new repository, you must change the server DAC number more than port 1024. Then stop and start the server. It may solve your problem.
Concerning
Sundar -
Error when creating a new DAC connection using the MSSQL connection type
Hello
I am trying to create a connection DAC, i.e. a new repository DAC in the SQL Server 2008 database.
Version of the DAC: 10.1.3.4.1
Database: SQL Server 2008
I downloaded from the link below sqljdbc4.jar file and placed it in the D:\orahome\10gR3_1\bifoundation\dac\lib folder.
[http://www.microsoft.com/en-us/download/details.aspx?displaylang=en & id = 11774]
I entered all the details correctly to the name of the database, the database host, port of the database. I created a new file for authentication.
I get error when I try to test the connection below.
Driver MESSAGE: MSSQL not available!
CLASS EXCEPTION: java.lang.IllegalArgumentException
com.siebel.etl.gui.login.LoginDataHandler$ LoginStructure.testConnection (LoginDataHandler.java:512)
com.siebel.etl.gui.login.LoginDataHandler.testConnection(LoginDataHandler.java:386)
com.siebel.etl.gui.login.ConnectionTestDialog$ Executor.run (ConnectionTestDialog.java:290)
: CAUSE:
MESSAGE: com. Microsoft.SqlServer.JDBC.SQLServerDriver
EXCEPTION CLASS: java.lang.ClassNotFoundException
java.net.URLClassLoader$ 1.run(URLClassLoader.java:200)
java.security.AccessController.doPrivileged (Native Method)
java.net.URLClassLoader.findClass(URLClassLoader.java:188)
java.lang.ClassLoader.loadClass(ClassLoader.java:306)
Sun.misc.Launcher$appclassloader$ AppClassLoader.loadClass (Launcher.java:276)
java.lang.ClassLoader.loadClass(ClassLoader.java:251)
java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
java.lang.Class.forName0 (Native Method)
java.lang.Class.forName(Class.java:169)
com.siebel.etl.gui.login.LoginDataHandler$ LoginStructure.testConnection (LoginDataHandler.java:510)
com.siebel.etl.gui.login.LoginDataHandler.testConnection(LoginDataHandler.java:386)
com.siebel.etl.gui.login.ConnectionTestDialog$ Executor.run (ConnectionTestDialog.java:290)
The error seems to be a problem of connectivity to SQL Server. I use the correct jar file?
Please, help me to solve this problem. Appreciate the help given on this forum earlier.
Thank youAdd
.\lib\sqljdbc4.jar
at the end of the line beginning with SQLSERVERLIB in the file config.batPls correct brand
-
Hello
I DAC 10.1.3 Server installed on a linux machine and I DAC client on a windows machine.
When I try to connect to the DAC repository via windows DAC client it does not connect to the server (Server connection symbol is red)
Hi friends,
I checked the server configuration in DAC and he is the host of details for the repository DAC and the correct port only (I activated the option test connection and it says connected). I tried to reboot the server to the client computer DAC but it still does not connect after login(server_symbol_remain_RED).though that I can see my repository details of dac source settings I put the last time.
a strange thing happened:
When I try to ping to host db repo dac there is not clicking, but through its connected says DAC!
I tried connecting the Linux dac server and when I stopped and restarted the server here (startserver.sh) DAC after awhile, I get an error message:
as:
January 11, 2012 09:18:32 com.siebel.etl.engine.core.ETLUtils logException
SEVERE:
FAULT INFO: Error in the processing of messages from the client.
EXCEPTION CLASS: java.io.EOFException
java.io.ObjectInputStream$ PeekInputStream.readFully (ObjectInputStream.java:2280)
java.io.ObjectInputStream$ BlockDataInputStream.readShort (ObjectInputStream.java:2749)
java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:779)
java.io.ObjectInputStream. < init > (ObjectInputStream.java:279)
com.siebel.etl.net.QServer.loop(QServer.java:313)
com.siebel.etl.net.QServer.run(QServer.java:288)
java.lang.Thread.run(Thread.java:662)
I don't know if I need to do anything since the Linux also do the DAC client to connect to the server.
Please advice...
Thanks in advance...Yep
you start the CAD server on the linux machine
for example
export TMP = / tmp
export ORACLE_HOME=/opt/infa/product/11.2.0/client_1
export PATH = $ORACLE_HOME/bin: $PATH
CD /opt/infa/product/10.1.3/dac_1/bifoundation/dac
nohup./StartServer.sh 2 > & 1 > $TMP/dac_1_startQServer.sh_ 'date' + %Y %m %d ". log.txt &
tail-f $TMP/dac_1_startQServer.sh_ 'date' + %Y %m %d ". log.txtthen it comes back
ps - ef | grep - v 'grep ' | grep 'com.siebel.etl.net.QServer '.then start your CAD client on your windows server
C:\opt\infa\product\10.1.3\dac_1\bifoundation\dac\startclient.bat
(you don't need to start the server of dac on you computer windows client)You must have defined in configuration - "CAD system properties.
The DAC server for example myhost.local host
3141 DAC default server portMake sure the port 3141 is open on the firewall between your client windows machine and linux server
(you can test this using telnet myhost.local 3141) -
Hello friends,
I installed DAC on my local machine and DAC server on a Linux machine.
I need to migrate the DAC from my local computer to a windows server, can someone give me an idea about it.
Thank you.Hi, hope this helps
1. backup your DAC metadata to *
/... / bifoundation/dac/export file *.
2. copy and paste the folder structure integer dac *: /... / bifoundation/dac /... * on windows server machine
3. If you are using Oracle DB, make sure that you have installed Informatica and Oracle Client on your Windows Server box
4 set up Config.bat (Java path, DAC House) and dac_env.bat (path of Informatica, INFA_DOMAINS_FILE)
5. start DAC client and connect to your repository DAC (same as you did on your local computer)
6 re check all of your configurations. (Server installation, DAC properties, Data Sources, servers of Informaitca)
7. ensure that DAC Client(On Windows) and DAC (Linux) server are interconnectedConcerning
Published by: user11173172 on October 1, 2010 10:08
-
Regarding the Installation of Oracle BI Applications
Hi all
I was going through the documentation for the Oracle BI Applications, probably the old version.
I. DAC has previously been installed as one of the components as well as the installation of Oracle's Applications of BI?
Now, I have installed the Oracle BI Applications version 7.9.6.1. and the DAC is not installed with it. It is available in a package separate from the OTN site.
Old Documentation says: "DAC installation is part of the * Oracle BI Analytics installation *", referring to what is said: what I've learned, it's there:
1. "infrastructure oracle installation.
2. "installation of oracle BI Applications.
3 "Installing Informatica Powercenter.
4 "Installation of DAC" in packaging separate.
II. what does mean by * "installing Oracle BI Analytics?" It is part of all above mentioned installation or a separate installation.
Kind regards
Anthony.I. DAC has previously been installed as one of the components as well as the installation of Oracle's Applications of BI?
This is a separate installation
Now, I have installed the Oracle BI Applications version 7.9.6.1. and the DAC is not installed with it. It is available in a package separate from the OTN site.
Correct, you must install it separately
II. what does mean by * "installing Oracle BI Analytics?" It is part of all above mentioned installation or a separate installation.
Documentation of the olive TREE is not very good, there a lot of ambiguities like the one you quote.
http://rnm1978.WordPress.com/2009/07/30/what-is-Obia/
Basically, you need to install:
Informatica Server & Client
DAC & Client Server
OBIEE
Metadata Informatica (repository)
Metadata (repository) DAC
Metadata OBIEE (repository RPD and WebCat)
Schema of the Data Warehouse -
DAC as schedular tool for Informatica workflows (not for OBIApps repository)
I have to use as administrative DAC / Informatica workflows Planner tool. Earlier, I set up the same thing for OBI Apps. It works well. Now I'm not referring to Oracle_BI_DW_Base.rep which is the built-in repository provided with OBI Apps. I have to run my workflow through DAC. I went through the below listed steps to get the same thing,
(1) created the new USER
(2) that the user are used to configure the DAC connection as well regarding the DAC repository tables. (not sure if it is a good idea or not)
(3) create a new container of system Source. (no containers were there initially)
(4) I Informatica as 'MyRep' folder, so I created task logical and physical files with the same name in DAC (Tools = > data seeds = >...)
(5) created new area, tables, tasks, tasks synchronize.
(5) configure informatica servers, physical Data Sources. Tested, no flaws.
(6) added new execution Plan.
(7) the area assigned to him.
(8) click on the button "generate" in the settings section of the execution plan. No generated parameter. Now I got confused.
(* 9) I couldn't find my tasks in TASK ORDERS tab.*
(10) click on BUILD. Then got the error message as below,
Task MESSAGE: No found to build this execution plan.
Please let me know what the bad step here.
Thank youASSEMBLE you correctly the area in question? Confirmed that the area in question has the correct mission was before you add it to an execution plan?
-
Size of the recommendations for DAC / Informatica repository
I'm starting a new installation... I have the pdf of the site installation of Oracle and also the SRSP for OBI Apps...
I wonder if anyone has an estimate of the size for the DAC and Informatica repository databases? I understand that they are not big but...
In addition, I understand that they could reside in the same instance (separate from the db OBAW)... Is this a valid hypothesis?
TXS.
AntonioOur repo DAC is currently about 600 MB and Informatica repo about 800MB
There is no reason why you should not host them on the same instance.
-
Could not import the DAC repository
Hi all
I install BIA7.9.5 on R12.
Now, I want to configure the DAC but when I go to:
Tools-> Dac repository management-> impoirt
And choose the application Oracle R12.
And truncate tables
Now when I press OK nothing happens.
I still see the same screen, no error in not other things. Looks like that the button does not work.
I configured config.bat with JAVA_HOME and DAC_HOME good.
Did anyone had any idea why this happens?I had the same problem because on my client machine, I had installed the last version of Java because the requirements said JDK 1.5 or higher. However, it seems that you get this bug "button does nothing" with 1.6. I installed the final version of 1.5 and mounted my config file.
On your client change \DAC\config.bat
REM JAVA_HOME=C:\app\Java\jdk1.6.0_12 set
Set JAVA_HOME=C:\app\Java\jdk1.5.0_17The idea of this fix came to this post
DAC 7.9.5.1 client is not compatible with Java SDK 1.6.0_12 - use 1.5.0_17
Re: DAC 7.9.5.1 Client is not compatible with Java SDK 1.6.0_12 - use 1.5.0_17 -
DAC does not synchronize the tables of Informatica
Hello
I'm new to OBIEE although I experience Informatica. I created a mapping and the workflow. DAC, I created the new folder logical and physical. Creates a new task and provided the name of the task and logical record and other details. When I try to "Synchronize the tasks", I get a message of success with the names of the tables as expected. The DAC journal file is below.
-The Sync task has begun-
Requested workflow is exported to C:\DAC\dac\repository\taskSync\.
New source/target tables are inserted in the DAC metadata.
Exported xml files are removed from C:\DAC\dac\repository\taskSync.
-Sync task completed.
It is not question of DAC with Informatica repository communication or integration service as connection test results are ok.
In Windows command prompt, pmcmd and pmrep works fine.
Can you please help me out here, as I'm stuck here and not able to go further.
I checked the DAC server log. It shows a serious error. I copy the same thing here.
12 INFO Fri May 02 09:46:09 EDT 2014 reading repository properties
INFO 13 Fri May 02 09:46:09 EDT 2014 reading repository properties
14 INFO Fri May 02 09:46:10 EDT 2014 reading execution types
15 INFO executors of sync Fri May 02 09:46:10 EDT 2014
16 SEVERE Fri May 02 09:46:10 EDT 2014 audit the localhost with the values defined in the DAC repository!
INFO 17 Fri May 02 09:46:10 EDT 2014 add server name: localhost
INFO 18 Fri May 02 09:46:10 EDT 2014 adding replacing servername: unspecified
19 SEVERE Fri May 02 09:46:12 EDT 2014 Unknown host Unspecified. Error message: unspecified
Fri May 02 20 INFO 09:46:12 EDT 2014 bound to port 3141
INFO 21 Fri May 02 09:46:12 EDT 2014 creation ClientMessage dispatcher with 2 worker threads
INFO 22 Fri May 02 09:46:12 EDT 2014 no competition, only ETL can work both and it will run in the same process space.
NEWS 23 Fri May 02 09:46:12 EDT 2014 SERVER_NETWORK_MESSAGE: Created ClientMessage dispatcher with 2 worker threads
Fri may 02 24 INFO 09:46:12 EDT 2014 com.siebel.etl.net.ClientMessageDispatcher registered with HeartBeatManager
Thanks for your help,
Ananth
You can ignore any additional flat file as the source for the task in CAD.
Make sure what you do know about informatica
Pls don't change the original message! If you want to add anything, you can go to meet.
I think that you are ready to go to close
-
Cannot start server on Linux DAC
We install a 7.9.6.4 OBIApps... Linux Red Hat servers...
Everything seems to be working ok... except now to start the DAC server... This is version 11... stand-alone mode. We get the "cannot find server.properties file."
DAC Client connects OK, repository is responsible... etc and we follow the online installation manual 7.9.6.4...
Strange also when looking in DAC > system properties... I see no properties as host of DAC, etc... I remember from previous versions that they existed it y...
Tried several settings with JAVA_HOME, INFA_DOMAINS, etc., but still the same message...
TXS for any help.
Antonio
I ve solved... Yes must use the standaloneserversetup.sh with option 5 to save the changes... and use the encryption key used when we plugged in CAD or the server.properties is created .sh but server won´t at the beginning due to failure in the encryption key.
TXS for all the comments.
Antonio
-
Hello
We strive to run a full charge in DAC, but it is a failure.
The S_ASSET table is empty and therefore, we get the error
Could not get column: ORA-01403: no data found
We were not facing this error in DAC 10 g but after upgrade to DCA 11 g, this task has no place if the array is empty. How remove this restriction and DAC will work perfectly even if the tables are empty to full charge?
CAD DETAILS
Console Administration Oracle BI DW
CAD build a 11.1.1.6.4.20121119.2022 Build date: November 19, 2012
Management Console to install, configure, and administer the Warehouse Analytics of Oracle Business
Schema version - 47, Index version - 36, repository version - 6, data version Seed - 16
INFORMATICA - VERSION9.0.1 HF2
Here's the log of IMG_BUILD - S_ASSET when this task was running in DAC 10 g. The task facing any problem when S_ASSET table was empty.
[code]
2013-12-23 15:34:29.644 CHANGE CAPTURE: IMG_BUILD - internal S_ASSET began.
the number of days of prunes: 4
After change data capture: 2013-12-19 12:23:19.332 in full Mode.
2013-12-23 15:34:29.69 - execution:
TRUNCATE TABLE S_ETL_I_IMG_6
2013-12-23 15:34:29.89 - executed successfully.
2013-12-23 15:34:29.9 - execution:
TRUNCATE TABLE S_ETL_R_IMG_6
2013-12-23 15:34:30.145 - executed successfully.
2013-12-23 15:34:30.16 - execution:
TRUNCATE TABLE S_ETL_D_IMG_6
2013-12-23 15:34:30.198 - executed successfully.
2013-12-23 15:34:30.236 - execution:
INSERT / * + APPEND * / INTO S_ETL_R_IMG_6
(ROW_ID, MODIFICATION_NUM, LAST_UPD)
SELECT
ROW_ID
MODIFICATION_NUM
LAST_UPD
Of
S_ASSET
WHERE
LAST_UPD > TO_DATE ('2013-12-19 12:23:19 ',' ' YYYY-MM-DD HH24:MI:SS)
2013-12-23 15:34:30.968 - executed successfully.
2013 12-23 15:34:30.989 - execution of deleting duplicates rowid.
2013-12-23 15:34:31.01 - successfully deleting duplicates rowid.
2013-12-23 15:34:31.27 - execution:
DROP VIEW V_ASSET
2013-12-23 15:34:31.566 - executed successfully.
2013-12-23 15:34:31.569 - execution:
CREATE VIEW V_ASSET AS
SELECT
*
Of
S_ASSET
2013-12-23 15:34:31.743 - executed successfully.
2013-12-23 15:34:31.78 - execution:
SELECT COUNT (*) IN S_ASSET
The County is: 0
Executed successfully: CHANGE CAPTURE: IMG_BUILD - internal S_ASSET
(Number of attempts: 1).
Executed orders:
IMG_BUILD - internal
2013-12-23 15:34:31.851 CHANGE CAPTURE: IMG_BUILD - internal S_ASSET has finished running with the completed state.
The same task, IMG_BUILD - S_ASSET fails when it is running in DAC 11 g, because the table is empty. Here is the log for the failure of the task to 11g.
[code]
2013-12-27 21:21:47.497 acquisition of resources
2013-12-27 21:22:20.33 acquired resources
2013-12-27 21:22:20.787 CHANGE CAPTURE: IMG_BUILD - internal S_ASSET began.
the number of days of plum: 5760
After change data capture: 2013-12-23 21:19:19.233 in full Mode.
21:22:21.211 2013-12-27 - execution:
TRUNCATE TABLE S_ETL_I_IMG_6
2013-12-27 21:22:22.384 - executed successfully.
21:22:22.389 2013-12-27 - execution:
TRUNCATE TABLE S_ETL_R_IMG_6
2013-12-27 21:22:22.532 - executed successfully.
21:22:22.534 2013-12-27 - execution:
TRUNCATE TABLE S_ETL_D_IMG_6
2013-12-27 21:22:22.546 - executed successfully.
21:22:22.55 2013-12-27 - execution:
INSERT / * + APPEND * / INTO S_ETL_R_IMG_6
(ROW_ID, MODIFICATION_NUM, LAST_UPD)
SELECT
ROW_ID
MODIFICATION_NUM
LAST_UPD
Of
S_ASSET
WHERE
LAST_UPD > TO_DATE ('2013-12-23 21:19:19 ',' ' YYYY-MM-DD HH24:MI:SS)
2013-12-27 21:22:22.614 - executed successfully.
2013 12-27 21:22:22.616 - execution of deleting duplicates rowid.
2013-12-27 21:22:24.72 - successfully deleting duplicates rowid.
Could not get column: ORA-01403: no data found
FAULT INFO: Error executing: CHANGE CAPTURE: IMG_BUILD - internal S_ASSET
MESSAGE: com. Siebel.Analytics.etl.Execution.exceptions.ChangeCaptureTaskFailedException: Change Capture failed.
EXCEPTION CLASS: java.lang.RuntimeException
com.siebel.analytics.etl.etltask.GenericTaskImpl.doExecuteWithRetries(GenericTaskImpl.java:536)
com.siebel.analytics.etl.etltask.GenericTaskImpl.execute(GenericTaskImpl.java:372)
com.siebel.analytics.etl.etltask.GenericTaskImpl.execute(GenericTaskImpl.java:253)
com.siebel.analytics.etl.etltask.GenericTaskImpl.run(GenericTaskImpl.java:655)
com.siebel.analytics.etl.taskmanager.XCallable.call(XCallable.java:63)
java.util.concurrent.FutureTask$ Sync.innerRun (FutureTask.java:303)
java.util.concurrent.FutureTask.run(FutureTask.java:138)
java.util.concurrent.Executors$ RunnableAdapter.call (Executors.java:441)
java.util.concurrent.FutureTask$ Sync.innerRun (FutureTask.java:303)
java.util.concurrent.FutureTask.run(FutureTask.java:138)
java.util.concurrent.ThreadPoolExecutor$ Worker.runTask (ThreadPoolExecutor.java:886)
java.util.concurrent.ThreadPoolExecutor$ Worker.run (ThreadPoolExecutor.java:908)
java.lang.Thread.run(Thread.java:619)
: CAUSE:
MESSAGE: change Capture failed.
EXCEPTION CLASS: com.siebel.analytics.etl.execution.exceptions.ChangeCaptureTaskFailedException
com.siebel.analytics.etl.etltask.ChangeCaptureTask.executeChangeCapture(ChangeCaptureTask.java:93)
com.siebel.analytics.etl.etltask.ChangeCaptureTask.doExecute(ChangeCaptureTask.java:46)
com.siebel.analytics.etl.etltask.GenericTaskImpl.doExecuteWithRetries(GenericTaskImpl.java:477)
com.siebel.analytics.etl.etltask.GenericTaskImpl.execute(GenericTaskImpl.java:372)
com.siebel.analytics.etl.etltask.GenericTaskImpl.execute(GenericTaskImpl.java:253)
com.siebel.analytics.etl.etltask.GenericTaskImpl.run(GenericTaskImpl.java:655)
com.siebel.analytics.etl.taskmanager.XCallable.call(XCallable.java:63)
java.util.concurrent.FutureTask$ Sync.innerRun (FutureTask.java:303)
java.util.concurrent.FutureTask.run(FutureTask.java:138)
java.util.concurrent.Executors$ RunnableAdapter.call (Executors.java:441)
java.util.concurrent.FutureTask$ Sync.innerRun (FutureTask.java:303)
java.util.concurrent.FutureTask.run(FutureTask.java:138)
java.util.concurrent.ThreadPoolExecutor$ Worker.runTask (ThreadPoolExecutor.java:886)
java.util.concurrent.ThreadPoolExecutor$ Worker.run (ThreadPoolExecutor.java:908)
java.lang.Thread.run(Thread.java:619)
(Number of attempts: 1).
IMG_BUILD - internal
2013-12-27 21:22:31.526 CHANGE CAPTURE: IMG_BUILD - internal S_ASSET has finished running with the Failed state.
[[code
The CREATE VIEW privilege was missing from the CAD database user, therefore, was not to create the step. I have granted CREATE VIEW and all is well.
-
Start for the DAC Server services is having hung
Dear all,
We have recently installed DAC 10.1.3.4.1 with patch 14306642 on AIX 6.1 64 bit system.
Whenever we try to start the server DAC, is getting hanged to the Infoline: CONNECTION_ISSUE repository of end of the poll even if DAC server started, but the command is not getting out, it's getting stuck on this line.
Please find from the log file.
3 SEVERE Tue Nov 19 05:22:44 THIS 2013 verification the localhost with the values defined in the DAC repository!
4 INFO Tue Nov 19 05:22:44 THIS addition of 2013 server name: N02AST0010
INFO 5 Tue Nov 19 05:22:44 THIS addition of 2013 replacing servername: localhost
INFO 6 Mar Nov 19 05:22:44 THIS 2013 linked to port 3141
INFO 7 Mar 19 Nov 05:22:44 THIS 2013 creation ClientMessage dispatcher with 2 worker threads
INFO 8 Mar 19 05:22:44 THIS 2013 SERVER_NETWORK_MESSAGE Nov: Created ClientMessage dispatcher with 2 worker threads
INFO 9 Mar 19 Nov 05:22:44 THIS 2013 com.siebel.etl.net.ClientMessageDispatcher registered with HeartBeatManager
10 INFO Tue Nov 19 05:22:44 THIS 2013 repository CONNECTION_ISSUE ending on election
Please let know us is something false.
Thank you and best regards,
Amit Shah
I'm not sure if you really read my message, I said it clearly that you need to use the nohup to be able to run the background process even you leave session or ctrl + c etc.
I think you might be starting as cette./startserver.sh and then after a while you leave just session or do ctrl + c that crashed the server DAC process.
Order to better run the start as ce./startserver.sh) 1 nohup command > dacserver.log 2 > & 1 & ou2) nohup./startserver.sh &
so, in this way, it can work in the background
Check if this may help and close the message
-
DAC: task failed during the ETL for financial applications
I'm looking for my first ETL on OLIVIER 7.9.6.4
I use Oracle EBS 12.1.1 as source system.
the ETL full 314 tasks successfully, but it fails at the task named:
'SDE_ORA_GL_AR_REV_LinkageInformation_Extract '.
DAC error log:
=====================================
STD OUTPUT
=====================================
Informatica (r) PMCMD, version [9.1.0 HotFix2], build [357.0903], 32-bit Windows
Copyright (c) Informatica Corporation 1994-2011
All rights reserved.
Called Wed Sep 18 09:46:41 2013
Connected to the Service of integration: [infor_int].
File: [SDE_ORAR1211_Adaptor]
Workflow: [SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full]
Instance: [SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full]
Mapping: [SDE_ORA_GL_AR_REV_LinkageInformation_Extract]
Session log file: [C:\Informatica\server\infa_shared\SessLogs\. SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full.ORA_R1211.log]
Success of the source lines: [0]
Source has no lines: [0]
Target lines of success: [0]
Target lines failed: [0]
Number of errors in transformation: [0]
First error code [4035]
First error message: [[error SQL RR_4035
ORA-00904: "XLA_EVENTS." "" UPG_BATCH_ID ": invalid identifier
Database driver error...
Function name: run
Stmt SQL: SELECT DISTINCT
DLINK. SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
DLINK. SOURCE_DISTRIBUTION_TYPE TABLE_SOURCE,
AELINE. ACCOUNTING_CLASS_CODE,
GLIMPREF. JE_HEADER_ID JE_HEADER_ID,
GLIMPREF. JE_LINE_NUM JE_LINE_NUM,
AELINE. AE_HEADER_ID AE_HEADER_ID,
AELINE. AE_LINE_NUM AE_LINE_NUM,
T.LEDGER_ID LEDGER_ID,
T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
JBATCH.NAME BATCH_NAME,
JHEADER.NAME HOSTHEADERNAME,
BY. END_DATE,
AELINE. CODE_COMBINATI]
Task run status: [failure]
Integration service: [infor_int]
The integration of Service process: [infor_int]
Grid integration Service: [infor_int]
----------------------------
Name of node [node01_AMAZON-9C628AAE]
Fragment of preparation
Partition: #1 [Partition]
Instance of transformation: [SQ_XLA_AE_LINES]
Transformation: [SQ_XLA_AE_LINES]
Apply the lines: [0]
Number of rows affected: [0]
Rejected lines: [0]
Throughput(Rows/sec): [0]
Throughput(bytes/sec): [0]
Last [16004] error code, message [ERROR: prepare failed.: []]
ORA-00904: "XLA_EVENTS." "" UPG_BATCH_ID ": invalid identifier
Database driver error...
Function name: run
Stmt SQL: SELECT DISTINCT
DLINK. SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
DLINK. SOURCE_DISTRIBUTION_TYPE TABLE_SOURCE,
AELINE. ACCOUNTING_CLASS_CODE,
GLIMPREF. JE_HEADER_ID JE_HEADER_ID,
GLIMPREF. JE_LINE_NUM JE_LINE_NUM,
AELINE. AE_HEADER_ID AE_HEADER_ID,
AELINE. AE_LINE_NUM AE_LINE_NUM,
T.LEDGER_ID LEDGER_ID,
T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
JBATCH.NAME BATCH_NAME,
JHEADER.NAME HOSTHEADERNAME,
BY. END_DATE,
AELINE. CODE_CO]
Departure time: [Sat Oct 18 09:46:13 2013]
End time: [Sat Oct 18 09:46:13 2013]
Partition: #1 [Partition]
Instance of transformation: [W_GL_LINKAGE_INFORMATION_GS]
Transformation: [W_GL_LINKAGE_INFORMATION_GS]
Apply the lines: [0]
Number of rows affected: [0]
Rejected lines: [0]
Throughput(Rows/sec): [0]
Throughput(bytes/sec): [0]
Last error code [0], message [no errors].
Departure time: [Sat Oct 18 09:46:14 2013]
End time: [Sat Oct 18 09:46:14 2013]
Disconnection of Service integration
Completed at Wed Sep 18 09:46:41 2013
-----------------------------------------------------------------------------------------------------
Informatica session log files:
DIRECTOR > VAR_27028 use override the value [DataWarehouse] session parameter: [$DBConnection_OLAP].
DIRECTOR > VAR_27028 use override the value [ORA_R1211] for the session parameter: [$DBConnection_OLTP].
DIRECTOR > VAR_27028 use override the value [.] SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full.ORA_R1211.log] for the session parameter: [$PMSessionLogFile].
DIRECTOR > VAR_27028 use override the [26] value for parameter mapping: [$DATASOURCE_NUM_ID].
DIRECTOR > VAR_27028 use override the value [' n '] parameter mapping: [$FILTER_BY_LEDGER_ID].
DIRECTOR > VAR_27028 use override the value [' n '] parameter mapping: [$FILTER_BY_LEDGER_TYPE].
DIRECTOR > VAR_27028 use override value for the parameter mapping]: [$$ Hint1].
DIRECTOR > override VAR_27028 use value [01/01/1970] parameter mapping: [$INITIAL_EXTRACT_DATE].
DIRECTOR > override VAR_27028 use value [01/01/1990] parameter mapping: [$LAST_EXTRACT_DATE].
DIRECTOR > VAR_27028 use override value [1] to parameter mapping: [$LEDGER_ID_LIST].
DIRECTOR > VAR_27028 use override the value ["NONE"] parameter mapping: [$LEDGER_TYPE_LIST].
DIRECTOR > session initialization of TM_6014 [SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full] at [Sat Oct 18 09:46:13 2013].
DIRECTOR > name of the repository TM_6683: [infor_rep]
DIRECTOR > TM_6684 server name: [infor_int]
DIRECTOR > TM_6686 folder: [SDE_ORAR1211_Adaptor]
DIRECTOR > Workflow TM_6685: [SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full] running Instance name: Id series []: [2130]
DIRECTOR > mapping TM_6101 name: SDE_ORA_GL_AR_REV_LinkageInformation_Extract [version 1].
DIRECTOR > TM_6963 pre 85 Timestamp compatibility is enabled
DIRECTOR > the TM_6964 Date of the Session format is [HH24:MI:SS DD/MM/YYYY]
DIRECTOR > TM_6827 [C:\Informatica\server\infa_shared\Storage] will be used as the storage of session directory [SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full].
DIRECTOR > cache CMN_1805 recovery is removed when running in normal mode.
DIRECTOR > CMN_1802 Session recovery cache initialization is complete.
DIRECTOR > configuration using [DisableDB2BulkMode, Yes] TM_6708 property
DIRECTOR > configuration using [OraDateToTimestamp, Yes] TM_6708 property
DIRECTOR > configuration using [overrideMpltVarWithMapVar Yes] TM_6708 property
DIRECTOR > TM_6708 using the configuration property [SiebelUnicodeDB, [APPS] @[54.225.65.108:1521:VIS] [DWH_REP2]@[AMAZON-9C628AAE:1521:obiaDW1]]
DIRECTOR > TM_6703 Session [SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full] is headed by [node01_AMAZON-9C628AAE], version [9.1.0 HotFix2] 32-bit integration Service, build [0903].
MANAGER > PETL_24058 Running score of the Group [1].
MANAGER > initialization of engine PETL_24000 of parallel Pipeline.
MANAGER > PETL_24001 parallel Pipeline engine running.
MANAGER > session initialization PETL_24003 running.
MAPPING > CMN_1569 Server Mode: [ASCII]
MAPPING > code page of the server CMN_1570: [MS Windows Latin 1 (ANSI), superset of Latin1]
MAPPING > TM_6151 the session to the sort order is [binary].
MAPPING > treatment of low accuracy using TM_6156.
MAPPING > retry TM_6180 blocking logic will not apply.
MAPPING > TM_6187 Session focused on the target validation interval is [10000].
MAPPING > TM_6307 DTM error log disabled.
MAPPING > TE_7022 TShmWriter: initialized
MAPPING > Transformation TE_7004 Parse WARNING [IIF (EVENT_TYPE_CODE = 'RECP_REVERSE',
IIF (UPG_BATCH_ID > 0,
SOURCE_TABLE | '~' || DISTRIBUTION_ID,
SOURCE_TABLE | ' ~ RECEIPTREVERSE ~' | DISTRIBUTION_ID),
SOURCE_TABLE | '~' || DISTRIBUTION_ID)
]; transformation continues...
MAPPING > TE_7004 processing Parse WARNING [< < PM Parse WARNING > > [|]: operand is converted to a string.
... IIF (EVENT_TYPE_CODE = 'RECP_REVERSE',
IIF (UPG_BATCH_ID > 0,
SOURCE_TABLE | '~' || > > > > DISTRIBUTION_ID < < < <.
SOURCE_TABLE | ' ~ RECEIPTREVERSE ~' | DISTRIBUTION_ID),
SOURCE_TABLE | '~' || DISTRIBUTION_ID)
< < WARNING PM Parse > > [|]: operand is converted to a string
... IIF (EVENT_TYPE_CODE = 'RECP_REVERSE',
IIF (UPG_BATCH_ID > 0,
SOURCE_TABLE | '~' || DISTRIBUTION_ID,
SOURCE_TABLE | ' ~ RECEIPTREVERSE ~' | (> > > > DISTRIBUTION_ID < < < <).
SOURCE_TABLE | '~' || DISTRIBUTION_ID)
< < WARNING PM Parse > > [|]: operand is converted to a string
... IIF (EVENT_TYPE_CODE = 'RECP_REVERSE',
IIF (UPG_BATCH_ID > 0,
SOURCE_TABLE | '~' || DISTRIBUTION_ID,
SOURCE_TABLE | ' ~ RECEIPTREVERSE ~' | DISTRIBUTION_ID),
SOURCE_TABLE | '~' || (> > > > DISTRIBUTION_ID < < < <)
]; transformation continues...
MAPPING > Transformation TE_7004 Parse WARNING [JE_HEADER_ID |] '~' || JE_LINE_NUM]; transformation continues...
MAPPING > TE_7004 processing Parse WARNING [< < PM Parse WARNING > > [|]: operand is converted to a string.
... > > > > JE_HEADER_ID < < < < | '~' || JE_LINE_NUM < < PM Parse WARNING > > [JE_LINE_NUM]: operand is converted to a string
... JE_HEADER_ID | '~' || [> > > > JE_LINE_NUM < < < <]; transformation continues...
MAPPING > Transformation TE_7004 Parse WARNING [AE_HEADER_ID |] '~' || AE_LINE_NUM]; transformation continues...
MAPPING > TE_7004 processing Parse WARNING [< < PM Parse WARNING > > [|]: operand is converted to a string.
... > > > > AE_HEADER_ID < < < < | '~' || AE_LINE_NUM < < PM Parse WARNING > > [AE_LINE_NUM]: operand is converted to a string
... AE_HEADER_ID | '~' || [> > > > AE_LINE_NUM < < < <]; transformation continues...
MAPPING > TM_6007 DTM initialized successfully for the session [SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full]
DIRECTOR > PETL_24033 all the DTM connection information: [< NO >].
MANAGER > PETL_24004 from the tasks before the session. : (My Sep 18 09:46:13 2013)
MANAGER > task PETL_24027 before the session completed successfully. : (My Sep 18 09:46:13 2013)
DIRECTOR > PETL_24006 from data movement.
MAPPING > Total TM_6660 Buffer Pool size is 12582912 bytes and block size is 128000 bytes.
READER_1_1_1 > DBG_21438 Reader: Source is [54.225.65.108:1521/VIS], [APPS] users
READER_1_1_1 > BLKR_16003 initialization completed successfully.
WRITER_1_ * _1 > WRT_8146 author: target is the database [AMAZON - 9C628AAE:1521 / obiaDW1], [DWH_REP2], loose [on] mode users
WRITER_1_ * _1 > WRT_8106 WARNING! Session Mode Bulk - recovery is not guaranteed.
WRITER_1_ * _1 > target WRT_8124 W_GL_LINKAGE_INFORMATION_GS of Table: SQL INSERT statement:
INSERT INTO W_GL_LINKAGE_INFORMATION_GS(SOURCE_DISTRIBUTION_ID,JOURNAL_LINE_INTEGRATION_ID,LEDGER_ID,LEDGER_TYPE,DISTRIBUTION_SOURCE,JE_BATCH_NAME,JE_HEADER_NAME,JE_LINE_NUM,POSTED_ON_DT,GL_ACCOUNT_ID,SLA_TRX_INTEGRATION_ID,DATASOURCE_NUM_ID) VALUES ( ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
WRITER_1_ * _1 > connection WRT_8270 #1 target group consists of target (s) [W_GL_LINKAGE_INFORMATION_GS]
WRITER_1_ * _1 > WRT_8003 writer initialization complete.
READER_1_1_1 > BLKR_16007 player run began.
READER_1_1_1 > RR_4029 SQ [SQ_XLA_AE_LINES] User Instance specified SQL query [SELECT DISTINCT
DLINK. SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
DLINK. SOURCE_DISTRIBUTION_TYPE TABLE_SOURCE,
AELINE. ACCOUNTING_CLASS_CODE,
GLIMPREF. JE_HEADER_ID JE_HEADER_ID,
GLIMPREF. JE_LINE_NUM JE_LINE_NUM,
AELINE. AE_HEADER_ID AE_HEADER_ID,
AELINE. AE_LINE_NUM AE_LINE_NUM,
T.LEDGER_ID LEDGER_ID,
T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
JBATCH.NAME BATCH_NAME,
JHEADER.NAME HOSTHEADERNAME,
BY. END_DATE,
AELINE. CODE_COMBINATION_ID,
AEHEADER. EVENT_TYPE_CODE,
NVL (XLA_EVENTS. UPG_BATCH_ID, 0) UPG_BATCH_ID
DLINK XLA_DISTRIBUTION_LINKS
GL_IMPORT_REFERENCES GLIMPREF
XLA_AE_LINES AELINE
GL_JE_HEADERS JHEADER
GL_JE_BATCHES JBATCH
, GL_LEDGERS T
, GL_PERIODS BY
WHERE DLINK. SOURCE_DISTRIBUTION_TYPE IN
("AR_DISTRIBUTIONS_ALL"
"RA_CUST_TRX_LINE_GL_DIST_ALL")
AND DLINK. APPLICATION_ID = 222
AND AELINE. APPLICATION_ID = 222
AND AELINE.GL_SL_LINK_TABLE = GLIMPREF.GL_SL_LINK_TABLE
AND AELINE.GL_SL_LINK_ID = GLIMPREF.GL_SL_LINK_ID
AND AELINE. AE_HEADER_ID = DLINK. AE_HEADER_ID
AND AELINE. AE_LINE_NUM = DLINK. AE_LINE_NUM
AND GLIMPREF. JE_HEADER_ID = JHEADER. JE_HEADER_ID
AND JHEADER. JE_BATCH_ID = JBATCH. JE_BATCH_ID
AND JHEADER. LEDGER_ID = T.LEDGER_ID
AND JHEADER. STATUS = 'P'
AND T.PERIOD_SET_NAME = BY. PERIOD_SET_NAME
AND JHEADER. PERIOD_NAME = PER. PERIOD_NAME
AND JHEADER. CREATION_DATE > =.
TO_DATE (JANUARY 1, 1970 00:00:00 ')
"MM/DD/YYYY HH24:MI:SS")
AND DECODE (', 'Y', T.LEDGER_ID, 1) (1)
[AND DECODE (', 'Y', T.LEDGER_CATEGORY_CODE, 'NONE') ("NO")]
READER_1_1_1 > RR_4049 SQL query sent to the database: (Wed Sep 18 09:46:13 2013)
WRITER_1_ * _1 > WRT_8005 writer run began.
WRITER_1_ * _1 > WRT_8158
START SUPPORT SESSION *.
Startup load time: Fri Sep 18 09:46:13 2013
Target table:
W_GL_LINKAGE_INFORMATION_GS
READER_1_1_1 > CMN_1761 Timestamp event: [Thu Sep 18 09:46:13 2013]
READER_1_1_1 > RR_4035 SQL Error]
ORA-00904: "XLA_EVENTS." "" UPG_BATCH_ID ": invalid identifier
Database driver error...
Function name: run
Stmt SQL: SELECT DISTINCT
DLINK. SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
DLINK. SOURCE_DISTRIBUTION_TYPE TABLE_SOURCE,
AELINE. ACCOUNTING_CLASS_CODE,
GLIMPREF. JE_HEADER_ID JE_HEADER_ID,
GLIMPREF. JE_LINE_NUM JE_LINE_NUM,
AELINE. AE_HEADER_ID AE_HEADER_ID,
AELINE. AE_LINE_NUM AE_LINE_NUM,
T.LEDGER_ID LEDGER_ID,
T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
JBATCH.NAME BATCH_NAME,
JHEADER.NAME HOSTHEADERNAME,
BY. END_DATE,
AELINE. CODE_COMBINATION_ID,
AEHEADER. EVENT_TYPE_CODE,
NVL (XLA_EVENTS. UPG_BATCH_ID, 0) UPG_BATCH_ID
DLINK XLA_DISTRIBUTION_LINKS
GL_IMPORT_REFERENCES GLIMPREF
XLA_AE_LINES AELINE
GL_JE_HEADERS JHEADER
GL_JE_BATCHES JBATCH
, GL_LEDGERS T
, GL_PERIODS BY
WHERE DLINK. SOURCE_DISTRIBUTION_TYPE IN
("AR_DISTRIBUTIONS_ALL"
"RA_CUST_TRX_LINE_GL_DIST_ALL")
AND DLINK. APPLICATION_ID = 222
AND AELINE. APPLICATION_ID = 222
AND AELINE.GL_SL_LINK_TABLE = GLIMPREF.GL_SL_LINK_TABLE
AND AELINE.GL_SL_LINK_ID = GLIMPREF.GL_SL_LINK_ID
AND AELINE. AE_HEADER_ID = DLINK. AE_HEADER_ID
AND AELINE. AE_LINE_NUM = DLINK. AE_LINE_NUM
AND GLIMPREF. JE_HEADER_ID = JHEADER. JE_HEADER_ID
AND JHEADER. JE_BATCH_ID = JBATCH. JE_BATCH_ID
AND JHEADER. LEDGER_ID = T.LEDGER_ID
AND JHEADER. STATUS = 'P'
AND T.PERIOD_SET_NAME = BY. PERIOD_SET_NAME
AND JHEADER. PERIOD_NAME = PER. PERIOD_NAME
AND JHEADER. CREATION_DATE > =.
TO_DATE (JANUARY 1, 1970 00:00:00 ')
"MM/DD/YYYY HH24:MI:SS")
AND DECODE (', 'Y', T.LEDGER_ID, 1) (1)
AND DECODE (', 'Y', T.LEDGER_CATEGORY_CODE, 'NONE') ('NONE')
Fatal error Oracle
Database driver error...
Function name: run
Stmt SQL: SELECT DISTINCT
DLINK. SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
DLINK. SOURCE_DISTRIBUTION_TYPE TABLE_SOURCE,
AELINE. ACCOUNTING_CLASS_CODE,
GLIMPREF. JE_HEADER_ID JE_HEADER_ID,
GLIMPREF. JE_LINE_NUM JE_LINE_NUM,
AELINE. AE_HEADER_ID AE_HEADER_ID,
AELINE. AE_LINE_NUM AE_LINE_NUM,
T.LEDGER_ID LEDGER_ID,
T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
JBATCH.NAME BATCH_NAME,
JHEADER.NAME HOSTHEADERNAME,
BY. END_DATE,
AELINE. CODE_COMBINATION_ID,
AEHEADER. EVENT_TYPE_CODE,
NVL (XLA_EVENTS. UPG_BATCH_ID, 0) UPG_BATCH_ID
DLINK XLA_DISTRIBUTION_LINKS
GL_IMPORT_REFERENCES GLIMPREF
XLA_AE_LINES AELINE
GL_JE_HEADERS JHEADER
GL_JE_BATCHES JBATCH
, GL_LEDGERS T
, GL_PERIODS BY
WHERE DLINK. SOURCE_DISTRIBUTION_TYPE IN
("AR_DISTRIBUTIONS_ALL"
"RA_CUST_TRX_LINE_GL_DIST_ALL")
AND DLINK. APPLICATION_ID = 222
AND AELINE. APPLICATION_ID = 222
AND AELINE.GL_SL_LINK_TABLE = GLIMPREF.GL_SL_LINK_TABLE
AND AELINE.GL_SL_LINK_ID = GLIMPREF.GL_SL_LINK_ID
AND AELINE. AE_HEADER_ID = DLINK. AE_HEADER_ID
AND AELINE. AE_LINE_NUM = DLINK. AE_LINE_NUM
AND GLIMPREF. JE_HEADER_ID = JHEADER. JE_HEADER_ID
AND JHEADER. JE_BATCH_ID = JBATCH. JE_BATCH_ID
AND JHEADER. LEDGER_ID = T.LEDGER_ID
AND JHEADER. STATUS = 'P'
AND T.PERIOD_SET_NAME = BY. PERIOD_SET_NAME
AND JHEADER. PERIOD_NAME = PER. PERIOD_NAME
AND JHEADER. CREATION_DATE > =.
TO_DATE (JANUARY 1, 1970 00:00:00 ')
"MM/DD/YYYY HH24:MI:SS")
AND DECODE (', 'Y', T.LEDGER_ID, 1) (1)
AND DECODE (', 'Y', T.LEDGER_CATEGORY_CODE, 'NONE') ('NONE')
[Error fatal Oracle].
READER_1_1_1 > CMN_1761 Timestamp event: [Thu Sep 18 09:46:13 2013]
READER_1_1_1 > BLKR_16004 ERROR: prepare failed.
WRITER_1_ * _1 > WRT_8333 roll back all the targets due to the fatal error of session.
WRITER_1_ * _1 > rollback WRT_8325 Final, executed for the target [W_GL_LINKAGE_INFORMATION_GS] at end of load
WRITER_1_ * _1 > time full load WRT_8035: Sun Sep 18 09:46:13 2013
SUMMARY OF THE LOAD
============
WRT_8036 target: W_GL_LINKAGE_INFORMATION_GS (Instance name: [W_GL_LINKAGE_INFORMATION_GS])
WRT_8044 responsible for this target data no.
WRITER_1_ * _1 > WRT_8043 * END LOAD SESSION *.
MANAGER > PETL_24031
PERFORMANCE INFORMATION FOR TGT SUPPORT ORDER [1] GROUP, SIMULTANEOUS GAME [1] *.
Thread [READER_1_1_1] created [stage play] point score [SQ_XLA_AE_LINES] is complete. Running time total was enough for significant statistics.
[TRANSF_1_1_1] thread created for [the scene of transformation] partition has made to the point [SQ_XLA_AE_LINES]. Running time total was enough for significant statistics.
Thread [WRITER_1_ * _1] created for [the scene of writing] partition has made to the point [W_GL_LINKAGE_INFORMATION_GS]. Running time total was enough for significant statistics.
MANAGER > PETL_24005 from tasks after the session. : (My Sep 18 09:46:14 2013)
MANAGER > task of PETL_24029 after the session completed successfully. : (My Sep 18 09:46:14 2013)
MAPPING > TM_6018 the session completed with errors of processing row [0].
MANAGER > parallel PETL_24002 engine Pipeline completed.
DIRECTOR > Session PETL_24013 run duly filled with failure.
DIRECTOR > TM_6022
PLENARY OF THE LOAD
================================================
DIRECTOR > TM_6252 Source load summary.
DIRECTOR > Table CMN_1740: [SQ_XLA_AE_LINES] (name of the Instance: [SQ_XLA_AE_LINES])
Output [0] lines, affected lines [0], applied [0] lines, rejected lines [0]
DIRECTOR > TM_6253 Target Load summary.
DIRECTOR > Table CMN_1740: [W_GL_LINKAGE_INFORMATION_GS] (name of the Instance: [W_GL_LINKAGE_INFORMATION_GS])
Output [0] lines, affected lines [0], applied [0] lines, rejected lines [0]
DIRECTOR > TM_6023
===================================================
DIRECTOR > TM_6020 Session [SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full] to [Sat Oct 18 09:46:14 2013].
------------------------------------------------------------------------------------------------------------------------------------------------------------------
* I made a few requests in my source (Vision) database, table "XLA_EVENTS" exists, column "UPG_BATCH_ID" exists also
* I added 'XLA_EVENTS' to the FROM clause and he ran in SQL Developer
* in the SELECT clause, I see a column named 'AEHEADER. EVENT_TYPE_CODE ".but there is no table named 'AEHEADER' in the FROM clause
so I added it manually, it is without doubt makes reference to 'XLA_AE_HEADERS '.
The final request looks like this:
SELECT DISTINCT
DLINK. SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
DLINK. SOURCE_DISTRIBUTION_TYPE TABLE_SOURCE,
AELINE. ACCOUNTING_CLASS_CODE,
GLIMPREF. JE_HEADER_ID JE_HEADER_ID,
GLIMPREF. JE_LINE_NUM JE_LINE_NUM,
AELINE. AE_HEADER_ID AE_HEADER_ID,
AELINE. AE_LINE_NUM AE_LINE_NUM,
T.LEDGER_ID LEDGER_ID,
T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
JBATCH.NAME BATCH_NAME,
JHEADER.NAME HOSTHEADERNAME,
BY. END_DATE,
AELINE. CODE_COMBINATION_ID,
AEHEADER. EVENT_TYPE_CODE,
NVL (XLA_EVENTS. UPG_BATCH_ID, 0) UPG_BATCH_ID
DLINK XLA_DISTRIBUTION_LINKS
GL_IMPORT_REFERENCES GLIMPREF
XLA_AE_LINES AELINE
GL_JE_HEADERS JHEADER
GL_JE_BATCHES JBATCH
, GL_LEDGERS T
, GL_PERIODS BY
XLA_AE_HEADERS AEHEADER
XLA_EVENTS
WHERE DLINK. SOURCE_DISTRIBUTION_TYPE IN
("AR_DISTRIBUTIONS_ALL"
"RA_CUST_TRX_LINE_GL_DIST_ALL")
AND DLINK. APPLICATION_ID = 222
AND AELINE. APPLICATION_ID = 222
AND AELINE.GL_SL_LINK_TABLE = GLIMPREF.GL_SL_LINK_TABLE
AND AELINE.GL_SL_LINK_ID = GLIMPREF.GL_SL_LINK_ID
AND AELINE. AE_HEADER_ID = DLINK. AE_HEADER_ID
AND AELINE. AE_LINE_NUM = DLINK. AE_LINE_NUM
AND GLIMPREF. JE_HEADER_ID = JHEADER. JE_HEADER_ID
AND JHEADER. JE_BATCH_ID = JBATCH. JE_BATCH_ID
AND JHEADER. LEDGER_ID = T.LEDGER_ID
AND JHEADER. STATUS = 'P'
AND T.PERIOD_SET_NAME = BY. PERIOD_SET_NAME
AND JHEADER. PERIOD_NAME = PER. PERIOD_NAME
AND JHEADER. CREATION_DATE > =.
TO_DATE (JANUARY 1, 1970 00:00:00 ')
"MM/DD/YYYY HH24:MI:SS")
AND DECODE (', 'Y', T.LEDGER_ID, 1) (1)
AND DECODE (', 'Y', T.LEDGER_CATEGORY_CODE, 'NONE') ('NONE')
* When I run this query, it takes a lot of run time without return no result (it took 4 hours to cancel it, I got the last time)
my questions are:
-What is the problem with this query?-How can I change the query in the workflow?
could someone please help?
Check if the session is reusable or non-reusable. If his re-usable, so you may need to modify the sql query in the window of the tasks
-
Hello
I installed Dac (build a 11.1.1.6.4.20121119.2022, the date of construction: November 19, 2012) and OLIVIER biapps_7963
The version of database is 11.2.0.1.0
While I export the repository dwrep\dac_metadata\dac_client dac, I get the following error.
The version of the data that you are trying to import is not compatible with the current version of the dac.
According to the certification of these version matrix seems to be compatible.
Help, please.
Take a look at this version of the data that you are trying to import is not compatible with the current Version of the DAC (Doc ID 1086976.1)
Maybe you are looking for
-
Can I transfer Thunderbird to earthlink to gmail?
Will soon get Google Fiber and already have a new gmail address, which will become my only when Google Fiber is installed. I would like to keep Thunderbird as my mail server. Thank you.
-
Add limits to the parameters step through API TestStand of LabVIEW
Dear LabVIEW and TestStand community! I am trying to add FileGlobal Variables that limits TestStand step NumericLimit programmatically, via LabVIEW via the TS API. I attach a screenshot of the code, which I use (part of it - the module that defines o
-
T420s is no longer available online?
Does anyone know why the T420s is no longer available only? It is phased out?
-
changing information in the profile
How can I change the date of birth in the basic details in profile on the Microsoft account?
-
I can't install microsoft silverlight on a 64-bit operating system
He says he has downloaded but player sky said that it is not