Dynamo admin: method to start loading data?
Hello
The files get generated in the folder you want. Now I need to test the load of data takes place successfully rather than wait until the next run time.
Which is the method in the admin of dynamo I call inorder to start loading data?
I tried to call the method loadAllAvailable() method under each Chargers, but it keeps returning 0 even if there are a lot of log files in the folder.
Please let me know.
Thank you
Saud
Hello
Finally managed to find where I had made the mistake. Eventhough the ARF.base has been added to the modules for construction of production, it was not in the list of modules in the startup for the production instance script.
Thus the LogRotationSink component was not enabled in dynamoMessagingSystem.xml
Once the ARF.base module has been included in the startup script, everything worked like a charm
Thank you
Saud
Tags: Oracle Applications
Similar Questions
-
Start CalcScript befor, loading data
How can I start CalcScript befor load data?
I want clear slice in cube befor load data.
Example:
I export the data with the period Jan, Feb, so I have to erase data Essbase cube only fo period Jan, Feb. In the case when I export data for the period from March, I want to erase data with period Mar.
I don't want to use maxl because in a maxl script I have to specify the user name and password.
Thank you very much.
--
GeloHello
The best way is to duplicate the KM - 'IKM SQL for Hyperion Essbase (DATA)'
Modify the command 'Load data into essbase' kmUpdate
SQL = "" "select <%=od...
".to
sql = ""
now you can create an interface, the value staging as 'Table SUNOPSIS MEMORY ENGINE' and multi-line, data store loading essbase to the target, the value of KM that you just updated. "." < br / > then use calc script options, in order to have an interface that doesn't have a source and can run scripts Calc.
Ok? FIX :)
Cheers
John < br / > http://john-goodwin.blogspot.com/
-
Name of user and password invalid executes the Plan of loading data
I get the following error when I try to execute the plan of loading data for loading data from my EBS server. I don't know what username and password she is claiming is not valid or where to change the value. Any ideas where I can find it?
ODI-1519: series step "start load Plan.
(InternalID:4923500) ' failed, because the child step "Global Variable Refresh.
(InternalID:4924500) "is a mistake.
ODI-1529: refreshment of the variable 'BIAPPS.13P_CALENDAR_ID' failed:
Select CASE WHEN 'The Global Variable Refresh' in (select distinct)
group_code from C_PARAMETER_VALUE_FORMATTER_V where PARAM_CODE =
"13P_CALENDAR_ID")
THEN (select param_value
of C_PARAMETER_VALUE_FORMATTER_V
where PARAM_CODE = '13P_CALENDAR_ID. '
and group_code = 'Global Variable Refresh'
and datasource_num_id = ' #BIAPPS. WH_DATASOURCE_NUM_ID')
ON THE OTHER
(select param_value in the C_GL_PARAM_VALUE_FORMATTER_V where PARAM_CODE =
'13P_CALENDAR_ID' and datasource_num_id =
' #BIAPPS. WH_DATASOURCE_NUM_ID')
END
of the double
0:72000:Java.SQL.SqlException: ORA-01017: name of user and password invalid.
connection refused
java.sql.SQLException: ORA-01017: name of user and password invalid. opening of session
denied
to
oracle.odi.jdbc.datasource.LoginTimeoutDatasourceAdapter.doGetConnection(LoginTimeoutDatasourceAdapter.java:133)
to
oracle.odi.jdbc.datasource.LoginTimeoutDatasourceAdapter.getConnection(LoginTimeoutDatasourceAdapter.java:62)
to
oracle.odi.core.datasource.dwgobject.support.OnConnectOnDisconnectDataSourceAdapter.getConnection(OnConnectOnDisconnectDataSourceAdapter.java:74)
to
oracle.odi.runtime.agent.loadplan.LoadPlanProcessor.executeVariableStep(LoadPlanProcessor.java:3050)
to
oracle.odi.runtime.agent.loadplan.LoadPlanProcessor.refreshVariables(LoadPlanProcessor.java:4287)
to
oracle.odi.runtime.agent.loadplan.LoadPlanProcessor.AddRunnableScenarios(LoadPlanProcessor.java:2284)
to
oracle.odi.runtime.agent.loadplan.LoadPlanProcessor.AddRunnableScenarios(LoadPlanProcessor.java:2307)
to
oracle.odi.runtime.agent.loadplan.LoadPlanProcessor.SelectNextRunnableScenarios(LoadPlanProcessor.java:2029)
to
oracle.odi.runtime.agent.loadplan.LoadPlanProcessor.StartAllScenariosFromStep(LoadPlanProcessor.java:1976)
to
oracle.odi.runtime.agent.loadplan.LoadPlanProcessor.startLPExecution(LoadPlanProcessor.java:491)
to
oracle.odi.runtime.agent.loadplan.LoadPlanProcessor.initLPInstance(LoadPlanProcessor.java:384)
to
oracle.odi.runtime.agent.loadplan.LoadPlanProcessor.startLPInstance(LoadPlanProcessor.java:147)
to
oracle.odi.runtime.agent.processor.impl.StartLoadPlanRequestProcessor.doProcessRequest(StartLoadPlanRequestProcessor.java:87)
to
oracle.odi.runtime.agent.processor.SimpleAgentRequestProcessor.process(SimpleAgentRequestProcessor.java:49)
to
oracle.odi.runtime.agent.support.DefaultRuntimeAgent.execute(DefaultRuntimeAgent.java:68)
to
oracle.odi.runtime.agent.servlet.AgentServlet.processRequest(AgentServlet.java:564)
to
oracle.odi.runtime.agent.servlet.AgentServlet.doPost(AgentServlet.java:518)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
to
weblogic.servlet.internal.StubSecurityHelper$ ServletServiceAction.run (StubSecurityHelper.java:227)
to
weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
to
weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:301)
at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
to
weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
to
oracle.security.jps.ee.http.JpsAbsFilter$ 1.run(JpsAbsFilter.java:119)
at java.security.AccessController.doPrivileged (Native Method)
to
oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:324)
to
oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:460)
to
oracle.security.jps.ee.http.JpsAbsFilter.runJaasMode(JpsAbsFilter.java:103)
to
oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:171)
at oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:71)
to
weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
to
oracle.dms.servlet.DMSServletFilter.doFilter(DMSServletFilter.java:163)
to
weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
to
weblogic.servlet.internal.WebAppServletContext$ ServletInvocationAction.wrapRun (WebAppServletContext.java:3730)
to
weblogic.servlet.internal.WebAppServletContext$ ServletInvocationAction.run (WebAppServletContext.java:3696)
to
weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
to
weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
to
weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2273)
to
weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2179)
to
weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1490)
at weblogic.work.ExecuteThread.execute(ExecuteThread.java:256)
at weblogic.work.ExecuteThread.run(ExecuteThread.java:221)
Caused by: java.sql.SQLException: ORA-01017: name of user and password invalid.
connection refused
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:462)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:397)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:389)
at oracle.jdbc.driver.T4CTTIfun.processError(T4CTTIfun.java:689)
to
oracle.jdbc.driver.T4CTTIoauthenticate.processError(T4CTTIoauthenticate.java:455)
at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:481)
at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:205)
to
oracle.jdbc.driver.T4CTTIoauthenticate.doOAUTH(T4CTTIoauthenticate.java:387)
to
oracle.jdbc.driver.T4CTTIoauthenticate.doOAUTH(T4CTTIoauthenticate.java:814)
at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:418)
to
oracle.jdbc.driver.PhysicalConnection. < init > (PhysicalConnection.java:678)
to
oracle.jdbc.driver.T4CConnection. < init > (T4CConnection.java:234)
to
oracle.jdbc.driver.T4CDriverExtension.getConnection(T4CDriverExtension.java:34)
at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:567)
to
oracle.odi.jdbc.datasource.DriverManagerDataSource.getConnectionFromDriver(DriverManagerDataSource.java:410)
to
oracle.odi.jdbc.datasource.DriverManagerDataSource.getConnectionFromDriver(DriverManagerDataSource.java:386)
to
oracle.odi.jdbc.datasource.DriverManagerDataSource.getConnectionFromDriver(DriverManagerDataSource.java:353)
to
oracle.odi.jdbc.datasource.DriverManagerDataSource.getConnection(DriverManagerDataSource.java:332)
to
oracle.odi.jdbc.datasource.LoginTimeoutDatasourceAdapter$ ConnectionProcessor.run (LoginTimeoutDatasourceAdapter.java:217)
to
java.util.concurrent.Executors$ RunnableAdapter.call (Executors.java:441)
to java.util.concurrent.FutureTask$ Sync.innerRun (FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
to
java.util.concurrent.ThreadPoolExecutor$ Worker.runTask (ThreadPoolExecutor.java:886)
to
java.util.concurrent.ThreadPoolExecutor$ Worker.run (ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Found the answer after 4 days of research. Opening ODI Studio. Go to the topology, expand-> Oracle Technologies. Open BIAPPS_BIACOMP. In my system, he had "NULLBIACM_IO" has changed nothing to my correct prefix and it worked.
Now, my data load gives me the error that the table W_LANGUAGES_G does not exist. At least I liked long.
-
Error while loading data in Planning
Failed to load the data in the planning of the 11.1.2.3.200 using ODI 11.1.1.7
Please find the errors at the bottom of newspapers:
INFO [SimpleAsyncTaskExecutor-2]: Oracle Data Integrator adapter for Hyperion Planning
INFO [SimpleAsyncTaskExecutor-2]: Connection to the planning your application [xxxxxxx] on [xxxxxxxxxxx]: [xxxx] using [admin] username.
[SimpleAsyncTaskExecutor-2] INFO: Successfully connected to the planning application.
INFO [SimpleAsyncTaskExecutor-2]: Loading for the charge of planning options
Name of the dimension: account type Parent child: false
Order By entry charge: forgery
Update the database: false
INFO [SimpleAsyncTaskExecutor-2]: Beginning of the loading process.
DEBUG [SimpleAsyncTaskExecutor-2]: number of columns in the result set of source does not match the number of columns planning targets.
INFO [SimpleAsyncTaskExecutor-2]: Type of load is a [member of the load dimension].
ERROR [SimpleAsyncTaskExecutor-2]: file [[A603010, null, null, null, null, null, null, null, null, null, null, null, xxxxx,-100, F3E0, C011, E7172_93275, FY17, Stage 1, level of current Service, Jul, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null]] was rejected by Planning Server.
ERROR [SimpleAsyncTaskExecutor-2]: file [[A601060, null, null, null, null, null, null, null, null, null, null, null, xxxxx,-250, F3E0, C011, E7172_93275, FY17, Stage 1, level of current Service, Jul, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null]] was rejected by Planning Server.
log. Err
Account, name of Cube to load data, Budget, Point of view, Error_Reason
A603010, xxxxx,-100, F3E0, C011, E7172_93275, FY17, Stage 1, current service level, Jul, cannot load the dimension member, error message is: RemoteException occurred in the server thread; nested exception is:
java.rmi.UnmarshalException: not recognized hash method: method not supported by the remote object
A601060, xxxxx,-250, F3E0, C011, E7172_93275, FY17, Stage 1, current service level, Jul, cannot load the dimension member, error message is: RemoteException occurred in the server thread; nested exception is:
java.rmi.UnmarshalException: not recognized hash method: method not supported by the remote object
Journal FDMEE:
: [AIF] error: no record exists for period "Pd 2 - 01-08-2014".
: [AIF] error: no record exists for period "Pd 3-2014-09-01"
FDMEE logging level is set to 5
The PES of planning that you applied contains a new version of the HspJS.jar so that could be a possible way of this error could have surfaced. For me personally, I think you'd better get everything patched up to the 11.1.2.3.500 PSU continuing at least before that it is a known problem in this version and there are the notes I have mentioned previously, to help patching.
It is clear from the error that there is a mismatch in versions of the Agent of ODI FDMEE with the Planning Server jar files. One thing you might be able to try on this front would be to save the current file HspJS.jar in the C:\Oracle\Middleware\odi\oracledi.sdk\lib House FDMEE ODI and place a copy of the same file of your planning server in the C:\Oracle\Middleware\EPMSystem11R1\products\Planning\lib folder (or equivalent).
I've not personally seen this error before where the 500 patch had not been implemented well. Decide which approach you take will be up to you really, but I suggest to patch to 500 as best as possible and go from there.
Concerning
Craig
-
I was using CF7 to query data and load in the pdf file using the blog of Ben Forte XPAAJ java library. It's simple, and it worked very well. Then came CF8 and LiveCycle ES. XPAAJ was free, but is not free of LiveCycle ES. LC ES does much more than load data into a pdf and cost a bit, but it's probably worth the money. Because a small part of the LC ES is the same thing as XPAAJ did, the CF8 upgrade almost completely mask XJAAP.
(I got it by the judgment of the CF8, starting from the CF7 app service application service, and then restart the CF8 - I don't want to run like that for a long time).
I can possibly get into LiveCycle, but for now, no one knows something like XPAAJ that can load the data directly into a pdf file or a file Form Data Format (FDF)?
Thanks Scott
If by 'data' PDF mean you XMP metadata, so yes: CF8 comes with a version of the iText Java library that includes classes for reading and writing XMP in PDF format. Tag of CFPDF, CF8 uses iText under the hood, but does not yet use all of its features via CFPDF, you can simply call the Java classes directly in order to access iText XMP features.
Reading and writing XMP examples are given here:
http://www.Adobe.com/cfusion/webforums/Forum/MessageView.cfm?forumid=1&CATID=7&ThreadId=13 28338Check the pdfUtils of Ray Camden CFC on riaForge: it contains a new method of readXMP(). writeXMP() is coming - the code is included in the above thread in any case.
If you mean that the data of the form rather than XMP data, then iText can do too. Check the iText documentation:
http://www.1t3xt.info/API/Here is a small example of filling out form
http://www.peabrane.com/2007/3/5/PDF-templates-via-Railsand cfSearching (who really knows iText in a context of CF), there is this example, note the use of the iText fdfReader class:
http://cfsearching.blogspot.com/2008/01/submit-PDF-form-as-FDF-with-ColdFusion.html -
Error in loading data with SQLLDR in Oracle 10 G
Hello
Can one suggest what the problem is in the slot mentioned control file used for loading data via SQL * LOADER
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------
DOWNLOAD THE DATA
INFILE 'D:\test\temt.txt '.
BADFILE "test.bad."
DISCARDFILE 'test.dsc '.
IN THE TABLE 'TEST '.
INSERT
(INTEGER SRNO (7))
PROD_ID INTEGER (10),
PROMO_ID INTEGER (10),
CHANNEL_ID INTEGER (10),
UNIT_COST INTEGER (10),
UNIT_PRICE INTEGER (10)
)
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------
I'm trying to load data in the schema SCOTT scott user.
Why make such a mistake, please see the attached log file.
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------
SQL * Loader: Release 10.2.0.1.0 - Production on Fri Mar 20 14:43:35 2009
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Control file: D:\test\temt.ctl
Data file: D:\test\temt.txt
Bad leadership: test.bad
Delete the file: test.dsc
(Allow all releases)
Number of loading: ALL
Number of jump: 0
Authorized errors: 50
Link table: 64 lines, maximum of 256000 bytes
Continuation of the debate: none is specified
Path used: classics
Table 'TEST', loaded from every logical record.
Insert the option in effect for this table: INSERT
Column Position Len term Encl. Datatype name
------------------------------ ---------- ----- ---- ---- ---------------------
SRNO FIRST 7 INTEGER
PROD_ID INTEGER 10 NEXT
PROMO_ID INTEGER 10 NEXT
CHANNEL_ID INTEGER 10 NEXT
UNIT_COST INTEGER 10 NEXT
UNIT_PRICE INTEGER 10 NEXT
Sheet 1: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 2: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 3: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Folder 4: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 5: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 6: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 7: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 8: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
File 9: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Case 10: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Factsheet 11: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 12: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
File 13: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Fact sheet 14: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Fact sheet 15: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 16: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
File 17: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 18: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
File 19: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 20: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 21: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Account 22: rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 23: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record number of 24: rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 25: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Fact sheet 26: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Fact sheet 27: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 28: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 29: rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 30: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record of 31: rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
• Statement 32: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 33: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Page 34: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 35: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 36: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 37: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 38: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 39: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 40: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 41: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Page 42: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 43: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 44: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 45: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
• Statement 46: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 47: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 48: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 49: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Page 50: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 51: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
NUMBER of MAXIMUM ERRORS EXCEEDED - above the statistics reflect partial performance.
Table 'TEST'
0 rows successfully loaded.
51 lines not filled due to data errors.
0 rows not loading because all WHEN clauses were failed.
0 rows not populated because all fields are null.
The space allocated to bind table: 3648 bytes (64 lines)
Bytes of read buffer: 1048576
Total logical records ignored: 0
Total logical records read: 64
Total rejected logical records: 51
Total logical records ignored: 0
Run started on Fri Mar 20 14:43:35 2009
Run finished Fri Mar 20 14:43:43 2009
Time was: 00:00:07.98
Time processor was: 00:00:00.28
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Here is the method to use SQLLDR and table details
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
SQL > desc test
Name Null? Type
----------------------- -------- ----------------
SRNO NUMBER (7)
PROD_ID NUMBER (10)
PROMO_ID NUMBER (10)
CHANNEL_ID NUMBER (10)
UNIT_COST NUMBER (10)
UNIT_PRICE NUMBER (10)
Use sqlldr process is:
cmd PROMT,
d:\ > sqlldr scott/tiger
Control = D:\test\temt.ctl
SQL * Loader: Release 10.2.0.1.0 - Production on Fri Mar 20 15:55:50 2009
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Commit the point reached - the number of logical records 64
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
I even tried a few examples,
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Which of the below control record make sense,
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
-1
DOWNLOAD THE DATA
INFILE 'D:\test\temt.txt '.
BADFILE "test.bad."
DISCARDFILE 'test.dsc '.
IN THE TABLE 'TEST '.
INSERT
COMPLETED FIELD BY (EN)
(INTEGER SRNO (7))
PROD_ID INTEGER (10),
PROMO_ID INTEGER (10),
CHANNEL_ID INTEGER (10),
UNIT_COST INTEGER (10),
UNIT_PRICE INTEGER (10)
)
-2
DOWNLOAD THE DATA
INFILE 'D:\test\temt.txt '.
BADFILE "test.bad."
DISCARDFILE 'test.dsc '.
IN THE TABLE 'TEST '.
INSERT
DOMAIN TERMINATED BY, eventually surrounded "" "
(INTEGER SRNO (7))
PROD_ID INTEGER (10),
PROMO_ID INTEGER (10),
CHANNEL_ID INTEGER (10),
UNIT_COST INTEGER (10),
UNIT_PRICE INTEGER (10)
)
For the code - 1 I get below mentioned error... *.
D:\ > sqlldr scott/tiger
Control = D:\test\temt.ctl
SQL * Loader: Release 10.2.0.1.0 - Production on Fri Mar 20 16:36 2009
Copyright (c) 1982, 2005, Oracle. All rights reserved.
SQL * Loader-350: error of syntax on line 8.
Expecting "(", found "FIELD".
COMPLETED FIELD BY (EN)
^
* And for the code - 2 I get the error below *.
D:\ > sqlldr scott/tiger
Control = D:\test\temt.ctl
SQL * Loader: Release 10.2.0.1.0 - Production on Fri Mar 20 16:39:22 2009
Copyright (c) 1982, 2005, Oracle. All rights reserved.
SQL * Loader-350: error of syntax on line 8.
Expecting "(", found "FIELD".
DOMAIN TERMINATED BY, eventually surrounded "" "
^
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------Hello
This will help for you
LOAD DATA INFILE 'D:\test\temt.txt' BADFILE 'test.bad' DISCARDFILE 'test.dsc' INSERT INTO TABLE "TEST" FIELDS TERMINATED BY ',' (SRNO INTEGER EXTERNAL , PROD_ID INTEGER EXTERNAL, PROMO_ID INTEGER EXTERNAL, CHANNEL_ID INTEGER EXTERNAL, UNIT_COST INTEGER EXTERNAL, UNIT_PRICE INTEGER EXTERNAL )
Thank you
-
Value of the argument of the Add method must be a date or UsiTimeDisp.
I am writing a code to allow users to define what to look for in the navigatro and then use the results for further analysis. But when I try to search by date of creation I get an error that says: "add the argument to the value of the method must be either a date or a USiTimeDisp. I have tried different methods to enter the dates (dd/mm/yyyy, mm.dd.yyyy etc) but none have worked.
What is this error and how can I fix?
The add method does not accept a string but only CDate or UsiTimeDisp.
Creation usiTimeDisp
protected cdateVal: cdateVal = now
protected timedispVal: set timedispVal = createTime (year (cdateVal), month (cdateVal), day (cdateVal), (cdateVal) hour, minute (cdateVal), second (cdateVal))
In this case, I started with a CDate I created using now.
How to get a CDate from a string. Here, we encounter the problem of the locale.
English: 22.08.1970 10:22:30
German: 22/08/1970 10:22:30
which is also reflected by converting a CDate chain.
Option Explicit
protected cdateVal: cdateVal = now
Dim strVal: strVal = CStr (cdateVal)
StrVal MsgBox
protected cDateVal2: cDateVal2 = CDate (strVal)
msgBox cDateVal2This code will work. It is capable of converting a CDate a CStr and then back into a CDate. This works if the string is formatted as expected by your locale.
CDate ("22.08.1970 10:22:30")
will work in Germany but fails in the United States.
protected diademtimeVal: TTR = diademtimeVal ("22.08.1970 22:30","dd.mm.yyyy HH" ")
protected timedispVal: set timedispVal = createTime (0, 0, 0, 0, 0, 0)
timedispVal.SecondsFrom0000 = diademtimeValyou create a usitimedisp object using a fixed format string. This should resolve your request.
-
Hello all i am trying to load data using FDMEE but I got these errors but I do not understand what it is
"
2016-01-25 12:34:31, 792 INFO [AIF]: beginning of the process FDMEE, process ID: 104
2016-01-25 12:34:31: 792 [AIF] INFO: recording of the FDMEE level: 4
2016-01-25 12:34:31, 793 [AIF] INFO: user:
2016-01-25 12:34:31, 793 INFO [AIF]: place: APC_Data_location (Partitionkey:11)
2016-01-25 12:34:31, 793 [AIF] INFO: period name: Dec-2015 (period key: 12/31/15 12:00 AM)
2016-01-25 12:34:31, 793 INFO [AIF]: name of the category: real (category key: 1)
2016-01-25 12:34:31, 793 [AIF] INFO: name rule: Data_loadRule_1 (rule ID:13)
2016-01-25 12:34:33, 792 [AIF] INFO: Version FDM: 11.1.2.4.000
2016-01-25 12:34:33, 792 INFO [AIF]: connect the file encoding: UTF-8
2016-01-25 12:34:34, 997 [AIF] INFO: - START IMPORT STEP -
2016-01-25 12:34:35, 157 INFO [AIF]: run the following script: /u02/Oracle/Middleware/EPMSystem11R1/products/FinancialDataQuality/bin/FusionCloud/FusionCloudAdapter.py
2016-01-25 12:34:35, INFO 171 [AIF]: FusionCloudAdapter.importDataFromFusion - START
2016-01-25 12:40:57, 601 INFO [AIF]: output Standard: INFO: from script FusionCloudAdapter...
Proxy configuration for deployment in Production SEEP
Main program of starting FusionAdapter...
mode of production: importDataFromFusion, pid: 104
FusionAdapter initialized during initialization.
fusionGlWebServiceWsdl: http://
fusionGlWebServiceUser: sysadmin
fusionProductType: GL
The main program of FusionAdapter is complete.
INFO: The FusionCloudAdapter script failed as described above.
2016-01-25 12:40:57, 601 INFO [AIF]: SD: com.sun.xml.ws.wsdl.parser.InaccessibleWSDLException: 2 InaccessibleWSDLException heads.
java.io.IOException: unable to proxy tunnel. Proxy returns "HTTP/1.1 502 cannotconnect.
java.io.IOException: unable to proxy tunnel. Proxy returns "HTTP/1.1 502 cannotconnect.
at com.sun.xml.ws.wsdl.parser.RuntimeWSDLParser.tryWithMex(RuntimeWSDLParser.java:182)
at com.sun.xml.ws.wsdl.parser.RuntimeWSDLParser.parse(RuntimeWSDLParser.java:153)
at com.sun.xml.ws.client.WSServiceDelegate.parseWSDL(WSServiceDelegate.java:284)
to com.sun.xml.ws.client.WSServiceDelegate. < init > (WSServiceDelegate.java:246)
to com.sun.xml.ws.client.WSServiceDelegate. < init > (WSServiceDelegate.java:197)
to com.sun.xml.ws.client.WSServiceDelegate. < init > (WSServiceDelegate.java:187)
to weblogic.wsee.jaxws.spi.WLSServiceDelegate. < init > (WLSServiceDelegate.java:86)
to weblogic.wsee.jaxws.spi.WLSProvider$ ServiceDelegate. < init > (WLSProvider.java:632)
at weblogic.wsee.jaxws.spi.WLSProvider.createServiceDelegate(WLSProvider.java:143)
at weblogic.wsee.jaxws.spi.WLSProvider.createServiceDelegate(WLSProvider.java:117)
at weblogic.wsee.jaxws.spi.WLSProvider.createServiceDelegate(WLSProvider.java:88)
to javax.xml.ws.Service. < init > (Service.java:77)
to com.hyperion.aif.ws.client.ErpIntegrationService.ErpIntegrationService_Service. < init > (ErpIntegrationService_Service.java:74)
to com.hyperion.aif.fusion.FusionAdapter. < init > (FusionAdapter.java:177)
at com.hyperion.aif.fusion.FusionAdapter.main(FusionAdapter.java:85)
2016-01-25 12:40:57, ERROR 602 [AIF]: the script failed to run:
2016-01-25 12:40:57, FATAL 605 [AIF]: error in Comm.executeJythonScript
Traceback (most recent call changed):
Folder "< string >", line 528, in executeJythonScript
File "/ u02/Oracle/Middleware/EPMSystem11R1/products/FinancialDataQuality/bin/FusionCloud/FusionCloudAdapter.py", line 163, < module >
fusionCloudAdapter.importDataFromFusion)
File "/ u02/Oracle/Middleware/EPMSystem11R1/products/FinancialDataQuality/bin/FusionCloud/FusionCloudAdapter.py", line 60, in importDataFromFusion
raise RuntimeError
RuntimeError
2016-01-25 12:40:57, 692 FATAL [AIF]: error in import data Pre COMM
2016-01-25 12:40:57, 697 [AIF] INFO: end process FDMEE, process ID: 104
Thank you
The script attempts to connect to a WSDL URL and therefore cannot error
"java.io.IOException: unable to tunnel by proxy." «Proxy returns «HTTP/1.1 502 cannotconnect»»
I see that it is from SEEP, maybe you have not set the correct details for Fusion Cloud, are you sure that you configure the WSDL URL appropriate in the connection source Configuration section in FDMEE
See you soon
John
-
Hi friends,
I'm trying to load records into the rules of the product table of the table with the following...
create table product)
prod_id varchar2 (20).
prod_grp varchar2 (20).
from_amt number (10),
to_amt number (10),
share_amt number (10)
);
Insert into product (prod_id, prod_grp, from_amt, share_amt) Values ('10037', "STK", 1, 18);
Insert into product (prod_id, prod_grp, from_amt, share_amt) Values ('10037', "NSTK", 1: 16.2);
Insert into product (prod_id, prod_grp, from_amt, to_amt, share_amt) Values ('10038', "NSTK", 1, 5000, 12);
Insert into product (prod_id, prod_grp, from_amt, to_amt, share_amt) Values ('10038', "STK", 5001, 10000, 16);
Insert into product (prod_id, prod_grp, from_amt, share_amt) Values ('10038', "STK", 10001, 20);
Insert into product (prod_id, prod_grp, from_amt, to_amt, share_amt) Values ('10039', "NSTK", 1, 8000, 10);
Insert into product (prod_id, prod_grp, from_amt, share_amt) Values ('10039', "STK", 8001, 12);
create table rules)
rule_id varchar2 (30),
rule_grp varchar2 (10),
rate_1 number (10),
point_1 number (10),
rate_2 number (10),
point_2 number (10),
rate_3 number (10),
point_3 number (10)
);
Criteria of loading in the rules of the table:
rule_id - "RL" | Product.prod_id
rule_grp - product.prod_grp
rate_1 - product.share_amt where from_amt = 1
point_1 - product.to_amt
rate_2 - if product.to_amt in point_1 is not NULL, then find product.share_amt of the next record with the same rule_id/prod_id where from_amt (of the next record) = to_amt (current record -
point_1) + 1
point_2 - if product.to_amt in point_1 is not NULL, then find product.to_amt of the next record with the same rule_id/prod_id where from_amt (of the next record) = to_amt (current record - )
point_1) + 1
rate_3 - if product.to_amt in point_2 is not NULL, then find product.share_amt of the next record with the same rule_id/prod_id where from_amt (of the next record) = to_amt(current )
Enregistrement-point_2) + 1
point_3 - if product.to_amt in point_2 is not NULL, then find product.to_amt of the next record with the same rule_id/prod_id where from_amt (of the next record) = to_amt (current record - )
point_2) + 1
I tried to load the first columns (rule_id, rule_grp, rate_1, point_1, rate_2, point_2) via the sql loader.
SQL > select * from product;
PROD_ID PROD_GRP FROM_AMT TO_AMT SHARE_AMT
-------------------- -------------------- ---------- ---------- ----------
10037 STK 1 18
10037 NSTK 1 16
1 5000 12 NSTK 10038
10038 5001-10000-16 STK.
10038 10001 20 STK.
10039 1 8000 10 NSTK
10039 STK 8001 12
produit.dat
PROD_ID | PROD_GRP | FROM_AMT | TO_AMT | SHARE_AMT
"10037' |'. STK' | 1. 18
"10037' |'. NSTK' | 1. 16.2
'10038' |' NSTK' | 1. 5000 | 12
'10038' |' STK' | 5001 | 10000 | 16
'10038' |' STK' | 10001 | 20
"10039' |'. NSTK' | 1. 8000 | 10
"10039' |'. STK' | 8001 | 12
Product.CTL
options (Skip = 1)
load data
in the table rules
fields ended by ' |'
surrounded of possibly ' '.
trailing nullcols
(rule_id POSITION (1) ""RL"|: rule_id")
rule_grp
from_amt BOUNDFILLER
point_1
share_amt BOUNDFILLER
, rate_1 ' BOX WHEN: from_amt = 1 THEN: share_amt END.
, rate_2 expression "(sélectionnez pr.share_amt de produit pr où: point_1 n'est pas null et pr.prod_id=:rule_id et: point_1 =: from_amt + 1)" "
, expression point_2 "(sélectionnez pr.to_amt de produit pr où: point_1 n'est pas null et pr.prod_id=:rule_id et: point_1 =: from_amt + 1)" "
)
He has not any support only values in rate_2, point_2... no error either... Not sure if there is another method to do this...
Please give your suggestions... Thank you very much for your time
Hello
Thanks for posting the CREATE TABLE and INSERT instructions for the sample data; It's very useful!
Don't forget to post the exact results you want from this data in the sample, i.e. what you want the rule table to contain once the task is completed.
As ground has said, there is no interest to use SQLLDR to copy data from one table to another in the same database. Use INSERT, or perhaps MERGE.
2817195 wrote:
Thank you for your answers... I thought it would be easier to manipulate the data using sql loader... I tried to use insert but do not know how to insert values in point_2, rate_3, rate_2, point_3, columns... For example, when point_1 is not null, need to do a find for the next with the same rule_id record and if the inserted record = pr.from_amt + 1 point_1 then RATE_2 should be inserted with this pr.share_amt of this record...
SQL > insert into the rules)
2 rule_id,
rule_grp 3,.
rate_1 4,.
point_1 5,.
rate_2 6,.
point_2 7,.
rate_3 8,.
9 point_3)
10. Select
11 'RL ' | PR.prod_id RULE_ID,
12 pr.prod_grp RULE_GRP,
13 CASES WHEN END of pr.from_amt = 1 THEN pr.share_amt RATE_1,
14 pr.to_amt POINT_1,
15 (select pr.share_amt from product pr where point_1 is not null and rules.rule_id = pr.prod_id and point_1 = pr.from_amt + 1) RATE_2,
16 (select pr.to_amt from product pr where point_1 is not null and rules.rule_id = pr.prod_id and = pr.from_amt + 1 point_1) POINT_2,.
17 (select pr.share_amt from product pr where point_2 is not null and rules.rule_id = pr.prod_id and = pr.from_amt + 1 point_2) RATE_3,.
18 (select pr.to_amt from product pr where point_2 is not null and rules.rule_id = pr.prod_id and = pr.from_amt + 1 point_2) POINT_3
19 product pr;
(select pr.share_amt from product pr where point_1 is not null and point_1 = pr.from_amt + 1) RATE_2,
*
ERROR on line 15:
ORA-00904: "POINT_1": invalid identifier
Help, please... Thank you very much
This is what causes the error:
The subquery on line 15 references only 1 table in the FROM clause, and this table is produced. There is no point_1 column in the product.
A scalar subquery like this can be correlated to a table in the Super request, but the only table in the FROM (line 19) clause is also produced. Since the only table that you read is produced, only columns that you can read are the columns of the product table.
You use the same table alias (pr) to mean different things 5. It's very confusing. Create aliases for single table in any SQL statement. (What you trying to do, I bet you can do without all these subqueries, in any case.)
-
Question to load data using sql loader in staging table, and then in the main tables!
Hello
I'm trying to load data into our main database table using SQL LOADER. data will be provided in separate pipes csv files.
I have develop a shell script to load the data and it works fine except one thing.
Here are the details of a data to re-create the problem.
Staging of the structure of the table in which data will be filled using sql loader
create table stg_cmts_data (cmts_token varchar2 (30), CMTS_IP varchar2 (20));
create table stg_link_data (dhcp_token varchar2 (30), cmts_to_add varchar2 (200));
create table stg_dhcp_data (dhcp_token varchar2 (30), DHCP_IP varchar2 (20));
DATA in the csv file-
for stg_cmts_data-
cmts_map_03092015_1.csv
WNLB-CMTS-01-1. 10.15.0.1
WNLB-CMTS-02-2 | 10.15.16.1
WNLB-CMTS-03-3. 10.15.48.1
WNLB-CMTS-04-4. 10.15.80.1
WNLB-CMTS-05-5. 10.15.96.1
for stg_dhcp_data-
dhcp_map_03092015_1.csv
DHCP-1-1-1. 10.25.23.10, 25.26.14.01
DHCP-1-1-2. 56.25.111.25, 100.25.2.01
DHCP-1-1-3. 25.255.3.01, 89.20.147.258
DHCP-1-1-4. 10.25.26.36, 200.32.58.69
DHCP-1-1-5 | 80.25.47.369, 60.258.14.10
for stg_link_data
cmts_dhcp_link_map_0309151623_1.csv
DHCP-1-1-1. WNLB-CMTS-01-1,WNLB-CMTS-02-2
DHCP-1-1-2. WNLB-CMTS-03-3,WNLB-CMTS-04-4,WNLB-CMTS-05-5
DHCP-1-1-3. WNLB-CMTS-01-1
DHCP-1-1-4. WNLB-CMTS-05-8,WNLB-CMTS-05-6,WNLB-CMTS-05-0,WNLB-CMTS-03-3
DHCP-1-1-5 | WNLB-CMTS-02-2,WNLB-CMTS-04-4,WNLB-CMTS-05-7
WNLB-DHCP-1-13 | WNLB-CMTS-02-2
Now, after loading these data in the staging of table I have to fill the main database table
create table subntwk (subntwk_nm varchar2 (20), subntwk_ip varchar2 (30));
create table link (link_nm varchar2 (50));
SQL scripts that I created to load data is like.
coil load_cmts.log
Set serveroutput on
DECLARE
CURSOR c_stg_cmts IS SELECT *.
OF stg_cmts_data;
TYPE t_stg_cmts IS TABLE OF stg_cmts_data % ROWTYPE INDEX BY pls_integer;
l_stg_cmts t_stg_cmts;
l_cmts_cnt NUMBER;
l_cnt NUMBER;
NUMBER of l_cnt_1;
BEGIN
OPEN c_stg_cmts.
Get the c_stg_cmts COLLECT in BULK IN l_stg_cmts;
BECAUSE me IN l_stg_cmts. FIRST... l_stg_cmts. LAST
LOOP
SELECT COUNT (1)
IN l_cmts_cnt
OF subntwk
WHERE subntwk_nm = l_stg_cmts (i) .cmts_token;
IF l_cmts_cnt < 1 THEN
INSERT
IN SUBNTWK
(
subntwk_nm
)
VALUES
(
l_stg_cmts (i) .cmts_token
);
DBMS_OUTPUT. Put_line ("token has been added: ' |") l_stg_cmts (i) .cmts_token);
ON THE OTHER
DBMS_OUTPUT. Put_line ("token is already present'");
END IF;
WHEN l_stg_cmts EXIT. COUNT = 0;
END LOOP;
commit;
EXCEPTION
WHILE OTHERS THEN
Dbms_output.put_line ('ERROR' |) SQLERRM);
END;
/
output
for dhcp
coil load_dhcp.log
Set serveroutput on
DECLARE
CURSOR c_stg_dhcp IS SELECT *.
OF stg_dhcp_data;
TYPE t_stg_dhcp IS TABLE OF stg_dhcp_data % ROWTYPE INDEX BY pls_integer;
l_stg_dhcp t_stg_dhcp;
l_dhcp_cnt NUMBER;
l_cnt NUMBER;
NUMBER of l_cnt_1;
BEGIN
OPEN c_stg_dhcp.
Get the c_stg_dhcp COLLECT in BULK IN l_stg_dhcp;
BECAUSE me IN l_stg_dhcp. FIRST... l_stg_dhcp. LAST
LOOP
SELECT COUNT (1)
IN l_dhcp_cnt
OF subntwk
WHERE subntwk_nm = l_stg_dhcp (i) .dhcp_token;
IF l_dhcp_cnt < 1 THEN
INSERT
IN SUBNTWK
(
subntwk_nm
)
VALUES
(
l_stg_dhcp (i) .dhcp_token
);
DBMS_OUTPUT. Put_line ("token has been added: ' |") l_stg_dhcp (i) .dhcp_token);
ON THE OTHER
DBMS_OUTPUT. Put_line ("token is already present'");
END IF;
WHEN l_stg_dhcp EXIT. COUNT = 0;
END LOOP;
commit;
EXCEPTION
WHILE OTHERS THEN
Dbms_output.put_line ('ERROR' |) SQLERRM);
END;
/
output
for link -.
coil load_link.log
Set serveroutput on
DECLARE
l_cmts_1 VARCHAR2 (4000 CHAR);
l_cmts_add VARCHAR2 (200 CHAR);
l_dhcp_cnt NUMBER;
l_cmts_cnt NUMBER;
l_link_cnt NUMBER;
l_add_link_nm VARCHAR2 (200 CHAR);
BEGIN
FOR (IN) r
SELECT dhcp_token, cmts_to_add | ',' cmts_add
OF stg_link_data
)
LOOP
l_cmts_1: = r.cmts_add;
l_cmts_add: = TRIM (SUBSTR (l_cmts_1, 1, INSTR (l_cmts_1, ',') - 1));
SELECT COUNT (1)
IN l_dhcp_cnt
OF subntwk
WHERE subntwk_nm = r.dhcp_token;
IF l_dhcp_cnt = 0 THEN
DBMS_OUTPUT. Put_line ("device not found: ' |") r.dhcp_token);
ON THE OTHER
While l_cmts_add IS NOT NULL
LOOP
l_add_link_nm: = r.dhcp_token |' _TO_' | l_cmts_add;
SELECT COUNT (1)
IN l_cmts_cnt
OF subntwk
WHERE subntwk_nm = TRIM (l_cmts_add);
SELECT COUNT (1)
IN l_link_cnt
LINK
WHERE link_nm = l_add_link_nm;
IF l_cmts_cnt > 0 AND l_link_cnt = 0 THEN
INSERT INTO link (link_nm)
VALUES (l_add_link_nm);
DBMS_OUTPUT. Put_line (l_add_link_nm |) » '||' Has been added. ") ;
ELSIF l_link_cnt > 0 THEN
DBMS_OUTPUT. Put_line (' link is already present: ' | l_add_link_nm);
ELSIF l_cmts_cnt = 0 then
DBMS_OUTPUT. Put_line (' no. CMTS FOUND for device to create the link: ' | l_cmts_add);
END IF;
l_cmts_1: = TRIM (SUBSTR (l_cmts_1, INSTR (l_cmts_1, ',') + 1));
l_cmts_add: = TRIM (SUBSTR (l_cmts_1, 1, INSTR (l_cmts_1, ',') - 1));
END LOOP;
END IF;
END LOOP;
COMMIT;
EXCEPTION
WHILE OTHERS THEN
Dbms_output.put_line ('ERROR' |) SQLERRM);
END;
/
output
control files -
DOWNLOAD THE DATA
INFILE 'cmts_data.csv '.
ADD
IN THE STG_CMTS_DATA TABLE
When (cmts_token! = ") AND (cmts_token! = 'NULL') AND (cmts_token! = 'null')
and (cmts_ip! = ") AND (cmts_ip! = 'NULL') AND (cmts_ip! = 'null')
FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "
TRAILING NULLCOLS
('RTRIM (LTRIM (:cmts_token))' cmts_token,
cmts_ip ' RTRIM (LTRIM(:cmts_ip)) ")". "
for dhcp.
DOWNLOAD THE DATA
INFILE 'dhcp_data.csv '.
ADD
IN THE STG_DHCP_DATA TABLE
When (dhcp_token! = ") AND (dhcp_token! = 'NULL') AND (dhcp_token! = 'null')
and (dhcp_ip! = ") AND (dhcp_ip! = 'NULL') AND (dhcp_ip! = 'null')
FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "
TRAILING NULLCOLS
('RTRIM (LTRIM (:dhcp_token))' dhcp_token,
dhcp_ip ' RTRIM (LTRIM(:dhcp_ip)) ")". "
for link -.
DOWNLOAD THE DATA
INFILE 'link_data.csv '.
ADD
IN THE STG_LINK_DATA TABLE
When (dhcp_token! = ") AND (dhcp_token! = 'NULL') AND (dhcp_token! = 'null')
and (cmts_to_add! = ") AND (cmts_to_add! = 'NULL') AND (cmts_to_add! = 'null')
FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "
TRAILING NULLCOLS
('RTRIM (LTRIM (:dhcp_token))' dhcp_token,
cmts_to_add TANK (4000) RTRIM (LTRIM(:cmts_to_add)) ")" ""
SHELL SCRIPT-
If [!-d / log]
then
Mkdir log
FI
If [!-d / finished]
then
mkdir makes
FI
If [!-d / bad]
then
bad mkdir
FI
nohup time sqlldr username/password@SID CONTROL = load_cmts_data.ctl LOG = log/ldr_cmts_data.log = log/ldr_cmts_data.bad DISCARD log/ldr_cmts_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &
nohup time username/password@SID @load_cmts.sql
nohup time sqlldr username/password@SID CONTROL = load_dhcp_data.ctl LOG = log/ldr_dhcp_data.log = log/ldr_dhcp_data.bad DISCARD log/ldr_dhcp_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &
time nohup sqlplus username/password@SID @load_dhcp.sql
nohup time sqlldr username/password@SID CONTROL = load_link_data.ctl LOG = log/ldr_link_data.log = log/ldr_link_data.bad DISCARD log/ldr_link_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &
time nohup sqlplus username/password@SID @load_link.sql
MV *.log. / log
If the problem I encounter is here for loading data in the link table that I check if DHCP is present in the subntwk table, then continue to another mistake of the newspaper. If CMTS then left create link to another error in the newspaper.
Now that we can here multiple CMTS are associated with unique DHCP.
So here in the table links to create the link, but for the last iteration of the loop, where I get separated by commas separate CMTS table stg_link_data it gives me log as not found CMTS.
for example
DHCP-1-1-1. WNLB-CMTS-01-1,WNLB-CMTS-02-2
Here, I guess to link the dhcp-1-1-1 with balancing-CMTS-01-1 and wnlb-CMTS-02-2
Theses all the data present in the subntwk table, but still it gives me journal wnlb-CMTS-02-2 could not be FOUND, but we have already loaded into the subntwk table.
same thing is happening with all the CMTS table stg_link_data who are in the last (I think here you got what I'm trying to explain).
But when I run the SQL scripts in the SQL Developer separately then it inserts all valid links in the table of links.
Here, she should create 9 lines in the table of links, whereas now he creates only 5 rows.
I use COMMIT in my script also but it only does not help me.
Run these scripts in your machine let me know if you also get the same behavior I get.
and please give me a solution I tried many thing from yesterday, but it's always the same.
It is the table of link log
link is already present: dhcp-1-1-1_TO_wnlb-cmts-01-1 NOT FOUND CMTS for device to create the link: wnlb-CMTS-02-2
link is already present: dhcp-1-1-2_TO_wnlb-cmts-03-3 link is already present: dhcp-1-1-2_TO_wnlb-cmts-04-4 NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-5
NOT FOUND CMTS for device to create the link: wnlb-CMTS-01-1
NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-8 NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-6 NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-0 NOT FOUND CMTS for device to create the link: wnlb-CMTS-03-3
link is already present: dhcp-1-1-5_TO_wnlb-cmts-02-2 link is already present: dhcp-1-1-5_TO_wnlb-cmts-04-4 NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-7
Device not found: wnlb-dhcp-1-13 IF NEED MORE INFORMATION PLEASE LET ME KNOW
Thank you
I felt later in the night that during the loading in the staging table using UNIX machine he created the new line for each line. That is why the last CMTS is not found, for this I use the UNIX 2 BACK conversion and it starts to work perfectly.
It was the dos2unix error!
Thank you all for your interest and I may learn new things, as I have almost 10 months of experience in (PLSQL, SQL)
-
How to load data into the App MVDEMO schema example
Hi all
I'm a POC on Oracle Mapviewer and try to build some reports in OBIEE using MApviewer.
This POC, I use Oracle MVDEMO example Data (11g). I think that these sample data covers few countries like the USA.
I need to make PDS for the Brazil, I downloaded data from the map of the site as Shapefiles Brazil
in these data of the Brazil, I got from .csv files 4 extensions, .dbf, .shp and SHX
I need to know how can I load these files into my Oracle 11 g DB? Should I load data into the same pattern of mvdemo, if yes then which table?
Any help will be appreciated a lot.
Thank you
Amit
Use the Java shapefile Converter utility (http://www.oracle.com/technetwork/database/options/spatialandgraph/downloads/index-093371.html)
GDAL (gdal.org) FME (Safe) or or MapBuilder.
Specify the to SRID (i.e. the SRID for loading in Oracle geoms) 4326 or 8307.
Load into a new table named anything you want. for example brazil_gadm with the geometry named GEOMETRY column
Once it's loaded, verify that there is an entry for the table and column (BRAZIL_GADM, GEOMETRY) in user_sdo_geom_metadata
Create a space on brazil_gadm.geometry index if the tool has not created a.
Add the definitions of topic for the country, State or whatever the admin areas exist in the dataset.
Import them as layers in OBIEE.
-
SQL Loader loading data into two Tables using a single CSV file
Dear all,
I have a requirement where in I need to load the data into 2 tables using a simple csv file.
So I wrote the following control file. But it loads only the first table and also there nothing in the debug log file.
Please suggest how to achieve this.
Examples of data
Source_system_code,Record_type,Source_System_Vendor_number,$vendor_name,Vendor_site_code,Address_line1,Address_line2,Address_line3
Victor, New, Ven001, Vinay, Vin001, abc, def, xyz
Control file script
================
OPTIONS (errors = 0, skip = 1)
load data
replace
in the table1 table:
fields ended by ',' optionally surrounded "" "
(
Char Source_system_code (1) POSITION "ltrim (rtrim (:Source_system_code)),"
Record_type tank "ltrim (rtrim (:Record_type)),"
Source_System_Vendor_number tank "ltrim (rtrim (:Source_System_Vendor_number)),"
$vendor_name tank "ltrim (rtrim (:Vendor_name)),"
)
in the Table2 table
1 = 1
fields ended by ',' optionally surrounded "" "
(
$vendor_name tank "ltrim (rtrim (:Vendor_name)),"
Vendor_site_code tank "ltrim (rtrim (:Vendor_site_code)),"
Address_line1 tank "ltrim (rtrim (:Address_line1)),"
Address_line2 tank "ltrim (rtrim (:Address_line2)),"
Address_line3 tank "ltrim (rtrim (:Address_line3)).
)the problem here is loading into a table, only the first. (Table 1)
Please guide me.
Thank you
Kumar
When you do not provide a starting position for the first field in table2, it starts with the following after a last referenced in table1 field, then it starts with vendor_site_code, instead of $vendor_name. So what you need to do instead, is specify position (1) to the first field in table2 and use the fields to fill. In addition, he dislikes when 1 = 1, and he didn't need anyway. See the example including the corrected below control file.
Scott@orcl12c > test.dat TYPE of HOST
Source_system_code, Record_type, Source_System_Vendor_number, $vendor_name, Vendor_site_code, Address_line1, Address_line2, Address_line3
Victor, New, Ven001, Vinay, Vin001, abc, def, xyz
Scott@orcl12c > test.ctl TYPE of HOST
OPTIONS (errors = 0, skip = 1)
load data
replace
in the table1 table:
fields ended by ',' optionally surrounded "" "
(
Char Source_system_code (1) POSITION "ltrim (rtrim (:Source_system_code)),"
Record_type tank "ltrim (rtrim (:Record_type)),"
Source_System_Vendor_number tank "ltrim (rtrim (:Source_System_Vendor_number)),"
$vendor_name tank "ltrim (rtrim (:Vendor_name)).
)
in the Table2 table
fields ended by ',' optionally surrounded "" "
(
source_system_code FILL (1) POSITION.
record_type FILLING,
source_system_vendor_number FILLING,
$vendor_name tank "ltrim (rtrim (:Vendor_name)),"
Vendor_site_code tank "ltrim (rtrim (:Vendor_site_code)),"
Address_line1 tank "ltrim (rtrim (:Address_line1)),"
Address_line2 tank "ltrim (rtrim (:Address_line2)),"
Address_line3 tank "ltrim (rtrim (:Address_line3)).
)
Scott@orcl12c > CREATE TABLE table1:
2 (Source_system_code VARCHAR2 (13),)
3 Record_type VARCHAR2 (11),
4 Source_System_Vendor_number VARCHAR2 (27),
5 $vendor_name VARCHAR2 (11))
6.
Table created.
Scott@orcl12c > CREATE TABLE table2
2 ($vendor_name VARCHAR2 (11),)
3 Vendor_site_code VARCHAR2 (16).
4 Address_line1 VARCHAR2 (13),
5 Address_line2 VARCHAR2 (13),
Address_line3 6 VARCHAR2 (13))
7.
Table created.
Scott@orcl12c > HOST SQLLDR scott/tiger CONTROL = test.ctl DATA = test.dat LOG = test.log
SQL * Loader: release 12.1.0.1.0 - Production on Thu Mar 26 01:43:30 2015
Copyright (c) 1982, 2013, Oracle and/or its affiliates. All rights reserved.
Path used: classics
Commit the point reached - the number of logical records 1
TABLE1 table:
1 row loaded successfully.
Table TABLE2:
1 row loaded successfully.
Check the log file:
test.log
For more information on the charge.
Scott@orcl12c > SELECT * FROM table1
2.
RECORD_TYPE SOURCE_SYSTEM_VENDOR_NUMBER $VENDOR_NAME SOURCE_SYSTEM
------------- ----------- --------------------------- -----------
Victor Ven001 new Vinay
1 selected line.
Scott@orcl12c > SELECT * FROM table2
2.
$VENDOR_NAME VENDOR_SITE_CODE ADDRESS_LINE1 ADDRESS_LINE2 ADDRESS_LINE3
----------- ---------------- ------------- ------------- -------------
Vinay Vin001 abc def xyz
1 selected line.
Scott@orcl12c >
-
Hi all
We had a request for EPMA Type HFM sizes have been local, validated and successfully deployed application.
We tried to load data into the application of HFM and loading the data has been a success.
Then, we decided to convert all the local dimension of the HFM application mentioned above as shared dimensions. After the conversion of all of the dimensions dimension shared with success that we get the error when loading data in the same application of HFM (the app is valid and can be deployed after a change)
The error log is below:
Load data started: 29/11/2014-10:53:15.
Line: 216, error: cell invalid for the period Oct.
REAL; 2014; Oct; FOR A YEAR; E_2100; < entity currency >; 89920000; [ICP no]; CORP.; [None]; [None]; FARM21000; 11979
> > > > > >
Line: 217, error: cell invalid for the period Nov.
REAL; 2014; Nov; FOR A YEAR; E_2100; < entity currency >; 89920000; [ICP no]; CORP.; [None]; [None]; FARM21000; 23544
> > > > > >
Online: 218, error: invalid cell for the period dec.
REAL; 2014; Dec; FOR A YEAR; E_2100; < entity currency >; 89920000; [ICP no]; CORP.; [None]; [None]; FARM21000; 58709
> > > > > >
Line: 219, error: cell invalid for the period Oct.
REAL; 2014; Oct; FOR A YEAR; E_2100; < entity currency >; 28050000; E_6000_20; [None]; [None]; [None]; FARM21000;-11979
> > > > > >
Line: 220, error: cell invalid for the period Nov.
REAL; 2014; Nov; FOR A YEAR; E_2100; < entity currency >; 28050000; E_6000_20; [None]; [None]; [None]; FARM21000;-11565
> > > > > >
Wanted to know if there is something I might have missed if conversion of local dimension in sharing (if there is any sequence to do, or no constraint that I am not aware, although the conversion looks good, as demand is validated and deployed after changes)
What can be the reason for missed data load, can anyone help?
Thank you
Rachid
Hello
I could watch the properties of account for this account (89920000) and see the TopCustom1... 4Member. you will find the reason behind the invalid cells.
When you convert the local shared dimensions, have you checked the "Dimension Association' accounts and society?
He does not lose the dimension association if a proper sequence is not respected.
Kind regards
S
-
ODI - the most recent version of the loading data
Hello
I have a flat file with the above structure
No. MRN DATE OF BIRTH
12345 12/04/1988
12345 13/06/1980
12345 21/05/1989
The requirement is to load data into Oracle tables where only the last row must be taken (in this sense Date of birth for last MNR number must be picked, essentially the last disk)
There is no verification of available data, which means that no insert_date or no matter what version number is available
What is the best way to achieve this.
Any help will be appreciated.
Thanks and greetings
Reshma
In this case as you consider this last incoming data as more later. You should be able to choose later from the file in 2 ways (using the oracle sequence generated by default or manually, by creating a sequence).
In the second case, when loading data to the staging table.
Create a SNO_MRN_SEQ sequence in Oracle DB
create sequences SNO_MRN_SEQ
START WITH 1
INCREMENT BY 1
NOCACHE;
Create an additional (SNO_SEQ) field in the staging data store, which in the mapping interface you will map the sequence (schema_name. SNO_MRN_SEQ. NEXTVAL).
Check the staging table data if the sequence is empty as expected.
during the loading to the filter of the target on MRN_NO Table and use the below query, which will essentially come to the max of the sequence number.
STGTABLE. SNO_SEQ =
(SELECT MAX (B.SNO_SEQ)
OF STAGING_TABLE B
WHERE STGTABLE. MRN_NO = B.MRN_NO
)
Let me know if that helps!
-
Loading data from the table of operating system
Hi all;
MY DB vsersion is 10.2.0.5.0 in LINUX
I have the .sql file in path X11R6.
I am trying to load data by running the script load.sql , but I get the error message.
> > CONTENT
$ cat vi.sample_tab.sql
create table tab1 (identification number,
name varchar2 (15).
Qual varchar2 (15).
City varchar2 (15).
Mobile number);
$ vi load.sql;
Start
because me 1.100000 loop
insert into table values (i,'* ',' MS ',' *', 1234554321);
commit;
end loop;
end;
/
SQL > @sample.sql;
Table created.
SQL > @load.sql;
insert into table values (i,'* ',' MS ',' *', 1234554321);
*
ERROR at line 3:
ORA-06550: line 3, column 19:
PL/SQL: ORA-00942: table or view does not exist
ORA-06550: line 3, column 1:
PL/SQL: SQL statement ignored
SQL > select * from tab;
TNOM TABTYPE CLUSTERID
------------------------------ ------- ----------
TABLE TAB1
SQL > tab1 desc;
Name Null? Type
----------------------------------------- -------- ----------------------------
ID NUMBER
NAME VARCHAR2 (15)
QUAL VARCHAR2 (15)
CITY VARCHAR2 (15)
MOBILE PHONE NUMBER
Thanks in advance.
Hello
GTS (DBA) wrote:
Hi all;
MY DB vsersion is 10.2.0.5.0
I have the .sql file in path X11R6.
I am trying to load data by running the script load.sql , but I get the error message.
> CONTENT
$ cat vi.sample_tab.sql
create table tab1 (identification number,
name varchar2 (15).
Qual varchar2 (15).
City varchar2 (15).
Mobile number);
...
Well, which creates a table called TAB1. The last character of the name of the table is the numeral "1".
$ vi load.sql;
Start
because me 1.100000 loop
insert into table values (i,'* ',' MS ',' *', 1234554321);
commit;
end loop;
end;
/
SQL > @sample.sql;
Table created.
SQL > @load.sql;
insert into table values (i,'* ',' MS ',' *', 1234554321);
*
ERROR at line 3:
ORA-06550: line 3, column 19:
PL/SQL: ORA-00942: table or view does not exist
ORA-06550: line 3, column 1:
PL/SQL: SQL statement ignored
Refers to another table. The last character of the name of this table is the letter 'L '.
SQL > tab1 desc;
Name Null? Type
----------------------------------------- -------- ----------------------------
ID NUMBER
NAME VARCHAR2 (15)
QUAL VARCHAR2 (15)
CITY VARCHAR2 (15)
MOBILE PHONE NUMBER
This is the table that you created (its name ends with the digit "1"), not that used in the INSERT statement (its name ends with the letter "L").
Maybe you are looking for
-
How can I move my bookmarks, etc. from my old mac to my new mac?
I recently bought a new Mac Book Pro. How to import my data from Firefox (bookmarks, etc.) to my new laptop. Thank you Jay
-
Hello I tried to update my windows service, but the BITS can not start (error 80246008). I tried to solve it using the suggested steps, however, got an error 1079 (account specified for this service is different from the account specified for other s
-
Changing settings blackBerry Z10 Message on Power off / on
Could someone try to say change the Contacts from the SIM card off in the settings menu of Contacts switch the power off and power on phone, then check the Contacts to decide if she switches it back on alonh with all the other settings of contact.
-
Why I get redirected to the download site everytime I try to access a web site?
-
Standard recognized as VGA adapter graphics card
So I rebooted recently my phone leaving no records, then I installed readers that I could find and a player that remained was the graphical drive, I have an AMD Raedon 6480 and I coudnt find a dryer anywhere. Now when I go to the hardware Manager it