Loading data from SQL to Essbase table
Hello
I'm loading data from SQL to Essbase table by using a rules file. Number of rows in the source table is 7 million. I use the SUNOPSIS MEMORY ENGINE as area transit, LKM SQL for SQL and SQL IKM for Hyperion Essbase (DATA).
Question:
1 can I not use any other LKM as MSSQL for MSSQL (PCBS) to load data to the staging instead of LKM SQL for SQL? What I have to change the transit area then? Loading data using LKM SQL for SQL seems quite slow.
2 it is mandatory to use LKM SQL for SQL, can someone please tell me what I can change to make this quick support parameters?
3. is it compulsory to use the SUNOPSIS MEMORY engine loading data from SQL server to Essbase?
Thank you...
(1) Yes, I highly recommend watching using a KM which uses native as database technology these will usually be more efficient than the generic (like LKM SQL for SQL) KM especially when large volumes of data are involved. Your staging will change depends on where you organize data for example if you are using a SQL server specific KM as - MSSQL for MSSQL (PCBS) - you must have a lay-by available on a MSSQL database and have access to the utility of PCBS.
(2) it is not mandatory to use this KM you can use a KMs supported by your database technology
(3) it is absolutely not obligatory to use the SUNOPSIS MEMORY engine. This should only be used when you have relatively small amounts of data, as well as all the processes in memory, or in the case where you have no other relational technology to perform the staging on. However, in your case to use wherever you are processesing these large volumes of data you should be staged on a physical such as SQL Server or Oracle database if they are available.
Tags: Business Intelligence
Similar Questions
-
ODI error when loading data from Oracle to Essbase
Hello and happy new year to everyone!
I can't load some data from an Oracle table to an Essbase database. The Oracle table has all the members of the Essbase database, columns, and in addition, it has a column for data.
I use the IKM SQL for Hyperion Essbase (DATA) as IKM and I can't choose a LKM in the dropdown, but I imported the LKM SQL to SQL.
Error thrown is:
org.apache.bsf.BSFException: exception of Jython:
Traceback (innermost last):
"< String >" file, line 23, in there?
com.hyperion.odi.essbase.ODIEssbaseException: error reached records the maximum error threshold: 100
at com.hyperion.odi.essbase.ODIEssbaseDataWriter.loadData (unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke (unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke (unknown Source)
at java.lang.reflect.Method.invoke (unknown Source)
at org.python.core.PyReflectedFunction.__call__ (PyReflectedFunction.java)
at org.python.core.PyMethod.__call__ (PyMethod.java)
at org.python.core.PyObject.__call__ (PyObject.java)
at org.python.core.PyInstance.invoke (PyInstance.java)
to org.python.pycode._pyx10.f$ 0 (< string >: 23)
to org.python.pycode._pyx10.call_function (< string >)
at org.python.core.PyTableCode.call (PyTableCode.java)
at org.python.core.PyCode.call (PyCode.java)
at org.python.core.Py.runCode (Py.java)
at org.python.core.Py.exec (Py.java)
at org.python.util.PythonInterpreter.exec (PythonInterpreter.java)
at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
at com.sunopsis.dwg.codeinterpretor.k.a (unknown Source)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting (unknown Source)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders (unknown Source)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders (unknown Source)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt (unknown Source)
at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt (unknown Source)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask (unknown Source)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep (unknown Source)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession (unknown Source)
at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand (unknown Source)
at com.sunopsis.dwg.cmd.DwgCommandBase.execute (unknown Source)
at com.sunopsis.dwg.cmd.e.i (unknown Source)
at com.sunopsis.dwg.cmd.g.y (unknown Source)
at com.sunopsis.dwg.cmd.e.run (unknown Source)
at java.lang.Thread.run (unknown Source)
Caused by: com.hyperion.odi.essbase.ODIEssbaseException: error reached records the maximum error threshold: 100
at com.hyperion.odi.essbase.ODIEssbaseDataWriter.sendRecordArrayToEsbase (unknown Source)
... 32 more
com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: error reached records the maximum error threshold: 100
at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
at com.sunopsis.dwg.codeinterpretor.k.a (unknown Source)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting (unknown Source)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders (unknown Source)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders (unknown Source)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt (unknown Source)
at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt (unknown Source)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask (unknown Source)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep (unknown Source)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession (unknown Source)
at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand (unknown Source)
at com.sunopsis.dwg.cmd.DwgCommandBase.execute (unknown Source)
at com.sunopsis.dwg.cmd.e.i (unknown Source)
at com.sunopsis.dwg.cmd.g.y (unknown Source)
at com.sunopsis.dwg.cmd.e.run (unknown Source)
at java.lang.Thread.run (unknown Source)
I'll wait for any help.
Thank you!First of all, I would highly recommend the upgrade because there were a few bugs with the adapters of Hyperion.
If you set MAXIMUM_ERRORS_ALLOWED to 0 it will be infinite, so it will not stop because of rejected records.Once you assign to LOG_ENABLED and LOG_ERRORS Yes
then you LOG_FILENAME together and ERROR_LOG_FILENAME, put them in a known example location c:\temp on windows.Ok?
See you soon
John
http://John-Goodwin.blogspot.com/ -
load data from csv file into table
Hello
I'm working on Oracle 11 g r2 on UNIX platform.
We have an obligation to load data to the table to a flat file, but on one condition, need to make the comparison between filed primary key if the file is available then update another filed and if the recording is not available, we need to insert.
How can achieve us.
SQLLoader to load the CSV file data into a staging Table.
Then use the MERGE SQL command to the lines of insert/update of table setting for the target table.
Hemant K Collette
-
Extract data from SQL to Pivot Table/CUBE database vFoglight?
Our CIO request hourly data so he can create a history of our virtual environment using the data collected from vFoglight the BODs to attend so that they can see ho are we utulizing the new environment, they allowed us to buy. I've created a report for him but he won't have to copy and paste data in an excel file, every hour, it's short.
It looks like this:
dnsName Name Use (%) ESXi Server 1 Memory 42 ESXi Server 2 Memory 37 So is there a way to allow Excel to connect to SQL Server and extract this data so that it can organize itself? Or can we write a report that displays data on time as follows?
I.E.
01:00 - 42%
02:00 - 39%
03:00 - 41%
Hi Morgan,.
There are a few examples of the extraction of metrics data formatted in Foglight using command-line scripts in the blog article Foglight Reporting using queries metric or Groovy to http://en.community.dell.com/techcenter/performance-monitoring/foglight-administrators/w/admins-wiki/5654.foglight-reporting-using-metric-queries-or-groovy
I hope this will give you some ideas.
Kind regards
Brian Wheeldon
-
How to load data from ms sql to the by the file rules and maxl essbase
Hi, everybody!
Pretty Shure the kind of topic exists already on the forum, but unfortunately can't find it.
I want to understand, how to load data from the database to ms sql for the PB of essbase application.
(I) so I have:
1. a request for 'society '.
2 and its database 'Plan '.
3. with simple outline, which contains only two dimensions:
-The time period < 1 > (Label only)
-Total year (~)
-Qtr1 (+)
Jan (+)
Feb (+)
Mar (+)
-Qtr2 (+)
APR (+)
May (+)
Jun (+)
-Accounts < 1 > (Label only)
-Lost / based (~) (Label only)
-Revenues (+)
L1.1 (+)
L1.2 (+)
-Costs (-)
L2.1 (+)
L2.2 (+)
(II) also, I created a rules file called "CO_DWH" and associated with this schema
It (Rules file) a 3 columns:
Data of the 'period', 'account '.
There is also the option checked "load data".
(III) in MS SQL Server, I have a database of "hypdb" and "essadmin" with "password" login
(IV) I executed my bat file:
C:\Hyperion\EssLoad.bat
There is only one line:
EssLoad.bat-
startMaxl.cmd EssLoad.txt
----------------------------------------------
EssLoad.txt-
login password admin on erpserver;
Import database data Company.Plan connect in essadmin password using the server rules_file 'CO_DWH' error abort;
disconnection;
"exit";
--------------------------------------------
All plans that I copied well worked ERP system, but I don't understand, this is exactly table in the MS SQL database, data loading to essbase db?TOU have to do a few things
1. on the server of Essbase, put in place a system odbc for your MySQL database connection
2. in the State of charge, go to the file menu and select open source SQL database and in the source of data in your SQL statement. A few tips. Where it says select, don't put in the select Word and when he says to not put in of. The system adds for you. TI enter a simple SQL statement is to do this in the box so select if your SQL would be normanlly as select * from myTable just enter the code as * from myTable in the upscale neighborhoodThe ol/recover the click and fill your connection information. Should bring back the data in the State of charge. Save the State of charge and use
-
SOS! -Error to load data from Oracle 11 g to Essbase using ODI
Hi all.
I want to load data from oracle database to essbase using ODI.
I have set up correctly the Hyperion essbase physical and logical topology manager and got the structure ESSBASE BASIC app DEMO.
The problem is.
1. when I try see data right click on the table of essbase.
va.sql.SQLException: driver must be specified
at com.sunopsis.sql.SnpsConnection.a (SnpsConnection.java)
at com.sunopsis.sql.SnpsConnection.testConnection (SnpsConnection.java)
at com.sunopsis.sql.SnpsConnection.testConnection (SnpsConnection.java)
at com.sunopsis.graphical.frame.b.jc.bE (jc.java)
at com.sunopsis.graphical.frame.bo.bA (bo.java)
at com.sunopsis.graphical.frame.b.ja.dl (ja.java)
to com.sunopsis.graphical.frame.b.ja. < init > (ja.java)
to com.sunopsis.graphical.frame.b.jc. < init > (jc.java)
I got the answer of partisan Oracle it's ok, just omit it. The second problem will appear.
2 create an interface between the oracle and essbase database, click on the option ' staging of deffirent from the area of the "(ce qui signifie que la mise en scène est créé à la base de données oracle) target, and using IKM SQL for Hyperion Essbase (metadata), run this interface".
org.apache.bsf.BSFException: exception of Jython:
Traceback (innermost last):
"< String >" file, line 61, in there?
com.hyperion.odi.essbase.ODIEssbaseException: invalid value specified [RULES_FILE] for the load option [null]
at com.hyperion.odi.essbase.ODIEssbaseMetaWriter.validateLoadOptions (unknown Source)
at com.hyperion.odi.essbase.AbstractEssbaseWriter.beginLoad (unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke (unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke (unknown Source)
at java.lang.reflect.Method.invoke (unknown Source)
at org.python.core.PyReflectedFunction.__call__ (PyReflectedFunction.java)
at org.python.core.PyMethod.__call__ (PyMethod.java)
at org.python.core.PyObject.__call__ (PyObject.java)
at org.python.core.PyInstance.invoke (PyInstance.java)
to org.python.pycode._pyx1.f$ 0 (< string >: 61)
to org.python.pycode._pyx1.call_function (< string >)
at org.python.core.PyTableCode.call (PyTableCode.java)
at org.python.core.PyCode.call (PyCode.java)
at org.python.core.Py.runCode (Py.java)
at org.python.core.Py.exec (Py.java)
I am very confused by it. Someone give me a solution or other docs.
Ethan.Hello
You say that you are loading, but you chose the KM for the loading of metadata?
If you load metadata then you must provide a rules file, if you are loading and then choose the KM - IKM SQL for Hyperion Essbase (DATA)Ok?
See you soon
John
http://John-Goodwin.blogspot.com/ -
Load data from SqlServer2005 load forst 50 discs with query in essbase
Hello
I have loaded the data from SQL Server 2005 to Essbase through a query (in a fact table) to and also rules file.
I have 2 problems.
The problem is its load 50 recordings only (when you open the editor of dataprep and open sql, on the screen by default,. when I run the query, his 50records of loading only)
(2) his watch do not cube for these 50 records statistics.
(I doubt, there is a problem in the outline property set)
We need to set a property to display all the records?
Please help me!
Kind regards
: SatyaBy default the data prep Editor shows only the first 50 records. You can change this (up to 500 I think) going to save-> County of record view and change it there. During the loading in fact data that he is responsible for all lines (unless you have a top to your Sql command)
Van ' t show you all the records unless you have less than 500, but you can choose which 500 you want by changing the setting from
-
Question to load data using sql loader in staging table, and then in the main tables!
Hello
I'm trying to load data into our main database table using SQL LOADER. data will be provided in separate pipes csv files.
I have develop a shell script to load the data and it works fine except one thing.
Here are the details of a data to re-create the problem.
Staging of the structure of the table in which data will be filled using sql loader
create table stg_cmts_data (cmts_token varchar2 (30), CMTS_IP varchar2 (20));
create table stg_link_data (dhcp_token varchar2 (30), cmts_to_add varchar2 (200));
create table stg_dhcp_data (dhcp_token varchar2 (30), DHCP_IP varchar2 (20));
DATA in the csv file-
for stg_cmts_data-
cmts_map_03092015_1.csv
WNLB-CMTS-01-1. 10.15.0.1
WNLB-CMTS-02-2 | 10.15.16.1
WNLB-CMTS-03-3. 10.15.48.1
WNLB-CMTS-04-4. 10.15.80.1
WNLB-CMTS-05-5. 10.15.96.1
for stg_dhcp_data-
dhcp_map_03092015_1.csv
DHCP-1-1-1. 10.25.23.10, 25.26.14.01
DHCP-1-1-2. 56.25.111.25, 100.25.2.01
DHCP-1-1-3. 25.255.3.01, 89.20.147.258
DHCP-1-1-4. 10.25.26.36, 200.32.58.69
DHCP-1-1-5 | 80.25.47.369, 60.258.14.10
for stg_link_data
cmts_dhcp_link_map_0309151623_1.csv
DHCP-1-1-1. WNLB-CMTS-01-1,WNLB-CMTS-02-2
DHCP-1-1-2. WNLB-CMTS-03-3,WNLB-CMTS-04-4,WNLB-CMTS-05-5
DHCP-1-1-3. WNLB-CMTS-01-1
DHCP-1-1-4. WNLB-CMTS-05-8,WNLB-CMTS-05-6,WNLB-CMTS-05-0,WNLB-CMTS-03-3
DHCP-1-1-5 | WNLB-CMTS-02-2,WNLB-CMTS-04-4,WNLB-CMTS-05-7
WNLB-DHCP-1-13 | WNLB-CMTS-02-2
Now, after loading these data in the staging of table I have to fill the main database table
create table subntwk (subntwk_nm varchar2 (20), subntwk_ip varchar2 (30));
create table link (link_nm varchar2 (50));
SQL scripts that I created to load data is like.
coil load_cmts.log
Set serveroutput on
DECLARE
CURSOR c_stg_cmts IS SELECT *.
OF stg_cmts_data;
TYPE t_stg_cmts IS TABLE OF stg_cmts_data % ROWTYPE INDEX BY pls_integer;
l_stg_cmts t_stg_cmts;
l_cmts_cnt NUMBER;
l_cnt NUMBER;
NUMBER of l_cnt_1;
BEGIN
OPEN c_stg_cmts.
Get the c_stg_cmts COLLECT in BULK IN l_stg_cmts;
BECAUSE me IN l_stg_cmts. FIRST... l_stg_cmts. LAST
LOOP
SELECT COUNT (1)
IN l_cmts_cnt
OF subntwk
WHERE subntwk_nm = l_stg_cmts (i) .cmts_token;
IF l_cmts_cnt < 1 THEN
INSERT
IN SUBNTWK
(
subntwk_nm
)
VALUES
(
l_stg_cmts (i) .cmts_token
);
DBMS_OUTPUT. Put_line ("token has been added: ' |") l_stg_cmts (i) .cmts_token);
ON THE OTHER
DBMS_OUTPUT. Put_line ("token is already present'");
END IF;
WHEN l_stg_cmts EXIT. COUNT = 0;
END LOOP;
commit;
EXCEPTION
WHILE OTHERS THEN
Dbms_output.put_line ('ERROR' |) SQLERRM);
END;
/
output
for dhcp
coil load_dhcp.log
Set serveroutput on
DECLARE
CURSOR c_stg_dhcp IS SELECT *.
OF stg_dhcp_data;
TYPE t_stg_dhcp IS TABLE OF stg_dhcp_data % ROWTYPE INDEX BY pls_integer;
l_stg_dhcp t_stg_dhcp;
l_dhcp_cnt NUMBER;
l_cnt NUMBER;
NUMBER of l_cnt_1;
BEGIN
OPEN c_stg_dhcp.
Get the c_stg_dhcp COLLECT in BULK IN l_stg_dhcp;
BECAUSE me IN l_stg_dhcp. FIRST... l_stg_dhcp. LAST
LOOP
SELECT COUNT (1)
IN l_dhcp_cnt
OF subntwk
WHERE subntwk_nm = l_stg_dhcp (i) .dhcp_token;
IF l_dhcp_cnt < 1 THEN
INSERT
IN SUBNTWK
(
subntwk_nm
)
VALUES
(
l_stg_dhcp (i) .dhcp_token
);
DBMS_OUTPUT. Put_line ("token has been added: ' |") l_stg_dhcp (i) .dhcp_token);
ON THE OTHER
DBMS_OUTPUT. Put_line ("token is already present'");
END IF;
WHEN l_stg_dhcp EXIT. COUNT = 0;
END LOOP;
commit;
EXCEPTION
WHILE OTHERS THEN
Dbms_output.put_line ('ERROR' |) SQLERRM);
END;
/
output
for link -.
coil load_link.log
Set serveroutput on
DECLARE
l_cmts_1 VARCHAR2 (4000 CHAR);
l_cmts_add VARCHAR2 (200 CHAR);
l_dhcp_cnt NUMBER;
l_cmts_cnt NUMBER;
l_link_cnt NUMBER;
l_add_link_nm VARCHAR2 (200 CHAR);
BEGIN
FOR (IN) r
SELECT dhcp_token, cmts_to_add | ',' cmts_add
OF stg_link_data
)
LOOP
l_cmts_1: = r.cmts_add;
l_cmts_add: = TRIM (SUBSTR (l_cmts_1, 1, INSTR (l_cmts_1, ',') - 1));
SELECT COUNT (1)
IN l_dhcp_cnt
OF subntwk
WHERE subntwk_nm = r.dhcp_token;
IF l_dhcp_cnt = 0 THEN
DBMS_OUTPUT. Put_line ("device not found: ' |") r.dhcp_token);
ON THE OTHER
While l_cmts_add IS NOT NULL
LOOP
l_add_link_nm: = r.dhcp_token |' _TO_' | l_cmts_add;
SELECT COUNT (1)
IN l_cmts_cnt
OF subntwk
WHERE subntwk_nm = TRIM (l_cmts_add);
SELECT COUNT (1)
IN l_link_cnt
LINK
WHERE link_nm = l_add_link_nm;
IF l_cmts_cnt > 0 AND l_link_cnt = 0 THEN
INSERT INTO link (link_nm)
VALUES (l_add_link_nm);
DBMS_OUTPUT. Put_line (l_add_link_nm |) » '||' Has been added. ") ;
ELSIF l_link_cnt > 0 THEN
DBMS_OUTPUT. Put_line (' link is already present: ' | l_add_link_nm);
ELSIF l_cmts_cnt = 0 then
DBMS_OUTPUT. Put_line (' no. CMTS FOUND for device to create the link: ' | l_cmts_add);
END IF;
l_cmts_1: = TRIM (SUBSTR (l_cmts_1, INSTR (l_cmts_1, ',') + 1));
l_cmts_add: = TRIM (SUBSTR (l_cmts_1, 1, INSTR (l_cmts_1, ',') - 1));
END LOOP;
END IF;
END LOOP;
COMMIT;
EXCEPTION
WHILE OTHERS THEN
Dbms_output.put_line ('ERROR' |) SQLERRM);
END;
/
output
control files -
DOWNLOAD THE DATA
INFILE 'cmts_data.csv '.
ADD
IN THE STG_CMTS_DATA TABLE
When (cmts_token! = ") AND (cmts_token! = 'NULL') AND (cmts_token! = 'null')
and (cmts_ip! = ") AND (cmts_ip! = 'NULL') AND (cmts_ip! = 'null')
FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "
TRAILING NULLCOLS
('RTRIM (LTRIM (:cmts_token))' cmts_token,
cmts_ip ' RTRIM (LTRIM(:cmts_ip)) ")". "
for dhcp.
DOWNLOAD THE DATA
INFILE 'dhcp_data.csv '.
ADD
IN THE STG_DHCP_DATA TABLE
When (dhcp_token! = ") AND (dhcp_token! = 'NULL') AND (dhcp_token! = 'null')
and (dhcp_ip! = ") AND (dhcp_ip! = 'NULL') AND (dhcp_ip! = 'null')
FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "
TRAILING NULLCOLS
('RTRIM (LTRIM (:dhcp_token))' dhcp_token,
dhcp_ip ' RTRIM (LTRIM(:dhcp_ip)) ")". "
for link -.
DOWNLOAD THE DATA
INFILE 'link_data.csv '.
ADD
IN THE STG_LINK_DATA TABLE
When (dhcp_token! = ") AND (dhcp_token! = 'NULL') AND (dhcp_token! = 'null')
and (cmts_to_add! = ") AND (cmts_to_add! = 'NULL') AND (cmts_to_add! = 'null')
FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "
TRAILING NULLCOLS
('RTRIM (LTRIM (:dhcp_token))' dhcp_token,
cmts_to_add TANK (4000) RTRIM (LTRIM(:cmts_to_add)) ")" ""
SHELL SCRIPT-
If [!-d / log]
then
Mkdir log
FI
If [!-d / finished]
then
mkdir makes
FI
If [!-d / bad]
then
bad mkdir
FI
nohup time sqlldr username/password@SID CONTROL = load_cmts_data.ctl LOG = log/ldr_cmts_data.log = log/ldr_cmts_data.bad DISCARD log/ldr_cmts_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &
nohup time username/password@SID @load_cmts.sql
nohup time sqlldr username/password@SID CONTROL = load_dhcp_data.ctl LOG = log/ldr_dhcp_data.log = log/ldr_dhcp_data.bad DISCARD log/ldr_dhcp_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &
time nohup sqlplus username/password@SID @load_dhcp.sql
nohup time sqlldr username/password@SID CONTROL = load_link_data.ctl LOG = log/ldr_link_data.log = log/ldr_link_data.bad DISCARD log/ldr_link_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &
time nohup sqlplus username/password@SID @load_link.sql
MV *.log. / log
If the problem I encounter is here for loading data in the link table that I check if DHCP is present in the subntwk table, then continue to another mistake of the newspaper. If CMTS then left create link to another error in the newspaper.
Now that we can here multiple CMTS are associated with unique DHCP.
So here in the table links to create the link, but for the last iteration of the loop, where I get separated by commas separate CMTS table stg_link_data it gives me log as not found CMTS.
for example
DHCP-1-1-1. WNLB-CMTS-01-1,WNLB-CMTS-02-2
Here, I guess to link the dhcp-1-1-1 with balancing-CMTS-01-1 and wnlb-CMTS-02-2
Theses all the data present in the subntwk table, but still it gives me journal wnlb-CMTS-02-2 could not be FOUND, but we have already loaded into the subntwk table.
same thing is happening with all the CMTS table stg_link_data who are in the last (I think here you got what I'm trying to explain).
But when I run the SQL scripts in the SQL Developer separately then it inserts all valid links in the table of links.
Here, she should create 9 lines in the table of links, whereas now he creates only 5 rows.
I use COMMIT in my script also but it only does not help me.
Run these scripts in your machine let me know if you also get the same behavior I get.
and please give me a solution I tried many thing from yesterday, but it's always the same.
It is the table of link log
link is already present: dhcp-1-1-1_TO_wnlb-cmts-01-1 NOT FOUND CMTS for device to create the link: wnlb-CMTS-02-2
link is already present: dhcp-1-1-2_TO_wnlb-cmts-03-3 link is already present: dhcp-1-1-2_TO_wnlb-cmts-04-4 NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-5
NOT FOUND CMTS for device to create the link: wnlb-CMTS-01-1
NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-8 NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-6 NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-0 NOT FOUND CMTS for device to create the link: wnlb-CMTS-03-3
link is already present: dhcp-1-1-5_TO_wnlb-cmts-02-2 link is already present: dhcp-1-1-5_TO_wnlb-cmts-04-4 NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-7
Device not found: wnlb-dhcp-1-13 IF NEED MORE INFORMATION PLEASE LET ME KNOW
Thank you
I felt later in the night that during the loading in the staging table using UNIX machine he created the new line for each line. That is why the last CMTS is not found, for this I use the UNIX 2 BACK conversion and it starts to work perfectly.
It was the dos2unix error!
Thank you all for your interest and I may learn new things, as I have almost 10 months of experience in (PLSQL, SQL)
-
Error loading data from Oracle 10 g to Excel
Hello gurus,
I get the error when loading data from the Oracle 10 g database table to Excel.
I use SQL TO ADD SQL IKM.
Staging area is different from the target and I use the same pattern sense to source and put in scene.
Please help me...
Error is
0: S1000: sun.jdbc.odbc.JdbcOdbcBatchUpdateException: [Microsoft] [ODBC Excel Driver] operation must use an update query.
sun.jdbc.odbc.JdbcOdbcBatchUpdateException: [Microsoft] [ODBC Excel Driver] operation must use an update query.
at sun.jdbc.odbc.JdbcOdbcPreparedStatement.emulateExecuteBatch (unknown Source)
at sun.jdbc.odbc.JdbcOdbcPreparedStatement.executeBatchUpdate (unknown Source)
at sun.jdbc.odbc.JdbcOdbcStatement.executeBatch (unknown Source)
at com.sunopsis.sql.SnpsQuery.executeBatch (SnpsQuery.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execCollOrders (SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt (SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt (SnpSessTaskSqlI.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask (SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep (SnpSessStep.java)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession (SnpSession.java)
at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand (DwgCommandSession.java)
at com.sunopsis.dwg.cmd.DwgCommandBase.execute (DwgCommandBase.java)
at com.sunopsis.dwg.cmd.e.i (e.java)
at com.sunopsis.dwg.cmd.g.y (g.java)
at com.sunopsis.dwg.cmd.e.run (e.java)
at java.lang.Thread.run (unknown Source)
Kindly help me...
Thanks in advance.
SamuelHello
When you created the DSN for the spreed excel worksheet you dot not uncheck the read-only box.
Make sure that the read-only checkbox is not checked, otherwise you will get an error message when trying to insert data into the Excel worksheet (error: "[Microsoft] [ODBC Excel Driver] operation must use an update query.").Thank you
Fati -
Hello!
I have the database with 5 tables empty and cca 420 csv files.
TABLES looks like
Table1 (vtime, id1, id2, id3 70..)
Table2 (vtime, 71, id72, id73... id140)
VTIME value is in each table from 1 to 22 368 145
CSV resembles
CSV1 (vtime, id, x, value) [vtime is for example 1 to 53250]
CSV420 (id, x, vtime, value) [vtime is for example 22 314 895 to 22 368 145]
There are all the IDS of each CSV file
What is the fastest way and to load data from csv to table?
I wanted to use sql loader, but I can't specify something like -
+ load data from csv files 420 all and value at position 2 is (1-70) [71-140] {...}, load his value (position 3) (table 1) [table 2] {...}, rowing where vtime in the table is the same as for value(position 1) vtime + *.
Can you help me with this please? Thank you!What is the fastest way and to load data from csv to table?
Use an external table.
In this way, you can do Oracle treat your files as if they were tables.
You will have many more opportunities in this way.http://asktom.Oracle.com/pls/Apex/f?p=100:11:0:P11_QUESTION_ID:6611962171229
http://www.Oracle-base.com/articles/9i/SQLNewFeatures9i.php#ExternalTables
http://asktom.Oracle.com/pls/Apex/f?p=100:11:0:P11_QUESTION_ID:2754967100346565998 -
Loading data to SQL server for the Oracle database
I want to create a table in oracle db table in sql server. Table is huge that he has obtained the documents 97,456,789.
I created the link (HS) db in the oracle database that points to the sql server. I choose this oracle db on the link table.
Select * from 'dbo '. "T1@dblink;
I shot below to create the table.
create table t2 nologging parallel (degree = 3) as select * from 'dbo '. "T1@dblink;
and its taking a long time... but its operation...
is there any other method to do this and and fill the table in oracle db faster.
Please notify. Thank you.vhiware wrote:
create table t2 nologging parallel (degree = 3) as select * from 'dbo '. "T1@dblink;
and its taking a long time... but its operation...I doubt that parallel processing will be used because it is unique to Oracle (using rowid varies in general) and not SQL-Server.
is there any other method to do this and and fill the table in oracle db faster.
Part of the performance overhead is to pull that data from SQL Server to Oracle in the whole of the liaison network between them. This can be accelerated by compressing the data first - and who then transfer them over the network.
For example: using + bcp + to export the data in the SQL Server box to a CSV file, compress/zip file, scp/sftp file Oracle and then to unzip there. In parallel and direct treatment of load can now be done using SQL * Loader to load the CSV file into Oracle.
If it is a basic Linux/Unix system, the process of decompression/unzip can be run in parallel with the SQL * process Loader by creating a pipe between the two – where the decompression process writes data uncompressed in the pipe and SQL * Loader reads and loads the data that is available through the pipe.
Otherwise you are PQ own transformation. Assume that the data is date varies. You can create a procedure on Oracle that looks like this:
{code}
create or replace procedure as copyday (day) is
Start
Insert / * + append * / into local_tab select * from remote_tab@remotedb where col_day = day;
-Add logging info, validation, etc.
end;
{code}You can now start 10 or more of these different days and run it in the background using DBMS_JOB.
-
Loading data from Microsoft Dynamics AX via FDM
Hello world
I was wondering if anyone was using FDM to load data from Microsoft Dynamics AX into HFM via FDM, and how this was done. We are currently on version 11.1.2.2.0, thanks for any idea.
The easiest way would be to get the MS Dynamics team to provide a flat file extracts the required data and then use FDM to import in HFM. There is not out of the box adapters to interface directly with MS Dynamics, but another option would be to build an integration script to interface directly with the data in the relevant MS Dynamics via FDM tables but it would be more involved and require a knowledge of the underlying database MS Dynamics, FDM scripting, dry and would be much more difficult to maintain for the average user.
-
strange problem in shape after the transfer of data from sql server2000
I successfully transferred data from sql server 2000 to oracle 10 g 2 for my forms application development...
Now I have when I create a sample form using this data block and run I get an error something like
1. cannot run the query... don't know why... :(
2. when I query my table in select * from tblmaterial all values are displayed
but
When I run select tblmaterial matid
He throws me an error like invalid identifier... when the column is actually present in the table...
its really make me go nutsHello
You may be similar to this problem...SQL> CREATE TABLE "AMD" 2 ("TID" NUMBER, 3 "Tname" VARCHAR2(10), 4 "TLOC" VARCHAR2(10)); SQL> INSERT INTO "AMD" 2 VALUES (1,'TEST','L1'); SQL> SELECT * FROM "AMD"; TID Tname TLOC ---------- ---------- ---------- 1 TEST L1 SQL> SELECT Tname 2 FROM "AMD"; SELECT Tname * ERROR at line 1: ORA-00904: "TNAME": invalid identifier SQL> SELECT "Tname" 2 FROM "AMD"; Tname ---------- TEST SQL>
If so, while creating the table remove the double quotes in column names and the name of the table.
-Clément
-
Loading data from the planning application sample
Hello
I created an example of planning application in the workspace (V 11.1.1.3). By default, they dimension 'Year' the year of departure as FY11. Whereas, loading data from file 'SampleApp_data.txt' that comes with the installation is the year of departure as FY09. Then, when I insert data in environmental assessment (form free loading), there is an error that the FY09 member is not present. Does anyone know how can I approach properly load the sample data? I tried to replace by FY11 by FY12 FY10 FY09 and so forth in Notepad, but the process still hang on the first attempt.
Thanks in advance,
AertsThe easiest way is not sample application tick the options and change the calendar 2008 or 2009, go to the finsh tab, then return to the first screen and sample application tick, then finished, it will create with the year of departure.
See you soon
John
http://John-Goodwin.blogspot.com/ -
How to load data from workflow
How to load data from workflow (POAPPRV.wft, QAPPRV.wft)
11i appsHello
Use WFLOAD to load the workflow data I.
WFLOAD apps /
0 Y FORCE xxxx.wft
WFLOAD apps /0 Y FORCE xxxx.wft
Maybe you are looking for
-
I have 6 s Iphone and IOS 9.2.1 Email app will remain open only for 5 seconds and then stops? I restarted but does not correct. Someone at - it fixes?
-
MY COMPUTER IS A HP 2000Z - 2 00. 8.1 FOR WINDOWS IS THE OPERATING SYSTEM. LET ME SAY THAT, AT THIS TIME, THE COMPUTER HAS BEEN RESTORED AT LEAST 5 TIMES. TWICE BY A TECHNICIAN OF HP AND THE REST BY MYSELF. I'M ABOUT TO PUT IT IN THE TRASH CAN. THE
-
All of my records have become READ-ONLY, how do? and is there a solution?
Problem, all records have become READ-ONLY (read only box is FILLED, not checked) I made the possible corrections as I found here and NONE have worked. I took possession of the records. I made the run, cmd, attrib - r Road etc etc. I tried to do a di
-
HI team,I get lines of a table TAB1 containing 1.6 million given. When I use the following query I can easily get the details...Select * from abc. Tab1 where rx_tx_id = 376741709 and rownum between 1-199999order of time_stamp;BUT when we use other va
-
Hellois there a way to abandon the lob partitions in 11 GR 2?