Load data CSV with Dates
Hi allI have a CSV file and try to load it in an existing table and manage loading dates into the date columns. It has dates in the format like (2010-12-31 00:00:00) and Apex data loading will not accept them. It seems not to accept that something like: December 31, 2010 00:00:00.
I can't reformat the CSV file as there are has 42 000 lines of data. Is it possible to make this work? I see a format column in the load screen data, but have found no info on what it is or how to use it.
Any help would be greatly appreciated!
TIA
Mark
Mark,
Try to use a statement of formatting as DD/MONTH/YYYY hh24:mi:ss import...
Thank you
Tony Miller
Webster, TX
Tags: Database
Similar Questions
-
SQL Loader issue - CSV with commas and quotes IN the data
Hello, I have a dataset for a simple table of 2 columns like this:
Column 1, "it is given for"Column 2", with commas and quotes."
Data are delimited by commas and may be surrounded by double quotes. In ADDITION, it may include commas and quotation marks in the data fields. I CANNOT manipulate data before sending it to SQL Loader.
I set my file of control like this:
DOWNLOAD THE DATA
INFILE '. / TEST.dat'
BADFILE '. / TEST. BAD'
DISCARDFILE '. / TEST. DSC"
REPLACE IN THE TEST TABLE
Fields ended by ',' POSSIBLY BOX BY ' "' TRAILING NULLCOLS"
(
Col1 char (50),
Col2 char (500)
)
Now when I run the present via SQLLDR, I get the following error in the log file:
Sheet 1: Rejected - error on table TEST, column COL2.
no terminator found after CLOSE and CLOSED field
What are my options to get the loaded data as presented above? I'm working on Oracle 11 g (11.2.0.3.0) 64-bit on AIX 6.1.0.0.
Thank you!
In this case, there is no way she can tell who's a delimiter or pregnant and which is part of the data. As far as I know, there is no way that you can load it into the appropriate columns.
-
SQL Loader loading data into two Tables using a single CSV file
Dear all,
I have a requirement where in I need to load the data into 2 tables using a simple csv file.
So I wrote the following control file. But it loads only the first table and also there nothing in the debug log file.
Please suggest how to achieve this.
Examples of data
Source_system_code,Record_type,Source_System_Vendor_number,$vendor_name,Vendor_site_code,Address_line1,Address_line2,Address_line3
Victor, New, Ven001, Vinay, Vin001, abc, def, xyz
Control file script
================
OPTIONS (errors = 0, skip = 1)
load data
replace
in the table1 table:
fields ended by ',' optionally surrounded "" "
(
Char Source_system_code (1) POSITION "ltrim (rtrim (:Source_system_code)),"
Record_type tank "ltrim (rtrim (:Record_type)),"
Source_System_Vendor_number tank "ltrim (rtrim (:Source_System_Vendor_number)),"
$vendor_name tank "ltrim (rtrim (:Vendor_name)),"
)
in the Table2 table
1 = 1
fields ended by ',' optionally surrounded "" "
(
$vendor_name tank "ltrim (rtrim (:Vendor_name)),"
Vendor_site_code tank "ltrim (rtrim (:Vendor_site_code)),"
Address_line1 tank "ltrim (rtrim (:Address_line1)),"
Address_line2 tank "ltrim (rtrim (:Address_line2)),"
Address_line3 tank "ltrim (rtrim (:Address_line3)).
)the problem here is loading into a table, only the first. (Table 1)
Please guide me.
Thank you
Kumar
When you do not provide a starting position for the first field in table2, it starts with the following after a last referenced in table1 field, then it starts with vendor_site_code, instead of $vendor_name. So what you need to do instead, is specify position (1) to the first field in table2 and use the fields to fill. In addition, he dislikes when 1 = 1, and he didn't need anyway. See the example including the corrected below control file.
Scott@orcl12c > test.dat TYPE of HOST
Source_system_code, Record_type, Source_System_Vendor_number, $vendor_name, Vendor_site_code, Address_line1, Address_line2, Address_line3
Victor, New, Ven001, Vinay, Vin001, abc, def, xyz
Scott@orcl12c > test.ctl TYPE of HOST
OPTIONS (errors = 0, skip = 1)
load data
replace
in the table1 table:
fields ended by ',' optionally surrounded "" "
(
Char Source_system_code (1) POSITION "ltrim (rtrim (:Source_system_code)),"
Record_type tank "ltrim (rtrim (:Record_type)),"
Source_System_Vendor_number tank "ltrim (rtrim (:Source_System_Vendor_number)),"
$vendor_name tank "ltrim (rtrim (:Vendor_name)).
)
in the Table2 table
fields ended by ',' optionally surrounded "" "
(
source_system_code FILL (1) POSITION.
record_type FILLING,
source_system_vendor_number FILLING,
$vendor_name tank "ltrim (rtrim (:Vendor_name)),"
Vendor_site_code tank "ltrim (rtrim (:Vendor_site_code)),"
Address_line1 tank "ltrim (rtrim (:Address_line1)),"
Address_line2 tank "ltrim (rtrim (:Address_line2)),"
Address_line3 tank "ltrim (rtrim (:Address_line3)).
)
Scott@orcl12c > CREATE TABLE table1:
2 (Source_system_code VARCHAR2 (13),)
3 Record_type VARCHAR2 (11),
4 Source_System_Vendor_number VARCHAR2 (27),
5 $vendor_name VARCHAR2 (11))
6.
Table created.
Scott@orcl12c > CREATE TABLE table2
2 ($vendor_name VARCHAR2 (11),)
3 Vendor_site_code VARCHAR2 (16).
4 Address_line1 VARCHAR2 (13),
5 Address_line2 VARCHAR2 (13),
Address_line3 6 VARCHAR2 (13))
7.
Table created.
Scott@orcl12c > HOST SQLLDR scott/tiger CONTROL = test.ctl DATA = test.dat LOG = test.log
SQL * Loader: release 12.1.0.1.0 - Production on Thu Mar 26 01:43:30 2015
Copyright (c) 1982, 2013, Oracle and/or its affiliates. All rights reserved.
Path used: classics
Commit the point reached - the number of logical records 1
TABLE1 table:
1 row loaded successfully.
Table TABLE2:
1 row loaded successfully.
Check the log file:
test.log
For more information on the charge.
Scott@orcl12c > SELECT * FROM table1
2.
RECORD_TYPE SOURCE_SYSTEM_VENDOR_NUMBER $VENDOR_NAME SOURCE_SYSTEM
------------- ----------- --------------------------- -----------
Victor Ven001 new Vinay
1 selected line.
Scott@orcl12c > SELECT * FROM table2
2.
$VENDOR_NAME VENDOR_SITE_CODE ADDRESS_LINE1 ADDRESS_LINE2 ADDRESS_LINE3
----------- ---------------- ------------- ------------- -------------
Vinay Vin001 abc def xyz
1 selected line.
Scott@orcl12c >
-
APEX application example to load data from .csv
Hi all
Apex 4.2
I want a sample application that load the .csv data in a table. Like the one conducted in the Apex utilily module.
I want the end user you connect by using the application mode and no developer apex itself.
Can give me link to this kind of application example please.
Thank you very much
MK
MariaKarpa (MK) wrote:
Apex 4.2
I want a sample application that load the .csv data in a table. Like the one conducted in the Apex utilily module.
I want the end user you connect by using the application mode and no developer apex itself.
Can give me link to this kind of application example please.
APEX 4.2 includes a sample of data loading packaged application showing how to create applications with loading of the data wizards.
-
load data from csv file into table
Hello
I'm working on Oracle 11 g r2 on UNIX platform.
We have an obligation to load data to the table to a flat file, but on one condition, need to make the comparison between filed primary key if the file is available then update another filed and if the recording is not available, we need to insert.
How can achieve us.
SQLLoader to load the CSV file data into a staging Table.
Then use the MERGE SQL command to the lines of insert/update of table setting for the target table.
Hemant K Collette
-
Hello
Please forgive me if this has been discussed in this forum already. If so, please point me to this discussion.
What I'm trying to do, it's a data merge InDesign with a (CSV) file delimited by commas. I created the CSV file by using the command "SaveAs CSV" with Office Excel 2013. The problem I have is with records that contain multiple lines in the same area. After reviewing the merged InDesign document, I noticed that the results are inaccurate for the rest of this record. When it gets to the record following some fields are filled with the remaining records of the previous record.
I'm sorry if I'm not explaining this very well, but the results are strange and I hope others have similar results and can offer a solution.
Thank you
Greg
You can't have line breaks in a cell. The solution is to use a code for the break, and then run find/replace after the merger to change the code to break you.
-
Problem with loading data into the cube with members twice in different dimensions
Have a cube with members dublicate allowed only in two dims (period1 and periode2).
Saves the contours without error, but when I try to load data, I got some errors:
"Member 2012-12 is a member in doubles in the sketch.
A6179398-68BD-7843-E0C2-B5EE81808D0B 01011 cd00 st01 2905110000 EK fo0000 2012 NNNN-NN-NN-12 cust00000$ $$ 1 "
The dims are two similar periods. I'm changing the names of members and the alias a little in one of them (f.e. yyyy1-mm-> yyyy1 - mm1)?
Users wouldn't like it...
Period.1
yyyy1
yyyy1/q1
yyyy1-mm1
yyyy1-mm2
yyyy1-mm3
yyyy1/q2
....
....
Periode2
yyyy1
yyyy1/q1
yyyy1-mm1
yyyy1-mm1-dd1
yyyy1-mm1-dd2
......
yyyy1-mm2
yyyy1-mm3
yyyy1/q2
....
....
Tnanx
Yes your records of entry must have member of full names
-
Loading data with the size of the attribute.
I have a currency attribute dimension asociated to dimension of entities and I would download data using the dimension of the associated attribute. I have a file with a column with the name of the entity and also a column with the name of the attribute. Could someone tell me how to set the column attribute of the rule of load?
Published by: user5170363 on March 22, 2012 12:55I guess that, dimensions of attribute are not County while you load data... Data file always have the standard dimension members...
Thus, you can filter your features according to your attribute or load only on these entities...
-----
Vivek jerbi -
Error loading data with a rule in 11.1.2.1 file loading
When I do the dim the Parent child build and alias will not load?
Rule reading information SQL for the database [DP]
Rule of reading of the purpose of the database rule [DP]
Active parallel loading: [1] block to prepare the discussions, block [1] write discussions.
DATAERRORLIMIT reached [1000]. Rejected records will be is more connected
No value data modified by this data file loading
Data load time for [Accounts01.txt] with [ACCOUNT.rul]: [0.512] seconds
There were errors, look in C:\Oracle\Middleware\EPMSystem11R1\products\Essbase\eas\client\dataload.err
Completed import data ['RPIWORK'.'] [DP ']
Out of the prepared columns: [0]
Outline = measurement Demension
Measure
Balance sheet
Profit & loss
Data file
is a SQL query
to Excel and save it as a text (delimited by tabs)
Account.txt
Parent | Child | Alias
Total assets of the balance sheet 10000
Total liabilities of the balance sheet 20000
Balance Sheet Total 30000 of the owner
Profit & loss 40000 recipes
Profit & loss 50000 cost of goods sold
Profits & losses 60000 S.G./A.
Profit & loss 70000 other operating income
Profit & loss 80000 other expenses
Profit & loss 85000 interest income/expenses
Profit & loss 90000 taxes
Profit & loss 99000 capitalized Contra
10000 10100 short-term assets
10000-14000 «wood, forest and roads land»
10000 17000 "goods, materials and equipment.
10000 to 19000 active other long-term
10000 19750 deferred tax assets
20000 20001 short-term liabilities
20000 21200 other Long term liabilities
20000-22000 timber contracts
20000 25100 long-term debt
20000-26000 deferred tax liability
30000 Equity subsidiary 30001
30000 30250 voting common shares
30000 30350 according to Common Stock
30000 30450 capital paid in additional
30000 30550 retained earnings
30000 30610 other comprehensive income items
30000 30675 tax distribution to Shareholde
40000 40055 wholesales
40000 41500 chips recipes
40000 43000 power sales
40000 nursery 45000 income
40000 46000 transfer logs
40000 47000 Intraco sales
40000-49000 sale deduction
50000 Labor 51000
50000 52000 raw
50000 rental 52800
50000 53000 operating expenses
50000 53900 forestry supplies
50000 capitalized 53999 Contra-forestry
50000 maintenance 54000
50000 55000 fuels & lubricants
50000-56000 utilities
50000 57000 cost of direct registration
50000 57500 personalized services
50000 impoverishment 57700
50000 58000 cost allocations of goods sold
fixed cost 50000 59000
50000 59510 changes in inventories
60000 60100 salaries
60000 60300 maintenance of PC hardware
60000 60400 other G & A
61000 60000 licenses/fees/charges
60000 61400 benefits not
60000 61550 furniture/fixtures
60000 legal 61750
60000 62000 fresh office
60000 62500 professional services
60000 63000 activities pre & Post jobs
60000 63200 telecommunication costs
60000 63550 employee activities
60000 63800 sales & Promotions
60000 63900 banking questions
60000 64000 Admin depreciation
60000 64500 insurance and property taxes
60000 65000 allowances S G & A
60000 66000 outside management
70000 70100 rental income
70000 disposals of fixed assets 70200
70000 70400 Misc income
80000 80200 factory inactive
85000 85001 interest expense
85000 85200 interest income
90000-90100 income tax charges
error file
------20000 not found in the database members
20000 25100 long-term debt
------20000 not found in the database members
20000-26000 deferred tax liability
------Member 30000 not found in the database
30000 Equity subsidiary 30001
------Member 30000 not found in the database
30000 30250 voting common shares
------Member 30000 not found in the database
30000 30350 according to Common Stock
------Member 30000 not found in the database
30000 30450 capital paid in additional
------Member 30000 not found in the database
30000 30550 retained earnings
------Member 30000 not found in the database
30000 30610 other comprehensive income items
------Member 30000 not found in the database
30000 30675 tax distribution to Shareholde
------Member 40000 not found in the database
40000 40055 wholesales
------Member 40000 not found in the database
40000 41500 chips recipes
------Member 40000 not found in the database
40000 43000 power sales
------Member 40000 not found in the database
40000 nursery 45000 income
------Member 40000 not found in the database
40000 46000 transfer logs
------Member 40000 not found in the database
40000 47000 Intraco sales
------Member 40000 not found in the database
40000-49000 sale deduction
------Member 50000 not found in the database
50000 Labor 51000
------Member 50000 not found in the database
50000 52000 raw
------Member 50000 not found in the database
50000 rental 52800
------Member 50000 not found in the database
50000 53000 operating expenses
------Member 50000 not found in the database
50000 53900 forestry supplies
------Member 50000 not found in the database
50000 capitalized 53999 Contra-forestry
------Member 50000 not found in the database
50000 maintenance 54000
------Member 50000 not found in the database
50000 55000 fuels & lubricants
------Member 50000 not found in the database
50000-56000 utilities
------Member 50000 not found in the database
50000 57000 cost of direct registration
------Member 50000 not found in the database
50000 57500 personalized services
------Member 50000 not found in the database
50000 impoverishment 57700
------Member 50000 not found in the database
50000 58000 cost allocations of goods sold
------Member 50000 not found in the database
fixed cost 50000 59000
------Member 50000 not found in the database
50000 59510 changes in inventories
------Member 60000 not found in the database
60000 60100 salaries
------Member 60000 not found in the database
60000 60300 maintenance of PC hardware
------Member 60000 not found in the database
60000 60400 other G & A
------Member 60000 not found in the database
61000 60000 licenses/fees/charges
------Member 60000 not found in the database
60000 61400 benefits not
------Member 60000 not found in the database
60000 61550 furniture/fixtures
------Member 60000 not found in the database
60000 legal 61750
------Member 60000 not found in the database
60000 62000 fresh office
------Member 60000 not found in the database
60000 62500 professional services
------Member 60000 not found in the database
60000 63000 activities pre & Post jobs
------Member 60000 not found in the database
60000 63200 telecommunication costs
------Member 60000 not found in the database
60000 63550 employee activities
------Member 60000 not found in the database
60000 63800 sales & Promotions
------Member 60000 not found in the database
60000 63900 banking questions
------Member 60000 not found in the database
60000 64000 Admin depreciation
------Member 60000 not found in the database
60000 64500 insurance and property taxes
------Member 60000 not found in the database
60000 65000 allowances S G & A
------Member 60000 not found in the database
60000 66000 outside management
------Member 70000 not found in the database
70000 70100 rental income
------Member 70000 not found in the database
70000 disposals of fixed assets 70200
------Member 70000 not found in the database
70000 70400 Misc income
------Member 80000 not found in the database
80000 80200 factory inactive
------Member 85000 not found in the database
85000 85001 interest expense
------Member 85000 not found in the database
85000 85200 interest income
------Member 90000 N
t database
90000-90100 income tax charges
That's how I build my loading rules file
Create-> rules file
File-> opendatafile->
account01.txt
Field-> Dimension Build Properties properties
Dimension =
Field 1 measure; Type = Parent
Measurement of field 2; Type = child
Measurement of field 2; Type = Alais
Click the Dimension Build field
Click on setting Dimension build - Parent\Child
Ok
Validate - rule is correct
Save as account
Load the data file = Acciount.txt
Rule file = account
Ok
Published by: level following December 11, 2011 06:24
Published by: level following December 11, 2011 06:25
Published by: level following December 11, 2011 06:27The drop-down list in the EA when you right click on the database and select load data. I have a question, the Member that already in outline? If this isn't the case, you will have problems. You would have to add it or have a line in the file to load at the top, with something like account, balance sheet in it. Also in the State of charge have you changed the load parameters of dimension for the dimension of accounts to be parent/child for the dimension of accounts. Often enough, the people don't realize they have to double-click the dimension name to make sure it gets set as the dimension that gets changed.
I'm sure that your question is she trying to make loading the data and not the generation dim, but it might just be the first problem
-
Member not found when loading data with SQL
Hello everyone:
I created a cube map extract all information with SQL statements and it works perfectly. Now I'm trying to load data in the same way, but I can't.
I created a view that returns data in this format:
Dimension 1 < tab > axis 2 < tab > member... Member of dimension 5 < tab > measure 1 < tab > 2 < tab > measure 3
I designed a new rule, indicating for each column in the dimension; and for each measure which specific member of dimension of accounts must be used. I have check and everything is ok, but when I try to load data, it does not work and tells me error:
+ "Data value [3.5] met before that all the selected Dimensions, [1] Records duly filled.
+ Essbase error unexpected 1003007 "+"
If I get the names of the members with quotes (because they contain spaces, and it is perhaps the reason for the previous error, although the rule analysis correctly) with the SQL statement, when Essbase import, it deletes these quotes. I must use another symbol, and as a rule change this another symbol quotes. Is this normal? I know that this issue when importing formulas, but not here.
Why in 'Dimension building' I don't have this problem with quotes?
And when changing the symbols of quotes, this error occurs:
+ "Member x not found in the database" +. " But I check the Member and there are in general terms. What's wrong with that? »
Thanks in advance
Concerning
Javier
Published by: Javi M on 26-mar-2011 05:52Yes, the SQL files and data (of all kinds) are supported by more than one column of data. As you noted, you just point to the Member that represents the column.
That said, I bet that if you look at your view/table and load the rule against your outline, I bet you find a dimension be mismapped, for example, you think that this column 4 points to the scenario, but you really repointe it produces and which purported to be the column 1, or you missed a dimension. Happens to me all the time.
Kind regards
Cameron Lackpour
-
Error while loading data file with using a file of rules through a MAXL.
I think that the functionality to generate a. Records error during loading data in the ERR file is supported only if there is a. RUL file being used.
Is this good?
I tried to get. ERR files to generate by using the following statement:
Import database PL_RPT. Reprting data data_file
"E:\Hyperion\AnalyticServices\APP\PL\PL.txt" to load_buffer with buffer_id 17
Error writing to "E:\Hyperion\Scripts\Pln\Logs\LoadlData.err."
When I run this, if there are errors, it is not generated any file errors and stops the load.
I saw the technical reference Essbase ASO MAXL data loading and code syntax diagram indicates that it is supported.
Any suggestions will be greatly appreciated.
Thank youHello
Here are a few suggestions for trapping errors. I hope that one of them will meet your needs:
1._____________________________________________
spool to 'D:\logs\maxlresults.out ';
function SIERREUR 'WRITE_ERRORS ';
/ * do stuff * /.
Define the label 'WRITE_ERRORS ';
spool off;
spool to 'D:\logs\maxlerrors.out ';
"exit";2._____________________________________________
coil stdout to "D:\logs\maxlresults.out."
coil stderr to "D:\logs\maxlerrors.out."3._____________________________________________
essmsh script.msh 2 > D:\logs\maxlresults.out
Robb
-
An error in loading data from .csv file using sql * loader
Hello
I wrote a control file, to load data into the table.
The .csv (input file) contains a value that contains within it. So that the value is placed in "" (double quotes)
The record that contains the value that is quoted is not loaded into the file.
example: -.
input file (.csv file)
001, apple, 1, 31 December 00
002, "abc, def & ltd", December 1, 31 00
in the example above second row is not inserted into the table.
How can I insert all the rows in the table.
any help is appreciated.
Thanks in advance...
RXGlooks like your control file?
Try this.
fields ended by ',' optionally surrounded "" " -
Load data into Hyperion Planning problem with police? Thai tank.
Hello
Can someone help me I just load the ODI in Hyperion Planning data; However, it has char Thai in my data. When I see data in Hyperion Planning, char Thai converts? instead. So, how can I load data without problem of tank with ODI fonts? Any expert! Please help me for this
< img src = "http://img10.imageshack.us/img10/6087/20090316155228.th.png" border = "0" alt = "Free Image Hosting at the www.ImageShack.us" / >
< img src = "http://imageshack.us/img/butansn.png" alt = "QuickPost" border = "0" > QuickPost this image to Myspace, Digg, Facebook, and others!
In Journal of Hyperion, there is no error here
2009-03-16 15:14:47, 847 [DwgCmdExecutionThread] INFO: Oracle Data Integrator adapter for Hyperion Planning - free 9.3.1.1
2009-03-16 15:14:47, 847 INFO [DwgCmdExecutionThread]: connection to the planning application [Budget] on [192.168.3.20]: [11333] using [hypadmin] username.
2009-03-16 15:14:47, 925 [DwgCmdExecutionThread] INFO: successfully connected to the planning application.
2009-03-16 15:14:47, 941 INFO [DwgCmdExecutionThread]: loading for the charge of planning options
Name of the dimension: location like Parent child: true
Order By entry load: true
Update the database: false
2009-03-16 15:14:48, 019 INFO [DwgCmdExecutionThread]: beginning of the loading process.
2009-03-16 15:14:48, 019 DEBUG [DwgCmdExecutionThread]: number of columns in the result set of source does not match the number of columns planning targets.
2009-03-16 15:14:48, 066 [DwgCmdExecutionThread] INFO: type of load is a [member of the load dimension].
2009-03-16 15:14:48, 285 INFO [DwgCmdExecutionThread]: circular reference detected Possible, abandonment of sort and continuing with load. 1368 possible circular reference documents found.
2009-03-16 15:15:14, 660 INFO [DwgCmdExecutionThread]: load the process is complete.
2009-03-16 15:27:45, 821 [DwgCmdExecutionThread] INFO: Oracle Data Integrator adapter for Hyperion Planning - free 9.3.1.1
2009-03-16 15:27:45, 821 INFO [DwgCmdExecutionThread]: connection to the planning application [Budget] on [192.168.3.20]: [11333] using [hypadmin] username.
2009-03-16 15:27:45, 883 [DwgCmdExecutionThread] INFO: successfully connected to the planning application.
2009-03-16 15:27:45, 899 INFO [DwgCmdExecutionThread]: loading for the charge of planning options
Name of the dimension: location like Parent child: true
Order By entry load: true
Update the database: true
2009-03-16 15:27:45, 962 INFO [DwgCmdExecutionThread]: beginning of the loading process.
2009-03-16 15:27:45, 962 DEBUG [DwgCmdExecutionThread]: number of columns in the result set of source does not match the number of columns planning targets.
2009-03-16 15:27:45, 993 [DwgCmdExecutionThread] INFO: type of load is a [member of the load dimension].
2009-03-16 15:27:46, 165 [DwgCmdExecutionThread] INFO: circular reference detected Possible, abandonment of sort and continuing with load. 1368 possible circular reference documents found.
2009-03-16 15:28:14, 540 [DwgCmdExecutionThread] INFO: planing cube refresh initiated.
2009-03-16 15:28:22, 993 INFO [DwgCmdExecutionThread]: planning of the cube refresh operation completed successfully.
2009-03-16 15:28:22, 993 INFO [DwgCmdExecutionThread]: load the process is complete.Hello
I'm glad you find the blog useful.
I understand that you use 9.3 and to prove that it is not a question of ODI, log into planning from the web, add one manually with thai characters, then refresh the application to push information to essbase, then look at the Member in EA.
See you soon
John
http://John-Goodwin.blogspot.com/ -
Error in loading data with SQLLDR in Oracle 10 G
Hello
Can one suggest what the problem is in the slot mentioned control file used for loading data via SQL * LOADER
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------
DOWNLOAD THE DATA
INFILE 'D:\test\temt.txt '.
BADFILE "test.bad."
DISCARDFILE 'test.dsc '.
IN THE TABLE 'TEST '.
INSERT
(INTEGER SRNO (7))
PROD_ID INTEGER (10),
PROMO_ID INTEGER (10),
CHANNEL_ID INTEGER (10),
UNIT_COST INTEGER (10),
UNIT_PRICE INTEGER (10)
)
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------
I'm trying to load data in the schema SCOTT scott user.
Why make such a mistake, please see the attached log file.
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------
SQL * Loader: Release 10.2.0.1.0 - Production on Fri Mar 20 14:43:35 2009
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Control file: D:\test\temt.ctl
Data file: D:\test\temt.txt
Bad leadership: test.bad
Delete the file: test.dsc
(Allow all releases)
Number of loading: ALL
Number of jump: 0
Authorized errors: 50
Link table: 64 lines, maximum of 256000 bytes
Continuation of the debate: none is specified
Path used: classics
Table 'TEST', loaded from every logical record.
Insert the option in effect for this table: INSERT
Column Position Len term Encl. Datatype name
------------------------------ ---------- ----- ---- ---- ---------------------
SRNO FIRST 7 INTEGER
PROD_ID INTEGER 10 NEXT
PROMO_ID INTEGER 10 NEXT
CHANNEL_ID INTEGER 10 NEXT
UNIT_COST INTEGER 10 NEXT
UNIT_PRICE INTEGER 10 NEXT
Sheet 1: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 2: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 3: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Folder 4: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 5: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 6: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 7: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 8: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
File 9: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Case 10: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Factsheet 11: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 12: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
File 13: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Fact sheet 14: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Fact sheet 15: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 16: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
File 17: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 18: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
File 19: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 20: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 21: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Account 22: rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 23: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record number of 24: rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 25: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Fact sheet 26: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Fact sheet 27: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 28: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 29: rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 30: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record of 31: rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
• Statement 32: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 33: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Page 34: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 35: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 36: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 37: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 38: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 39: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 40: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 41: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Page 42: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 43: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Sheet 44: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 45: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
• Statement 46: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 47: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 48: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 49: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Page 50: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
Record 51: Rejected - error on the table 'TEST '.
ORA-01460: dead letter or unreasonable conversion requested
NUMBER of MAXIMUM ERRORS EXCEEDED - above the statistics reflect partial performance.
Table 'TEST'
0 rows successfully loaded.
51 lines not filled due to data errors.
0 rows not loading because all WHEN clauses were failed.
0 rows not populated because all fields are null.
The space allocated to bind table: 3648 bytes (64 lines)
Bytes of read buffer: 1048576
Total logical records ignored: 0
Total logical records read: 64
Total rejected logical records: 51
Total logical records ignored: 0
Run started on Fri Mar 20 14:43:35 2009
Run finished Fri Mar 20 14:43:43 2009
Time was: 00:00:07.98
Time processor was: 00:00:00.28
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Here is the method to use SQLLDR and table details
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
SQL > desc test
Name Null? Type
----------------------- -------- ----------------
SRNO NUMBER (7)
PROD_ID NUMBER (10)
PROMO_ID NUMBER (10)
CHANNEL_ID NUMBER (10)
UNIT_COST NUMBER (10)
UNIT_PRICE NUMBER (10)
Use sqlldr process is:
cmd PROMT,
d:\ > sqlldr scott/tiger
Control = D:\test\temt.ctl
SQL * Loader: Release 10.2.0.1.0 - Production on Fri Mar 20 15:55:50 2009
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Commit the point reached - the number of logical records 64
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
I even tried a few examples,
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Which of the below control record make sense,
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
-1
DOWNLOAD THE DATA
INFILE 'D:\test\temt.txt '.
BADFILE "test.bad."
DISCARDFILE 'test.dsc '.
IN THE TABLE 'TEST '.
INSERT
COMPLETED FIELD BY (EN)
(INTEGER SRNO (7))
PROD_ID INTEGER (10),
PROMO_ID INTEGER (10),
CHANNEL_ID INTEGER (10),
UNIT_COST INTEGER (10),
UNIT_PRICE INTEGER (10)
)
-2
DOWNLOAD THE DATA
INFILE 'D:\test\temt.txt '.
BADFILE "test.bad."
DISCARDFILE 'test.dsc '.
IN THE TABLE 'TEST '.
INSERT
DOMAIN TERMINATED BY, eventually surrounded "" "
(INTEGER SRNO (7))
PROD_ID INTEGER (10),
PROMO_ID INTEGER (10),
CHANNEL_ID INTEGER (10),
UNIT_COST INTEGER (10),
UNIT_PRICE INTEGER (10)
)
For the code - 1 I get below mentioned error... *.
D:\ > sqlldr scott/tiger
Control = D:\test\temt.ctl
SQL * Loader: Release 10.2.0.1.0 - Production on Fri Mar 20 16:36 2009
Copyright (c) 1982, 2005, Oracle. All rights reserved.
SQL * Loader-350: error of syntax on line 8.
Expecting "(", found "FIELD".
COMPLETED FIELD BY (EN)
^
* And for the code - 2 I get the error below *.
D:\ > sqlldr scott/tiger
Control = D:\test\temt.ctl
SQL * Loader: Release 10.2.0.1.0 - Production on Fri Mar 20 16:39:22 2009
Copyright (c) 1982, 2005, Oracle. All rights reserved.
SQL * Loader-350: error of syntax on line 8.
Expecting "(", found "FIELD".
DOMAIN TERMINATED BY, eventually surrounded "" "
^
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------Hello
This will help for you
LOAD DATA INFILE 'D:\test\temt.txt' BADFILE 'test.bad' DISCARDFILE 'test.dsc' INSERT INTO TABLE "TEST" FIELDS TERMINATED BY ',' (SRNO INTEGER EXTERNAL , PROD_ID INTEGER EXTERNAL, PROMO_ID INTEGER EXTERNAL, CHANNEL_ID INTEGER EXTERNAL, UNIT_COST INTEGER EXTERNAL, UNIT_PRICE INTEGER EXTERNAL )
Thank you
-
Question to load data using sql loader in staging table, and then in the main tables!
Hello
I'm trying to load data into our main database table using SQL LOADER. data will be provided in separate pipes csv files.
I have develop a shell script to load the data and it works fine except one thing.
Here are the details of a data to re-create the problem.
Staging of the structure of the table in which data will be filled using sql loader
create table stg_cmts_data (cmts_token varchar2 (30), CMTS_IP varchar2 (20));
create table stg_link_data (dhcp_token varchar2 (30), cmts_to_add varchar2 (200));
create table stg_dhcp_data (dhcp_token varchar2 (30), DHCP_IP varchar2 (20));
DATA in the csv file-
for stg_cmts_data-
cmts_map_03092015_1.csv
WNLB-CMTS-01-1. 10.15.0.1
WNLB-CMTS-02-2 | 10.15.16.1
WNLB-CMTS-03-3. 10.15.48.1
WNLB-CMTS-04-4. 10.15.80.1
WNLB-CMTS-05-5. 10.15.96.1
for stg_dhcp_data-
dhcp_map_03092015_1.csv
DHCP-1-1-1. 10.25.23.10, 25.26.14.01
DHCP-1-1-2. 56.25.111.25, 100.25.2.01
DHCP-1-1-3. 25.255.3.01, 89.20.147.258
DHCP-1-1-4. 10.25.26.36, 200.32.58.69
DHCP-1-1-5 | 80.25.47.369, 60.258.14.10
for stg_link_data
cmts_dhcp_link_map_0309151623_1.csv
DHCP-1-1-1. WNLB-CMTS-01-1,WNLB-CMTS-02-2
DHCP-1-1-2. WNLB-CMTS-03-3,WNLB-CMTS-04-4,WNLB-CMTS-05-5
DHCP-1-1-3. WNLB-CMTS-01-1
DHCP-1-1-4. WNLB-CMTS-05-8,WNLB-CMTS-05-6,WNLB-CMTS-05-0,WNLB-CMTS-03-3
DHCP-1-1-5 | WNLB-CMTS-02-2,WNLB-CMTS-04-4,WNLB-CMTS-05-7
WNLB-DHCP-1-13 | WNLB-CMTS-02-2
Now, after loading these data in the staging of table I have to fill the main database table
create table subntwk (subntwk_nm varchar2 (20), subntwk_ip varchar2 (30));
create table link (link_nm varchar2 (50));
SQL scripts that I created to load data is like.
coil load_cmts.log
Set serveroutput on
DECLARE
CURSOR c_stg_cmts IS SELECT *.
OF stg_cmts_data;
TYPE t_stg_cmts IS TABLE OF stg_cmts_data % ROWTYPE INDEX BY pls_integer;
l_stg_cmts t_stg_cmts;
l_cmts_cnt NUMBER;
l_cnt NUMBER;
NUMBER of l_cnt_1;
BEGIN
OPEN c_stg_cmts.
Get the c_stg_cmts COLLECT in BULK IN l_stg_cmts;
BECAUSE me IN l_stg_cmts. FIRST... l_stg_cmts. LAST
LOOP
SELECT COUNT (1)
IN l_cmts_cnt
OF subntwk
WHERE subntwk_nm = l_stg_cmts (i) .cmts_token;
IF l_cmts_cnt < 1 THEN
INSERT
IN SUBNTWK
(
subntwk_nm
)
VALUES
(
l_stg_cmts (i) .cmts_token
);
DBMS_OUTPUT. Put_line ("token has been added: ' |") l_stg_cmts (i) .cmts_token);
ON THE OTHER
DBMS_OUTPUT. Put_line ("token is already present'");
END IF;
WHEN l_stg_cmts EXIT. COUNT = 0;
END LOOP;
commit;
EXCEPTION
WHILE OTHERS THEN
Dbms_output.put_line ('ERROR' |) SQLERRM);
END;
/
output
for dhcp
coil load_dhcp.log
Set serveroutput on
DECLARE
CURSOR c_stg_dhcp IS SELECT *.
OF stg_dhcp_data;
TYPE t_stg_dhcp IS TABLE OF stg_dhcp_data % ROWTYPE INDEX BY pls_integer;
l_stg_dhcp t_stg_dhcp;
l_dhcp_cnt NUMBER;
l_cnt NUMBER;
NUMBER of l_cnt_1;
BEGIN
OPEN c_stg_dhcp.
Get the c_stg_dhcp COLLECT in BULK IN l_stg_dhcp;
BECAUSE me IN l_stg_dhcp. FIRST... l_stg_dhcp. LAST
LOOP
SELECT COUNT (1)
IN l_dhcp_cnt
OF subntwk
WHERE subntwk_nm = l_stg_dhcp (i) .dhcp_token;
IF l_dhcp_cnt < 1 THEN
INSERT
IN SUBNTWK
(
subntwk_nm
)
VALUES
(
l_stg_dhcp (i) .dhcp_token
);
DBMS_OUTPUT. Put_line ("token has been added: ' |") l_stg_dhcp (i) .dhcp_token);
ON THE OTHER
DBMS_OUTPUT. Put_line ("token is already present'");
END IF;
WHEN l_stg_dhcp EXIT. COUNT = 0;
END LOOP;
commit;
EXCEPTION
WHILE OTHERS THEN
Dbms_output.put_line ('ERROR' |) SQLERRM);
END;
/
output
for link -.
coil load_link.log
Set serveroutput on
DECLARE
l_cmts_1 VARCHAR2 (4000 CHAR);
l_cmts_add VARCHAR2 (200 CHAR);
l_dhcp_cnt NUMBER;
l_cmts_cnt NUMBER;
l_link_cnt NUMBER;
l_add_link_nm VARCHAR2 (200 CHAR);
BEGIN
FOR (IN) r
SELECT dhcp_token, cmts_to_add | ',' cmts_add
OF stg_link_data
)
LOOP
l_cmts_1: = r.cmts_add;
l_cmts_add: = TRIM (SUBSTR (l_cmts_1, 1, INSTR (l_cmts_1, ',') - 1));
SELECT COUNT (1)
IN l_dhcp_cnt
OF subntwk
WHERE subntwk_nm = r.dhcp_token;
IF l_dhcp_cnt = 0 THEN
DBMS_OUTPUT. Put_line ("device not found: ' |") r.dhcp_token);
ON THE OTHER
While l_cmts_add IS NOT NULL
LOOP
l_add_link_nm: = r.dhcp_token |' _TO_' | l_cmts_add;
SELECT COUNT (1)
IN l_cmts_cnt
OF subntwk
WHERE subntwk_nm = TRIM (l_cmts_add);
SELECT COUNT (1)
IN l_link_cnt
LINK
WHERE link_nm = l_add_link_nm;
IF l_cmts_cnt > 0 AND l_link_cnt = 0 THEN
INSERT INTO link (link_nm)
VALUES (l_add_link_nm);
DBMS_OUTPUT. Put_line (l_add_link_nm |) » '||' Has been added. ") ;
ELSIF l_link_cnt > 0 THEN
DBMS_OUTPUT. Put_line (' link is already present: ' | l_add_link_nm);
ELSIF l_cmts_cnt = 0 then
DBMS_OUTPUT. Put_line (' no. CMTS FOUND for device to create the link: ' | l_cmts_add);
END IF;
l_cmts_1: = TRIM (SUBSTR (l_cmts_1, INSTR (l_cmts_1, ',') + 1));
l_cmts_add: = TRIM (SUBSTR (l_cmts_1, 1, INSTR (l_cmts_1, ',') - 1));
END LOOP;
END IF;
END LOOP;
COMMIT;
EXCEPTION
WHILE OTHERS THEN
Dbms_output.put_line ('ERROR' |) SQLERRM);
END;
/
output
control files -
DOWNLOAD THE DATA
INFILE 'cmts_data.csv '.
ADD
IN THE STG_CMTS_DATA TABLE
When (cmts_token! = ") AND (cmts_token! = 'NULL') AND (cmts_token! = 'null')
and (cmts_ip! = ") AND (cmts_ip! = 'NULL') AND (cmts_ip! = 'null')
FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "
TRAILING NULLCOLS
('RTRIM (LTRIM (:cmts_token))' cmts_token,
cmts_ip ' RTRIM (LTRIM(:cmts_ip)) ")". "
for dhcp.
DOWNLOAD THE DATA
INFILE 'dhcp_data.csv '.
ADD
IN THE STG_DHCP_DATA TABLE
When (dhcp_token! = ") AND (dhcp_token! = 'NULL') AND (dhcp_token! = 'null')
and (dhcp_ip! = ") AND (dhcp_ip! = 'NULL') AND (dhcp_ip! = 'null')
FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "
TRAILING NULLCOLS
('RTRIM (LTRIM (:dhcp_token))' dhcp_token,
dhcp_ip ' RTRIM (LTRIM(:dhcp_ip)) ")". "
for link -.
DOWNLOAD THE DATA
INFILE 'link_data.csv '.
ADD
IN THE STG_LINK_DATA TABLE
When (dhcp_token! = ") AND (dhcp_token! = 'NULL') AND (dhcp_token! = 'null')
and (cmts_to_add! = ") AND (cmts_to_add! = 'NULL') AND (cmts_to_add! = 'null')
FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "
TRAILING NULLCOLS
('RTRIM (LTRIM (:dhcp_token))' dhcp_token,
cmts_to_add TANK (4000) RTRIM (LTRIM(:cmts_to_add)) ")" ""
SHELL SCRIPT-
If [!-d / log]
then
Mkdir log
FI
If [!-d / finished]
then
mkdir makes
FI
If [!-d / bad]
then
bad mkdir
FI
nohup time sqlldr username/password@SID CONTROL = load_cmts_data.ctl LOG = log/ldr_cmts_data.log = log/ldr_cmts_data.bad DISCARD log/ldr_cmts_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &
nohup time username/password@SID @load_cmts.sql
nohup time sqlldr username/password@SID CONTROL = load_dhcp_data.ctl LOG = log/ldr_dhcp_data.log = log/ldr_dhcp_data.bad DISCARD log/ldr_dhcp_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &
time nohup sqlplus username/password@SID @load_dhcp.sql
nohup time sqlldr username/password@SID CONTROL = load_link_data.ctl LOG = log/ldr_link_data.log = log/ldr_link_data.bad DISCARD log/ldr_link_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &
time nohup sqlplus username/password@SID @load_link.sql
MV *.log. / log
If the problem I encounter is here for loading data in the link table that I check if DHCP is present in the subntwk table, then continue to another mistake of the newspaper. If CMTS then left create link to another error in the newspaper.
Now that we can here multiple CMTS are associated with unique DHCP.
So here in the table links to create the link, but for the last iteration of the loop, where I get separated by commas separate CMTS table stg_link_data it gives me log as not found CMTS.
for example
DHCP-1-1-1. WNLB-CMTS-01-1,WNLB-CMTS-02-2
Here, I guess to link the dhcp-1-1-1 with balancing-CMTS-01-1 and wnlb-CMTS-02-2
Theses all the data present in the subntwk table, but still it gives me journal wnlb-CMTS-02-2 could not be FOUND, but we have already loaded into the subntwk table.
same thing is happening with all the CMTS table stg_link_data who are in the last (I think here you got what I'm trying to explain).
But when I run the SQL scripts in the SQL Developer separately then it inserts all valid links in the table of links.
Here, she should create 9 lines in the table of links, whereas now he creates only 5 rows.
I use COMMIT in my script also but it only does not help me.
Run these scripts in your machine let me know if you also get the same behavior I get.
and please give me a solution I tried many thing from yesterday, but it's always the same.
It is the table of link log
link is already present: dhcp-1-1-1_TO_wnlb-cmts-01-1 NOT FOUND CMTS for device to create the link: wnlb-CMTS-02-2
link is already present: dhcp-1-1-2_TO_wnlb-cmts-03-3 link is already present: dhcp-1-1-2_TO_wnlb-cmts-04-4 NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-5
NOT FOUND CMTS for device to create the link: wnlb-CMTS-01-1
NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-8 NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-6 NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-0 NOT FOUND CMTS for device to create the link: wnlb-CMTS-03-3
link is already present: dhcp-1-1-5_TO_wnlb-cmts-02-2 link is already present: dhcp-1-1-5_TO_wnlb-cmts-04-4 NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-7
Device not found: wnlb-dhcp-1-13 IF NEED MORE INFORMATION PLEASE LET ME KNOW
Thank you
I felt later in the night that during the loading in the staging table using UNIX machine he created the new line for each line. That is why the last CMTS is not found, for this I use the UNIX 2 BACK conversion and it starts to work perfectly.
It was the dos2unix error!
Thank you all for your interest and I may learn new things, as I have almost 10 months of experience in (PLSQL, SQL)
Maybe you are looking for
-
How to download fotos of iCloud to iOS10
I would like to know what to do to trigger my iPhone 7 to download all photos in iCloud. I had an iPhone 6 with 128 GB that has reached its limit of capacity on 10,000 + photos and 500 videos + and the phone became very slow and unresponsive. To free
-
I would like to be able to drag and drop the names of folders to categorize.
With Thunderbird, folders currently appear in the order they are created. I'd like to be able to drag and drop to a new location so my frequently used address is near the beginning and those less-used at the end. Tyty
-
External hard drive creates folders and files and lose the music I synced to it.
I've been struggling with my 8 GB Clip +, which is the same thing my old 4 GB Clip + has. It creates files SID and incorporate in the album folders. Renamed it with records of the characters of 8 or 9 (alpha and digital) at the beginning of the file
-
How to get Windows Mail to open in full screen.
How to get Windows Mail to open in full screen (Vista 64). I went into properties and changed the setting to the maximzed screen. Still no change.
-
From a security point of view, I don't want to not allow Adobe Flash in my democratic security but whenever I hit the sayinf box not to allow that box keeps popping up even though I tell him permit. Very frustrating.