FDMEE effective data file loading, Multiple Accts in columns
Hi FDMEE Experts.
I want to load the data on the numbers coming from the source system in header in the CSV format. so rest except employee, entity and the currency of the columns column account (most of the are Smatist values). We were loading the unknown world of file using the rules file where we set up accounts in the rules file, mapping and the addition of other dimensions of the application in the data file.
Employee ID | entity | Period | Grade | Band | Employee of expected transfer/termination Month | Employee of anticipated year of transfer/termination | Month's rent of the employee | Rental year employee | Employee status | Location of the employee tax | Annual rate of Comp | Currency | Target % Bonus the employee | FTES | Staff |
Now the question is:
1 - How can I deal with so many accounts in FDMEE, probably by adding the dimension account account import format?
2. There is no separate data column, all data coming in repectivs account columns.
3 - 12 periods are also coming in rows instead of columns
Please suggest all got no idea about this, much appreciated...
Thank you
VJ
Thank you, SH, the idea to use dimensions ATTR. I'm good to load my file now.
Solution: I've imported the TDATASEG file and then run the SQL query for her columns ATTR accounts/amount unpivot and then export to essbase.
Thank you
Vivek
Tags: Business Intelligence
Similar Questions
-
Hello
If we try to load data and data cells have ',' as seprator for ex 34,6788,666.7 will then be loaded very well data or we should remove the seprator?
Some cells have "-" missing data should I change with #missing or 0.00 in my data file?
Thank you!Are you a '-' gets loaded like a zero in Essbase through a data loading rule?
I know one "-" get send to Essbase of Excel if the formatting in Excel goes from 0 to-'s, but this is because there is real zeros behind the-'s.
I must say that I never tried to load a "-" through a data load rule that I have always said that the missing data #Missing.
If the data file cannot be changed in the source, you can use the data loading rules file to make a replacement of the the '-' with a #Missing. I prefer to do some manipulations in the possible rule file because it is a pain to maintain.
Kind regards
Cameron Lackpour
-
Error loading data with a rule in 11.1.2.1 file loading
When I do the dim the Parent child build and alias will not load?
Rule reading information SQL for the database [DP]
Rule of reading of the purpose of the database rule [DP]
Active parallel loading: [1] block to prepare the discussions, block [1] write discussions.
DATAERRORLIMIT reached [1000]. Rejected records will be is more connected
No value data modified by this data file loading
Data load time for [Accounts01.txt] with [ACCOUNT.rul]: [0.512] seconds
There were errors, look in C:\Oracle\Middleware\EPMSystem11R1\products\Essbase\eas\client\dataload.err
Completed import data ['RPIWORK'.'] [DP ']
Out of the prepared columns: [0]
Outline = measurement Demension
Measure
Balance sheet
Profit & loss
Data file
is a SQL query
to Excel and save it as a text (delimited by tabs)
Account.txt
Parent | Child | Alias
Total assets of the balance sheet 10000
Total liabilities of the balance sheet 20000
Balance Sheet Total 30000 of the owner
Profit & loss 40000 recipes
Profit & loss 50000 cost of goods sold
Profits & losses 60000 S.G./A.
Profit & loss 70000 other operating income
Profit & loss 80000 other expenses
Profit & loss 85000 interest income/expenses
Profit & loss 90000 taxes
Profit & loss 99000 capitalized Contra
10000 10100 short-term assets
10000-14000 «wood, forest and roads land»
10000 17000 "goods, materials and equipment.
10000 to 19000 active other long-term
10000 19750 deferred tax assets
20000 20001 short-term liabilities
20000 21200 other Long term liabilities
20000-22000 timber contracts
20000 25100 long-term debt
20000-26000 deferred tax liability
30000 Equity subsidiary 30001
30000 30250 voting common shares
30000 30350 according to Common Stock
30000 30450 capital paid in additional
30000 30550 retained earnings
30000 30610 other comprehensive income items
30000 30675 tax distribution to Shareholde
40000 40055 wholesales
40000 41500 chips recipes
40000 43000 power sales
40000 nursery 45000 income
40000 46000 transfer logs
40000 47000 Intraco sales
40000-49000 sale deduction
50000 Labor 51000
50000 52000 raw
50000 rental 52800
50000 53000 operating expenses
50000 53900 forestry supplies
50000 capitalized 53999 Contra-forestry
50000 maintenance 54000
50000 55000 fuels & lubricants
50000-56000 utilities
50000 57000 cost of direct registration
50000 57500 personalized services
50000 impoverishment 57700
50000 58000 cost allocations of goods sold
fixed cost 50000 59000
50000 59510 changes in inventories
60000 60100 salaries
60000 60300 maintenance of PC hardware
60000 60400 other G & A
61000 60000 licenses/fees/charges
60000 61400 benefits not
60000 61550 furniture/fixtures
60000 legal 61750
60000 62000 fresh office
60000 62500 professional services
60000 63000 activities pre & Post jobs
60000 63200 telecommunication costs
60000 63550 employee activities
60000 63800 sales & Promotions
60000 63900 banking questions
60000 64000 Admin depreciation
60000 64500 insurance and property taxes
60000 65000 allowances S G & A
60000 66000 outside management
70000 70100 rental income
70000 disposals of fixed assets 70200
70000 70400 Misc income
80000 80200 factory inactive
85000 85001 interest expense
85000 85200 interest income
90000-90100 income tax charges
error file
------20000 not found in the database members
20000 25100 long-term debt
------20000 not found in the database members
20000-26000 deferred tax liability
------Member 30000 not found in the database
30000 Equity subsidiary 30001
------Member 30000 not found in the database
30000 30250 voting common shares
------Member 30000 not found in the database
30000 30350 according to Common Stock
------Member 30000 not found in the database
30000 30450 capital paid in additional
------Member 30000 not found in the database
30000 30550 retained earnings
------Member 30000 not found in the database
30000 30610 other comprehensive income items
------Member 30000 not found in the database
30000 30675 tax distribution to Shareholde
------Member 40000 not found in the database
40000 40055 wholesales
------Member 40000 not found in the database
40000 41500 chips recipes
------Member 40000 not found in the database
40000 43000 power sales
------Member 40000 not found in the database
40000 nursery 45000 income
------Member 40000 not found in the database
40000 46000 transfer logs
------Member 40000 not found in the database
40000 47000 Intraco sales
------Member 40000 not found in the database
40000-49000 sale deduction
------Member 50000 not found in the database
50000 Labor 51000
------Member 50000 not found in the database
50000 52000 raw
------Member 50000 not found in the database
50000 rental 52800
------Member 50000 not found in the database
50000 53000 operating expenses
------Member 50000 not found in the database
50000 53900 forestry supplies
------Member 50000 not found in the database
50000 capitalized 53999 Contra-forestry
------Member 50000 not found in the database
50000 maintenance 54000
------Member 50000 not found in the database
50000 55000 fuels & lubricants
------Member 50000 not found in the database
50000-56000 utilities
------Member 50000 not found in the database
50000 57000 cost of direct registration
------Member 50000 not found in the database
50000 57500 personalized services
------Member 50000 not found in the database
50000 impoverishment 57700
------Member 50000 not found in the database
50000 58000 cost allocations of goods sold
------Member 50000 not found in the database
fixed cost 50000 59000
------Member 50000 not found in the database
50000 59510 changes in inventories
------Member 60000 not found in the database
60000 60100 salaries
------Member 60000 not found in the database
60000 60300 maintenance of PC hardware
------Member 60000 not found in the database
60000 60400 other G & A
------Member 60000 not found in the database
61000 60000 licenses/fees/charges
------Member 60000 not found in the database
60000 61400 benefits not
------Member 60000 not found in the database
60000 61550 furniture/fixtures
------Member 60000 not found in the database
60000 legal 61750
------Member 60000 not found in the database
60000 62000 fresh office
------Member 60000 not found in the database
60000 62500 professional services
------Member 60000 not found in the database
60000 63000 activities pre & Post jobs
------Member 60000 not found in the database
60000 63200 telecommunication costs
------Member 60000 not found in the database
60000 63550 employee activities
------Member 60000 not found in the database
60000 63800 sales & Promotions
------Member 60000 not found in the database
60000 63900 banking questions
------Member 60000 not found in the database
60000 64000 Admin depreciation
------Member 60000 not found in the database
60000 64500 insurance and property taxes
------Member 60000 not found in the database
60000 65000 allowances S G & A
------Member 60000 not found in the database
60000 66000 outside management
------Member 70000 not found in the database
70000 70100 rental income
------Member 70000 not found in the database
70000 disposals of fixed assets 70200
------Member 70000 not found in the database
70000 70400 Misc income
------Member 80000 not found in the database
80000 80200 factory inactive
------Member 85000 not found in the database
85000 85001 interest expense
------Member 85000 not found in the database
85000 85200 interest income
------Member 90000 N
t database
90000-90100 income tax charges
That's how I build my loading rules file
Create-> rules file
File-> opendatafile->
account01.txt
Field-> Dimension Build Properties properties
Dimension =
Field 1 measure; Type = Parent
Measurement of field 2; Type = child
Measurement of field 2; Type = Alais
Click the Dimension Build field
Click on setting Dimension build - Parent\Child
Ok
Validate - rule is correct
Save as account
Load the data file = Acciount.txt
Rule file = account
Ok
Published by: level following December 11, 2011 06:24
Published by: level following December 11, 2011 06:25
Published by: level following December 11, 2011 06:27The drop-down list in the EA when you right click on the database and select load data. I have a question, the Member that already in outline? If this isn't the case, you will have problems. You would have to add it or have a line in the file to load at the top, with something like account, balance sheet in it. Also in the State of charge have you changed the load parameters of dimension for the dimension of accounts to be parent/child for the dimension of accounts. Often enough, the people don't realize they have to double-click the dimension name to make sure it gets set as the dimension that gets changed.
I'm sure that your question is she trying to make loading the data and not the generation dim, but it might just be the first problem
-
Hello
I need assistance with the values in my data file. I search and can not find info on it. I'm sure it's there. However, I spent 3 and a half hours to research.
someone can help me.
I have three files.
created the sql table file that contains a single table.
a control file
a data file
the table contains the columns date and created_by column.
I have these columns in my control file with the given position.
My question is:
In the data file.
can the date columns, I use the sysdate to a value?
If so, it means a date achual when it's loaded?
Second:
in my data file.
can I use the USER name as a value?
I want to
the user, like me, when I load the data file and then ask her to varify that this is going to work, I'll see my username in the created_by column.
Thanks for your help
kabrajoIf I understand correctly, you want to load the current date/time and the current user for each line.
Here's a recent thread similar to what you need - documentation links are here
user input windows OS when loading data using sql loaderIn this case, the poster was interested in an OS/user. Your question is more simple. You can use these lines in your file of control for the user name and datestamp colums
... datestamp EXPRESSION sysdate, username EXPRESSION user, ...
-
Loading multiple data to an XML file?
Hello
I have a problem loading multiple data from the same XML file. In this XML file, I have a list of dishes, I want to add to a menu. So, how can I load more data from different branches of the XML file into my text fields?
Thanks in advance,
Rafael Carignato
Found a way to do it. I created X text elements named "textbox1", "textbox2", etc. Here is the code:
var i = 1;
var message;
$.ajax({)
type: 'GET ',.
URL: "books.xml"
data type: "xml."
success: function {(xml)
($xml).find('Book').each (function () {}
var Work'stitle = $(this).find('Title').text ();
var sAuthor = $(this).find('Author').text ();
sGenre var = $(this).find('Genre').text ();
message = Work'stitle + ',' + sAuthor + ', ' + sGenre;
Console.log ("Message:" + message);
Console.log ("i:" + i);
SYM. $("textbox"_+_i).html (message);
Console.log("");
i ++ ;
});
},
});
-
FDMEE requires ODI agent for simple file loading?
Hello
I have my agent ODI isn't working because of a problem with OS. I am trying to upload a simple file using FDMEE. When I try to run data load rule he says this ODI agent fails.
My question is:
FDMEE requires ODI agent even for simple file loading? FDM does not require us agent of ODI for simple file loading.
Concerning
Yes, as I mentioned in my previous post THAT FDMEE is built on top of the ODI, so she run ODI scenarios for all flows of integration
-
sqlloader to load two tables of the single data file in a single operation
Oracle 11.2.0.3 SE - One
Oracle Linux 5.6
I don't know if I need a second set of eyes or if I am missing something.
Problem: Given a file of text csv with header and detail records (identified by the first field in the file) use sql loader to load the header and detail of the tables in a single operation.
The header record is to take, but the detail records are rejected by omitting the WHEN clause.
More comments after reading through the exhibits:
In view of these two tables:
SQL > desc EDSTEST_HEADER
Name Null? Type
----------------------------------------- -------- ----------------------------
EDSTEST_HEADER_ID NOT NULL NUMBER
REC_TYPE VARCHAR2 (10)
SOLD_TO_ACCOUNT VARCHAR2 (50)
SCAC_RECEIVER_ID VARCHAR2 (50)
FORMAT_TYPE VARCHAR2 (10)
CLIENT_NAME VARCHAR2 (100)
CUSTOMER_PICKUP_ADDRESS VARCHAR2 (100)
CUSTOMER_PICKUP_CITY VARCHAR2 (50)
CUSTOMER_PICKUP_STATE VARCHAR2 (10)
CUSTOMER_PICKUP_ZIP VARCHAR2 (50)
INSERT_USER VARCHAR2 (50)
DATE OF INSERT_USER_DATE
INSERT_STATUS_CODE VARCHAR2 (10)
SQL > desc EDSTEST_DETAIL
Name Null? Type
----------------------------------------- -------- ----------------------------
EDSTEST_DETAIL_ID NOT NULL NUMBER
NUMBER OF EDSTEST_HEADER_ID
REC_TYPE VARCHAR2 (10)
SHIP_TO_NAME VARCHAR2 (100)
SHIP_TO_ADDRESS VARCHAR2 (100)
SHIP_TO_CITY VARCHAR2 (50)
SHIP_TO_STATE VARCHAR2 (10)
SHIP_TO_ZIP VARCHAR2 (50)
STATUS_OR_APPT_REASON_CD VARCHAR2 (10)
EVENT_DESCRIPTION VARCHAR2 (50)
SHIPMENT_STATUS_CD VARCHAR2 (10)
SHIPMENT_EVENT_DATE VARCHAR2 (10)
SHIPMENT_EVENT_TIME VARCHAR2 (10)
EVENT_TIME_ZONE VARCHAR2 (30)
EVENT_CITY VARCHAR2 (100)
EVENT_STATE VARCHAR2 (50)
EVENT_ZIP VARCHAR2 (50)
CUSTOMER_CONFIRM VARCHAR2 (100)
DELIVERY_CONFIRM VARCHAR2 (100)
TRACKING_NUMBER VARCHAR2 (50)
MAIL_REC_WEIGHT VARCHAR2 (20)
MAIL_REC_WEIGHT_CD VARCHAR2 (10)
MAIL_RED_QTY VARCHAR2 (10)
INSERT_USER VARCHAR2 (50)
DATE OF INSERT_USER_DATE
INSERT_STATUS_CODE VARCHAR2 (10)
In light of this data file:
Oracle: mydb$ cat eds_edstest.dat
HDR, 0005114090, MYORG, CSV, MY NAME OF THE COMPANY, 123 ELM ST, STUCKYVILLE, OH, 12345
DTL, TOADSUCK, NC, 27999, NS, ARRIVED at the UNIT, X 4, 20140726, 063100, AND, TOADSUCK, NC,.3861, 27999, 12345, 23456 lbs, 1
DTL, TOADSUCK, NC, 27999, lbs, 1 NS, SORTING COMPLETE, X 6, 20140726, 080000, AND TOADSUCK, NC,.3861, 27999, 12345, 23456
DTL, TOADSUCK, NC, 27999, NS, PRONOUNCED, D1, 20140726, 121800, TOADSUCK, NC, 27999, 12345, 23456,.3861, lbs and 1
Given this control sqlloader file:
Oracle: mydb$ cat eds_edstest_combined.ctl
Load
INFILE ' / xfers/oracle/myapp/data/eds_edstest.dat'
BADFILE ' / xfers/oracle/myapp/data/eds_edstest.bad'
DISCARDFILE ' / xfers/oracle/myapp/data/eds_edstest.dsc'
ADD
IN THE TABLE estevens.edstest_header
WHERE (rec_type = 'HDR')
FIELDS TERMINATED BY ', '.
SURROUNDED OF POSSIBLY "" "
TRAILING NULLCOLS
(rec_type CHAR,
sold_to_account TANK,
scac_receiver_id TANK,
format_type TANK,
client_name TANK,
customer_pickup_address TANK,
customer_pickup_city TANK,
customer_pickup_state TANK,
customer_pickup_zip TANK,
INSERT_USER "1"
INSERT_USER_DATE sysdate,
INSERT_STATUS_CODE CONSTANT 'I')
IN THE TABLE estevens.edstest_detail
WHERE (rec_type = 'PIF')
FIELDS TERMINATED BY ', '.
SURROUNDED OF POSSIBLY "" "
TRAILING NULLCOLS
(rec_type CHAR,
ship_to_name TANK,
ship_to_address TANK,
ship_to_city TANK,
ship_to_state TANK,
ship_to_zip TANK,
status_or_appt_reason_cd TANK,
event_description TANK,
shipment_status_cd TANK,
shipment_event_date TANK,
shipment_event_time TANK,
event_time_zone TANK,
event_city TANK,
Event_State TANK,
event_zip TANK,
customer_confirm TANK,
delivery_confirm TANK,
tracking_number TANK,
mail_rec_weight TANK,
mail_rec_weight_cd TANK,
mail_red_qty TANK,
INSERT_USER "1"
INSERT_USER_DATE sysdate,
INSERT_STATUS_CODE CONSTANT 'I')
-END CONTROL FILE
And the time of execution of transactions:
SQL * Loader: release 11.2.0.3.0 - Production the kill Jul 29 07:50:04 2014
Copyright (c) 1982, 2011, Oracle and/or its affiliates. All rights reserved.
Control file: /xfers/oracle/myapp/control/eds_edstest_combined.ctl
Data file: /xfers/oracle/myapp/data/eds_edstest.dat
Bad leadership: /xfers/oracle/myapp/data/eds_edstest.bad
Delete the file: /xfers/oracle/myapp/data/eds_edstest.dsc
(Allow all releases)
Number of loading: ALL
Number of jump: 0
Authorized errors: 50
Link table: 10000 lines, maximum of 256000 bytes
Continuation of the debate: none is specified
Path used: classics
Silent options: your COMMENTS
Table ESTEVENS. EDSTEST_HEADER, loaded when REC_TYPE = 0 X 484452 (character "HDR") add the option in effect for this table: APPEND TRAILING NULLCOLS option in effect
Column Position Len term Encl. Datatype name
------------------------------ ---------- ----- ---- ---- ---------------------
FIRST REC_TYPE *, O ("") CHARACTER
SOLD_TO_ACCOUNT NEXT *, O ("") CHARACTER
SCAC_RECEIVER_ID NEXT *, O ("") CHARACTER
FORMAT_TYPE NEXT *, O ("") CHARACTER
CLIENT_NAME NEXT *, O ("") CHARACTER
CUSTOMER_PICKUP_ADDRESS NEXT *, O ("") CHARACTER
CUSTOMER_PICKUP_CITY NEXT *, O ("") CHARACTER
CUSTOMER_PICKUP_STATE NEXT *, O ("") CHARACTER
CUSTOMER_PICKUP_ZIP NEXT *, O ("") CHARACTER
INSERT_USER NEXT *, O ("") CHARACTER
The SQL string for the column: "1."
INSERT_USER_DATE SYSDATE
CONSTANT INSERT_STATUS_CODE
The value is 'I '.
Table ESTEVENS. EDSTEST_DETAIL, loaded when REC_TYPE = 0X44544c ('PIF' character) in effect for this table insert option: APPEND TRAILING NULLCOLS option in effect
Column Position Len term Encl. Datatype name
------------------------------ ---------- ----- ---- ---- ---------------------
REC_TYPE NEXT *, O ("") CHARACTER
SHIP_TO_NAME NEXT *, O ("") CHARACTER
SHIP_TO_ADDRESS NEXT *, O ("") CHARACTER
SHIP_TO_CITY NEXT *, O ("") CHARACTER
SHIP_TO_STATE NEXT *, O ("") CHARACTER
SHIP_TO_ZIP NEXT *, O ("") CHARACTER
STATUS_OR_APPT_REASON_CD NEXT *, O ("") CHARACTER
EVENT_DESCRIPTION NEXT *, O ("") CHARACTER
SHIPMENT_STATUS_CD NEXT *, O ("") CHARACTER
SHIPMENT_EVENT_DATE NEXT *, O ("") CHARACTER
SHIPMENT_EVENT_TIME NEXT *, O ("") CHARACTER
EVENT_TIME_ZONE NEXT *, O ("") CHARACTER
EVENT_CITY NEXT *, O ("") CHARACTER
EVENT_STATE NEXT *, O ("") CHARACTER
EVENT_ZIP NEXT *, O ("") CHARACTER
CUSTOMER_CONFIRM NEXT *, O ("") CHARACTER
DELIVERY_CONFIRM NEXT *, O ("") CHARACTER
TRACKING_NUMBER NEXT *, O ("") CHARACTER
MAIL_REC_WEIGHT NEXT *, O ("") CHARACTER
MAIL_REC_WEIGHT_CD NEXT *, O ("") CHARACTER
MAIL_RED_QTY NEXT *, O ("") CHARACTER
INSERT_USER NEXT *, O ("") CHARACTER
The SQL string for the column: "1."
INSERT_USER_DATE SYSDATE
CONSTANT INSERT_STATUS_CODE
The value is 'I '.
value used for the parameter LINES increased from 10000 to 30 sheet 2: discarded - failed all WHEN clauses.
Sheet 3: Discarded - failed all WHEN clauses.
Folder 4: Discarded - failed all WHEN clauses.
Table ESTEVENS. EDSTEST_HEADER:
1 row loaded successfully.
0 rows not loaded due to data errors.
3 rows not loading because all WHEN clauses were failed.
0 rows not populated because all fields are null.
Table ESTEVENS. EDSTEST_DETAIL:
0 rows successfully loaded.
0 rows not loaded due to data errors.
4 rows not loading because all WHEN clauses were failed.
0 rows not populated because all fields are null.
The space allocated to bind table: 247800 byte (30 lines)
Bytes of read buffer: 1048576
Total logical records ignored: 0
Total logical records read: 4
Total rejected logical records: 0
Logical records discarded total: 3
Run started the kill Jul 29 07:50:04 2014
Run finished on Tue Jul 29 07:50:04 2014
Time was: 00:00:00.07
Time processor was: 00:00:00.01
It works on linux, and the file data calculated from a Windows system to the time we get to it, but it's in the * nix format - with a simple x '0A' as the line terminator.
If, in the control file, I comment on the block of INSERTION for the header table, retail inserts very well.
If, in the control file, (return to the initial charge, two tables) I change the line
INFILE ' / xfers/oracle/myapp/data/eds_edstest.dat'
To read
INFILE ' / xfers/oracle/myapp/data/eds_edstest.dat' "str" | "» \n' »
The saved result becomes
Table ESTEVENS. EDSTEST_HEADER:
1 row loaded successfully.
0 rows not loaded due to data errors.
0 rows not loading because all WHEN clauses were failed.
0 rows not populated because all fields are null.
Table ESTEVENS. EDSTEST_DETAIL:
0 rows successfully loaded.
0 rows not loaded due to data errors.
1 row not loaded because all WHEN clauses were failed.
0 rows not populated because all fields are null.
I try to help the developer on this, and it resists change to use external tables. Even if I can overcome that, I now have a puzzle I want to solve, just to add to my knowledge. Plus, I have some concerns at this stage that whatever it is that miss me here could also come into play if I convert external tables.
Ed,
Are you sure that put you the post in the right place? It should be located in the first definition of the following field each time after the first when clause clause. Put after the first when the clause is optional. When I use the following with what you have provided, it loads 1 record in the table header and 3 records in the Details table. Did you actually do or tell your developer to do and wish that he understood and put it in the right place?
Load
INFILE 'eds_edstest.dat '.
BADFILE "eds_edstest.bad."
DISCARDFILE 'eds_edstest.dsc '.
ADD
IN THE TABLE edstest_header
WHERE (rec_type = 'HDR')
FIELDS TERMINATED BY ', '.
SURROUNDED OF POSSIBLY "" "
TRAILING NULLCOLS
(rec_type CHAR,
sold_to_account TANK,
scac_receiver_id TANK,
format_type TANK,
client_name TANK,
customer_pickup_address TANK,
customer_pickup_city TANK,
customer_pickup_state TANK,
customer_pickup_zip TANK,
INSERT_USER "1"
INSERT_USER_DATE sysdate,
INSERT_STATUS_CODE CONSTANT 'I')
IN THE TABLE edstest_detail
WHERE (rec_type = 'PIF')
FIELDS TERMINATED BY ', '.
SURROUNDED OF POSSIBLY "" "
TRAILING NULLCOLS
(rec_type POSITION (1) TANK,
ship_to_name TANK,
ship_to_address TANK,
ship_to_city TANK,
ship_to_state TANK,
ship_to_zip TANK,
status_or_appt_reason_cd TANK,
event_description TANK,
shipment_status_cd TANK,
shipment_event_date TANK,
shipment_event_time TANK,
event_time_zone TANK,
event_city TANK,
Event_State TANK,
event_zip TANK,
customer_confirm TANK,
delivery_confirm TANK,
tracking_number TANK,
mail_rec_weight TANK,
mail_rec_weight_cd TANK,
mail_red_qty TANK,
INSERT_USER "1"
INSERT_USER_DATE sysdate,
INSERT_STATUS_CODE CONSTANT 'I')
-
When loading, error: field in the data file exceeds the maximum length
Oracle Database 11 g Enterprise Edition Release 11.2.0.3.0 - 64 bit Production
PL/SQL Release 11.2.0.3.0 - Production
CORE Production 11.2.0.3.0
AMT for Solaris: 11.2.0.3.0 - Production Version
NLSRTL Version 11.2.0.3.0 - Production
I am trying to load a table, small size (110 lines, 6 columns). One of the columns, called NOTES is less error when I run the load. That is to say that the size of the column exceeds the limit max. As you can see here, the column of the table is equal to 4000 bytes)
CREATE TABLE NRIS. NRN_REPORT_NOTES
(
Sys_guid() NOTES_CN VARCHAR2 (40 BYTE) DEFAULT is NOT NULL.
REPORT_GROUP VARCHAR2 (100 BYTE) NOT NULL,
POSTCODE VARCHAR2 (50 BYTE) NOT NULL,
ROUND NUMBER (3) NOT NULL,
VARCHAR2 (4000 BYTE) NOTES,
LAST_UPDATE TIMESTAMP (6) WITH ZONE SCHEDULE systimestamp NOT NULL default
)
TABLESPACE USERS
RESULT_CACHE (DEFAULT MODE)
PCTUSED 0
PCTFREE 10
INITRANS 1
MAXTRANS 255
STORAGE)
80K INITIAL
ACCORDING TO 1 M
MINEXTENTS 1
MAXEXTENTS UNLIMITED
PCTINCREASE 0
DEFAULT USER_TABLES
DEFAULT FLASH_CACHE
DEFAULT CELL_FLASH_CACHE
)
LOGGING
NOCOMPRESS
NOCACHE
NOPARALLEL
MONITORING;
I did a little investigating, and it does not match.
When I run
Select max (lengthb (notes)) in NRIS. NRN_REPORT_NOTES
I got a return of
643
.
Which tells me that the larger size of this column is only 643 bytes. But EACH insert is a failure.
Here is the header of the file loader and first couple of inserts:
DOWNLOAD THE DATA
INFILE *.
BADFILE '. / NRIS. NRN_REPORT_NOTES. BAD'
DISCARDFILE '. / NRIS. NRN_REPORT_NOTES. DSC"
ADD IN THE NRIS TABLE. NRN_REPORT_NOTES
Fields ended by '; '. Eventually framed by ' |'
(
NOTES_CN,
REPORT_GROUP,
Zip code
ALL ABOUT NULLIF (R = 'NULL'),
NOTES,
LAST_UPDATE TIMESTAMP WITH TIME ZONE ' MM/DD/YYYY HH24:MI:SS. FF9 TZR' NULLIF (LAST_UPDATE = 'NULL')
)
BEGINDATA
| E2ACF256F01F46A7E0440003BA0F14C2; | | DEMOGRAPHIC DATA |; A01003; | 3 ; | demographic results show that 46% of visits are made by women. Among racial and ethnic minorities, the most often encountered are native American (4%) and Hispanic / Latino (2%). The breakdown by age shows that the Bitterroot has a relatively low of children under 16 (14%) proportion in the population of visit. People over 60 represent about 22% of visits. Most of the visitation comes from the region. More than 85% of the visits come from people who live within 50 miles. | ; 29/07/2013 0, 16:09:27.000000000 - 06:00
| E2ACF256F02046A7E0440003BA0F14C2; | | DESCRIPTION OF THE VISIT; | | A01003; | 3 ; | most visits to the Bitterroot are relatively short. More than half of the visits last less than 3 hours. The median duration of visiting sites for the night is about 43 hours, or about 2 days. The average Wilderness visit lasts only about 6 hours, although more than half of these visits are shorter than the duration of 3 hours. Most of the visits come from people who are frequent visitors. Over thirty percent are made by people who visit between 40 and 100 times a year. Another 8% of visits from people who say they visit more than 100 times a year. | ; 29/07/2013 0, 16:09:27.000000000 - 06:00
| E2ACF256F02146A7E0440003BA0F14C2; | | ACTIVITIES |. A01003; | 3 ; | most often reported the main activity is hiking (42%), followed by alpine skiing (12%) and hunting (8%). More than half of the report visits participating in the relaxation and the display landscape. | ; 29/07/2013 0, 16:09:27.000000000 - 06:00
Here's the full start of log loader, ending after the return of the first row. (They ALL say the same error)
SQL * Loader: Release 10.2.0.4.0 - Production Thu Aug 22 12:09:07 2013
Copyright (c) 1982, 2007, Oracle. All rights reserved.
Control file: NRIS. NRN_REPORT_NOTES. CTL
Data file: NRIS. NRN_REPORT_NOTES. CTL
Bad File:. / NRIS. NRN_REPORT_NOTES. BAD
Discard File:. / NRIS. NRN_REPORT_NOTES. DSC
(Allow all releases)
Number of loading: ALL
Number of jump: 0
Authorized errors: 50
Link table: 64 lines, maximum of 256000 bytes
Continuation of the debate: none is specified
Path used: classics
NRIS table. NRN_REPORT_NOTES, loaded from every logical record.
Insert the option in effect for this table: APPEND
Column Position Len term Encl. Datatype name
------------------------------ ---------- ----- ---- ---- ---------------------
FIRST NOTES_CN *; O (|) CHARACTER
REPORT_GROUP NEXT *; O (|) CHARACTER
AREA CODE FOLLOWING *; O (|) CHARACTER
ROUND NEXT * ; O (|) CHARACTER
NULL if r = 0X4e554c4c ('NULL' character)
NOTES NEXT * ; O (|) CHARACTER
LAST_UPDATE NEXT *; O (|) DATETIME MM/DD/YYYY HH24:MI:SS. FF9 TZR
NULL if LAST_UPDATE = 0X4e554c4c ('NULL' character)
Sheet 1: Rejected - error in NRIS table. NRN_REPORT_NOTES, information ABOUT the column.
Field in the data file exceeds the maximum length.
I don't see why this should be failed.
Hello
the problem is bounded by default, char (255) data... Very useful, I know...
you need two, IE sqlldr Hat data is longer than this.
so change notes to notes char (4000) you control file and it should work.
see you soon,
Harry
-
How to load multiple files into multiple tables using a Controlfile?
Hello world
I have four different tables with similar structures, get the data from four different data files. I would like to use one control file to load the data from four different files from four different tables.
Here's the DOF of the tables:
CREATE TABLE Product_Sales( Year_of_Sale NUMBER, Product_Type VARCHAR2(25 CHAR), Product_Group VARCHAR2(25 CHAR), Category_Type VARCHAR2(25 CHAR), Category_Group VARCHAR2(10 CHAR), Product_Count NUMBER, Product_Amount NUMBER(19,2), Category_Count NUMBER, Category_Amount NUMBER(19,2) )
CREATE TABLE Retail_Sales( Year_of_Sale NUMBER, Product_Type VARCHAR2(25 CHAR), Product_Group VARCHAR2(25 CHAR), Category_Type VARCHAR2(25 CHAR), Category_Group VARCHAR2(10 CHAR), Product_Count NUMBER, Product_Amount NUMBER(19,2), Category_Count NUMBER, Category_Amount NUMBER(19,2) )
-
Option to enter the name of data file in SQL * Loader
Hello
I have a requirement to enter the name of the data file in the staging table, is it possible that I can capture in SQL * Loader or any other way to do it.
Need suggestion of experts please.
Thank you
Genoo
Genoo salvation,
Please have the basename command in Linux to enter the name of the file that is currently running. I don't have a linux machine at the moment access, so please check at your end.
BTW, as you file name of variant data in and values not consistent, you need to change the data file to include the file name. Unless it is approved for your management I don't see alternatives like the fill source tables are the data file. If this is approved then you can do the following:
SED EI / $/ ' basename $0' /'-i '.
Thus, for example, if you have a data file name 01test.csv, and it contains data such as:
11, AAA, ABC
22, BBB, BCD
The command: sed EI / $/, ' basename $0' /'-i 01test.csv will go to:
11, AAA, ABC, 01test.csv
22, BBB, BCD, 01test.csv
Then use SQL Loader to load the file
It is the closest solution to you, I can think. There is no way that you can achieve your goal of features SQL Loader but rather, you should use technical Linux for a work around.
Best regards
-
SQL Loader failed to load the same source sometimes data file
I meet different types of data, loading errors when you try to load data. What makes it really annoying, is that I'm not able to identify the real culprit since the success of load depends on the amount of lines in the source data file but not its content. I use dataset Toad export feature to create delimited data set. When I decided to take the first 50 lines then succeeds data to load into the target table. When I decide to take the first 150 lines then the data to load into the target table fails indicating that:
I can see .bad file that the same line has been loaded successfully when the data file that contains 50 rows was used. The content has no column for this particular row is NULL (no chain).Record 13: Rejected - Error on table ISIKUD, column SYNLAPSI. ORA-01722: invalid number
I suspect that the toad generates faulty delimited text file when taking 150 lines. File data for reasons of confidentiality, I can't show. What can I do? How can we all further investigate this problem? Size of the array that must be loaded by SQL Loader is almost 600 MB. I use Windows XP.
Published by: totalnewby on June 30, 2012 10:17I do not believe that customer sqlloader 11g allows you to load a 10g database. You can run sqlldr on the server of database 10g itself? PL also post the rest I asked information
HTH
Srini -
Problem with the rule while the data is loading files
Hello
I'm having a problem trying to load data to an application that uses rules files. Load data files were from another application using the DATAEXPORT. I have two files for each entity. One is for the current year and one for next year. The source and target applications are different, so I do all the mapping using rule files.
The data of the current year file contains data for CurrMonth:Dec & CurrYear. The following year data file contains data for Jan: Dec and the next. I made rules for each entity files load data into the application and has made the mapping when creating these rules files using set 12 months.
The problem that I am facing is that I'm able to correctly load the following year in the target application, my rules file is the right month for each column mapping. But for the year underway, I have just 2 columns of data (whereas & CurrMonth = Nov) and so when I'm trying to load the data file using the rules file, it is to map the November and December months like January and February and charging at these months of FY11, which is false.
I don't know how to solve this problem. The current year data file it will always change in terms of months then how can I modify the rules that he recognizes that the months are coming in and where to put the data.
Please let me know your suggestions. Any help will be appreciated.
Thank you!
~ Hervé
PS ~ it was the same problem, I was faced with a few months back when I posted this question and I followed the advice in this post, but always faced to the question -
Published by: Gwen on December 20, 2011 08:25Adella,
We have faced a similar issue and we were able to solve it by sticking to these months dimension members (& currmonth: dec) and generate the header
The export file created when paste us the dimension names in front of months on the export file
Suppose that your export file is currmonth.txt and she Nov and Dec
first line of your export file should look like this
'Version' 'Year' 'Scenario' "XYZ" "123" "ABC" "Account", "Nov", "Dec".
We have created a batch file to add this to the original export file, it looks like this
Paste d "\0" Header.txt "D:\Hyperion\products\Essbase\EssbaseServer\app\xyz\123\currmonth.txt" > "D:\Hyperion\products\Essbase\EssbaseServer\app\xyz\121\currmonthchange.txt".Header.txt is a text file that it contains ("Version" "Year" "Scénario" "XYZ" "123" "ABC" "count")
change your rules file so that it generates the header in your source (currmonthchange.txt)
rulefile--> the properties of data source--> header--> data records source header--> recording containing the names of data filed charge (set 1)
I hope this will solve your problem
-
Loading multiple files using the same interface in ODI
Hi all
We load multiple files using the same interface and get the error "java.sql.SQLException: ORA-00942: table or view does not exist" while inserting record in the staging table. It looks like the same temporary table is used when loading multiple files and the error. Grateful if someone offers a solution to avoid this error.
We use the following KMS:
(1) LKM SQL file
(2) IKM Oracle SQL COMMAND append.
Receive a quick response.
Thank you
RPHello
See this http://odiexperts.com/interface-parallel-execution-a-new-solution
Thank you
Fati -
Need data files and rules file to create and load data for sample.basi
I need emergency data files to build dimensions dynamically and also the rule files for loading data to the application Sample.Basic for Essbase 9. Kindly let me know if anyone can provide them. Otherwise any link frm where I can get it?
Thanks in advance.Outline and all the data (calcdat.txt) are included in the zip, so you don't need to rebuild something with the loading rules
See you soon
John
http://John-Goodwin.blogspot.com/ -
How does Photoshop decide what dict (tw10428.dat) file must be loaded?
I want to add some ZStrings to the dictionary files tw10428.dat, in my computer, the file is located in the folder "C:\Program Files\Adobe\Adobe Photoshop CS5\Locales\XX_YY\Support Files", here XX_YY stands for locale en_US, zh_tW rating, etc. How does Photoshop decide in which folder the dict file is located? Your Setup, or local playback information (i.e. codepage) at runtime, or through local folder?
And another question is, after I upgraded to tw10428.dat, if there is an update to photoshop installed, will be this dict file replaced? If so, the ZStrings, I added would be lost, how can I keep them?
Thanks for all your advice.
There is no method to tell Photoshop to load another full of zstrings dat file.
I don't know anyway to determine the language that Photoshop is running without the Photoshop running.
Maybe you are looking for
-
8.2 built exe can work in the environment 8.6
Thank you.
-
User profile hive cleanup service do not appear to work with Vista!
HII,The user profile Hive Cleanup service do not seem to work with Vista. DoesAnyone know equivalent that please. Thank you.
-
where can I get an activation code
I try to activate windows vista, and the need for a code, how do I get a.
-
How is you back up the registry?
Remember - this is a public forum so never post private information such as numbers of mail or telephone! Ideas: You have problems with programs Error messages Recent changes to your computer What you have already tried to solve the problem
-
Hello I am looking to reduce the volume of stocks of toner that keep us and we buy an MFP and printer. If anyone knows if there's office for a laser printer and a laser MFP solutions that use the same toner cartridges? Thank you Jimmy