Lots of semantic data loading

I'm trying to load a file of N3 pending Oracle 11 g. Basically, it's the same data as in the example of the family (here: http://tinyurl.com/yl9zqtl) I put in a file of N3. I'm running a command like below (by: http://tinyurl.com/yjp4olv)


Java - Ddb.user = < dbuser > - Ddb.password = < pass > - Ddb.host = 127.0.0.1 - Ddb.port = 11521 - Ddb.sid = oracle - cp sdordf.jar:ojdbc6.jar oracle.spatial.rdf.client.BatchLoader family.n3 family_rdf_data rdf_tablespace fam

The "family.n3" file looks like:

@prefix fam: < http://www.example.org/family.rdf# >.
@prefix rdf: http://www.w3.org/1999/02/22-rdf-syntax-ns# >.

FAM:John RDF: type fam:Male;
FAM:fatherOf fam:Suzie;
FAM:fatherOf fam:Matt.

FAM:Janice fam:motherOf fam:Suzie;
FAM:motherOf fam:Matt.

FAM:Sammy fam:fatherOf fam:Cathy;
FAM:fatherOf fam:Jack.

.. .etc


Then, I get an error message in the BatchLoader program:

The batch loading has begun...
Subject: @prefix
Property: rdf:
Subject: < http://www.w3.org/1999/02/22-rdf-syntax-ns# >

Maximum loading of rows = 2
java.lang.StringIndexOutOfBoundsException: String index out of range:-1
at java.lang.String.substring(String.java:1937)
at oracle.spatial.rdf.client.NTripleTokenizer.parse(NTripleTokenizer.java:71)
at oracle.spatial.rdf.client.NTripleConverter.loadNTriple(NTripleConverter.java:534)
at oracle.spatial.rdf.client.BatchLoader.main(BatchLoader.java:302)


It errors on the first line if I remove the second line from the top. I'm doing something wrong? I glued the N3 file in a validator N3 (found here: http://www.rdfabout.com/demo/validator/) and it validates, so the N3 of the file at least looks like it works very well.

Thank you

Ryan

Hi Ryan,

The charger of the lot at sdordf.jar does not accept the N3 format. You can try one of the following
-you use Jena adapter (which accepts the N3, NTRIPLE, RDF/XML), or
-convert N3 NTriple format and re - try batch fetching.

See you soon,.

Zhe Wu

Tags: Database

Similar Questions

  • Update quantities available through data loader

    Dear all,

    I need to update the quantities in stock of a number of elements (5000 points).

    I hope that I can do this with charger data, but which is the screen that I need to update the stock. Is the path, or any other that I can accomplish this stock update...

    Please update...

    Thanks in advance...

    Hello
    We do not update for amount of 5000 Articles through data loader will be a good option. Update of data /Loading by Dataloader is good for fewer records only, but when the volume is too high it is preferable to use interafce program/API and this way you can save a lot of your time and your energy :)

    Why do you not use below approach for updating onhand quantity:

    (* 1) table interface onhand amount of image data for the items for which you want to change the amount of onhand (apps.mtl_transactions_interface) *.
    (* 2) and then use below API to update the quantity onhand *.

    Apps. Inv_Txn_Manager_Pub.process_transactions (p_api_version-online p_api_version,
    p_init_msg_list-online p_init_msg_list,
    p_commit-online p_commit,
    p_validation_level-online p_validation_level,
    x_return_status-online x_return_status,
    x_msg_count-online x_msg_count,
    x_msg_data-online x_msg_data,
    x_trans_count-online x_trans_count,
    p_table-online p_table,
    p_header_id-online p_header_id);

    It's suggestion of backend (onhand API) ONHAND.

    This way update items onhand quantity will be relatively faster than data loader. Please consider this as a suggestion, I'm no way to challenge your approach to the data loader. :)

    Kind regards
    S.P DASH

  • Newbie sorry data-load question and datafile / viral measure

    Hi guys

    Sorry disturbing you - but I did a lot of reading and am still confused.

    I was asked to create a new tablespace:

    create tablespace xyz datafile 'oradata/corpdata/xyz.dbf' size 2048M extent management local size unique 1023M;

    alter tablespace xyz add datafile ' / oradata/corpdata/xyz.dbf' size 2048M;

    Despite being worried not given information about the data to load or why the tablespace must be sized that way - I was told to just 'do it '.

    Someone tried to load data - and there was a message in the alerts log.

    ORA-1652: unable to extend temp by 65472 segment in tablespace xyz

    We do not use autoextend on data files even if the person loading the data would be so (they are new on the environment).

    The database is on a cold backup nightly routine - we are in a rock anvil - we have no space on the server - to make RMAN and only 10 G left on the Strip for (Veritas) backup routine and thus control space with no autoextend management.

    As far as I know of the above error message is that the storage space is not large enough to hold the load data - but I was told by the person who imports the data they have it correctly dimensioned and it something I did when the database create order (although I have cut and pasted from their instructions - and I adapted to our environment - Windows 2003 SP2 but 32 bits).

    The person called to say I had messed up their data loading and was about to make me their manager for failing to do my job - and they did and my line manager said that I failed to correctly create the tablespace.

    When this person was asked to create the tablespace I asked why they thought that extensions should be 1023M and said it was a large data load that must be inserted to a certain extent.

    That sounds good... but I'm confused.

    1023M is very much - this means that you have only four extents in the tablespace until it reaches capacity.

    It is a load - is GIS data - I have not participated in the previous data loads GIS - other than monitor and change of tablespaces to support - and previous people have size it right - and I've never had no return. Guess I'm a bit lazy - just did as they asked.

    However, they never used 128K as a size measure never 1023M.

    Can I ask is 1023 M normal for large data loads - or I'm just the question - it seems excessive unless you really just a table and an index of 1023M?

    Thanks for any idea or other research.

    Assuming a block size of 8 KB, 65472 would be 511 MB. However, as it is a GIS database, my guess is that the database block size itself has been set to 16K, then 65472 is 1023MB.

    What load data is done? Oracle Export dump? Which includes a CREATE INDEX statement?
    Export-Import is a CREATE TABLE and INSERT so that you would get an ORA-1652 on it. So you get ORA-1652 if the array is created.
    However, you will get an ORA-1652 on an INDEX to CREATE the target segment (ie the Index) for this operation is initially created as a 'temporary' segment until the Index build is complete when it switches to be a 'temporary' to be a segment of "index".

    Also, if parallelism is used, each parallel operation would attempt to assign degrees of 1023 MB. Therefore, even if the final index target should have been only, say 512 MB, a CREATE INDEX with a DEGREE of 4 would begin with 4 extensions of 1023 MB each and would not decrease to less than that!

    A measure of 1023 MB size is, in my opinion, very bad. My guess is that they came up with an estimate of the size of the table and thought that the table should be inserted in 1 measure and, therefore, specified 1023 MB in the script that is provided to you. And it is wrong.

    Same Oracle AUTOALLOCATE goes only up to 64 MB extended when a Segment reached the mark of 1 GB.

  • I lost Fox reminder... a lot of important dates etc... how I get it back

    Lost reminder Fox of Firefox. This has had a lot of important dates. Can I get it back. Something is displayed in the files when I search on the computer, but I can't understand how to try to add to the profile. It also doesn't let me read it because windows does not know how to open it. I tried the 'copying files between profile folders' but he won't let me not make right click in the profile to stick for some reason any

    Hi there... not the extension did not show anywhere in Firefox, although I could find a few mentions 'reminderfox' looking for windows.

    This was driving me crazy in the morning and half in the afternoon (I'm one of those who does not like to be beaten... especially by the computer). So now I have solved my problem.

    What I did to reinstall reminderfox, then, after further research, I've been just "reminderfox.ics.bak3" (assuming it was the backup) in the icon on the status bar and voila! It worked.

    Thanks a lot for picking up on my question

  • Firefox has become very slow. It takes a lot of time to load any website. I reinstalled FF but the fleeting problem.

    Firefox has become very slow. It takes a lot of time to load any website. I reinstalled FF but the rest of the problem.
    As a test, I downloaded Chrome and it is much faster.
    But I love FF and would be very happy if you can help me solve this problem.

    Hi Reos,

    When you reinstall Firefox it always uses your old profile with all the old things that you had. That said, you're lucky, there is a new feature that lets you "Reset" of Firefox. This will remove some of the pieces that might cause problems while it keeps your bookmarks and other important things safe.

    Check it out and let me know if it works: Refresh Firefox – reset the parameters and modules

  • Forcing errors when loading essbase nonleaf data loading

    Hi all

    I was wondering if anyone had experience forcing data load errors when loadrules is trying to push the nonleaf members data in an essbase cube.

    I obviously ETL level to prevent members to achieve State of charge which are no sheet of error handling, but I would however like the management of the additional errors (so not only fix the errors).

    ID much prefer errors to be rejected and shown in the log, rather than being crushed by aggregation in the background

    Have you tried to create a security filter for the user used by the load that allows only write at level 0 and not greater access?

  • Data loading convert timestamp

    I use APEX 5.0 and download data wizard allows to transfer data from excel.

    I define Update_Date as timestamp (2) and the data in Excel is 2015/06/30 12:21:57

    NLS_TIMESTAMP_FORMAT is HH12:MI:SSXFF AM JJ/MM/RR

    I created the rule of transformation for update_date in the body of the function, plsql as below

    declare

    l_input_from_csv varchar2 (25): = to_char(:update_date, 'MM/DD/YYYY HH12:MI:SS AM');

    l_date timestamp;

    Start

    l_date: = to_timestamp (l_input_from_csv, ' MM/DD/YYYY HH12:MI: SS AM' ");

    Return l_date;

    end;

    I keep having error of transformation rule.  If I do create a transformation rule, it will ask for invalid month.

    Please help me to fix this.

    Thank you very much in advance!

    Hi DorothySG,

    DorothySG wrote:

    Please test on demand delivery data loading

    I stick a few examples to the Instruction of Page, you can use copy and paste function, it will give the same result

    Please change a coma separator ',' on the other will be message "do not load".

    Check your Application 21919. I changed the rule of transformation:

    Data is loading properly.

    Kind regards

    Kiran

  • Move rejected records to a table during a data load

    Hi all

    When I run my interfaces sometimes I get errors caused by "invalid records. I mean, some contents of field is not valid.

    Then I would move these invalid to another table records, while the data loading lights in order not to interrupt the loading of data.

    How can I do? There are examples that I could follow?

    Thanks in advance

    concerning

    Hi Alvaro,

    Here you can find different ways to achieve this goal and choose according to your requirement:

    https://community.Oracle.com/thread/3764279?SR=Inbox

  • Data loading Wizzard

    On Oracle Apex, is there a feasibility study to change the default feature

    1. can we convert the load data Wizard just insert to insert / update functionality based on the table of the source?

    2. possibility of Validation - Count of Records < target table is true, then the user should get a choice to continue with insert / cancel the data loading process.

    I use APEX 5.0

    Need it please advice on this 2 points

    Hi Sudhir,

    I'll answer your questions below:

    (1) Yes, loading data can be inserted/updated updated

    It's the default behavior, if you choose the right-hand columns in order to detect duplicate records, you will be able to see the records that are new and those who are up to date.

    (2) it will be a little tricky, but you can get by using the underlying collection. Loading data uses several collections to perform the operations, and on the first step, load us all the records of the user in the collection "CLOB_CONTENT". by checking this against the number of records in the underlying table, you can easily add a new validation before moving on to step 1 - step 2.

    Kind regards

    Patrick

  • The data load has run in Odi but still some interfaces are running in the operator tab

    Hi Experts,

    I'm working on the customization of the olive TREE, we run the incremental load every day. Data loading is completed successfully, but the operator status icon tab showing some interfaces running.

    Could you please, what is the reason behind still running the interfaces tab of the operator. Thanks to Advance.your valuable suggestion is very useful.

    Kind regards

    REDA

    What we called stale session and can be removed with the restart of the agent.

    You can also manually clean session expired operator.

  • Ignore the ASO - zero data loads and missing values

    Hello

    There is an option that ignores the zero values & the missing values in the dialog box when loading data in cube ASO interactively via EAS.

    Y at - it an option to specify the same in the MAXL Import data command? I couldn't find a technical reference.

    I have 12 months in the columns in the data flow. At least 1/4 of my data is zeros. Ignoring zeros keeps the size of the cube small and faster.

    We are on 11.1.2.2.

    Appreciate your thoughts.

    Thank you

    Ethan.

    The thing is that it's hidden in the command Alter Database (Aggregate Storage) , when you create the data loading buffer.  If you are not sure what a buffer for loading data, see loading data using pads.

  • Data loading 10415 failed when you export data to Essbase EPMA app

    Hi Experts,

    Can someone help me solve this issue I am facing FDM 11.1.2.3

    I'm trying to export data to the application Essbase EPMA of FDM

    import and validate worked fine, but when I click on export its failure

    I am getting below error

    Failed to load data

    10415 - data loading errors

    Proceedings of Essbase API: [EssImport] threw code: 1003029 - 1003029

    Encountered in the spreadsheet file (C:\Oracle\Middleware\User_Projects\epmsystem1\EssbaseServer\essbaseserver1\app\Volv formatting

    I have Diemsion members

    1 account

    2 entity

    3 scenario

    4 year

    5. period

    6 regions

    7 products

    8 acquisitions

    9 Servicesline

    10 Functionalunit

    When I click on the button export its failure

    I checked 1 thing more inception. DAT file but this file is empty

    Thanks in advance

    Hello

    Even I was facing the similar problem

    In my case I am loading data to the Application of conventional planning. When all the dimension members are ignored in the mapping for the combination, you try to load the data, and when you click Export, you will get the same message. . DAT empty file is created

    You can check this

    Thank you

    Praveen

  • FDMEE of planning data loaded successfully but not able to see the data in Planning - export of fish shows in FDMEE

    Hi all

    We loaded FDMEE data to planning, data has been loaded successfully, but not able to see the data in the Planning Application.

    In the processes log, I can see her mentioned data loaded in the Cube. Please advise on this.

    Thank you

    Roshi

    Two things:

    -I wasn't talking about method you import data but export data. You use the SQL method. Go to target Applications, select your application of planning/essbase, and set load method as a file. Memorize your settings

    2014-06-19 12:26:50, 692 [AIF] INFO: rules properly locked the file AIF0028

    2014-06-19 12:26:50, 692 INFO [AIF]: load data into the cube by launching the rules file...

    2014-06-19 12:26:50, 692 INFO [AIF]: loading data into the cube using sql...

    2014-06-19 12:26:50, 801 [AIF] INFO: the data has been loaded by the rules file.

    2014-06-19 12:26:50, 801 [AIF] INFO: Unlocking AIF0028 rules file

    2014-06-19 12:26:50, 801 [AIF] INFO: successfully unlocked rules AIF0028 file

    -Then export again and review. DAT file in the Outbox folder. Is it empty?

    -You need to add a new dimension to your import format (Dimension add > currency). Then add Local as expression

    -Import, validate and export data

  • Data loader detects the NUMBER instead of VARCHAR2

    Hello

    In my database, I have a table that stores information about components of vehicles. I created a new process identifier as unique keys, fields of manufacturer and reference data loading.

    (For example: manufacturer 'BOSCH', '0238845' reference)

    When the system is running the data map, it detects the column reference number, delete the first zero character '0238845' -> 238845

    How can I solve this problem?

    Kind regards

    transformation in the data loader to make a character?

    Thank you

    Tony Miller

    Software LuvMuffin

  • ODI - SQL for Hyperion Essbase data loading

    Hello

    We have created a 'vision' in SQL Server that contains our data.  The view currently has every year and periods of Jan 2011 to present.  Each period is about 300 000 records.  I want to only load one period at a time.  For example may 2013.  Currently we use ODBC through a rule of data loading, but the customer wants to use ODI to be compatible with the versions of dimension metadata.  Here's the SQL on the view that works very well.   Is there a way I can run this SQL in the ODI Interface so it pulls only what I declare in the Where clause?  If yes where can I do it?

    Select

    CATEGORY, YEAR, LOCATION, SCRIPT, DEPT, PROJECT, EXPCODE, TIME, ACCOUNT, AMOUNT

    Of

    PS_LHI_HYP_PRJ_ACT

    Where

    YEAR > = "2013" AND PERIOD = 'MAY '.

    ORDER BY CATEGORY ASC ASC FISCAL_YEAR, LOCATION ASC, ASC, ASC, ASC, ASC, PERIOD EXPCODE PROJECT DEPT SCENARIO CSA ACCOUNT CSA;

    Hello

    Simply use the following KM to load data - IKM SQL for Hyperion Essbase (DATA) - in an ODI interface that has the view that you created the Source model. You can add filters to the source which are dynamically by ODI variables to create the Where clause based on the month and year. Make sure you only specify a rule of load method to load the data into the KM

Maybe you are looking for

  • U160 BIOS

    Hello! Please, can someone tell me, where I can download BIOS for U160? It has a whitelist and I used to connect 802 b/g/n card

  • Vista runs in Mode safe, but freezes at normal startup

    Hello I have a Vista Home Premium 32-bit HP Desktop.  As on the first of September, my office has been freezing at different stages in the boot process and in some cases, once started and connected.  The kicker is that it works very well in all the b

  • * Excel-send a mail when reach a certain value

    Hello I need help to learn how to get Excel to email me when it reaches a certain value. Example: Is this possible? :)

  • Getting into computer new LR6

    Any who...Have the most frustrating time, trying to find a place to re-enter my license # to reset previously bought LR6.What should do? Grateful for the help with this one.Al

  • Compaq Presarioc V2424NR - BIOS forgot supervisor password

    Hello I have a Compaq Presario V2424NR laptop and you forgot the BIOS supervisor password. I'm moving my hard drive to a new Solid State Drive and need to go to the BIOS. Please help me if you know how to reset the BIOS supervisor password. When I tr