Processing the XML file

I have a requirement to process a stream of xml files and load its content in the oracle as a name-value pair tables. Basically, my table has three columns ID, Detail_NM, Detail_val.  I have to extract the ID and the corresponding details and store it in this table. The XML file looks like in below.  I can use an extract sql but you are looking for the most effective way to do it. I would like to highlight the elements dynamically as an element may or may not be present in every game. for example. Second series below has address2 that isn't there in the first document set. Even if nothing new is introduced, I should be able to treat them without explicitly mentioning the name of the element.  Is there a better way to deal with?  Please, share your ideas.  Thanks in advance

< document >

< id > 546534 < /ID >

< details >

the New York < City > < / City >

Ave de Rome 45 < address > < / address >

< zip > 10281 < / Zip >

< / details >

< / document >

< document >

< id > 6785565 < /ID >

< details >

Dallas < City > < / City >

< address > 56 locust ave < / address >

< Address2 > 2nd floor < / address >

< zip > 07454 < / Zip >

< / details >

< / document >

user626688 wrote:

Thanks again.  Yes, we can have a node root here.

-What is the average size of a single XML entity? (Number of nodes, size in bytes)? -A file can be up to 2MB size. Each ID can have different elements from 0 to 200 under him. (Each document tag).

Sorry, missed your last answer once again.

If there is a root node, it is even simpler.

And you can certainly use an intermediate binary table of XMLType:

SQL> begin
  2
  3    -- insert as Binary XMLType
  4    -- optional step for small files
  5    insert into tmp_xml
  6    values (
  7      xmltype(bfilename('TEST_DIR','sample2.xml'), nls_charset_id('AL32UTF8'))
  8    );
  9
10    -- parse and insert into the target table
11    insert into my_table (id, detail_nm, detail_val)
12    select x1.id
13         , x2.detail_nm
14         , x2.detail_val
15    from tmp_xml t
16       , xmltable('/Documents/Document'
17           passing t.object_value
18           columns id       number  path 'Id'
19                 , details  xmltype path 'Details'
20         ) x1
21       , xmltable('/Details/*'
22           passing x1.details
23           columns detail_nm  varchar2(256)  path 'local-name(.)'
24                 , detail_val varchar2(4000) path 'text()'
25         ) x2 ;
26
27  end;
28  /

PL/SQL procedure successfully completed.

SQL> select * from my_table;

        ID DETAIL_NM            DETAIL_VAL
---------- -------------------- ------------------------
    546534 City                 New York
    546534 Address              45 Rome Ave
    546534 Zip                  10281
   6785565 City                 Dallas
   6785565 Address              56 locust ave
   6785565 Address2             2nd Floor
   6785565 Zip                  07454

7 rows selected.

Tags: Database

Similar Questions

  • How to import the XML file into an oracle table using a BPEL process

    Hi friends

    How can I import XML file in the db oracle table using a BPEL process

    (1) I have generated an XML file in my local system with a field
    (2) I created a temporary table in my oracledb with the same field in the XML file
    (3) that I want to import these XML files which is the local host to my db oracle using a BPEL process
    (4) for that what steps should I me fallow please suggest me if there is no document for this?

    Thanks in advance
    AT

    Hi to

    Here you go...

    http://blogs.Oracle.com/ajaysharma/2011/03/using_file_adapter_database_adapter_and_mediator_component_in_soa_11g.html

    I hope that helps!

    Thank you
    AJ

  • How can I autorun script that will process an XML file at the opening?

    I use image 11. I have a saved script that looks for the event Constants.FA_Note_PostOpenXML notification. When this event is raised, the script is supposed to examine the root element (so I change only the appropriate XML files), then make some changes to the file. Specifically, I want to be able to delete empty pages, remove the room for the heads side and correct table formatting (for example the left indent, whereby I can not directly access, apparently, in the rules of the EDD or R/W). The code snippet that starts to do the following work:


    Notification (constants.FA_Note_PostOpenXML, true);

    function {Notify (note, object, sparam, etriqu)

    switch (note) {}

    case Constants.FA_Note_PostOpenXML:

    doTheWork();

    break;

    }

    }

    function doTheWork() {}

    var doc, flow, root, elemName, mPageAttrib, topicElem, topicElemName, allAttribs;

    doc = app. ActiveDoc;

    flow = doc. MainFlowInDoc;

    root = flow. HighestLevelElement; will always get something although unstructured document

    While (root. ObjectValid()) {//only do something for structured documents

    elemName = getElementName (root);

    .. more code to do things...

    }

    }

    I check then the element name root, and if it matches, I make a few changes. This code works fine when operating manually. But, when it is run as a recorded script, the app. ActiveDoc object is invalid, and there is no structure to edit. (I added the line if(!doc.) debug ObjectValid()) {alert ("invalid");} that is not included above.) It seems that the Constants.FA_Note_PostOpenXML event fires whenever the XML file opens, but BEFORE frame bed actually.

    Does anyone have recommendations on how to get around this? Is there something else I could use instead of Constants.FA_Note_PostOpenXML? Is there another way to manipulate an XML file automatically when loading?

    Thanks in advance

    The notification event receives four settings: note, object, sparam and etriqu. For the event that you are using, the object must be the subject of document of the FrameMaker document being opened. Then, you should be able to use this:

    doTheWork(object);
    

    Make sure that you update your doTheWork function to receive the Doc object:

    function doTheWork(doc) {
        var flow, root, elemName, mPageAttrib, topicElem, topicElemName, allAttribs;
        flow = doc.MainFlowInDoc;
        root = flow.HighestLevelElement; // will always get something even if unstructured document
        while(root.ObjectValid()) { //only do something for structured docs
            elemName = getElementName(root);
            ...more code to do stuff....
        }
    }
    

    -Rick

  • RSS feeds save in the xml file in the device memory

    Hello

    I'm in the analysis of an RSS feed with SAX parser. Now, I want to record the first RSS an xml file in the memory of the device or on the SD card. Can someone guide me through this process

    To write in an XML rss content, start by creating the file with the extension ".xml" with the help of the FileConnection API, then write content (rss data) in the XML file.

    You can use code below to write the content in the file. We must give the path as a string & content in byte format.

    public boolean writeFile(String path, byte[] data)
        {
            javax.microedition.io.Connection c = null;
            java.io.OutputStream os = null;
            try {
                c = javax.microedition.io.Connector.open("file:///" + path, javax.microedition.io.Connector.READ_WRITE);
                javax.microedition.io.file.FileConnection fc =
                        (javax.microedition.io.file.FileConnection) c;
                if (!fc.exists())
                    fc.create();
                else
                    fc.truncate(0);
                os = fc.openOutputStream();
                os.write(data);
                os.flush();
                return true;
            } catch (Exception e) {
                return false;
            } finally {
                try {
                    if (os != null)
                        os.close();
                    if (c != null)
                        c.close();
                } catch (Exception ex) {
                    ex.printStackTrace();
                }
            }
        }
    
  • Problem creating test instance, cannot run processSetup for configuration: cannot run createHOMObj for configuration: No. GuestOS appearing in the XML file

    I got following error when running tests on the Workbench 2.1


    [February 19, 2014 16:27:10: TRANSPORT] [0] FRAME: Async command is monitored by the process of STAF 73

    [February 19, 2014 16:27:10: FACTORYIMP] SETTING [0]: insert in the container

    [February 19, 2014 16:27:10: TESTHASH] [0] INFO: VirtualMachine installation process

    [February 19, 2014 16:27:10: VIRTUALMAC] [0] FRAMEWORK: the Setup() method called

    [February 19, 2014 16:27:10: STAFBASE] SETTING [0]: command execution STAF: staf VTAF_VM localhost connect password of administrator agent 192.168.8.158 userid: 11:Infocore' 1 q ssl

    [February 19, 2014 16:27:10: STAFBASE] [0] FRAME: command execution STAF: staf localhost VTAF_VM getvms anchor 192.168.8.158:administrator

    [February 19, 2014 16:27:11: MULTITECH] [0] FRAME: called VTAF::TestLib:Sphere:Lib:STAFSDK:HostSystem:GetAllVMs (HostName = '192.168.8.150' password = 'infocore"username ="root") returned UNDEF

    [February 19, 2014 16:27:11: VIRTUALMAC] [0] FRAMEWORK: new creation vaaivm1-150 VM from scratch...

    [February 19, 2014 16:27:11: TESTHASH] [0] ERROR: cannot run processSetup for configuration: cannot run createHOMObj for configuration: No. GuestOS appearing in the XML file

    [February 19, 2014 16:27:11: TESTHASH] WARN [0]: found objects that need to be cleaned

    [February 19, 2014 16:27:11: VIRTUALMAC] [0] INFO: cleaning of the virtual machine: vaaivm1-150

    [February 19, 2014 16:27:11: HOSTSYSTEM] [0] FRAME: HostSystem Cleanup() called

    [February 19, 2014 16:27:11: HOSTSYSTEM] [0] FRAMEWORK: the location of the swapfile to the directory of the VM on the host 192.168.8.150 restoration VM...

    [February 19, 2014 16:27:11: HOSTSYSTEM] SETTING [0]: setting VM Swapfile location to use the directory of the virtual machine

    [February 19, 2014 16:27:11: STAFBASE] SETTING [0]: command execution STAF: staf VTAF_Host localhost connect password of administrator agent 192.168.8.158 userid: 11:Infocore' 1 q ssl

    [February 19, 2014 16:27:11: STAFBASE] [0] FRAME: command execution STAF: staf localhost VTAF_Host setswapfilelocation anchor 192.168.8.158:administrator host 192.168.8.150

    [February 19, 2014 16:27:31: MULTITECH] [0] FRAME: called VTAF::TestLib:Sphere:Lib:STAFSDK:HostSystem:SetSwapFileLocation (HostName = '192.168.8.150' password = 'infocore"username ="root") '1' returned

    [February 19, 2014 16:27:31: HOSTSYSTEM] [0] FRAME: destruction of object 192.168.8.150...

    [February 19, 2014 16:27:31: LOGMANAGEM] COMMENTS [0]: recovery log file 192.168.8.150 host vmkernel.log

    [February 19, 2014 16:27:31: FILEUTILIT] [0] FRAME: PutTmpDirectory - called for destination host localhost

    The same problem was sloved.

    Re: Hardware Certification-do can not find the storage50info.txt file to...

  • How to import the xml file into bcc?

    My input to atg file is xml containing assets of category or products.

    I want the Scheduler to run to auto create a project and send an e-mail to the customer for approval.

    When the client approves the xml file, and then run the Scheduler to get the data from the xml file into bcc.

    I use: SingletonSchedulableService for this

    Give me a direction for the same thing.

    Thanks in advance.

    I want to implement this in bcc.

    What listener will listen to the authorization of the client for email (that contains the XML to import) we sent to him?

    You will need to do a custom development to hang your workflow customized with incoming e-mail. To send notifications by mail electronic action is already there, and while creating his element within the ACC, you can specify a JSP page for the email template. You can view the rest of the relevant documents:

    Oracle, Web ATG Commerce - Workflows

    Oracle, ATG trade Web - action workflow items

    I recommend you to consult the existing source of DeploymentEmailer in \Publishing\samples\Java and see different types of DeploymentEvent and States can be used, based on your requirement in your custom component. See also these API javadocs for the process and the events of deployment available:

    DeploymentEvent (ATG Java API)

    DeploymentServer (ATG Java API)

    ATG has a component/atg/dynamo/service/POP3Service to retrieve messages from a POP3 e-mail server, but it is mainly used to detect the bounced email. Not too sure if it helps or adjustment in your case, but still you can go through all the details to know what is already there:

    Bounced from Oracle ATG Web Commerce - E-mail

    You can also check the event InboundEmail and API javadocs for InboundEmailMessage which are used in this function of rebound detection E-mail:

    Oracle, ATG trade Web - InboundEmail event

    InboundEmailMessage (ATG Java API)

    Now based on all this, if you extend and customize a lot of things, that it would be little complex and tedious detect mail incoming approval based on its content and then advance your workflow. Another approach may be to put a URL in the e-mail template of your notification by e-mail that the approver will click and on the landing page, you can authenticate and advance your workflow to the needs.

  • How can I parse the XML file using the Oracle's Sql query.

    Hi all
    I have an XML file that must analyze and display the result according to the following example
    Can you please recommend me an approach to get the result.

    For example, here is my XML:

    <? XML version = "1.0" encoding = "UTF-8"? >
    < xmlns:pi = "urn:com.workday / picof pi: Extract_Employees" >
    < IP: employee >
    < Additional_Information: pi > < pi: pi function: PriorValue = "" > Intern - masteri¿½s < / pi: function >
    < / pi: Additional_Information >
    < / pi: employee >
    < / pi: Extract_Employees >

    Databases:

    Oracle Database 10 g Enterprise Edition release 10.2.0.3.0 - production

    SQL > SELECT * FROM NLS_DATABASE_PARAMETERS;

    NLS_LANGUAGE AMERICAN
    NLS_TERRITORY AMERICA
    NLS_CURRENCY $
    NLS_ISO_CURRENCY AMERICA
    NLS_NUMERIC_CHARACTERS.,.
    WE8ISO8859P1 NLS_CHARACTERSET
    NLS_CALENDAR GREGORIAN
    NLS_DATE_FORMAT DD-MON-RR
    NLS_DATE_LANGUAGE AMERICAN
    NLS_SORT BINARY
    NLS_TIME_FORMAT HH.MI. SSXFF AM
    NLS_TIMESTAMP_FORMAT-DD-MON-RR HH.MI. SSXFF AM
    NLS_TIME_TZ_FORMAT HH.MI. SSXFF AM TZR
    NLS_TIMESTAMP_TZ_FORMAT DD-MON-RR HH.MI. SSXFF AM TZR
    NLS_DUAL_CURRENCY $
    BINARY NLS_COMP
    NLS_LENGTH_SEMANTICS BYTES
    NLS_NCHAR_CONV_EXCP FAKE
    NLS_NCHAR_CHARACTERSET AL16UTF16
    NLS_RDBMS_VERSION 10.2.0.3.0



    The Xml file above with UTF-8 character sets which is multi bytes.

    But in my character database is WE8ISO8859P1 for example ISO-8859-1 (single-byte character set)

    SQL > SELECT extractValue (Value (x) ', ' / pi:Employee/pi:Additional_Information/pi:Job_Title','xmlns:pi="urn:com.workday/picof ' ')
    TABLE (XMLSequence (extract (XMLType (bfilename('XMLDIR','XML_Issue_227176.xml'), nls_charset_id ('AL32UTF8')),'/ pi: Employee ',' xmlns:pi="urn:com.workday/picof"'))) x;)))


    which gives the following error:

    Error:
    ORA-31011: XML parsing failed
    ORA-19202: an error has occurred in the processing of XML
    LPX-00200: could not convert from UTF-8 encoding to ISO-8859-1
    Error on line 1
    ORA-06512: at "SYS." XMLTYPE", line 295
    ORA-06512: at line 1

    Also I tried with this
    SQL > SELECT convert (extractValue (Value (x), ' / pi:Employee/pi:Additional_Information/pi:Job_Title','xmlns:pi="urn:com.workday/picof"'),'WE8ISO8859P1 ', 'UTF8'))
    TABLE (XMLSequence (extract (XMLType (bfilename('XMLDIR','XML_Issue_227176.xml'), nls_charset_id ('AL32UTF8')),'/ pi: Employee ',' xmlns:pi="urn:com.workday/picof"'))) x;)))

    The same error is according to the above error message.

    Please help in this regard.

    Thank you and best regards,
    Sandrine

    You know the code of real character behind "" or you receive the file like that?

    For the record, "" is the wildcard of UTF-8 (0xEFBFBD), so that the original character of the means has already been replaced and that very probably the file was not coded properly in the first place.

    With respect to the resolution of the problem, try another method to read the file:

    SQL> select value from nls_database_parameters where parameter = 'NLS_CHARACTERSET';
    
    VALUE
    ----------------------------------------
    WE8ISO8859P15
    
    SQL> SELECT x.*
      2  FROM XMLTable(
      3         XMLNamespaces(default 'urn:com.workday/picof')
      4       , '/Extract_Employees/Employee'
      5         passing xmltype(
      6                   dbms_xslprocessor.read2clob(
      7                     'COP_DIR'
      8                   , 'XML_Issue_227176.xml'
      9                   , nls_charset_id('AL32UTF8')
     10                   )
     11                 )
     12         columns job_title varchar2(30) path 'Additional_Information/Job_Title'
     13       ) x
     14  ;
    
    JOB_TITLE
    ------------------------------
    Intern -  Master¿s
     
    
  • Retrieve / search a specific node of the XML file

    Hello

    I have a question about reading the XML (search) in PL SQL file:

    There is an XML file with the following structure:
    <root>
      <Hnode1 attr1="value1" />
      <Hnode2 attr1="value1" /> 
         <node1_of_Hnode2 id="10" personname="Steven"/>
         <node2_of_Hnode2 id="20" personname="Christian"/>
         <node3_of_Hnode2 id="30" personname="Arnold"/>
         .
         .
      <Hnode2/>
    <root/>
    How it is possible to filter a node by one of its attribute value. For example:
    The id of a person - I guess that id = 20. In this case I would like to get the whole node (in this case node2_of_Hnode2) to get the personname = > 'Christian '.
    Just like a SELECT statement (WHERE ID = 20)

    Is it possible to get it in a single order or should it be buckled on all the lines in the Document. Can you please provide an example for me.

    Thank you for advice!

    The basic version is 10.

    Sorry, this is not a version.
    For example:

    SQL> select * from v$version;
    
    BANNER
    ----------------------------------------------------------------
    Oracle Database 10g Enterprise Edition Release 10.2.0.5.0 - 64bi
    PL/SQL Release 10.2.0.5.0 - Production
    CORE     10.2.0.5.0     Production
    TNS for 64-bit Windows: Version 10.2.0.5.0 - Production
    NLSRTL Version 10.2.0.5.0 - Production
     
    

    The use of the extraction of the values?
    1. better in a Select statement. assignment directli in a variable, or by using the xmlquery function?

    PL/SQL or SQL calls to the XMLType methods such as getStringVal() or extract() function should be equivalent (save the change of context).
    However, none of them are the right way to retrieve a scalar value of a given node.
    You must use the rather extractValue() function (SQL only) for immediate release< 11.2="" and="" xmlcast/xmlquery="" starting="" with="">

    Another Question:

    I've seen a lot of options for dealing with XML files.

    In my case only query data in XML files is relevant.

    I've seen to many API called XMLDOM, DMBS_XMLDOM, XMLTYPE...

    When to use these APIs. Can you give me feedback on this please?

    It is a broad topic.
    An appropriate response will depend on your condition.

    -You have to deal with large files and if so what size?
    -Performance/memory management is a concern for you?
    -You want to extract unique values or process the XML content in the form of relational data?

    If you have any specific test cases to show us, please post in the {forum: id = 34} forum and make sure to include all the necessary details (see the FAQ).

    Published by: odie_63 on 23 Feb. 2012 11:35

  • Automation of several Flash file by using the XML file 'mailing list '.

    I created a Flash interactive multi-fichiers 'demo' for a client. It is very well as a Web site (although he doesn't use it as a Web site) in the way it is stated that there is an index page which can then direct you to product categories, which you can click to see. Each 'page' contains the product (or product category) information, photos, diagrams, etc. There is nothing of Flash animation.

    My client also wants to be able to put in place so that the demo runs automatically in a configurable XML file. My ActionScript skills are fairly limited, but I was able to by using ActionScript 3 to load and read XML file. A loop using 'for', I am able to read the XML file and returns the path of each Flash file and the time that each file must be displayed (my client asked that he able to configure the duration of each page as well).

    I created a charger for each video clip and a timer for the duration. Unfortunately, the thing that throws me off is that when I try to load the clip (as the last element in the loop for), it reads the entire XML file, returns all the clips and the durations (I use trace to see this) before loading the clip. Thus, the single loaded clip is the last reading of the XML file.

    As I said, I'm not very good at ActionScript, so everything I managed to combine was internet tutorials.

    This is my code:

    var xmlLoader:URLLoader = new URLLoader();

    xmlLoader.addEventListener (Event.COMPLETE, xmlLoaded);

    var mcLoader:Loader = new Loader();
    mcLoader.contentLoaderInfo.addEventListener (Event.INIT, playClip);

    var mcCurrent:MovieClip = new MovieClip();
    stage.addChild (mcCurrent);

    var xml;

    function playClip(e:Event):void
    {
    trace("Function:playClip");
    stage.removeChild (mcCurrent);
    mcCurrent = MovieClip (mcLoader.content);
    mcLoader.unload ();
    stage.addChild (mcCurrent);
    }

    function xmlLoaded(e:Event):void
    {
    trace ("function: xmlLoaded");
    XML = new XML (e.target.data);

    var cl:XMLList = xml.clip;
    for (var i: uint = 0; i < cl.length (); i ++)
    {
    trace ("Begin"for"loop");
    trace ("variable" i "is" + i);

    clipPath var = cl[i].@path;
    var delay = cl[i].@duration;

    trace ("Clip path is" + clipPath);
    trace ("Display for" + delay);

    mcLoader.load (new URLRequest (clipPath));
    }
    }

    xmlLoader.load (new URLRequest ("filerunlist.xml"));


    I removed the timer event because, like all the rest was repeated until whatever it is obtained, so I wanted to figure out out to get the charger clip (mcLoader) to load the clip for each file, before I realized where put in the time to make sure each element is displayed for the appropriate duration. I suspect that the timer event must occur in the function "playClip", but that it would take advice on that as well.

    Thanks to anyone who can help me understand this.

    You need to do in the service of parsing xml is just to store all data in a table.  You will use while an array element after another through some timer control you set up, access each file in the order that they are stored in the table.  You do not load in this function, all data storage, and when finished, the last line of this function outside the loop, fires at any function you have who begins to treat the first element of the array.

    For example, to store the data objects in the array...

    var clipData:Array = new Array();   store your data xml here

    var clipToShow:uint = 0;                   use later in the function showClip

    function xmlLoaded(e:Event):void
    {
    trace ("function: xmlLoaded");
    XML = new XML (e.target.data);
       
    var cl:XMLList = xml.clip;
    for (var i: uint = 0; i
    {
    clipArray.push ({clipPath: cl[i].@path, delay: cl[i].@duration});

    }

    showClip();
    }

    function showClip() {}

    Regardless of the process of loading .clipPath clipArray [clipToShow]

    and using clipArray [clipToShow] .delay to start some timer is involved

    the timer handler function calls this function when it is run

    clipToShow += 1;   increment the counter for the next clip online

    }

    If you are not comfortable/familiar with the approach of the object, then you could also store the clip and delay data in two separate tables.

  • Is it possible to read the contents of the XML file rather than the properties for the location of FLEX

    Is it possible to read the contents of the XML file rather than the properties for the location of FLEX

    Strictly speaking, this is not really true. When running, the ResourceManager of IResourceBundles for localization, not deal with XML files or properties files. As long as you implement IResourceBundle, you can back it up with XML, properties, JSON, AMF files - what you want. The problem is that mxmlc will not build modules of resources for you what it is, with the exception of properties files, and cannot compile just resource bundles in a .swf using anything other than properties files. If you build your own interface to spit IResourceBundles, this could work, but it is probably more trouble that it's worth.

    (Assuming that you establish the link in the application) the best solution is to write a script to pre-process the XML in properties files.

  • The data in the XML file validation

    Hi gang...

    Just got a project (not) I have to analyze and validate an xml document... the work is as follows:

    1. gather xml

    2 analyze the data

    3. validate the data

    4 insert data into db

    5 send the message to the browser

    I don't know what is the best way to go about this problem in a productive way, using everything that's out there.

    Any help would be appreciated.  (the xml file is included)

    Netpagino

    I don't know exactly what you mean by "validation of the data in an XML file.  At first glance, I interpreted your request that means you have to check that the XML file is well-formed,... that is to say, according to the DTD or schema.  But on re-reading your post, I'm not sure.

    Of course, you want to use full construction CF XML processing and querying capabilities.  (You want to use an XPath search to find the nodes, for example...)  As for the later task then check the data contained in these nodes against various business rules, well, who would take more thought.  But remember (a) that it is very much ground (so "do not re-invent the wheel"), and (b) you have all Java at your disposal if you need it.

  • "Failed to process the backup file" - bookmarks JSON file

    I deleted my appdata files without thinking and then allowed my trash. When I opened firefox again all my favorites have disappeared. I used piriform recuva to find my deleted files and recover the backup JSON files 10 bookmark. When I try to restore my bookmarks, I get the message "failed to process the backup file". "Files are 95 KB in size, but when I open them with Notepad that they are empty, however, when you use the Restore tool, it is said ' elements of 480", which I assume is the amount of bookmarks, I had, which means that firefox can see data. The same happens for all ten backups that I recovered.

    Any help would be appreciated.

    What is the name of this file?

    Firefox can only look at the file name for the number of items.

    • < number > _ < hash > .json bookmark-YYYY-MM-JJ_

    The files in the bookmarkbackups folder are currently in compressed format .jsonlz4 (i.e., they show data binary like a ZIP compressed archive) and cannot be easily unzipped for access the JSON text format.
    The files are compressed means that a single error in the file will make it impossible to unpack the file.

    Have you checked all the available JSON backup to see if everything works?

    A backup created manually JSON is always not compressed, so can be easily inspected or opened in a text editor (Notepad > "Pretty Print"), but with a compressed file, it is much more difficult.

    Recovery of a file via an undelete utility is no guarantee that the space occupied by the file has not been used by another file, and so is corrupt.

  • When I try to restore my bookmarks I get the message "Failed to process the backup file."

    I lost my favorites and when I try to get back them, I can see the backup files in the recover window, but when I try to restore my bookmarks I get the message "Failed to process the backup file." Can help you? Thank you

    You can check for problems with the database places.sqlite file in the Firefox profile folder.

  • Extracting XML error. The XML file that is assigned in the configuration.xml is either incorrect or damaged.

    1. some web pages gives "error extracting XML. The XML file that is assigned in the configuration.xml is wrong or corrupt.

    2 Youtube gives, after having pressed the button start, message "an error has occurred. Please try again later.
    Even youtube opening page with IE, there is no problem.

    3. scrolling on Mozilla firefox pages only works with the sidebar not with the mouse wheel.

    Other user accounts on this computer is not these problems.

    BR Juha

    This can be caused by a recent update of Flash 11.3.

    See:

  • repair of iTunes - fusion of the xml files or combine libraries

    My iTunes xml I messed up a little bit and now all my old songs have the same Date added date, if they have been added to 4 years ago or on 02/07/15.

    Titles added after 02/07/15 have correct dates.

    I am trying to find the better (not necessarily the easiest) way to fix them.

    It is important for me because I sort my titles by date added.

    I have a huge library and can change them one by one in the xml file.

    I think that the ideal way to change the dates maybe with a script that would take the correct dates back to the old xml and replace the incorrect in the xml of the last.

    However, I do not have such a script, and I don't know how to write one.

    So my other option is to merge libraries.

    I will do this on a separate drive in order not to spoil the libraries either old or new.

    I have a bunch of backups, but had not noticed the problem posed by the until after all the backups had already been updated with new and incorrect files.

    I also have a fully functional library on a player removed from my MBP dead last year.

    So I now several copies of a messed library and an obsolete library containing the correct information.

    If I had to start with the old library I would need then to add all new titles that were added last year on my new computer.

    I can sort of filter by modification Date to clear the majority of them.

    I know that iTunes has a feature to add to the library , but before I use it I want to make sure that will accomplish what I am trying to do.

    I hope that someone with real experience in doing that reads my post.

    Useful suggestions of someone else is appreciated.

    Too bad (as if someone was actually going to answer).

    I have since discovered that the added Date cannot be changed.

    So now I have copied the old library on an external drive, made iTunes open this library, and then added everything in my current library which was not in the old library. I now have a library with the correct dates for the majority of the files, but all my files since August 2014 have now to the current Date added date.

    Better to have 16 months from the incorrect dates of 6 years.

Maybe you are looking for

  • The screen freezes, then jumps and thaws

    Using OS X El Capitan, 10.11.6, but it is with me since Mountain Lion days Often, my screen work can go "dead". No response from the mouse, not the ability to drag the pane all. It usually jumps laterally or vertically a few mm when this happens. Jus

  • First Start Up of the iPad Pro

    You just bought a new iPad Pro and when asked if I wanted to open with a copy of another iPad I choose this option. However I was not allowed to use this option but had to open as a new iPad as my new iPad Pro comes with iOS 9.1 and the iPad, I wante

  • How to remove an additional admin account and keep the one I want

    Recently bought a new iMac, I used the wizard to merge. She added a new my old mac admin account. Now I have two admin accounts. I want to use the old Mac that has been transferred through and get rid of the other. I'm going to SysPrefs/users / group

  • Tecra A10 - Firewire chipset that is used?

    Hello everyone, I plan to buy a tecra A10, but I have a question on the left.For my FireWire cards that I use, I need to know what kind of brand FireWire chipset does build to the A10. Does someone knows or can verify this for me?It is visible in 'De

  • Satellite A200 Toshiba LCD TV connection

    New on the forum with new 37 "Regza TV and 1 - VP A200 laptop. Both have a port of connector 15-pin I can understand about, but I then got a little lost as to what standard of cable should I use to get the best results (VGA, SVGA, XGA, etc etc).Can a