APEX_JSON - get CLOB

Hello

I wonder how to deal with a clob field in a JSON object that is returned by a web service. It consists of 3 - number, varchar2, and clob fields and looks like this:

{"MyNumber": 10}

'MyVarchar2': ' application/pdf ',.

{'MyClob': 'JVBERi0xLjQKJeLjz9MKMyAwIG9iago8PC9GaWx0ZXI'}

Its very easy to take the first two values using built-in functions, but I have no idea how to do to extract the clob value. My code is as follows:

DECLARE
v_response_clob CLOB; - the entire web service response

v_my_number NUMBER;
v_my_varchar2 VARCHAR2 (100);
v_my_clob CLOB.

BEGIN
-Select clob from collections of apex
SELECT c.clob001
IN v_response_clob
Of apex_collections c
WHERE c.collection_name =: PX_MY_COLLECTION_NAME;

-parse response clob
apex_json. Parse (v_response_clob);

-get my number
v_my_number: = apex_json.get_number (p_path = > "MyNumber");

-My varchar2
v_my_varchar2: = apex_json.get_varchar2 (p_path = > 'MyVarchar2');

-now how to get the CLOB?

END;

Any help would be much appreciated.

Thank you very much

Pavel

Edit: I forgot to mention that I use 11g XE, so the new JSON 12 c functions are not available and 5 APEX (which is actually quite obvious because of the APEX_JSON package).

Hahaha,

This is a known issue (well documented):

20974582 - apex_json.parse errors when parsing CLOB if DB Version<>

If your version of the database is 11.1.x, 11.2.0.1, 11.2.0.2 or 11gXE, apex_json.parse may report errors during parsing of a clob.

Solution: There is a group exception of fixes for this available on My Oracle Support - search by 20931298bug number. If you use ADR (Apex Listener), restart after installation of the patch.

Workaround: If you have not installed the patch, you can use the overloaded analytical procedures that take varchar2 or table of varchar2 rather.

Here

Martin Giffy Souza on Oracle APEX: APEX_JSON. ANALYZE the issue with CLOB and 11g

is a nice article on this subject.

Tags: Database

Similar Questions

  • Advice on the use of APEX_JSON with a CLOB in the body as a string

    Hello all and thanks for any help in advance.

    I have the task of liaising with a 3rd party API via JSON to send information to our database.  First warning is the HOST for us is not and race so I have no way of properly this test until tomorrow, but I wanted to ask this question to try to overcome a potential problem.

    In the body of the JSON object, we must add a chunk of xml (already created/kept or can be created on the fly) and it must be validated on as a long string.  Now, this question is that the string can be up to 20000 characters long so when its processed is stored as a CLOB in the database (not made by me).

    Now my thoughts early to do this was to create a procedure allowing certain parameters get the relevant piece of data and xml stored as a CLOB.  Then and that's where I maybe need some tips, follow these steps.

    This host requires JSON in this format

    {

    'batchcode': 1234562,

    'XML string': 'THIS WILL BE a MASSIVE XML STRING',

    ""bounding ":" ",".

    "ID": "123456465",

    'field1 ':

    .

    .

    etc.

    }

    We currently have a few apex applications so I got a good look at the documentation and allows to generate the required using JSON:

    Thus to collect information of input parameters

    APEX_JSON. INITIALIZE_CLOB_OUTPUT;

    APEX_JSON. OPEN_OBJECT();

    APEX_JSON. WRITE ("batchcode," value from database");

    APEX_JSON. WRITE ("XML string ', our xml database that is stored as a CLOB string");

    APEX_JSON. WRITE (bounding "," value of database");

    APEX_JSON. WRITE ('ID', 'Value of database');

    .

    .

    .

    etc.

    APEX_JSON. CLOSE_OBJECT();

    Then validate using endpoint:

    APEX_WEB_SERVICE. () MAKE_REST_REQUEST

    p_url = > 'URL of the end point. "

    p_http_method = > 'MESSAGE ',.

    p_username = > API_USER,

    p_password = > API_KEY,.

    p_body = > apex_json.get_clob_output

    );

    Now assuming that my method is correct only the APEX_JSON. WRITING literally accept its entry as a type CLOB and he will be able to properly analyze or will it arrive at a mansion that will actually write the string itself?

    Edit I found that if I APEX_JSON. WRITE ("XML", "CLOB table column") that it exported the CLOB but not in the format I want.  This is due to the fact that my CLOB has characters like <>etc in II and the output is burning out in unicode, so for example <? XML version = "1.0" encoding = "UTF-8"? "> as stored in the object CLOB becomes \u003C?xml version=\"1.0\ ' encoding =------"UTF - 8\ '? \u003E\n in the output using APEX_JSON. To write.

    The only other way to do it, I think it is so divided in pieces as follows:

    SELECT ROWNUM as XML_PIECE_NO, TO_CHAR (SUBSTR (a.xmldoc, (ROWNUM-1) * 4000, 4000)) AS SECTION

    Of ast_xmldata one

    CONNECT (ROWNUM-1) * 4000 < = LENGTH (a.xmldoc)

    Then maybe concat in the JSON long xml string field.

    I hope I have explained what I want to do but if not please respond any help will be greatly appreciated.

    Concerning

    Post edited by: K4E

    EDIT I CAME FROM WITH a SOLUTION

    Post edited by: K4E - found solution 02/04/2016

    Thanks for the response of Christian, logical exit that way.

    Through some trial and error, I found a solution to my problem that seems effective enough without having to worry about unicode characters written.  There is seemingly a procedure without papers in the APEX_JSON package called apex_json.write_raw that does what it says.  All I had to do then was to use escape characters in my CLOB for things like "then I get a few good valid JSON in the format of the API, I work with love."

    Hope this helps someone else with the situation.

    K4E

  • Selection of XML in a CLOB data which a repeating XML tags - don't know how to get each individual tag

    This is an XML file that I store in a CLOB field in a table, Oracle 10

    'i' <? XML version = "1.0" encoding = "utf-8"? >

    < MERIDIANMASTERSEND xsi:noNamespaceSchemaLocation = "MeridianMasterSend.xsd" "xmlns: xsi =" " http://www.w3.org/2001/XMLSchema-instance ">

    < MASTER_SPEC_NUMBER > 10217655 < / MASTER_SPEC_NUMBER >

    < MASTER_SPEC_REVISION > 2 < / MASTER_SPEC_REVISION >

    < MASTER_SPEC_DESCRIPTION > CORNNUTS JALAPENO CHEDDAR 11.3 kg in bulk FS-40 x 1-MS < / MASTER_SPEC_DESCRIPTION >

    YES < RELEVANT_ALLERGEN_DATA_PROVIDED > < / RELEVANT_ALLERGEN_DATA_PROVIDED >

    CASHEW nuts < ALLERGEN_ATTRIBUTE > < / ALLERGEN_ATTRIBUTE >

    < ALLERGEN_LEVEL_OF_CONTAINMENT > < / ALLERGEN_LEVEL_OF_CONTAINMENT >

    < ALLERGEN_SPECIFICATION_AGENCY > < / ALLERGEN_SPECIFICATION_AGENCY >

    < ALLERGEN_SPECIFICATION_NAME > < / ALLERGEN_SPECIFICATION_NAME >

    MILK of < ALLERGEN_ATTRIBUTE > < / ALLERGEN_ATTRIBUTE >

    CONTAINS < ALLERGEN_LEVEL_OF_CONTAINMENT > < / ALLERGEN_LEVEL_OF_CONTAINMENT >

    Health Canada and the CFIA < ALLERGEN_SPECIFICATION_AGENCY > < / ALLERGEN_SPECIFICATION_AGENCY >

    < ALLERGEN_SPECIFICATION_NAME > food and drug B.01.010.1, B.01.010.2, B.01.010.3 < / ALLERGEN_SPECIFICATION_NAME >

    SOY < ALLERGEN_ATTRIBUTE > < / ALLERGEN_ATTRIBUTE >

    CONTAINS < ALLERGEN_LEVEL_OF_CONTAINMENT > < / ALLERGEN_LEVEL_OF_CONTAINMENT >

    Health Canada and the CFIA < ALLERGEN_SPECIFICATION_AGENCY > < / ALLERGEN_SPECIFICATION_AGENCY >

    < ALLERGEN_SPECIFICATION_NAME > food and drug B.01.010.1, B.01.010.2, B.01.010.3 < / ALLERGEN_SPECIFICATION_NAME >

    < ALLERGEN_ATTRIBUTE > TREE_NUTS < / ALLERGEN_ATTRIBUTE >

    < ALLERGEN_LEVEL_OF_CONTAINMENT > < / ALLERGEN_LEVEL_OF_CONTAINMENT >

    < ALLERGEN_SPECIFICATION_AGENCY > < / ALLERGEN_SPECIFICATION_AGENCY >

    < ALLERGEN_SPECIFICATION_NAME > < / ALLERGEN_SPECIFICATION_NAME >

    < / MERIDIANMASTERSEND >

    I normally query the Oracle table that stores this file XML in a CLOB column, called MDMXML by using the following query to get the different fields.

    Select a.part_no, a.revision, xmltype (a.mdmxml) .extract ("/ MERIDIANMASTERSEND/MASTER_SPEC_DESCRIPTION/text()').getstringVal () 'Master Spec Description' from interspc.atmdmdata where a.master_part_no is not null")

    How would be to extract the cashews, milk, soy, values of walnuts (attribute, containment level, agency of specification, the specification name) in my SQL statement above?

    Otherwise, I'd be OK for a list

    Glad to hear because your main requirement cannot be reached.

    You cannot have a return to SELECT an unknown number of columns.

    Separate line inscription makes more sense, for what is a relational database is.

    Select x.*

    of atmdmdata one

    xmltable)

    ' for $i in /MERIDIANMASTERSEND

    , $j in $i / ALLERGEN_ATTRIBUTE

    Returns the element r {}

    $j/next - sibling:ALLERGEN_LEVEL_OF_CONTAINMENT [1]

    , $j/next - sibling:ALLERGEN_SPECIFICATION_AGENCY [1]

    , $j/next - sibling:ALLERGEN_SPECIFICATION_NAME [1]

    , $j/.

    , $i / MASTER_SPEC_DESCRIPTION

    }'

    passage xmltype (a.mdmxml)

    columns

    Path of varchar2 (30) MASTER_SPEC_DESCRIPTION 'MASTER_SPEC_DESCRIPTION '.

    , Path of varchar2 (30) attribute 'ALLERGEN_ATTRIBUTE '.

    , Path of varchar2 (30) LEVEL_OF_CONTAINMENT 'ALLERGEN_LEVEL_OF_CONTAINMENT '.

    , Path of varchar2 (30) SPECIFICATION_AGENCY 'ALLERGEN_SPECIFICATION_AGENCY '.

    , Path of varchar2 (80) SPECIFICATION_NAME 'ALLERGEN_SPECIFICATION_NAME '.

    ) x

    ;

    MASTER_SPEC_DESCRIPTION LEVEL_OF_CONTAINMENT SPECIFICATION_AGENCY SPECIFICATION_NAME ATTRIBUTE

    ------------------------------ ------------------------------ ------------------------------ ------------------------------ --------------------------------------------------------------------------------

    CORNNUTS 11.3 CASHEW CHEDDAR JALAPEÑO

    CORNNUTS JALAPENO CHEDDAR 11.3 MILK CONTAINS Health Canada and CFIA food and drug B.01.010.1, B.01.010.2, B.01.010.3

    CORNNUTS JALAPENO CHEDDAR 11.3 SOY CONTAINS Health Canada and CFIA food and drug B.01.010.1, B.01.010.2, B.01.010.3

    CORNNUTS JALAPENO CHEDDAR 11.3 TREE_NUTS

  • get dynamic SQL CLOB in PLSQL?

    Hello

    I need to call a function to get a clob in return, but the name of this function is dynamic. This means that I get a user input and depends on the name of the function to call.

    Problem is that this feature is also DML, so I can't do

      EXECUTE IMMEDIATE 'select function_name_'||version||' from dual'
    
    

    And I don't like the idea of

    if .. then return function_nam_1
    elsif .. then return function_nam_2
    elsif .. then return function_nam_3
    elsif .. then ...
    
    

    So I think to use the DBMS_SQL package.

    But what this part would look like exactly?

    create or replace function  execute_fkn_test(version number, p_input varchar2) return CLOB is
    -- does the SQL string need a change?
      v_sql varchar2(1000) := ' declare v_return_clob clob; begin v_return_clob:= test_function_'||version||'(p_param => '||p_input ||' ); end;';
      v_dummy INTEGER;
      v_result CLOB;
    BEGIN 
      EXECUTE IMMEDIATE v_sql;
      -- what to write here to get v_return_clob from dynamic SQL into v_result?
    return v_result;
    END;
    
    

    Good bye

    DPT

    I don't know the reason for this design, but to answer your question from the academic point of view: you must use bind variables, like this:

    create or replace function execute_fkn_test (version number, p_input varchar2) return clob

    is

    v_sql varchar2 (1000): = ' START: 1: = test_function_' | TO_CHAR (version) | "(p_param =>: 2); END;';

    CLOB v_result;

    Start

    execute immediate v_sql using the v_result, in p_input

    Return v_result;

    end;

  • Getting string of a CLOB column

    Hi team,

    I have a problem. as if I had the table that contains a CLOB column. Since I need to extract exactly a word string. I'll give an example on how it will be below.

    Let T_OBJ of Table has a WORK_LOG column that is a CLOB.

    The CLOB data is huge, but since I need to extract a string, containing the name that falls exactly after a WORD

    Create table T_OBJ (SNO number, WORK_LOG clob);

    Insert into T_OBJ values (1, '1263636000 AR_ESCALATOR. amended by XXXXXXXXX XXXX ("This is Auto Closed.Change was closed")

    How is the sample data. But what I found most is there are the delimiters in the text. A delimiter is exactly after name i.e. XXXXXXXX XXXX (delimiter)

    My output should be one name i.e. XXXXXXXX XXXX, which falls immediately after the text "modified by".

    I tried like this

    Select dbms_lob.substr (WORK_LOG, 50, dbms_lob.instr (WORK_LOG, "modified by"))

    of T_OBJ;

    Can someone help me please. I'm using ORACLE 11 g.

    Hello

    Check if the query below matches your needs.

    SELECT dbms_lob.substr(WORK_LOG
                          ,((dbms_lob.instr(WORK_LOG,'.',dbms_lob.instr(WORK_LOG,'Modified By ')))-    --- get position of DELIMITER AFTER MODIFIED BY
                           (dbms_lob.instr(WORK_LOG,'Modified By ')+length('Modified By ')))           --- get position of MODIFIED BY
                          ,dbms_lob.instr(WORK_LOG,'Modified By ')+length('Modified By '))             --- get position of MODIFIED BY
    FROM t_obj;
    
  • Get literal string when using CLOB

    Hi, I am using Oracle 11.2.0.3 on Windows 2003 R2 and I have a procedure to extract the different bits of XML and GML of an XMLType column. GML for a record can be one or more geometries and I need to derive a unique SDO_GEOMETRY to and convert to WGS84. I created a function called MULTI_GML_TO_SDOGEOM in which I analyze my GML as a CLOB. I can add it to a SQL_STMT variable which is also a CLOB. In the treatment of 10,000 records, this function worked well for 8000 in vain, then when he struck a record that had more than 4,000 characters in GML (seven geometries) with an ORA-01704 string literal too long. I am in debug on each line of the function and found the function failed on the open cursor statement
    OPEN c_geoms FOR sql_stmt;
    I don't understand why I get this error that the total length of sql_stmt for registration failed, it is about 7500 characters and I use CLOB which must be able handle length. I don't know if I have not used the CLOB correctly or maybe I need to use something DBMS_LOB package but I can't find any decent examples and I don't really know why it does not work in any case.
    Here's the function:
    CREATE OR REPLACE FUNCTION MULTI_GML_TO_SDOGEOM (
       geometry_components IN CLOB)
       RETURN sdo_geometry
    IS
    v_count             NUMBER;
    v_gml               XMLType;
    v_gml_rec           XMLType;
    v_gml_clob          CLOB;
    v_gml_clob_rec      CLOB;
    sql_stmt            CLOB;
    v_sdogeom           SDO_GEOMETRY;
    v_sdogeom_all       SDO_GEOMETRY;
    varray_sdogeom      SDO_GEOMETRY_ARRAY;
    
    
    TYPE t_ref_cursor  IS REF CURSOR;
    c_geoms         t_ref_cursor;
    
    BEGIN
    
    varray_sdogeom := SDO_GEOMETRY_ARRAY();
    
    IF geometry_components is not null THEN
    
      v_gml := XMLType ('<GeometryComponents xmlns:gml="http://www.opengis.net/gml/3.2">'||geometry_components||'</GeometryComponents>');
    
      v_gml_clob := v_gml.getClobVal();
    
      SELECT count(*) INTO v_count FROM XMLTable ('declare namespace gml="http://www.opengis.net/gml/3.2"; (: :)
                                                 //polygon' PASSING v_gml);
    
      If v_count > 0 THEN
    
        sql_stmt := 'WITH gml_input AS (SELECT XMLType ('''||v_gml_clob||''') as gmldata from dual)
                     select poly.spatial_location from gml_input,
                                                     xmltable (xmlnamespaces (''http://www.opengis.net/gml/3.2'' as "gml"),
                                                              ''GeometryComponents/polygon/gml:Polygon''
                                                               PASSING gmldata
                                                               COLUMNS
                                                               spatial_location XMLTYPE PATH ''//gml:Polygon'') poly
                     UNION ALL
                     select point.spatial_location from gml_input,
                                                     xmltable (xmlnamespaces (''http://www.opengis.net/gml/3.2'' as "gml"),
                                                              ''GeometryComponents/polygon/gml:Point''
                                                               PASSING gmldata
                                                               COLUMNS
                                                               spatial_location XMLTYPE PATH ''//gml:Point'') point';
    --    dbms_output.put_line (sql_stmt);
    
    
        OPEN c_geoms FOR sql_stmt;
    
        LOOP
    
          FETCH c_geoms INTO v_gml_rec;
          EXIT WHEN c_geoms%NOTFOUND;
    
          v_gml_clob_rec := v_gml_rec.getClobVal;
    
          sql_stmt := 'SELECT SDO_CS.TRANSFORM(SDO_UTIL.FROM_GML311GEOMETRY ('''||v_gml_clob_rec||'''), 8307) FROM dual';
    
          EXECUTE IMMEDIATE sql_stmt INTO v_sdogeom;
     
          varray_sdogeom.EXTEND;
    
          varray_sdogeom(varray_sdogeom.COUNT) := v_sdogeom;
    
        END LOOP;   -- c_geoms fetch
    
        CLOSE c_geoms;
    
        select SDO_AGGR_SET_UNION(varray_sdogeom, 0.005) INTO v_sdogeom_all from dual;
    
      END IF;  -- v_count > 0
    
    RETURN v_sdogeom_all;
    
    END IF;
    
    END MULTI_GML_TO_SDOGEOM;
    /
    
    show errors
    Unfortunately I can't add the data, I'm processing as it is classified, but here's a dummy sample of the type of GML, I'm analysis although this is short enough to make it work:
    <GeometryComponents xmlns:gml="http://www.opengis.net/gml/3.2">
    <polygon xmlns:gmd="http://www.isotc211.org/2005/gmd" xmlns:srv="http://www.isotc211.org/2005/srv"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:gco="http://www.isotc211.org/2005/gco" xmlns:gml="http://www.opengis.net/gml/3.2" xmlns:mgmp="http://www.mod.uk/mgmp" xmlns:smr="http://www.mod.uk/smr"
    xmlns:xlink="http://www.w3.org/1999/xlink"><gml:Polygon gml:id="bp2" srsName="EPSG:4326">
    <gml:exterior>
    <gml:LinearRing>
    <gml:posList srsDimension="2">175 -40 176 -40 176 -39 175 -39 175 -40</gml:posList>
    </gml:LinearRing>
    </gml:exterior>
    </gml:Polygon>
    </polygon>
    </GeometryComponents>
    And even if this function is normally called by procedure, this is a double call
     select MULTI_GML_TO_SDOGEOM ('<GeometryComponents xmlns:gml="http://www.opengis.net/gml/3.2">
    <polygon xmlns:gmd="http://www.isotc211.org/2005/gmd" xmlns:srv="http://www.isotc211.org/2005/srv"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:gco="http://www.isotc211.org/2005/gco" xmlns:gml="http://www.opengis.net/gml/3.2" xmlns:mgmp="http://www.mod.uk/mgmp" xmlns:smr="http://www.mod.uk/smr"
    xmlns:xlink="http://www.w3.org/1999/xlink"><gml:Polygon gml:id="bp2" srsName="EPSG:4326">
    <gml:exterior>
    <gml:LinearRing>
    <gml:posList srsDimension="2">175 -40 176 -40 176 -39 175 -39 175 -40</gml:posList>
    </gml:LinearRing>
    </gml:exterior>
    </gml:Polygon>
    </polygon>
    </GeometryComponents>') from dual;
    Thanks in advance.

    Hello

    I don't see why you use dynamic SQL statements here.
    As mentioned above, you do a lot of things wrong and unnecessary, the first being not not using bind variables.
    So, I see many serialization/construction on XMLType which only adds more load.

    Basically, the function can be simplified to:

    create or replace function multi_gml_to_sdogeom (
      geometry_components in clob
    )
    return sdo_geometry
    is
    
      v_sdogeom_all       SDO_GEOMETRY;
    
    begin
    
      select SDO_AGGR_SET_UNION(
               cast(
                 collect(
                   SDO_CS.TRANSFORM(SDO_UTIL.FROM_GML311GEOMETRY(spatial_location), 8307)
                 )
                 as sdo_geometry_array
               )
             , .005
             )
      into v_sdogeom_all
      from (
        select xmlserialize(content x.column_value) as spatial_location
        from xmltable(
               xmlnamespaces ('http://www.opengis.net/gml/3.2' as "gml")
             , '/GeometryComponents/polygon/(gml:Polygon|gml:Point)'
               passing xmlparse(document geometry_components)
             ) x
      ) ;
    
      return v_sdogeom_all;
    
    end;
    
    SQL> select multi_gml_to_sdogeom('
      2  
      5  
      6  
      7  175 -40 176 -40 176 -39 175 -39 175 -40
      8  
      9  
     10  
     11  
     12      45.67, 88.56
     13    
     14  
     15  ')
     16  from dual ;
    
    MULTI_GML_TO_SDOGEOM('
     
    

    Published by: odie_63 on 8 Jan. 2013 18:02

  • CLOB getting not saved.

    Dear all,
    I have the following code. I'm trying to save the Clob data, but its does not save
    BindingContext cntx = BindingContext.getCurrent ();
    App = JUApplication
    (JUApplication) cntx.get ("AppModule_OutgoingEmailDataControl");
    ApplicationModuleImpl = appimpl
    (ApplicationModuleImpl) app.getApplicationModule ();

    OutgoingEmail_VOImpl obj =
    (OutgoingEmail_VOImpl) appimpl.findViewObject ("OutgoingEmail_VO1");
    Line OutgoingEmail_VORowImpl = (OutgoingEmail_VORowImpl) obj.createRow ();
    .
    .
    .
    row.setBody (new ClobDomain (nvl (getTxtBody () .getValue ()));

    where getTxtBody() returns the instance of the RichTextEditor.

    In the method of doDML() of EOImpl, I'll call the following stored procedure.
    String sql =
    "{pkg_template_mgmt_system_new.p_Add_Email(?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)". call} ";
    CallableStatement cs =
    getDBTransaction () .createCallableStatement (sql, 1);
    cs.setString ("I_FROM_EMAILID", getFromId());
    .
    .
    .
    cs.setClob ("I_BODY", (CLOB) getBody () .getData ());

    Any ideas?

    Let me see... in the most recent versions of JDBC, I think that you can just create a string in array of char of the ClobDomain and use setString() on the statement.
    The pilot will make the CLOB manages for you.
    You can also do it explicitly. Use the connection interface to create a Clob, then fill with the data.

    Sascha

  • How to get the value of a particular tag to a column (clob)

    Hi friends,

    How to get the value of a particular tag to a column (clob), which has an XML value?

    Thanks in advance.

    Kind regards
    Sirot Chauvet

    You are welcome, if you got your answer to the question as such brand

  • Getting ORA-22835: buffer too small for to CHAR CLOB or BLOB to RAW conversion in my ODI interface

    Hi all

    I am creating an interface ODI as explained below,

    I table A column 'Comments' and one other table with column B "No Comment." I create a view as shown below and use it as a source for my interface.

    SELECT * FROM WC_EQM_CSAP_FS_TMP WHERE to_char (QUESTION_COMMENT) IN (SELECT NO_COMMENTS FROM WC_EQM_NO_COMMENTS_LIST).

    But my interface fails with the error below

    ORA-22835: buffer too small for to CHAR CLOB or BLOB to RAW conversion (real: 4108, maximum: 4000)

    22835 00000 - "too small buffer tank CLOB or BLOB to RAW conversion (actual: %s, maximum: %s)) '

    * Cause: an attempt was made to convert CLOB to CHAR or BLOB to RAW, where

    LOB size was larger than the buffer for CHAR and RAW limit types.

    Note that the widths are reported in characters if characters length isadds are in effect for the column, other widths are

    in bytes.

    Here QUESTION_COMMENT is of type CLOB data. NO_COMMENTS is also CLOB data type. The maximum length of the data of QUESTION_COMMENT is also 11710

    Please tell us how to solve.

    Thank you

    Dany

    IMHO, it is not good to compare CLOB, can be that it is possible to use the primary key for the comments

    Is it QUESTION_COMMENT and NO_COMMENTS a PRIMARY INDEX,

    may be is the best solution to create vacabulary with comments (if it does not exist) and then compare it by id;

    You can simply use the left join and check for null in a CASE statement

    Try this, but I don't know... as I said before I have no Oracle DB

    SELECT INTEGRATION_ID, DATASOURCE_NUM_ID, QUESTIONNAIRE_ID, QUESTION_ID,

    CLIENT_ID, SITE_ID, CLIENT_REP, COMMITMENT, ENGAGEMENT_TYPE, TOP_CLIENT_FLG,

    INITIAL_PLANNED_DT, REVISED_PLANNED_DT, INTERVIEW_DT, RECEIVED_DT, QUESTION_SCORE,

    (

    CASE WHEN EXISTS (SOME T2.NO_COMMENTS OF WC_EQM_NO_COMMENTS_LIST T2 WHERE DBMS_LOB.COMPARE (TMP.) (QUESTION_COMMENT, T2.NO_COMMENTS) = 0)

    THEN NULL

    OF OTHER QUESTION_COMMENT

    END QUESTION_COMMENT),

    (CASE WHEN QUESTION_SCORE<=5 then="" 1="" else="" 0="" end)="">

    0 AS QWOCOMMENTS,

    CREATED_ON_DT, CHANGED_ON_DT OF WC_EQM_CSAP_FS_TMP as TMP

    P.S. does not use TO_CHAR (LOB) there are possible error

  • Gets the number of clob

    Hello

    I would like a specific search text such as "regalAIIIMS # years old male patient" according to the CLOB data. According to this text, I have to extract digital data (i.e. #).

    Sex of the patient can change male or female.

    for example

    for ID 101, it contains text such as "AIIIMS treat old of 32 male patients. I want to exit 32

    With master as

    (

    Select 101 as id,

    "The ACR and ARHP have prepared information for patients on the many rheumatic diseases and conditions."

    AIIIMS treat patients of 32 years old male. AIIMS study patient age of 12 years;

    Patient suffered from asthma at the age of 21 years. "as a CLOB to double union

    Select 102,

    "Vasculitis is a term for a group of rare diseases.

    What have in common the inflammation of the blood vessels.

    AIIIMS treat patients of 42. The

    Patient dated dated 23 June is completely cured "double hyphen

    Select 103,

    "The genetic factors (different genes)

    seem to be more or less important in the disease.

    AIIIMS do not treat patients aged 76 years. I have

    think you should take risks "of the double

    )

    Select * from the master

    Output

    Age of ID

    32 101

    42 102

    103.

    Select 101 as id,

    "The ACR and ARHP have prepared information for patients on the many rheumatic diseases and conditions."

    AIIIMS treat patients of 32 years old male. AIIMS study patient age of 12 years;

    Patient suffered from asthma at the age of 21 years. "as a CLOB to double union

    Select 102,

    "Vasculitis is a term for a group of rare diseases.

    What have in common the inflammation of the blood vessels.

    AIIIMS treat patients of 42. The

    Patient dated dated 23 June is completely cured "double hyphen

    Select 103,

    "The genetic factors (different genes)

    seem to be more or less important in the disease.

    AIIIMS do not treat patients aged 76 years. I have

    think you should take risks' double Union

    Select 104,

    "Genetic factors (different genes - 42 {TEST: before the string number})"

    seem to be more or less important in the disease.

    AIIIMS treating old patients of 76. I have

    think that you should not take risks of double

    )

    SELECT id,

    -case when instr (CLOB, ' AIIIMS treats ' ') > 0

    then regexp_substr (CLOB, '\d+', instr (CLOB, ' regal AIIIMS' "), 1).

    age of the end

    the master


    ID AGE
    101 32
    102 42
    103 -
    104 76

    Concerning

    Etbin

  • get the table script using dbms_metadata.get_ddl but with clob field

    Thus, Oracle 11g R2...
    I use dbms_metadata.get_ddl for table scripts and it works fine...

    now, I have a table with clob field, and it does not work... I got an error "missing a closing parenthesis (ora-0907) '...
    I could paste a script I had, but I don't think that it makes no sense...

    does anyone have an experience on the use of this package on clob tables?


    TNX

    See this code.

    DECLARE
      myddl clob;
      PROCEDURE print_clob(p_clob in clob) as
        l_offset number default 1;
      BEGIN
        loop
          exit when l_offset > dbms_lob.getlength(p_clob);
          dbms_output.put_line(dbms_lob.substr(p_clob, 255, l_offset));
          l_offset := l_offset + 255;
        end loop;
      END print_clob;
      FUNCTION get_metadata return clob is
        h   number;
        th  number;
        doc clob;
      BEGIN
        h := dbms_metadata.open('TABLE');
        dbms_metadata.set_filter(h, 'SCHEMA', 'HR');
        dbms_metadata.set_filter(h, 'NAME', 'EMPLOYEES');
        th := dbms_metadata.add_transform(h, 'MODIFY');
        th := dbms_metadata.add_transform(h, 'DDL');
        --dbms_metadata.set_transform_param(th,'SEGMENT_ATTRIBUTES',false);
        doc := dbms_metadata.fetch_clob(h);
        dbms_metadata.CLOSE(h);
        return doc;
      END get_metadata;
    BEGIN
      myddl := get_metadata;
      print_clob(myddl);
    END;
    

    This procedure of print_ddl I took of the documentation.
    Use of the long VALUE. See, in the first example out put is truncated.

    SQL> SELECT dbms_metadata.get_ddl('TABLE','EMP','SCOTT') FROM dual;
    
    DBMS_METADATA.GET_DDL('TABLE','EMP','SCOTT')
    --------------------------------------------------------------------------------
    
      CREATE TABLE "SCOTT"."EMP"
       (    "EMPNO" NUMBER(4,0),
            "ENAME" VARCHAR2(10),
    
    SQL> set long 10000
    SQL> /
    
    DBMS_METADATA.GET_DDL('TABLE','EMP','SCOTT')
    --------------------------------------------------------------------------------
    
      CREATE TABLE "SCOTT"."EMP"
       (    "EMPNO" NUMBER(4,0),
            "ENAME" VARCHAR2(10),
            "JOB" VARCHAR2(9),
            "MGR" NUMBER(4,0),
            "HIREDATE" DATE,
            "SAL" NUMBER(7,2),
            "COMM" NUMBER(7,2),
            "DEPTNO" NUMBER(2,0),
             CONSTRAINT "PK_EMP" PRIMARY KEY ("EMPNO")
    
    DBMS_METADATA.GET_DDL('TABLE','EMP','SCOTT')
    --------------------------------------------------------------------------------
      USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS
      STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
      PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
      TABLESPACE "USERS"  ENABLE,
             CONSTRAINT "FK_DEPTNO" FOREIGN KEY ("DEPTNO")
              REFERENCES "SCOTT"."DEPT" ("DEPTNO") ENABLE
       ) PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING
      STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
      PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
      TABLESPACE "USERS"
    
    SQL> SET LINESIZE 132
    SQL> SET pagesize 0
    SQL> SET LONG 1000000
    SQL> /
    
      CREATE TABLE "SCOTT"."EMP"
       (    "EMPNO" NUMBER(4,0),
            "ENAME" VARCHAR2(10),
            "JOB" VARCHAR2(9),
            "MGR" NUMBER(4,0),
            "HIREDATE" DATE,
            "SAL" NUMBER(7,2),
            "COMM" NUMBER(7,2),
            "DEPTNO" NUMBER(2,0),
             CONSTRAINT "PK_EMP" PRIMARY KEY ("EMPNO")
      USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS
      STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
      PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
      TABLESPACE "USERS"  ENABLE,
             CONSTRAINT "FK_DEPTNO" FOREIGN KEY ("DEPTNO")
              REFERENCES "SCOTT"."DEPT" ("DEPTNO") ENABLE
       ) PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING
      STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
      PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
      TABLESPACE "USERS"
    
    SQL>
    
  • Quosmio G20. Fan gets loud, then stops.

    Hello.
    The fan on my G20 seems to be much more noisy that when I got it, then very hard and she stops.
    Very hot just left of the right speaker. I used to get all sorts of errors briefly onscreen. (IE: failed to load powrprof. dll has been one of many)
    Then I would get the dreaded blue screen, and it would just shut down. Now, I get no message, just a noisy fan all the time then it sounds like its taking power off before stopping. I would really appreciate if someone could point me in the right direction.
    Thank you very much.

    Hello

    first of all, you must clean your ventilation system to remove dust that might clob heatpipe system and the processor fan.
    Just take a few tablets canned air (which can be purchased in all electronic markets) and gently make some short hollow blasts the ventilation holes on the side/rear/bottom of your machine.

    If this does not help, I suggest you to bring your machine to a technician or at least to an authorized partner of toshiba. They do a hardware check and maybe clean your cooling system.

    If you need a link to find the one closest to your country, then check this one out:

    http://EU.computers.Toshiba-Europe.com/cgi-bin/ToshibaCSG/generic_content.jsp?service=EU&ID=ASP_SUPPORT

    Welcome them

  • ORA-22835: buffer too small for to CHAR CLOB or BLOB to RAW conversion (real: 22960, maximum: 2000)

    Hi all

    I have the underside of SQL that returns the error in question:

    WITH c_file_imp_data
             
              AS
              (select t_data.src
                from sni_ar_file_import_blob t,
                    xmltable('/a/b'
                        passing xmltype('<a><b>'||replace(UTL_RAW.CAST_TO_VARCHAR2(t.file_blob),chr(10),'</b><b>')||'</b></a>')
                        columns src varchar2(4000) path '.'
                       ) t_data  )
             
             SELECT
             -- regexp_subsr is finding the position of the x occcurrance of a tab delimitter
             -- the format mask for date is reckognizing masks without '-' e.g. 01feb2016 will work as well as 01-Feb-2016
              to_number(rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 1), chr(9))), --RECORD_ID
              to_number(rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 2), chr(9))), --BILL_TO_CUSTOMER_NUMBER,
              to_number(rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 3), chr(9))), -- BILL_TO_LOCATION,
              to_number(rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 4), chr(9))), -- SHIP_TO_CUSTOMER_NUMBER,
              to_number(rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 5), chr(9))), -- SHIP_TO_LOCATION,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 6), chr(9)), -- CURRENCY,
              to_date(rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 7), chr(9)),
                      'DD-MON-YYYY'), -- GL_DATE,
              to_date(rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 8), chr(9)),
                      'DD-MON-YYYY'), -- TRANSACTION_DATE,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 9), chr(9)), -- TRANSACTION_TYPE_NAME,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 10), chr(9)), -- TRANSACTION_SOURCE,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 11), chr(9)), -- TERMS,
              to_date(rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 12), chr(9)),
                      'DD-MON-YYYY'), -- DUE_DATE,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 13), chr(9)), -- PAYMENT_METHOD,
              to_number(rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 14), chr(9))), -- SALESREP_NUMBER,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 15), chr(9)), -- ITEM,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 16), chr(9)), -- DESCRIPTION,
              to_number(rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 17), chr(9))), -- QUANTITY,
              to_number(rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 18), chr(9))), -- UNIT_SELLING_PRICE,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 19), chr(9)), -- UNIT_OF_MEASURE,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 20), chr(9)), -- WAREHOUSE,
              to_number(rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 21), chr(9))), -- TAX_RATE,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 22), chr(9)), -- TAX_CODE,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 23), chr(9)), -- GL_ACCOUNT_STRING,
              to_number(rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 24), chr(9))), -- CURRENCY_EXCHANGE_RATE,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 25), chr(9)), -- LINE_TRX_DFF_CONTEXT_VAL,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 26), chr(9)), -- LINE_TRANSACTION_FIELD1,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 27), chr(9)), -- LINE_TRANSACTION_FIELD2,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 28), chr(9)), -- LINE_TRANSACTION_FIELD3,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 29), chr(9)), -- LINE_TRANSACTION_FIELD4,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 30), chr(9)), -- LINE_TRANSACTION_FIELD5,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 31), chr(9)), -- LINE_TRANSACTION_FIELD6,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 32), chr(9)), -- LINE_TRANSACTION_FIELD7,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 33), chr(9)), -- LINE_TRANSACTION_FIELD8,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 34), chr(9)), -- LINE_TRANSACTION_FIELD9,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 35), chr(9)), -- LINE_TRANSACTION_FIELD10,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 36), chr(9)), -- LINE_TRANSACTION_FIELD11,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 37), chr(9)), -- LINE_TRANSACTION_FIELD12,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 38), chr(9)), -- LINE_TRANSACTION_FIELD13,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 39), chr(9)), -- LINE_TRANSACTION_FIELD14,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 40), chr(9)), -- LINE_TRANSACTION_FIELD15,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 41), chr(9)), -- INV_LINE_INFO_DFF_CONT_VAL,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 42), chr(9)), -- NO_ANIMALS,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 43), chr(9)), -- MX_WEIGHT,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 44), chr(9)), -- MX_SLAUGHTER,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 45), chr(9)), -- REBILLED,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 46), chr(9)), -- NON_STAT,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 47), chr(9)), -- ATTRIBUTE12,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 48), chr(9)), -- ATTRIBUTE13,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 49), chr(9)), -- ATTRIBUTE14,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 50), chr(9)), -- ATTRIBUTE15,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 51), chr(9)), -- ATTRIBUTE8,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 52), chr(9)), -- ATTRIBUTE11,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 53), chr(9)), -- ATTRIBUTE2,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 54), chr(9)), -- ATTRIBUTE3,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 55), chr(9)), -- ATTRIBUTE4,
              rtrim(regexp_substr(src, '[^' || chr(9) || ']*' || chr(9) || '', 1, 56), chr(9)) -- ATTRIBUTE9
                      
               FROM c_file_imp_data
               ;
    
    
    
    
    
    

    The sni_ar_file_import_blob of source table has the FILE_BLOB column as blob data type. The data inside the tab delimeted and as we can see above the limit.

    Is anyone know anything how to get this data and select as stated above?

    Oracle version: 12 c

    Kind regards

    Alex

    Hello

    You must use dbms_lob.writeappend

    Here is an example here:

    https://Doganay.WordPress.com/2016/02/10/BLOB-to-CLOB/

  • How to force the object mixed case names in REST GET SQL service

    Hello

    Version of ADR: 3.0.1.177.18.02

    Apex version: 5.0.1.00.06

    DB version: 12.1.0.2

    GlassFish 4.1 Community Edition

    I am creating a web service using the method = GET and source type = query using names of objects with different cases.  The web service is forcing my lowercase object names.

    I know I can set the web service by changing the type of PL/SQL source to hand and create manually using htp.prn, mixed case works when I do this.

    But it is a huge SQL except for the forced lowercase, such works as required.

    Example of

    Create a simple web service test, type GET, type source query.

    select sysdate as "currentDate" from dual
    
    

    The result will be

    {"currentdate": "2015-11 - 23 T 12: 44:25Z '}

    and not the expected

    {"currentDate": "2015-11 - 23 T 12: 44:25Z '}

    Is there a way to tell the engine to keep the case and do not force lowercase?  apex_json is case-sensitive, which makes this weird behavior.

    declare
       json   varchar2 (32767) := '{"firstName":"Olafur Tryggvason"}';
    begin
       apex_json.parse (json);
       dbms_output.put_line ('Mixed case: ' || apex_json.GET_VARCHAR2 ('firstName'));
       dbms_output.put_line ('Lowercase: ' || apex_json.GET_VARCHAR2 ('firstname'));
    end;
    
    

    Will display:

    Executed PL/SQL block

    Mixed case: Olafur Tryggvason

    Tiny:

    Concerning

    Olafur,

    Post edited by: Olafur T Added version information

    Just got a response from an SR that I created.  camelCase is not supported.  Workaround is creating it in pl/sql.

  • apex_json throws ORA-06502

    Hello

    I use 5 APEX and must work with GEOJSON.

    If I test the following statement in SQL Developer, everything works fine:

    DECLARE
        s varchar2(32767) := '{ "a": 1, "b": ["hello", "world"]}';
    BEGIN
        apex_json.parse(s);
        sys.dbms_output.put_line('a is '||apex_json.get_varchar2(p_path => 'a'));
    END;
    

    But if I change the JSON data to the following:

    s varchar2(32767) := '{ "a": 1.1, "b": ["hello", "world"]}';
    

    I get this error:

    Fehler beim Start in Zeile : 1 in Befehl -
    DECLARE
        s varchar2(32767) := '{ "a": 1.1, "b": ["hello", "world"]}';
    BEGIN
        apex_json.parse(s);
        sys.dbms_output.put_line('a is '||apex_json.get_varchar2(p_path => 'a'));
    END;
    Fehlerbericht -
    ORA-06502: PL/SQL: numerischer oder Wertefehler: Fehler beim Konvertieren von Zeichen zu Zahl
    ORA-06512: in "APEX_050000.WWV_FLOW_JSON", Zeile 367
    ORA-06512: in "APEX_050000.WWV_FLOW_JSON", Zeile 519
    ORA-06512: in "APEX_050000.WWV_FLOW_JSON", Zeile 566
    ORA-06512: in "APEX_050000.WWV_FLOW_JSON", Zeile 756
    ORA-06512: in "APEX_050000.WWV_FLOW_JSON", Zeile 774
    ORA-06512: in "APEX_050000.WWV_FLOW_JSON", Zeile 811
    ORA-06512: in "APEX_050000.WWV_FLOW_JSON", Zeile 839
    ORA-06512: in "APEX_050000.WWV_FLOW_JSON", Zeile 852
    ORA-06512: in Zeile 4
    06502. 00000 -  "PL/SQL: numeric or value error%s"
    *Cause:    
    *Action:
    

    The same thing happens with with MyMethod...

    sys.dbms_output.put_line('a is '||apex_json.get_number(p_path => 'a'));
    

    Because I work with GEOJSON, there are decimal numbers in my JSON data - I have to deal with.

    I have not found any suspicion in the apex_json documentation.

    How can I solve this problem?

    Kind regards

    Christian

    Hello

    your example works fine for me on 5.0.2. There were a few bugs that fixed us since 5.0. Some may be available as SAFF, but maybe you need to upgrade to the latest version.

    Kind regards

    Christian

Maybe you are looking for