Aggregation of dynamic string more than 4000 bytes in length. Oracle 11g

Hello guys,.

We use Oracle 11 g

I have a problem with the concatenation of strings being too large. I looked around for a while
and suggested customized functions (which is difficult due to restrictions in our db environment)
or by using a clob data type. Now, I have not found a way how to use a clob with the method below.
The number of lines I need to merge varies with each group IDNum so it must be dynamic

-ORA-01489: result of concatenating string is too long

SELECT LISTAGG (MYBIGTEXT, ',') IN the GROUP (ORDER BY IDNum) from ourtablewithbigtext;

Thanks for the tips.

You found a character in data that are not allowed in XML documents.

See {message identifier: = 4076401}

Try this:

SQL> alter session set events = '19119 trace name context off'
/
session SET altered.

SQL> select xmlelement("test",chr(0)) from dual
/
Error starting at line 3 in command:
select xmlelement("test",chr(0)) from dual
Error report:
SQL Error: ORA-31061: XDB error: special char to escaped char conversion failed.

SQL> alter session set events = '19119 trace name context forever, level 0x100000'
/
session SET altered.

SQL> select xmlelement("test",chr(0)) from dual
/
XMLELEMENT("TEST",CHR(0))
--------------------------------------------------------------------------------
?                                                                   

SQL> alter session set events = '19119 trace name context forever, level 0x200000'
/
session SET altered.

SQL> select xmlelement("test",chr(0)) from dual
/
XMLELEMENT("TEST",CHR(0))
--------------------------------------------------------------------------------
& #x0000;                                                            

SQL> alter session set events = '19119 trace name context forever, level 0x400000'
/
session SET altered.

SQL> select xmlelement("test",chr(0)) from dual
/
XMLELEMENT("TEST",CHR(0))
--------------------------------------------------------------------------------
                                                                    

SQL> alter session set events = '19119 trace name context off'
/
session SET altered.

Tags: Database

Similar Questions

  • External table - load a log file with more than 4000 bytes per column

    Hello
    I'm trying to import a log file into a database table that has a single column: txt_line
    In this column, I'm trying to fill out a log by record type entry. Each log entry is normally more than 4000 bytes in the outer table, it should be a clob.
    Below is a table of external work that works, but cut all entries after 4000 bytes. How is it is possible to directly load the data into a clob column? All I've found are descriptions where I have a clob-file by file.
    Any help is appreciated
    Thank you



    Source file
    .. more than 4000 bytes...]] .. .more Quen 4000 bytes...]] .. more than 4000 bytes.

    ]] ist the record delimiter

    External table:
    create the table TST_TABLE
    (
    txt_line varchar2 (4000)
    )
    external organization
    (type
    ORACLE_LOADER
    the default directory tmp_ext_tables
    (settings) access
    records delimited by a "]]"
    fields (txt_line char (4000))
    )
    location ("test5.log")
    )
    reject the limit 0
    ;

    user12068228 wrote:

    I'm trying to import a log file into a database table that has a single column: txt_line
    In this column, I'm trying to fill out a log by record type entry. Each log entry is normally more than 4000 bytes in the outer table, it should be a clob.
    Below is a table of external work that works, but cut all entries after 4000 bytes. How is it is possible to directly load the data into a clob column? All I've found are descriptions where I have a clob-file by file.
    Any help is appreciated
    . . . E t c...

    And what did you expect if you define the field source and target column as 4000 characters?

    Try this:

    CREATE TABLE tst_table
     (
       txt_line CLOB
     )
     ORGANIZATION EXTERNAL
     (TYPE oracle_loader
        DEFAULT DIRECTORY tmp_ext_tables
        ACCESS PARAMETERS (
           RECORDS DELIMITED BY ']]'
           FIELDS (txt_line CHAR(32000))
        )
      LOCATION ('test5.log')
     )
    REJECT LIMIT 0
    ;
    

    8 2

  • Cause: java.sql.SQLException: ORA-22295: unable to bind more than 4000 bytes

    Hello

    When you use Java/XML to insert, we got an error like:

    Cause: java.sql.SQLException: ORA-01461: can bind to a LONG value only for insert into a LONG column

    then change column varchar2 (4000) to a long data type.

    as:
    truncate table BAM_ACTIVITY

    ALTER TABLE BAM_ACTIVITY CHANGE (CONTEXT LONG);


    then download the new error message like:

    Cause: java.sql.SQLException: ORA-22295: impossible to link more LOB data and LONG columns 4000 bytes in 1 statement


    I think that time limit of data type is only 2 Giga bytes, do we need to change the data type of this column to
    CLOB, which can be as large as 4 GB,

    Or an experience to draw from this kind of error?

    Thank you very much

    Roy

    Published by: ROY123 on January 25, 2010 14:26

    Published by: ROY123 on January 25, 2010 14:27

    Has LONG been deprecated for a (pardon the pun) a very long time.

    CLOB would be your horse if you need store more than 4000 bytes of information.

    If you need help with that (insert via java), this can provide useful information for you.

    [http://www.oracle.com/technology/sample_code/tech/java/codesnippet/jdbc/clob10g/handlingclobsinoraclejdbc10g.html]

  • Select the list with the list of dynamic values with more than 4000 tank of query

    Hello

    I have no application where users can store SQL queries in a CLOB column. This query is then used to populate the list, select a dynamic element through LOV. Following the code returns the query for dynamic LOV used to populate the select list. It works fine except when the length of the lv_sqlStatement becomes more than 4000 characters. Then application crashes with "ORA-06502: PL/SQL: digital or value error: character string buffer too small" when Select the list item rendering.

    Any ideas how to get around this problem? Any help is appreciated. Do not say to them to write shorter than 4000 queries because I can't (it's operational requirements).

    DECLARE
    lv_sqlStatement end_user_set.sql_statement%type;
    BEGIN
    lv_sqlStatement: =: P2_SQL_STATEMENT;
    return ' select the label, value of (' | lv_sql_statement | t ')
    To_char (t.value) if not in (select value from end_user_set_member eusm)
    where eusm. EUSRSET_ID = ' | : P2_EUSRSET_ID | ')';
    END;

    I just blogged about this problem and posted a solution. See this announcement:

    http://www.deneskubicek.blogspot.de/2013/03/select-list-with-dynamic-lov-and-Ora.html

    Denes Kubicek
    -------------------------------------------------------------------
    http://deneskubicek.blogspot.com/
    http://www.Apress.com/9781430235125
    http://Apex.Oracle.com/pls/Apex/f?p=31517:1
    http://www.Amazon.de/Oracle-Apex-XE-Praxis/DP/3826655494
    -------------------------------------------------------------------

  • Should what data type I use to store more than 4000 characters in a column

    Hello friends,

    I am currently using the suite oracle version for my database:

    SQL > select * from v version $;

    BANNER
    --------------------------------------------------------------------------------
    Oracle Database 11g Enterprise Edition Release 11.1.0.6.0
    PL/SQL release 11.1.0.6.0

    SQL > create table clobexample (clob t1);

    SQL > insert into clobexample values ('aaaaaaaaaaaaaaaaaaaa... ») ;

    Error in the command line: 2 column: 8
    Error report:
    SQL error: ORA-01704: string literal too long
    01704 00000 - "string literal too long."
    * Cause: The string literal is longer than 4000 characters.
    * Action: Use a string literal of more than 4,000 characters.
    Longer values can only be entered using bind variables.

    My request is that what kind of data can I use table to enter more than 4000 characters in the table, I even tried with clob (example) above, but it is not favourable.
    Is there another way of letting?

    Please help me.
    Thank you in advance.
    Kind regards.

    Hello

    You can use the same CLOB, but you cannot insert directly, you may need to use the pl/sql.

    Try the method mentioned in this link.

    http://www.orafaq.com/Forum/t/48485/0/

    see you soon

    VT

  • more than 4000 characters of parameter

    Layer of java calls a stored proc EXEC_DDL that makes an EXECUTE IMMEDIATE on the VARCHAR2 parameter string passed. Since the java layer goes to more than 4000 characters DDL statement, we get an error cannot bind for a long time... What kind of data can I use as a parameter instead of VARCHAR2 inside the stored EXEC_DDL Steven Feuerstein proc advises against the use for a long TIME in his book of PL/SQL. All types of BUSINESS are inside or outside the database objects, but I want just the java code to pass a DDL string greater than 4000 characters. I want to use the simplest approach.

    Thank you in anticipation

    What version of Oracle are you using? One thing - you can pass a variable of liaison which is more than 4,000 characters - so your options are rather spend a clob. Now if you're on 11g you can pass a clob in an immediate statement execute otherwise you will need to store the clob as varchar2 variable in the procedure, and then pass that variable in the statement execute immediate. In pl/sql, you can store up to 32K into a varchar2 variable.

  • Switch Cisco Nexus 6004 removes more than 300 bytes IP packets

    Hi all

    We have a circuit of wave level 3 10 G running between two switches Cisco Nexus 6004. The circuit came online between our two data centers (in the same city) without problem.

    When attempting to ping to the remote-end 10G interface, it works very well with packets of 64 bytes. CDP is enabled and that we see the CDP information remote switch. However, if we increase the size of the ping to more than 300 bytes packets, we lose 1 in every 20.

    We settings MTU verifed, type of cable of 10G and duplex settings.

    Level 3A tested clean and we will move forward with more testing.

    Any ideas on the problem? We feel the carrier out and end to test with their testers. But so far, the circuit's own test. I was not sure if it is something related to Cisco. I am at a loss at the moment.

    Thank you.

    Mike

    Hi Mike,.

    There is nothing wrong with your switches or circuit.  NEXUS devices have a default COPP on the control plan that limit the size and the amount of traffic that must be the CPU process.

    http://www.Cisco.com/en/us/docs/switches/Datacenter/SW/6_x/NX-OS/security/configuration/guide/b_Cisco_Nexus_7000_NX-OS_Security_Configuration_Guide__Release_6.x_chapter_011001.html

    HTH

  • Ping failed with more than 76 bytes on Codian 4510

    Hi all

    Some troubleshooting problem I stumbled across what seems to be a strange behavior with the Codian 4510. In two separate environments, I realized that ping the device with a packet size of more than 76 bytes of data results in delays as shown below. Anyone know why this happens?

    localhost: ~ jason$ ping s 76 10.2.0.208

    PING 10.2.0.208 (10.2.0.208): 76 data bytes

    84 bytes from 10.2.0.208: icmp_seq = 0 ttl = 254 times = 3,642 ms

    84 bytes from 10.2.0.208: icmp_seq = 1 ttl = 254 = ms 3,579 times

    ^ C

    -10.2.0.208 - ping statistics

    2 packets transmitted, 2 packets received, 0.0% packet loss

    round-trip min/avg/max/stddev = 3.579/3.611/3.642/0.031 ms

    localhost: ~ jason$ ping s 77 10.2.0.208

    PING 10.2.0.208 (10.2.0.208): 77 data bytes

    Request timeout for icmp_seq 0

    Timeout for icmp_seq request 1

    ^ C

    -10.2.0.208 - ping statistics

    3 packets transmitted, 0 packets received, loss of packets of 100.0%

    This is normal, to avoid problems due to the too large ICMP packets.

  • Photo dimension more than 4000 x 4000 pixels

    Hello, I have a question for the employees of the Pixel Bender Photoshop PlugIn. Is there a real chance for the near future to do Pixel Bender Photoshop PlugIn capable of working with filters on photos that have more than 4000 x 4000 pixels-dimension. Most professional or semi-professional DSLR do today photos with higher dimensions.

    I'll buy the new Sony A65 - and there will be pictures with the following dimensions:

    Yet the 16:9 Image size: L size: 6000 x 3376 (20 M) M size: 4240 x 2400 (10 M) S size: 3008 x 1688 (5.1 M)
    Still Image 3:2 size: L size: 6000 x 4000 (24 M) M size: 4240 x 2832 (12 MB) S size: 3008 x 2000 (6 M)

    I think that it cannot be the solution for all users of a DSLR is such modern to take pictures only size ' or by reducing the dimensions of the image in Photoshop to max.4000pixel. I buyed me Pixel Bender filter "PSKiss Edge Gear" to sharpen my photos (on the GPU on the graphics card), but it doesn't work on the photos with a maximum of 4000 pixels (I tested this on photos with different dimensions). I have had Email contact with an employee of PSKiss and he said it's because of the restrictions of the Pixel Bender plugin maximally. 4000 x 4000 pixel image size.

    Today's computers (CPU, GPU, RAM, VRAM) material has no problem to manage larger photos in like photoshop graphics software, but I also think the

    "bottleneck" for this problem is the restrictions on size of the Pixel Bender Photoshop plugin. I would be great if you have a solution for this in the near future.

    Than 4Kx4K in the Pixel Bender Photoshop CS4 plugin has been added for good reason - many pilots at the time indicated the possibility of managing the 6Kx6K or same 8Kx8K textures in vain well under 4Kx4K - although it really hurt us to add this limit. We reassessed and subsequently removed the 4Kx4K in Photoshop CS5 and replaced with a limit of size of 128 MB. This would allow users to deal with a lot of the standard image size detach the digital SLR cameras of the day. Although I can't say if it will be deleted - the decision will be based on the results of our tests, we will reassess the 128 MB limit for the next version of the Pixel Bender plugin. What version of Photoshop are you using?

  • Can we have more than one system secure in Oracle Identity Manager?

    Can we have more than one system secure in Oracle Identity Manager?

    Yes we can
    Assume that we are reconciling userid and certain attributes of database and electronic mail, jobcode, city of AD.

    We will have then two trusted resource. We have a number of trusted resource based on the client system.

  • Module reading NI 9870 (more than 64 bytes)

    Hello

    I m trying to use the NI 9870, Interface FPGA module.

    When I send data, I Don t have a problem, but I m having a lot of problems in reading.

    If the data that I try to read it s less than 64 bytes, the program works without any problem, but when I have more time data (> 64 bytes), I got error of the 65575 saturation in the FPGA.

    I tried changed the speed in the transmision to FIFO be filled not so quickly, but it does not work. I have also included in the FPGA to increase the time, delay time... .but I get no solution.

    I m very lost with this... Please can someone help me with this problem?

    Thank you and best regards,

    Hello
     
    In case it helps someone. I solved, creating in the FPGA program, a parallel loop for playback, the write-configuration and without the use of interrupts, because they made me a read cycle and, therefore, problems.

  • column with more than 4000 characters

    Hello

    Version: 10.2.0.4.0

    I have a requirement to display more than 4,000 characters (clob, long data type) through sql.

    Although this can be achieved through pl/sql, I'm not able to get the output in sql statements. Is it possible to do this through sql?
    I can use the pl/sql through treatment if necessary.

    Thanks for your help.

    Preta says:
    Although this can be achieved through pl/sql, I'm not able to get the output in sql statements. Is it possible to do this through sql?

    Yes, that have you tried? -If all goes well, not LONG

    SQL> drop table t purge;
    
    Table dropped.
    
    SQL> create table t (c clob);
    
    Table created.
    
    SQL> insert into t values (rpad(to_clob('x'),4001, 'x'));
    
    1 row created.
    
    SQL> set long 5000
    SQL> set pages 100
    SQL> select c from t;
    
    C
    --------------------------------------------------------------------------------
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    x
    
    SQL>
    

    Concerning
    Peter

  • How to get more information about the users in oracle 11g

    Hi all

    I need to know more information on users in oracle 11g.

    1. how to check when user times last connection to the database.
    2. If the user account is blocking why it is blocked? If it is locked trying Loing with wrong password how to check how many times he tried with wrong password... and other reasons also
    3 reaction strives to select user expire date is to show the value empty... we can check expire date by querys or we need to check the profile?
    4. how to check while he was reset his password, the last time.

    Please correct me if am wrong. Thank you

    Concerning

    1. how to check when user times last connection to the database.

    AUDIT

    2. If the user account is blocking why it is blocked? If it is locked trying Loing with wrong password how to check how many times he tried with wrong password... and other reasons also

    Check the profile of this user and attr.

    3 reaction strives to select user expire date is to show the value empty... we can check expire date by querys or we need to check the profile?

    SQL> select EXPIRY_DATE from dba_users ;
    
    EXPIRY_DA
    ---------
    24-SEP-11
    24-SEP-11
    24-SEP-11
    24-SEP-11
    24-SEP-11
    24-SEP-11
    24-SEP-11
    23-MAY-13
    24-SEP-11
    24-SEP-11
    24-SEP-11
    

    check
    EXPIRY_DATE in dba_users is null

    4. how to check while he was reset his password, the last time.

    SELECT SYS PTIME. THE USER$;

  • JavaScript: Cannot load more than 4093 bytes in variable. String is truncated. In IE, everything works fine.

    I have request data from the server.
    Server returns XML data. Processing these data. When the variable size exceeds 4093, the string is truncated. Code:

    function handleGetVars (data) {
    var xmlResponse = data;
    if (xmlResponse == null) return;
    xmlRoot = xmlResponse.documentElement;
    if (!xmlResponse || !xmlRoot)
    throw ("Wrong XML document structure:\n" + xmlHttp.responseText);
    if (xmlRoot.nodeName == "parsererror")
    throw ("Wrong XML document structure:\n" + xmlHttp.responseText);
    numArray = xmlRoot.getElementsByTagName ("num");
    nameArray = xmlRoot.getElementsByTagName ("name");
    valueArray = xmlRoot.getElementsByTagName ("value");
    var html = "<table>";
    var num = 0;
    var value = "";
    if (numArray.length)
    {
    for (var i=0; i<nameArray.length; i++)
    {
    num = numArray.item(i).firstChild.data;
    html += "<tr><td align='left'>";
    html += nameArray.item(i).firstChild.data + "</td>\n";
    html += "<td align='left'><div id=variable_" + num + ">";
    value = valueArray.item(i).firstChild.data;
    html += value + "</div></td>\n";
    html += "<td><div id = 'btnBlock_" + num + "'>";
    html += btnBlock (num);
    html += "</div></td></tr>\n";
    } // for
    } // if
    html += "</table>";
    $('#variables').html(html);
    } // handleGetVars

    This forum focuses on the care of the end user. You can find more help web development on the mozillaZine Web Development Council. That Council handles also better than that of the special characters (using code BBCode tags).

  • How to store more than 4000 characters in a table

    I have a requirement to store 4000 + string in the table. CLOB and BLOB cannot me because he has limitations of 4000 characters.

    Any suggestions please.

    Pentaho seems based jdbc then look for an example of a jdbc clob insertion.
    For example http://www.oracle.com/technology/sample_code/tech/java/codesnippet/jdbc/clob10g/handlingclobsinoraclejdbc10g.html

    This will probably be a better approach than messing around with blocks anonymous plsql, etc. that do not sound relevant to what you're trying to reach really.

    This forum comment made me smile of the 'Integration of data head' @ Pentaho can:
    http://forums.Pentaho.com/showthread.php?62231-insert-a-string-in-a-CLOB

    It should work just fine. You probably need to swap your JDBC driver or something.
    Oracle can be mysterious in that dept. 
    
    xxx xxxxxx, Chief Data Integration
    Pentaho, Open Source Business Intelligence
    

    Reassuring.

Maybe you are looking for

  • T540p with Fedora

    Hi guys I'm about to change my Linux Fedora operating system. need to know if all my functions of PC (cam, card reader, reader of SM and Nvidia card...) are supported and can use or! Thank you

  • Dv7 1260us memory upgrade

    Hey guys I'm looking for to upgrade the memory of my laptop. It's a HP DV7 1260US. I went to a few local computer stores, but they don't have it. I am up in canada. Anyone have any idea where I can find the ram for it? I think it's what I'm looking f

  • Vista Stop error message: 0XC012001D after the installation of Windows updates.

    Original title: vista stop working Hello I need help on vista, because I update and restart and stop working and appear this error code: !! 0xc012001d! Registry\Machine\Components\DriverData...

  • BIOS system off 01649

    I get disabled 01649 system when I try to access the BIOS on my computer.

  • New update - ThinkPad tablet OTA 1 - release 0075

    I was just invited to install a new update. I am now on A310_02_0037_0075_US Not what has been updated however. EDIT ADMIN - please see the list of changes on our support site here