DATA_LENGTH in all_tab_columns in BYTES/CHAR?

Hello

in Oracle 11 g 2 or 10 gr 2 documentation, you will find the following description of ALL_TAB_COLUMNS. DATA_LENGTH.
DATA_LENGTH      NUMBER      NOT NULL      Length of the column (*in bytes*)
But what of VARCHAR2 fields that were created with
NLS_LENGTH_SEMANTICS='CHAR'
My observation is that in 10g ALL_TAB_COLUMNS. DATA_LENGTH indicates the length in mode what ever, that you have created the table. So if I created a VARCHAR2 as VARCHAR2 (4 CHAR) field, then ALL_TAB_COLUMNS. DATA_LENGTH indicates 4. 11 g however the field really shows the length in bytes. So I could observe data_length = 16 for a field that has been defined as VARCHAR2 (4 CHAR) on a database of 11 GR 2 with NLS_CHARACTERSET = "AL32UTF8.

Is there an opposite observation or experience?

FYI - compare and contrast with char_length and char_used columns.

Tags: Database

Similar Questions

  • convert BYTE CHAR

    Hello
    on the 11g R2,.

    If a Column1 is VARCHAR2 (20 byte) and column2 is VARCHAR2(20 char), which needs more disk space to be stored (let's assume we have just one line)?

    We are in non-unicode mode.

    If I MYTABLE (column1 VARCHAR2 (20 byte)) how can I convert to MYTABLE (column1 VARCHAR2 (20 CHAR))?

    Thank you.

    user522961 wrote:
    Thank you all.
    I wanted to understand the reason for the parameter NLS_LENGTH_SEMANTICS to CHAR

    As already explained more high semantic length is designed to define the size of a column in the table so that you do not get the error of allocation of space in the column of table such as:

     oerr ora 12899
    12899, 00000, "value too large for column %s (actual: %s, maximum: %s)"
    // *Cause: An attempt was made to insert or update a column with a value
    //         which is too wide for the width of the destination column.
    //         The name of the column is given, along with the actual width
    //         of the value, and the maximum allowed width of the column.
    //         Note that widths are reported in characters if character length
    //         semantics are in effect for the column, otherwise widths are
    //         reported in bytes.
    // *Action: Examine the SQL statement for correctness.  Check source
    //          and destination column data types.
    //          Either make the destination column wider, or use a subset
    //          of the source column (i.e. use substring).
    

    If your application requires it, you do it because it is so dependent on the application.

    Here's a long discussion on the pros and cons of the parameter NLS_LENGTH_SEMANTICS to CHAR: Re: language support of several

  • Read byte/char from socket inputstream

    Hey fellow coders.
    I ran into a bit of coding of a http client drive problem for my assignment. I created a simple program that opens a connection with a server and the request for a file to download. The socket inputstream is encapsulated in a bufferedreader and I can successfully retrieve the response of the full text of the server via the player. Unfortunately, every time I download an mp3 file, the data is corrupted. The problem lies in the fact that bufferedreader decodes the stream of bytes into a string with the utf8. I can't use the underlying socket inputstream as bufferedreader has already read before in and perhaps spent the beginning of the content area. Analysis of the chain with getBytes ("UTF 8") does not work that the audio file is damaged.
    My question is, is there any player that can send me data from a stream unique both as char or byte? Everyone fell on this problem and found a solution for it?

    Thank you.

    The socket inputstream is encapsulated in a bufferedreader

    Why?

    and I can successfully retrieve the response of the full text of the server via the player.

    Why is the server sends a response text followed by binary data?

    In this circumstance, I would use DataInputStream.readLine (), despite his disapproval, followed to call read() to get binary bytes.

  • Command to change the semantics BYTE char to a table column

    DB version: 10 gr 2

    I know that he is an ALTER TABLE TableName CHANGE command to change the NLS_LENGTH_SEMANTICS for a table column.

    I need to change column of the table EMP of Ename and TANK job. How can I do this with an ALTER command?
    SQL>col data_Type format a12
    SQL>col column_name format a10
    SQL>
    SQL>select COLUMN_NAME, DATA_TYPE,DATA_LENGTH, CHAR_LENGTH,CHAR_USED
      2  from dba_tab_columns
      3  where table_name='EMP' and owner = 'SCOTT';
    
    COLUMN_NAM DATA_TYPE    DATA_LENGTH CHAR_LENGTH C
    ---------- ------------ ----------- ----------- -
    EMPNO      NUMBER                22           0
    ENAME      VARCHAR2              10          10 B
    JOB        VARCHAR2               9           9 B
    MGR        NUMBER                22           0
    HIREDATE   DATE                   7           0
    SAL        NUMBER                22           0
    COMM       NUMBER                22           0
    DEPTNO     NUMBER                22           0
    
    8 rows selected.

    Try

    alter table emp modify (ename varchar2(10 char), job varchar2(9 char));
    
  • How to convert an array of char byte array?

    Hello

    Someone can say, how can I convert Byte char []?

    Thank you

    What:

    data Byte [] =...

    Char [] charArr = (new String (data)) .toCharArray ();

    Rab

  • Incorrect Data_length for columns with the semantics of tank in 10g

    Hello

    I spent a few databases in my workplace and I noticed something unusual.

    Database server - Oracle 10 g R2
    Database Client - Oracle 11g R1 (11.1.0.6.0 EA)
    Client OS - Win XP
    SQL>
    SQL> @ver
    
    BANNER
    ----------------------------------------------------------------
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    PL/SQL Release 10.2.0.4.0 - Production
    CORE    10.2.0.4.0      Production
    TNS for Linux: Version 10.2.0.4.0 - Production
    NLSRTL Version 10.2.0.4.0 - Production
    
    5 rows selected.
    
    SQL> --
    SQL> drop table t;
    
    Table dropped.
    
    SQL> create table t (
      2    a    char(3 char),
      3    b    char(3 byte),
      4    c    char(3),
      5    d    varchar2(3 char),
      6    e    varchar2(3 byte),
      7    f    varchar2(3)
      8  );
    
    Table created.
    
    SQL> --
    SQL> desc t
     Name                                      Null?    Type
     ----------------------------------------- -------- ----------------------------
     A                                                  CHAR(3 CHAR)
     B                                                  CHAR(3)
     C                                                  CHAR(3 CHAR)      <= why does it show "CHAR" ? isn't "BYTE" semantics the default i.e. CHAR(3) = CHAR(3 BYTE) ?
     D                                                  VARCHAR2(3 CHAR)
     E                                                  VARCHAR2(3)
     F                                                  VARCHAR2(3 CHAR)  <= same here; this should be VARCHAR2(3)
    
    SQL> --
    SQL> select table_name,
      2         column_name,
      3         data_type,
      4         data_length,
      5         data_precision,
      6         data_scale
      7    from user_tab_columns
      8   where table_name = 'T';
    
    TABLE_NAME   COLUMN_NAME  DATA_TYPE  DATA_LENGTH DATA_PRECISION DATA_SCALE
    ------------ ------------ ---------- ----------- -------------- ----------
    T            A            CHAR                12                               <= why 12 and not 3 ? why multiply by 4 ?
    T            B            CHAR                 3
    T            C            CHAR                12                               <= same here
    T            D            VARCHAR2            12                               <= and here
    T            E            VARCHAR2             3
    T            F            VARCHAR2            12                               <= and here
    
    6 rows selected.
    
    SQL>
    SQL>
    I think it multiplies the size of 4, because it shows 16 in user_tab_columns, when size changed to 4.

    When I try this on server 11g R1, it looks good.

    Database server - Oracle 11 g R1
    Database Client - Oracle 11g R1 (11.1.0.6.0 EA)
    Client OS - Win XP
    SQL>
    SQL> @ver
    
    BANNER
    --------------------------------------------------------------------------------
    Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
    PL/SQL Release 11.1.0.6.0 - Production
    CORE    11.1.0.6.0      Production
    TNS for 32-bit Windows: Version 11.1.0.6.0 - Production
    NLSRTL Version 11.1.0.6.0 - Production
    
    5 rows selected.
    
    SQL> --
    SQL> drop table t;
    
    Table dropped.
    
    SQL> create table t (
      2    a    char(3 char),
      3    b    char(3 byte),
      4    c    char(3),
      5    d    varchar2(3 char),
      6    e    varchar2(3 byte),
      7    f    varchar2(3)
      8  );
    
    Table created.
    
    SQL> --
    SQL> desc t
     Name                                      Null?    Type
     ----------------------------------------- -------- ----------------------------
     A                                                  CHAR(3 CHAR)
     B                                                  CHAR(3)
     C                                                  CHAR(3)
     D                                                  VARCHAR2(3 CHAR)
     E                                                  VARCHAR2(3)
     F                                                  VARCHAR2(3)
    
    SQL> --
    SQL> select table_name,
      2         column_name,
      3         data_type,
      4         data_length,
      5         data_precision,
      6         data_scale
      7    from user_tab_columns
      8   where table_name = 'T';
    
    TABLE_NAME   COLUMN_NAME  DATA_TYPE    DATA_LENGTH DATA_PRECISION DATA_SCALE
    ------------ ------------ ------------ ----------- -------------- ----------
    T            A            CHAR                   3
    T            B            CHAR                   3
    T            C            CHAR                   3
    T            D            VARCHAR2               3
    T            E            VARCHAR2               3
    T            F            VARCHAR2               3
    
    6 rows selected.
    
    SQL>
    SQL>
    Is this a known bug? Unfortunately, I have not access to Metalink.

    Thank you
    isotope

    Published by: isotope on March 3, 2010 06:46

    Hello

    Read nls_parameter $ v as v$ nls_parameters in post above.
    OK, so it seems that your problem is solved.

    Now about the tank is multiplied by 4, I suspect that you are using the multibyte character set. You can check v $ nls_parameters.

    Concerning
    Anurag

  • Send direct connection bytes

    Hi guys,.

    This code worked perfectly on the Simulator, but does not work on Blackberry "BOLD" (URL for device: "socket://xxx.xxx.xxx.xxx:8008;) deviceside = true; APN = internet; »)

    public static String URL = 'socket://xxx.xxx.xxx.xxx:8008; deviceside = true; » ;
    StreamConnection conn = null;

    Dim str As String = "uo! E9<>
    char c1 = str.charAt (0);
    bValue1 bytes = c1 (byte);
    char c2 = str.charAt (1);
    bValue2 bytes = c2 (byte);

    Byte [] anArray = new ubyte [4];
    anArray [0] = bValue1;
    anArray [1] = bValue2;
    anArray [2] = bValue3;
    anArray [3] = bValue4;

    Conn = (StreamConnection) Connector.open (URL);
    DataOutputStream os = conn.openDataOutputStream ();
    OS. Write (anArray);
    OS. Flush();
    String stringToConvert = "blahblahblahyaddayaddayadda";
    Byte [] theByteArray = stringToConvert.getBytes ();
    OS. Write (theByteArray);
    OS. Flush();

    What is wrong with him?

    Thank you

    Get rid of the final semicolon at the end of the URL. There was a thread here recently about that: http://supportforums.blackberry.com/rim/board/message?board.id=java_dev&message.id=57620&query.id=16...

  • Change char to varchar2 for all tables

    Hello
    I need to change the data type of char to varchar2 globally for all tables. For example, if four tables in a database contains one or more columns of type char (128), it should be replaced by varchar2 (128). I can do in a single statement?

    Thank you
    Sujnan

    Hello

    Try this...
    Coil d:\column_type.sql

    Select 'alter table' | table_name |' change '. column_name |' varchar2 (128);' from all_tab_columns where data_type = 'CHAR' and table_name =' or select from user_tables help subquery ' >;

    Spool off

    You will get sql statements to change the data type of Char to varchar2 (128)
    Run the file on hold and you have completed the task.

    Concerning

    You can also do this by using the procedure. Use the cursor to select name tablename and column of the user_tab_columns or All_tab_columns and then use immediate Execute to execute the statement Alter table.

    Published by: user644725 on October 22, 2008 12:16 AM

  • Confusion of length of column data type

    Hi all

    To learn about my column data type and length, I have run the following query. However, when I confirm it with the table_name desc command, I see that the lengths of data do not correspond. You have an idea?

    > > select column_name | » '|| DATA_TYPE | » ('|| ( data_length|') ' col_info all_tab_columns where table_name = 'CUSTTABLE' and column_name = 'ACCOUNTNUM;

    > > ACCOUNTNUM NVARCHAR2 (24)

    > > desc CUSTTABLE.

    > > ACCOUNTNUM 1 N NVARCHAR2 (12)

    Concerning

    Charlie

    NightWing wrote:

    By the way I couln t understand what were you thinking when you explain to no.

    I missed you NVARCHAR2 column and thought it was VARCHAR2. When you declare the semantics of VARCHAR2 column length is specified explicitly or implicitly. Explicitly suffixing length with BYTE or CHAR and implicitly when you specify the length of the right itself. Then length is based on the value NLS_LENGTH_SEMANTICS session. So let's assume you generate table create statement (and it seems that you is, based on column_name |) » '|| DATA_TYPE | » ('|| ( data_length|') ' ). Then it does not matter if use use data_length or char_col_decl_length. It will be regadless semantics implied length of what you use. Therefore, when you run create table instructions column length will be interpreted as bytes if current NLS_LENGTH_SEMANTICS byte value. But same length is interpreted as the characters if current NLS_LENGTH_SEMANTICS a char value. As you can see, in order to generate create table statement we use explicit semantic length column, otherwise create table statement can produce lengths of different column in the source table.

    SY.

  • Is there a way to find the length of a line, not a column

    I'm trying to find the length of a particular line in a table... is it possible?
    I don't mean length of the column value... I can get by using LENGTH (COLUMNNAME)
    I'm not average line length, I can get extended calculation or using the trace file.
    Please let me know if there is a function, that allows to calculate or any other solution.

    Thank you!

    >
    (1) select avg_row_len from dba_tables
    where table_name = 'abc ';
    I got the length of the average line like 65
    That means 65... It is 65 bytes/KB/MB, or what exactly is 65?
    >
    This means that 65 bytes - see ALL_TABLES in doc database
    http://docs.Oracle.com/CD/B19306_01/server.102/b14237/statviews_2105.htm#i1592091
    >
    (2) also a member of the team of mine use this query:
    SELECT sum (data_length) FROM ALL_TAB_COLUMNS WHERE
    table_name = 'ABC ';
    and divided the result above with the number of rows in this table

    Suppose that gives only the length of the average column type... is not our requirement?
    >
    At best, that would give you the MAXIMUM length that COULD be a line. It will not help at all. For example all columns defined as NUMBER (with any precision or scale) appears in the form of bytes 22 since they can take a lot. And a VARCHAR2 (4000) will be listed as 4000 even if it is NULL or if the average length is 2 bytes.

  • White spaces are weird

    I am trying to query through a dblink Oracle to SQL Server 2012... basic stuff works as select * and a couple of joints.

    But when I try more than one couple joins my has output of white space between the texts.

    I tried regular expressions, unicode conversion / delimited, but nothing seems to work...

    What of even weirder when I copy the result in excel there is nothing there, but when I commit on the new joints and run the query, the text is there and able to be copy and pasted.

    If I reset the dblink and run the same exact query with commits them more additional joins, the output is fine but if I try an order or anything else, that white return spaces!


    I think it's a different text but wtf?



    SELECT

    PR."PRequestId."

    PR.' name' as PFLEX,.

    t.' name' as LCRIDE

    OF "PProductionRequest"@TICdb_link pr.

    "" "JOIN"Detail"@TICdb_link d d." PRequestId "= pr." PRequestId.

    "" "JOIN"T"@TICdb_link t t." TId "= d." TId ".

    -left JOIN "Ppr PtProductionRule"@ticdb_link PR. "PProductionRuleId" = ppr. "" PProductionRuleId ".

    image.jpgimage 2.jpg

    Have seen this before with heterogeneous database links from Oracle to SQL Server.

    As I understand it is that the text returned by SQL-Server is a set of characters to multibyte (2 bytes in this case), and deals with Oracle's standard ASCII 1 byte. So the 2nd byte (unused by the 2-byte char) is presented as a white space.

    The likely reason that this happens when you add joins - joins change the character set in the projection of SQL.

  • SQL Server Oracle Migration: semantics of the column

    I'm testing a migration to SQL Server 2012 to Oracle 12 c using SQL Developer 4.0.0.13 migration Assistant. The problem that I am facing is that all the generated table creation scripts have CHARACTER semantics for character columns. I wish that the semantic bytes. As the generated tables and columns are several thousand it is not possible to manually change the semantics.

    CREATE TABLE account_dod)

    account_ID VARCHAR2 (16 CHAR) NOT NULL,

    I can't find any possibility to change the semantics when generating. Anyone know of any option that I can set so that the migration wizard will generate all of the columns in the form of BYTES?

    Thank you very much

    Hello

    If you email me, I can provide an extension changed to use unity BYTE CHAR and VARCHAR2 columns.

    My email address is my [email protected]

    Kind regards

    Dermot ONeill

    SQL development team

  • Avoid to insert a short value in column with friendly notification

    Hi all

    I've got table with column 'mobile phone '. My need is to avoid inserting too short values (normal duration is 9). Also the user should receive easy to use, why he or she can't insert a value that is too short. How can I make it?

    Thanks in advance.

    CREATE TABLE 'SCHEMENAME '. "" CELL PHONES. "

    ('ID' (32 BYTE) CHAR by DEFAULT sys_guid(),

    'MOBILE PHONE' NUMBER (9.0).

    "DESCR" VARCHAR2 (1000 BYTE),

    'STATE' NUMBER (5.0) DEFAULT 0,

    "CREATED" DATE default sysdate,

    "CREATEDBY" VARCHAR2 (1000 BYTE),

    DATE 'DAY. '

    "UPDATEDBY' VARCHAR2 (1000 BYTE)

    );

    Hello

    You can use a trigger to achieve

    for example

    CREATE OR REPLACE TRIGGER SCHEMENAME.TR_CELLPHONES

    BEFORE INSERT OR UPDATE ON SCHEMENAME. CELL PHONES

    FOR EACH LINE

    DECLARE

    BEGIN

    IF length(:new.) MOBILE PHONE)<>

    THEN

    RAISE_APPLICATION_ERROR ("-20101, ' the phone is too short!");

    ROLLBACK;

    END IF;

    END;

    /

    Concerning

  • TANK (and VARCHAR2) semantic attribute in views catalog

    Hello

    This may seem very basic, but I am struggling to find information about it.

    I created an object that contains an attribute declared in the specification of type char (8 CHARACTERS).
    But where in views catalog I find the semantic character?

    For example, in the case of the paintings, I can watch the CHAR_USED column in the DBA_TAB_COLUMNS view. Unfortunately, none of the three views (DBA_TYPES, DBA_TYPE_VERSIONS and DBA_TYPE_ATTRS) associated to type specification have such an indicator.

    Of course, I could do an analysis of the chain on the TEXT in the DBA_TYPE_VERSIONS opinion column, but that seems rather unnatural if there is a direct wayl...

    Any ideas?

    Best regards

    Philippe

    Solomon,

    Very well! I looked at it before and he couldn't understand. I applied your solution to a test case that I had previously put in place at the time and found that I had to make a slight change (additional join) for several columns. I have provided below.

    SCOTT@orcl_11gR2> create or replace type test_object as object
      2    (char_attribute     char(8 char),
      3       byte_attribute     char(8 byte));
      4  /
    
    Type created.
    
    SCOTT@orcl_11gR2> column type_name     format a11
    SCOTT@orcl_11gR2> column attr_name     format a14
    SCOTT@orcl_11gR2> column attr_type_name format a14
    SCOTT@orcl_11gR2> column char_semantic     format a13
    SCOTT@orcl_11gR2> select t.type_name,
      2           ta.attr_name,
      3           ta.attr_type_name,
      4           case
      5              when ta.attr_type_name in ('CHAR','VARCHAR2')
      6              then decode(bitand(a.properties,4096),0,'BYTE','CHAR')
      7           end char_semantic
      8  from   user_types t,
      9           user_type_attrs ta,
     10           sys.attribute$ a
     11  where  t.type_name = 'TEST_OBJECT'
     12  and    t.type_name = ta.type_name
     13  and    t.type_oid = a.toid
     14  and    ta.attr_no = a.attribute#
     15  /
    
    TYPE_NAME   ATTR_NAME      ATTR_TYPE_NAME CHAR_SEMANTIC
    ----------- -------------- -------------- -------------
    TEST_OBJECT CHAR_ATTRIBUTE CHAR           CHAR
    TEST_OBJECT BYTE_ATTRIBUTE CHAR           BYTE
    
    2 rows selected.
    
  • Failures of batch PrepraredStatement

    I'm trying to understand the failure mode of using the method .sendBatch (failure as shown below) for the PreparedStatements pads. It seems to insert correctly intermittently. I changed the batch size to try see if there still failed on the same insertion into an empty table (the table is cleared before each race), but it's not. It attempts to insert space data (with or without geometry) and seems to have no problem. It would be nice to have a source available for Eclipse IDE in reference to help understand failure or at least javadocs on methods of T4CTT for the (11.1.0.7.0) odbc driver ojdbc5.jar talk to Oracle 10.2 server. Any suggestions on the resolution? Hope I'm not on something obvious and I googled this many solutions.


    Thread [GraphicsTranslator:0] (Suspended (exception NullPointerException))
    T4CTTIoac.init (OracleTypeADT, int, int) online: 338
    T4C8Oall.initBindsDefinition (T4CTTIoac []) line: 1516
    T4C8Oall.Marshal (boolean, boolean, boolean, boolean, byte, int, byte [], int, accessor [], int, accessor [], int, byte [], char [], short [], int, DBConversion, InputStream [] [], byte [] [] [], byte [], OracleTypeADT [] [], OracleStatement, byte [], char [], short [], T4CTTIoac [], int [], int [], int [], NTFDCNRegistration) line: 494
    Line T4CPreparedStatement.doOall8 (boolean, boolean, boolean, boolean): 180
    T4CPreparedStatement.executeForRows (Boolean) line: 953
    T4CPreparedStatement (OracleStatement) .doExecuteWithTimeout () line: 1222
    T4CPreparedStatement (OraclePreparedStatement) .sendBatch () line: 3691
    OraclePreparedStatementWrapper.sendBatch (line): 1140

    Hello

    ojdbc5.jar is the JDBC , ODBC driver not. You'll probably have more luck posting in this forum.

    Greg

Maybe you are looking for