Exclude null columns in a query

Hi gurus,
I need your help here. I have a table that contains columns of more than 120. Now I want to produce a report only for those who are not null columns. How can I accomplish this?

Is it possible, to display the results as follows -

Table - structure
col1 col2 col3 col4 col5 col6 col7 col8
A     B      A    C            D        D

col5 and col8 has no value if I want to display resutl like that.
col1 A
col2 B
COL3 HAS
C COL4
col6 D
col7 D

Thanks in advance,

Kind regards
Pascal M

In my example, add a column 'olic_key' in wr_test, then your application should work.

Tags: Database

Similar Questions

  • Interactive report with Null columns

    I have a user who wants to export an interactive report, and there need to some columns will appear on export so that he can fill the data later (there is an intermediate step, while we work on getting all the data they need in the database). I created the report query, and there are a handful of columns null or empty (to keep the correct order for export). When I try to add this query to the APEX, in an interactive report, I get an "ORA-00001: unique constraint (APEX_040000.WWV_FLOW_WORKSHEET_COLUMNS_UK2) violated" error.

    Googling autour shows me that this happens because of the columns the same name, and the solution is to provide aliases for all columns. My report has alias on all columns, but I can't yet create an interactive report. I tried to change the columns null of empty strings, as well as encompassing the alias in double - quotes, but nothing works. However, I can use my original query to create a standard report, but due to export, this is not ideal.

    I was able to create the interactive report with a null and then edited the source column of the report to add to the others. This had to be done one at a time, given that you are trying to add several null columns at the same time gives the same error even once. Unfortunately, when I try and you run the page, I get a ' ORA-20001: get_dbms_sql_cursor error ORA-00918: column ambiguously defined "error.

    My original query:
    select customer.customer_name as customer,
           project.name as project_name,        
           trunc(project.estimated_end_dt) as due_date, 
           project_status.project_status_desc as status, 
           null as revenue, 
           project.baseline_effort as baseline_hours,
           null as projected_cost,
           null as est_gain_loss, 
           project.actual_hours as actual_hours,
           project.estimated_hours as projected_hours,
           null as projected_cost, 
           null as projected_gain_loss, 
           null as roi       
        from project, 
             customer
      where customer.customer_id = project.customer_id
      and project.inactive_ind = 0
      and project.customer_id is not null
      and project.estimated_end_dt >= :DTP_P50_STARTDT
      and project.estimated_end_dt <= :DTP_P50_ENDDT
    order by customer.customer_name, 
             project.estimated_end_dt;             
    Can someone tell me a way to create an interactive report with multiple null columns?

    Hi shimmoril,

    The problem is probably that you have two columns aliased as 'projected_cost' (columns 7 and 11).

    Hope this helps,
    John

    If you find this information useful, please indicate the 'useful' or 'correct' post so that others benefit. *

  • How to find the NULL columns in the table?

    Hello

    Please provide the query to search for null columns in the table. Here, all the rows in the table have same column as null. It will not change.

    Table
    ---------------
    C1 c2 c3 c4
    X C 10
    T D 20

    I want to know that C2 is the nullable column.
    Thanks in advance!

    Kind regards
    Florian...

    A variance more

    create or replace function is_nulled_column(tab varchar2, col varchar2) return varchar2 is
    cnt number:=1;
    begin
      execute immediate 'select count(1) from '||tab||' where '||col||' is not null and rownum=1' into cnt;
      return case when cnt=0 then 'Y' else 'N' end;
    end;
    

    and use:

    select
     c.OWNER,c.TABLE_NAME,c.COLUMN_NAME,c.NULLABLE,is_nulled_column(c.TABLE_NAME,c.COLUMN_NAME) all_nulls
    from all_tab_columns c
    where c.OWNER=user
    and c.TABLE_NAME like '%'
    order by c.OWNER,c.TABLE_NAME,c.COLUMN_ID
    
  • Replace NULL values for PIVOT query

    Hello

    I'm working a table for sales, for certain values, when now columns, finishing with a null value. How can I handle these values 0 (zeros).

    WITH (AS PIVOT_DATA)
    SELECT S.ZONE_CODE, Z.ZONE_NAME, S.YEAR, S.PERIOD, S.SALES
    OF STAT_TABLE_SALES ADV
    AREAS OF JOIN Z ON S.COMP = Z.COMP AND S.ZONE_CODE = Z.ZONE_CODE
    WHERE S.COMP = '001'
    AND S.BRAND_CODE = '001'
    )
    SELECT *.
    OF PIVOT_DATA
    PIVOT)
    SUM (SALES) FOR THE PERIOD (YEAR) TO ((20091),
    (2009,2),
    (2009,3),
    (20094)
    (2009,5),
    (2009,6),
    (2009,7),
    (2009.8),
    (2009,9),
    (2009.10)
    (2009,11),
    (2009.12),
    (2010,1),
    (20102)
    (2010,3),
    (2010,4),
    (2010,5),
    (2010,6),
    (2010.7))
    )
    ORDER BY DESC NULLS LAST 14;

    This query returns the following:

    COD_ZONA NOM_ZONA 2009_3 2009_2 2009_1
    -------- ------------------------------ ---------------------- ---------------------- --------------------
    01 YEDUSIJH. 1382367.75 1559943.27 1441279.64
    02 C, ASKAK 711897.82 865854,05 1583232.3
    ASDFG 03 130443.03 205409,84 178633.69
    04 OSOIDSD 320118.32 439008,83 409251.18
    05 ODFSDF 300908.21 276301,59 260188.53
    06 CH 242749.65 325196,71 464938.9
    SOA 07 610312.31 606312,93 754569.82
    08 SAN 89426.8 81360,04 61649.27
    09 YP 284487.79 328281,31 267210.85
    10 TC 87043.28 158594,43 85195.8
    11 BAGNN 76778.78 68180,76 118530.04
    12 CRT 122023.7 143442,21 134744.85
    13 ABC 209992.79 196477,03 185222.14
    IDLIB 14
    15 ARE 23870.41 4137,33 16660.53

    * These are not all the columns and rows, but it is a piece of what it returns.

    How can I replace the NULL values with 0 (zeros).

    Still not sure why nvl shouldn't meet your needs:

    SQL> select ename, nvl (clerk_20, 0) clerk_20, nvl (sal_30, 0) sal_30
      from emp pivot (sum (sal)
               for (job, deptno)
               in ( ('CLERK', 20) clerk_20, ('SALESMAN', 30) sal_30))
    /
    ENAME             CLERK_20          SAL_30
    ---------- --------------- ---------------
    WARD                     0            1250
    JONES                    0               0
    TURNER                   0            1500
    ADAMS                 1100               0
    ALLEN                    0            1600
    SMITH                  800               0
    CLARK                    0               0
    KING                     0               0
    BLAKE                    0               0
    JAMES                    0               0
    FORD                     0               0
    SCOTT                    0               0
    MARTIN                   0            1250
    MILLER                   0               0
    
    14 rows selected.
    
  • How to get the value of a column in sql query?

    Hi, anyone knows how to get the value of a column in sql query?

    Here is my code, the value must be 1350079224397 in my PB, but I get 0

    QString query ("SELECT version FROM db_version");

    QVariant result = sda.execute (query);
    QVariantMap versionMap = result.toList () such () .toMap ();
    If (! versionMap.IsEmpty ())
    {
    qDebug()<"Version: "=""><>
    }

    OK, I have the solution

    QString query ("SELECT version as version FROM db_version");

  • SQL Loader - ignore the lines with "rejected - all null columns."

    Hello

    Please see the attached log file. Also joined the table creation script, data file and the bad and throw the files after execution.

    Sqlldr customer in the version of Windows-

    SQL * Loader: release 11.2.0.1.0 - Production

    The CTL file has two clauses INTO TABLE due to the nature of the data. The data presented are a subset of data in the real world file. We are only interested in the lines with the word "Index" in the first column.

    The problem we need to do face is, according to paragraph INTO TABLE appears first in the corresponding CTL lines file to the WHEN CLAUSE it would insert and the rest get discarded.

    1. statement of Create table : create table dummy_load (varchar2 (30) name, number, date of effdate);

    2. data file to simulate this issue contains the lines below 10. Save this as name.dat. The intention is to load all of the rows in a CTL file. The actual file would have additional lines before and after these lines that can be discarded.

    H15T1Y Index | 2. 19/01/2016 |

    H15T2Y Index | 2. 19/01/2016 |

    H15T3Y Index | 2. 19/01/2016 |

    H15T5Y Index | 2. 19/01/2016 |

    H15T7Y Index | 2. 19/01/2016 |

    H15T10Y Index | 2. 19/01/2016 |

    CPDR9AAC Index | 2. 15/01/2016 |

    MOODCAVG Index | 2. 15/01/2016 |

    H15TXXX Index | 2. 15/01/2016 |

    H15TXXX Index | 2. 15/01/2016 |

    3. the CTL file - name.ctl

    DOWNLOAD THE DATA

    ADD

    IN THE TABLE dummy_load

    WHEN (09:13) = "Index".

    TRAILING NULLCOLS

    (

    COMPLETED name BY ' | ',.

    rate TERMINATED BY ' | '.

    COMPLETED effdate BY ' | '. ' TO_DATE (: effdate, "MM/DD/YYYY").

    )

    IN THE TABLE dummy_load

    WHEN (08:12) = "Index".

    TRAILING NULLCOLS

    (

    COMPLETED name BY ' | ',.

    rate TERMINATED BY ' | '.

    COMPLETED effdate BY ' | '. ' TO_DATE (: effdate, "MM/DD/YYYY").

    )

    invoke SQL loader in a file-> beats

    C:\Oracle\product\11.2.0\client\bin\sqlldr USERID = myid/[email protected] CONTROL=C:\temp\t\name.ctl BAD=C:\temp\t\name_bad.dat LOG=C:\temp\t\name_log.dat DISCARD=C:\temp\t\name_disc.dat DATA=C:\temp\t\name.dat

    Once this is run, the following text appears in the log file (excerpt):

    Table DUMMY_LOAD, charged when 09:13 = 0X496e646578 ('Index' character)

    Insert the option in effect for this table: APPEND

    TRAILING NULLCOLS option in effect

    Column Position Len term Encl. Datatype name

    ------------------------------ ---------- ----- ---- ---- ---------------------

    NAME                                FIRST     *   |       CHARACTER

    RATE                                 NEXT     *   |       CHARACTER

    EFFDATE NEXT * |       CHARACTER

    SQL string for the column: ' TO_DATE (: effdate, "MM/DD/YYYY").

    Table DUMMY_LOAD, charged when 08:12 = 0X496e646578 ('Index' character)

    Insert the option in effect for this table: APPEND

    TRAILING NULLCOLS option in effect

    Column Position Len term Encl. Datatype name

    ------------------------------ ---------- ----- ---- ---- ---------------------

    NAME                                 NEXT     *   |       CHARACTER

    RATE                                 NEXT     *   |       CHARACTER

    EFFDATE NEXT * |       CHARACTER

    SQL string for the column: ' TO_DATE (: effdate, "MM/DD/YYYY").

    Record 1: Ignored - all null columns.

    Sheet 2: Cast - all null columns.

    Record 3: Ignored - all null columns.

    Record 4: Ignored - all null columns.

    Sheet 5: Cast - all null columns.

    Sheet 7: Discarded - failed all WHEN clauses.

    Sheet 8: Discarded - failed all WHEN clauses.

    File 9: Discarded - failed all WHEN clauses.

    Case 10: Discarded - failed all WHEN clauses.

    Table DUMMY_LOAD:

    1 row loaded successfully.

    0 rows not loaded due to data errors.

    9 lines not loading because all WHEN clauses were failed.

    0 rows not populated because all fields are null.

    Table DUMMY_LOAD:

    0 rows successfully loaded.

    0 rows not loaded due to data errors.

    5 rows not loading because all WHEN clauses were failed.

    5 rows not populated because all fields are null.


    The bad file is empty. The discard file has the following

    H15T1Y Index | 2. 19/01/2016 |

    H15T2Y Index | 2. 19/01/2016 |

    H15T3Y Index | 2. 19/01/2016 |

    H15T5Y Index | 2. 19/01/2016 |

    H15T7Y Index | 2. 19/01/2016 |

    CPDR9AAC Index | 2. 15/01/2016 |

    MOODCAVG Index | 2. 15/01/2016 |

    H15TXXX Index | 2. 15/01/2016 |

    H15TXXX Index | 2. 15/01/2016 |


    Based on the understanding of the instructions in the CTL file, ideally the first 6 rows will have been inserted into the table. Instead the table comes from the line 6' th.

    NAMERATEEFFDATE
    H15T10Y Index2January 19, 2016



    If the INTO TABLE clauses were put in the CTL file, then the first 5 rows are inserted and the rest are in the discard file. The line 6' th would have a ""rejected - all columns null. "in the log file. "


    Could someone please take a look and advise? My apologies that the files cannot be attached.

    Unless you tell it otherwise, SQL * Loader assumes that each later in the table and what clause after the first back in the position where the previous left off.  If you want to start at the beginning of the line every time, then you need to reset the position using position (1) with the first column, as shown below.  Position on the first using is optional.

    DOWNLOAD THE DATA

    ADD

    IN THE TABLE dummy_load

    WHEN (09:13) = "Index".

    TRAILING NULLCOLS

    (

    name POSITION (1) TERMINATED BY ' | '.

    rate TERMINATED BY ' | '.

    COMPLETED effdate BY ' | '. ' TO_DATE (: effdate, "MM/DD/YYYY").

    )

    IN THE TABLE dummy_load

    WHEN (08:12) = "Index".

    TRAILING NULLCOLS

    (

    name POSITION (1) TERMINATED BY ' | '.

    rate TERMINATED BY ' | '.

    COMPLETED effdate BY ' | '. ' TO_DATE (: effdate, "MM/DD/YYYY").

    )

  • Group by excluding null values

    I want to run a select using "group by". However, I would like the results to exclude all null values in my area. Like for example the code following, but excluding NULL values.

    Select the FIELD, count (FIELD)

    of MYDATA

    FIELD group;

    How can I do this?

    Hello

    Maybe I don't understand your complete problem, but what of this

    select FIELD, count(FIELD)
      from MYDATA
     where field is not null
     group by FIELD;
    

    concerning
    Kay

  • ORA-01723 null columns are not allowed

    We create a table using DEC from a remote database. We use the DEC due to performance issues with INSERT-SELECT.
    EXECUTE IMMEDIATE 'CREATE TABLE ods_idl_vst NOLOGGING AS SELECT * FROM remote_tab@remote_db vst';
    "How can I ' ora-01723 null columns are not allowed" error? Our data base is 11g.

    Hello

    Yes it is possible, then you will need to use the trick of cast, or remove the columns in your selection.

    Herald tiomela
    http://htendam.WordPress.com

  • How to set a non-Null column in CF9 ORM?

    How to set a non-Null column in CF9 ORM?  Thank you.

    Just to recycle my answer to your other question:

    You should find what you need in the docs here: http://help.adobe.com/en_US/ColdFusion/9.0/Developing/WSB7BEC0B4-8096-498d-8F9B- 77C88878AC6C.html

    Scroll up to where it describes the properties of the DOF.

    --

    Adam

  • You can add not null column to a table that already contains data?

    Hello
    You can add not null column to a table that already contains data?
    Database 9i / 10g on RHEL
    Concerning

    Who worked in 9i?

    Looks like that it:

    SQL> select * from v$version where rownum = 1
    /
    BANNER
    ----------------------------------------------------------------
    Oracle9i Enterprise Edition Release 9.2.0.8.0 - 64bit Production
    1 row selected.
    
    SQL> create table emp2 as select * from emp
    /
    Table created.
    
    SQL> alter table emp2 add new_col integer default 0 not null
    /
    Table altered.
    
  • How to exclude null values?

    Hello
    How to exclude null values when populating a temporary table?

    Thank you
    Sollier

    Hi, Sollier,

    It depends on.
    Maybe

    WHERE   coloumn_a  IS NOT NULL
    

    If you need help, post a small example of data (CREATE TABLE and INSERT statements) and the results that you want from these data (in this case, the content of the table is filled).

  • Query for a list of null columns in the table

    Hello experts!

    It may be a silly question but I'm stuck in a this simple query:

    Question: I have a table with 10 columns and 10000 records, I want a query which can only show me the names of columns that contains null values. I just want to know the number of columns is there in my table that contains null values.

    Thank you!

    Hello
    You can query user_tab_columns:

    select column_name from user_tab_columns where table_name= and num_nulls>0
    

    HTH

  • A query to extract only the non-null columns.

    Hello

    I have a table with:

    COLA, COLB TEACHERS COLD COLE

    AA BB < null > < null > JJ

    < Null > < null > CC < null > EE


    I need a query that retrieves only 1 rows like this:

    COLA, COLB TEACHERS COLD COLE

    AA BB CC DD EE


    My version of the database is 11.2


    Thnak you

    odd design but in this case, you could go with MAX (or MIN)

    HTH

  • How to exclude the result of the query of account?

    I am stuck how can I update my request.

    Can anyone help please?

    CREATE TABLE "LOGON"."LOGON_DATA" 
       ( "CLIENT" VARCHAR2(20 BYTE) NOT NULL ENABLE,
      "ROW_ID" NUMBER NOT NULL ENABLE, 
      "OUR_ACCOUNT" VARCHAR2(1 BYTE), 
      "DATE_OF_LOGON" VARCHAR2(20 BYTE) NOT NULL ENABLE, 
      "LOGON_ID" NUMBER NOT NULL ENABLE, 
        );
    

    SAMPLE of DATA (for a "client" for a day):

    PETER123,021,N,10-01-2015,121514
    PETER123,022,,10-01-2015,121514
    PETER123,023,N,10-01-2015,221514
    PETER123,024,Y,10-01-2015,221514
    PETER123,025,Y,10-01-2015,221514
    PETER123,026,Y,10-01-2015,221514
    PETER123,027,Y,10-01-2015,221514
    PETER123,028,N,10-01-2015,221514
    PETER123,029,,10-01-2015,221559
    PETER123,030,Y,10-01-2015,221600
    PETER123,189,N,10-01-2015,225601
    PETER123,201,Y,10-01-2015,233539
    

    ...

    10 million lines

    12 c - 12.1.0.1.1 - I used 'insert' in SQL Developer to load Oracle data.

    The query below counts everything after the first 'our_account' = there is found and excludes in double "logon_id" County, but...

    WITH    got_first_row_id    AS
    (
        SELECT  client, logon_id
        ,       row_id
        ,       MIN ( CASE
                          WHEN  our_account  = 'Y'
                          THEN  row_id
                      END
                    ) OVER (PARTITION BY  client)   AS first_row_id
        FROM    LOGON_DATA
        where date_of_logon = '10-01-2015'
    )
    SELECT    client, COUNT(DISTINCT logon_id) AS cnt
    FROM      got_first_row_id
    WHERE     row_id  > first_row_id 
    GROUP BY  client
    order by COUNT(DISTINCT logon_id) desc;
    

    .. .but how can I also exclude "logon_id" number which is + 1 in the previous result?

    i.e.

    If my result will be logon_id = 225500 - I don't want to count the 225501 but I count on 225502

    The query must rely to the extent indicated in pink:

    PETER123, 021, N-10-01-2015, 121514

    PETER123, 022, 10-01-2015, 121514

    PETER123, 023, N-10-01-2015, 221514

    PETER123, 024, Y, 10-01-2015, 221514

    PETER123, 025, Y, 10-01-2015, 221514

    PETER123, 026, Y, 10-01-2015, 221514

    PETER123, 027, Y, 10-01-2015, 221514

    PETER123, 028, N-10-01-2015, 221514

    PETER123, 029, 10-01-2015, 221559

    PETER123, 030, Y, OCTOBER 1 00

    PETER123, 189, N-10-01-2015, 225601

    PETER123, 201, Y, 10-01-2015, 233539

    the expected result would be

    customer, cnt

    PETER123, 4

    Very appreciated for any help.

    Using

    If I understand your problem, the implementation in your case might be like this:

    WITH evaluated_logons AS

    (SELECT

    customer,

    logon_id,

    ROW_ID,

    MIN (DECODE (our_account, 'Y', row_id)) compared to first_row_id (partition by customer).

    LAG (logon_id, 1, -1) on previous_logon_id (partition sales order by logon_id)

    OF logon_data

    WHERE date_of_logon = TO_DATE (1 October 2015 ', 'MM DD YYYY')

    )

    SELECT

    customer,

    Count (*) AS cnt

    Of evaluated_logons

    WHERE

    -do not count Y first

    ROW_ID > first_row_id

    -count only nonsequtive logon_ids

    AND previous_logon_id + 1<>

    Customer GROUP

    NTC to ORDER BY DESC

    ;

  • Report with null column values template

    Hi all

    I'm trying to hide the columns that don't have a value or a null value. I tried to do in the template, but couldn't the way I tried to get

    Here is my requirement and the request which I use:

    Select
    uu. AAA,
    uu. BBB,
    nnn NVL (xxx.ttt, 0),
    NVL (YYY. TTT, 0) ppp
    of zzz.uu.
    (select gg.hh, ttt count (*)
    of zzz.gg.
    zzz. III
    where iii.kkk = gg.mmm
    and trunc (gg.ccc) = iii.ddd
    Gg.hh group) xxx,.
    (select gg.hh, ttt count (*)
    of zzz.gg.
    zzz. III
    where iii.kkk = gg.mmm
    and trunc (gg.ccc) = iii.ddd
    and trunc (gg.ccc) > iii.eee
    Gg.hh group) yyy
    where
    uu. AAA = xxx.hh (+)
    and uu.aaa = yyy.hh (+)

    I want to build a model in a way where the columns nnn and ppp that has a null value or 0 should not be included in the report and it should only lines whose value.

    But my problem is that I have same column aaa and bbb in the report which has value while ppp and nnn have no values. In my query aaa is employer code and bbb is the name of the employer and nnn and ppp are the name of the employer, and the employer id values that have null values.

    Finally, my requirement is the report should not display the id of the employer and the names nnn and ppp values that are null or the o, but it should display only the employer id and names for which they should have one value other than null.
    I want to know if is there a way I can do it through model or what I need to make changes to my application.
    Please let me know about it.


    Thanks in advance!

    Hello

    You can check out the code below:
    If you want to display the name of ID & Enployer employee where "nnn" value isn't Null & equl to zero:




    Note: In Calc firstly check the condition null field to avoid error 'cannot be converted to number



    465224
    TOM

    0


    985462
    JACK
    0
    5

    Thank you
    Sandeep

Maybe you are looking for