Doubt about the Index

Hi all

Oracle Database 11 g Enterprise Edition Release 11.2.0.2.0 - 64 bit Production
PL/SQL Release 11.2.0.2.0 - Production
"CORE 11.2.0.2.0 Production."
AMT for Linux: Version 11.2.0.2.0 - Production
NLSRTL Version 11.2.0.2.0 - Production

I have a question about the index. Is - this required that the index will be useful if we have a "WHERE" clause I tried to find myself there but do not.
In this example I haven't used where clause used but group. But it gives a comprehensive analysis. Is it possible to get the scan interval or something else using Group by?
SELECT tag_id FROM taggen.tag_master GROUP by tag_id 

Explain Plan:
Plan hash value: 1688408656
 
---------------------------------------------------------------------------------------
| Id  | Operation             | Name          | Rows  | Bytes | Cost (%CPU)| Time     |
---------------------------------------------------------------------------------------
|   0 | SELECT STATEMENT      |               |  4045 | 20225 |     6  (17)| 00:00:01 |
|   1 |  HASH GROUP BY        |               |  4045 | 20225 |     6  (17)| 00:00:01 |
|   2 |   INDEX FAST FULL SCAN| TAG_MASTER_PK |  4045 | 20225 |     5   (0)| 00:00:01 |
---------------------------------------------------------------------------------------

Hello

SamFisher wrote:
Since I was on what they do full scan. Is it possible to restrict of fullscan without using where clause?
I guess having limit clause but not quite know.

Why?
If this query is producing good results, then you need a full analysis.
If fool you somehow the optimizer by doing a scan of interval, it will be slower.

Tags: Database

Similar Questions

  • doubt about the Index Skip Scan

    Hi all

    I read the setting of Oracle performance guide (Version 11.2 Chapter 11). I just want to see index skip scan with an example. I created a table called t and inserted the test data. When I asked the table optimizer did not use the index skip scan path.

    Can you please let me know what mistake I am doing here.

    Thanks a lot for your help in advance.

    SQL > create table t (empno number
    2, ename varchar2 (2000)
    3, varchar2 (1) sex
    4, email_id varchar2 (2000));

    Table created

    SQL >
    SQL >-test data
    SQL > insert into t
    2 level, select "suri" | (level), ','suri.king' | level | ' @gmail.com'
    3 double
    4. connect by level < = 20000
    5.

    20000 lines inserted

    SQL >
    SQL > insert into t
    2 Select level + 20000, 'surya ' | (level + 20000), 'F', 'surya.princess'. (level + 20000) : ' @gmail.com '
    3 double
    4. connect by level < = 20000
    5.

    20000 lines inserted

    SQL > create index t_gender_email_idx on t (gender, email_id);

    Index created

    SQL > explain the plan for
    2 Select
    3 t
    4 where email_id = "[email protected]";

    He explained.

    SQL > select *.
    table 2 (dbms_xplan.display);

    PLAN_TABLE_OUTPUT
    ----------------------------------------------------------------------------------------------------------------
    Hash value of plan: 1601196873

    --------------------------------------------------------------------------
    | ID | Operation | Name | Lines | Bytes | Cost (% CPU). Time |
    --------------------------------------------------------------------------


    |   0 | SELECT STATEMENT |      |     4.  8076 |   103 (1) | 00:00:02 |
    |*  1 |  TABLE ACCESS FULL | T    |     4.  8076 |   103 (1) | 00:00:02 |
    --------------------------------------------------------------------------

    Information of predicates (identified by the operation identity card):
    ---------------------------------------------------

    1 - Filter ("EMAIL_ID"= "[email protected]")

    Note
    -----
    -dynamic sample used for this survey (level = 2)

    17 selected lines.

    See you soon,.

    Suri

    You have just demonstrated how your execution plan gets screwed up if you do not have your statistics

    SQL > create table t
    () 2
    3 empno number
    4, ename varchar2 (2000)
    5, varchar2 (1) sex
    6, email_id varchar2 (2000)
    7  );

    Table created.

    SQL > insert into t
    2 Select level, "suri" | (level), ', 'suri.king'| level | ' @gmail.com'
    3 double
    4. connect by level<=>
    5.

    20000 rows created.

    SQL > insert into t
    2 Select level + 20000, 'surya ' | (level + 20000), 'F', 'surya.princess'. (level + 20000) : ' @gmail.com'
    3 double
    4. connect by level<=>
    5.

    20000 rows created.

    SQL > create index t_gender_email_idx on t (gender, email_id);

    The index is created.

    SQL > set autotrace traceonly explain
    SQL >
    SQL > select *.
    2 t
    3 where email_id = "[email protected]";

    Execution plan
    ----------------------------------------------------------
    Hash value of plan: 2153619298

    --------------------------------------------------------------------------
    | ID | Operation | Name | Lines | Bytes | Cost (% CPU). Time |
    --------------------------------------------------------------------------
    |   0 | SELECT STATEMENT |      |     3.  6057.    79 (4) | 00:00:01 |
    |*  1 |  TABLE ACCESS FULL | T    |     3.  6057.    79 (4) | 00:00:01 |
    --------------------------------------------------------------------------

    Information of predicates (identified by the operation identity card):
    ---------------------------------------------------

    1 - Filter ("EMAIL_ID"= "[email protected]")

    Note
    -----
    -dynamic sampling used for this statement

    SQL > exec dbms_stats.gather_table_stats (user, 't', cascade-online true)

    PL/SQL procedure successfully completed.

    SQL > select *.
    2 t
    3 where email_id = "[email protected]";

    Execution plan
    ----------------------------------------------------------
    Hash value of plan: 2655860347

    --------------------------------------------------------------------------------------------------
    | ID | Operation | Name               | Lines | Bytes | Cost (% CPU). Time |
    --------------------------------------------------------------------------------------------------
    |   0 | SELECT STATEMENT |                    |     1.    44.     1 (0) | 00:00:01 |
    |   1.  TABLE ACCESS BY INDEX ROWID | T                  |     1.    44.     1 (0) | 00:00:01 |
    |*  2 |   INDEX SKIP SCAN | T_GENDER_EMAIL_IDX |     1.       |     1 (0) | 00:00:01 |
    --------------------------------------------------------------------------------------------------

    Information of predicates (identified by the operation identity card):
    ---------------------------------------------------

    2 - access ("EMAIL_ID"= '[email protected]')
    filter ("EMAIL_ID"= "[email protected]")

    SQL >

  • I have a doubt about the file .folio and publications

    Hello, I m new here.

    I want to start working with DPS, but I have a doubt about which version to buy.

    At the moment I have one customer just wants to publish a magazine, but my intention is to have more customers and publish more magazines.

    If I buy the unique edition of DPS, I read that I can publish a single file .folio. What it means? Each folio file represents a publication?

    Please, I need help to understand this before you purchase the software.

    Thank you very much

    Paul

    Here's a quick blog I wrote to compare the simple edition and

    multifolio apps:

    http://boblevine.us/Digital-Publishing-Suite-101-single-Edition-vs-multi-Folio-apps/

    Bob

  • Some doubts about the topology, interfaces and security modules

    Hello

    Below, some questions about the ODI:


    1. to use an LKM ODI always ask to use two different DATASERVERS (one for the SOURCE) and another to the TARGET?

    2. what would be the best way to create a new IKM with GROUP BY clauses?

    3. What is the required minimum PROFILE for developers users could import projects created in other ODI environments?

    4. If a particular WORK_REP is lost, it is possible that retrieve projects from version control information stored in the MASTER_REP?

    1.) Yes. LKM always loads data from one root to another.
    More than once I saw that even if there is a single physical server, several servers are configured in the topology Manager. This would lead to the use of a LKM because ODI consider 2 different servers.
    If the physical server is set only once, LKM won't be necessary.

    2.) IKM automatically adds a GROUP BY clause if it detects an aggregation function in the Interface implementation.

    3.) try to use the profile of the creator of NG.

    4.) this is not an easy task. But all the versioned objects are compressed and stored in a BLOB field in the master repository.
    You will need to know the names and versions you need to recover.
    SNP_VERSION and SNP_DATA have this information. Retrieves the field BLOB SNP_DATA and unpack using a zip utility. This will give you the XML property of the object that was transferred.
    Now, you can import this xml file and retrieve the object.

    You will need to loop through all the records in order of I_DATA, then extract to .xml file, and then import them to build the work rep.

  • Some doubts about the navigation in unifying

    Hi all

    I had a few questions about unifying navigation.

    Is it possible to move the admin mode user mode access level?

    I mean, if a particular feature as Manager of the shell I can only access from admin mode is it possible to provide access even in user mode?

    If so, how?

    My 2nd question of doubt is, currently, we can access company BPs level "Journal of society" or "Resource Manager" under shell 'Company Workspace'.

    Is it possible to move the "journal of the society" or "Resource Manager" in the folder? If yes how?

    I tried in "navigation user mode" to move the company BPs level at shell of the House, but I can't do it.

    To answer your questions:

    (1) User-Mode browser can have the user feature included. You cannot change the view mode Admin or move functions admin for user mode.

    (2) you cannot move these on the Home tab.

  • Doubts about the speed

    Hello gentlemen;

    I have a few questions, I would like to ask more experienced people here. I have a program running on a computer that has a processor i7 processor. In this computer that I have programmed in LabVIEW, meanwhile in another lab, we have another PC, a little older, a dual core 2.3 Ghz, in this pc, we perform a testing platform for a couple of modems, let us not get into the details.

    My problem is that I discovered recently that my program, I programmed in the computer, i7, much slower work in the other machine, the dual core, so the timings are all wrong and the program does not run correctly. For example, there is a table with 166 values, which, in the i7 machine are filled quickly, leaving almost without delay, however, the double machine heart, it takes a few milliseconds to fill about 20 values in the table, and because of the timing, it can fill more values and so the waveform that I use is all wrong. This, of course, live of the whole program and I can't use it as a test I need to integrate.

    I have create a .exe program in labview and try it in the different PC that's how I got to this question.

    Now, I want to know if there is actually a big problem due to the characteristics of the computer, the program is slow in one machine. I know that, to ensure the eficiently program, I need to use States, sub - vi, idea of producer-consumer machines and other things. However, I discovered this is not a problem of the speed generated by the program, because, if that were the case, the table would eventually fill it completely, however in slow computer, it is not filled more with 20 values.

    Else, helps to hide unnecessary variables in the front panel?, because the time beeing I have keep track of lots of variables in the program, so when I create the .exe I still see them runing to keep this follow-up. In the final version, that I won't need them so I'll delete some and hide front panel some. It helps that require less condition?

    I would like to read your comments on this topic, if you have any ideas in machines to States, sub - vi, etc., if there is a way to force the computer to use more resources in the Labview program, etc.
    I'm not add any VI because, in the current state, I know you will say, state machines, sub.vi and so on, and I think that the main problem is between the difference in computers, and I'm still working in the things of the State/sub-VI/etc

    Thank you once again, we just let this hollow.

    Kind regards

    IRAN.

    Get started with, using suitable as a machine for States stream you can ensure that your large table would be always filled completely before moving on, regardless of how long it takes. Believe it or not add that a delay to your curls will do more all the program run faster and smoother, because while loops are eager and can consume 100% of CPU time just a loop waiting for a button press, at the same time all other processes are fighting for time CPU.

  • Doubt about the persistent object

    Hi friends,

    I've stored data object persistent after that some time, my Simulator has taken a lot of time to load the application so I run clear.bat do fast Simulator. But after I run clear.bat. The values of what I stored in the persistent object had disappeared. Someone at - he said, therefore, it is the persistent object data are parties to cause of the performer, the clear.bat or any other reason. pls clarify my doubt friends...

    Kind regards

    s.Kumaran.

    It is b'caz of clean.bat. Clean.bat will remove all applications and unnecessary files, etc...

  • Doubts about the migration parallel to Lync 2013-&gt; Skype4B 2015 on VCS - C (not clustered)

    Hello everyone!

    As I saw on Cisco documents, applying "B2BUA/Microsoft Interoperability" on VCS can "communicate" with just an instance Microsoft Lync pool servers, but we need to migrate the Lync server on parallel to the servers of Skype, we need to have a few "maintenance window" to migrate all users!

    Can we keep 'UP' communication for VCS (lync and Skype) pool of two servers until the end of the migration? The lync Server legacy 2013 (shared resources) with VCS today can communicate with users (migrated) for 2015 of Skype with trunk Lync existing TLS today?

    I think we generate another certificate for TLS and affecting some Skype server on the option "host approved", that's okay, I forgot something? Or I have other ways to communicate two pools Microsoft server with a VCS - C with the application "B2BUA/Microsoft Interoperability?

    Thanks for help me!

    To see some possible examples of deployment options, refer to Appendix 3 of the infrastructure Microsoft (X8.8) Deployment Guide and totalled, suggest that you also look over the guide in full as it might answer some of your questions about what is supported.

  • doubts about the css class...

    I tried to load a background image in the theme universal apex 5 in the login page.

    and I used the code found in the following link and got it works

    Apex 5.0: Theme Roller and background image

    But I doubt that can be very simple for the css professionals.

    .t-PageBody-.t-body connection

    {

    Background: URL("Sports.jpg") repeat top center white scroll;

    Color: #000000;

    do-family: Arial, Helvetica, Sans-serif;

    do-size: 12px;

    line-height: 17px;

    }

    .t - PageBody.t - body

    How do you know .t-PageBody - .t-body connection was the main class to change...

    Let me know if my interpretation is correct

    .t-PageBody - login is the main class

    and .t-Body is the upper class?


    pauljohny100 wrote:

    I tried to load a background image in the theme universal apex 5 in the login page.

    and I used the code found in the following link and got it works

    Apex 5.0: Theme Roller and background image

    But I doubt that can be very simple for the css professionals.

    .t-PageBody-.t-body connection

    {

    Background: URL("Sports.jpg") repeat top center white scroll;

    Color: #000000;

    do-family: Arial, Helvetica, Sans-serif;

    do-size: 12px;

    line-height: 17px;

    }

    .t - PageBody.t - body

    How know .t-PageBody-.t-body connection was the main class change...

    Let me know if my interpretation is correct

    .t-PageBody - login is the main class

    and .t-Body is the upper class?

    .t-PageBody--login .t-Bodyis a descendant selector. It matches any element with a class attribute that contains a t-Body value having an element ancestor with a class attribute that contains a t-PageBody--login value. There is no concept of 'main' class or 'slot' in CSS. The required selector is likely to have been determined on the supplement page using a web Inspector.

    It is advisable to take some tutorials to get at least a basic understanding of web technologies when you work with APEX.

  • Doubt about the LDAP synchronization

    Hi all

    I have sync LDAP enabled on my server of IOM. I also installed OID connector. I installed it since I want a user to be able to see DIO user resource in service to him in the "resources" tab. Now, whenever I create a new user, the user is created successfully. Now I have an access policy that grants the user the user OID resource based on its role. Now, once the user is created, I see in the OID I use. Of course, it is placed in service in the cn = default user directory but I read here that it is configurable from the LDAP container rules xml file. Now this provisioning in OID arrives with LDAP synchronization, and so I do not see any resource under the tab "resources". Then I grants it the user the OID resource by attaching the role to him and now he gets put in service to the OID as well. Now I see that based on the pre fill out cards that I put in place, this user gets provisioned to the correct container in the OID. But the question is now I find myself with two users with the same name and details in the directory of the OID. I don't want that to happen. Is there some way I can cut the somehow OID LDAP synchronization over the create user operation? Commissioning product only when I apply the role and therefore in the correct container?

    Thank you
    $id

    This is where, with a solid knowledge of the IOM is required. Should be re-evaluated the connector. For example, if the user exists, you know that you can not use the default create the task from the user. You will need to put just a spot of AutoComplete, since you know that each user will exist. You must also remove all your form variables that are managed from the user profile of the IOM.

    I suggest the following:

    Change your form to include only the user ID, the common name and the OrclGUID and name of the organization. You can use a pre-fill adapter on all those who will come from the user profile, because they already exist. If you need to move them to a different OU, after execution of the AutoComplete that defines the status of provisioned, you could start an update task the organization name field, which then the user to the appropriate ORGANIZATIONAL unit.

    You really need to think about all the tasks, and what is involved and change the connector. When you implement two methods that accomplish the same thing, you need to remove a few pieces of one of them. If you need to look at all of the tasks that will be required and the actions which are carried out. Some of them will have to be autocomplété so you can always view the status of correct resource.

    -Kevin

  • Doubt about the passage collection (Pl/SQL table) in a procedure.

    Hi all

    I have developed a package of sample with procedure 1. Here, I spent the output to a table in the collection data and I am passing the array of the collection as a parameter out.
    When I run the proc, it worked for 1 scenario, but did not work for the other. I posted two scenarios after the code.
    pkg spec:
    
    create or replace
    PACKAGE IMP_EXP_BKUP_PKG
    AS
      TYPE test10_tbl2 IS TABLE OF test10.t1%type INDEX BY BINARY_INTEGER;
      v2_test10 test10_tbl2;
      PROCEDURE manpower_list(v1 number, v2 out test10_tbl2);
    END IMP_EXP_BKUP_PKG;
    
    Pkg Body:
    
    create or replace
    PACKAGE BODY IMP_EXP_BKUP_PKG 
    AS 
    PROCEDURE manpower_list(v1 number, v2 out test10_tbl2)  AS
    BEGIN
      SELECT t1 BULK COLLECT INTO v2 FROM test10 WHERE t4 = v1; 
    END;
    END IMP_EXP_BKUP_PKG;
    Scenario 1:
    DECLARE
      v2 imp_exp_bkup_pkg.test10_tbl2;
    BEGIN
      imp_exp_bkup_pkg.manpower_list('10', v2);
      FOR i IN v2.FIRST..v2.LAST
      LOOP
        DBMS_OUTPUT.PUT_LINE(v2(i));
      END LOOP;
    END;
    Worked well.

    Scenario 2:
    DECLARE
      --v2 imp_exp_bkup_pkg.test10_tbl2;
      TYPE typ_tbl2 IS TABLE OF test10.t1%type INDEX BY BINARY_INTEGER;
      v2 typ_tbl2;
    BEGIN
      imp_exp_bkup_pkg.manpower_list('10', v2);
      FOR i IN v2.FIRST..v2.LAST
      LOOP
        DBMS_OUTPUT.PUT_LINE(v2(i));
      END LOOP;
    END;
    
    Error:
    ORA-06550: line 6, column 3:
    PLS-00306: wrong number or types of arguments in call to 'MANPOWER_LIST'
    Is not here.

    I want to just make sure that, are we supposed to use the same type that we have breeding stock in the package for the declaration of the variables?

    SamFisher wrote:

    I want to just make sure that, are we supposed to use the same type that we have breeding stock in the package for the declaration of the variables?

    Yes, you MUST use the same type definition.

    SY.

  • Doubt about the restoration

    Hi all

    I have a doubt.

    I took the back of a database.

    All in restoration, the same I found that it is storing thw control file in the folder $ORACLE_HOME/dbs instead of the actual file.

    Can someone explain to me why this happens?

    The acutal controlfile location is "/ orasoft/test '.

    If 'restore spfile' already fails, you have a "dummy" spfile, who does not have an entry CONTROL_FILES. I guess that you did not indicate the DBID, which is mandatory when a catalog is not used. It is an example of the documentation, how to restore the spfile:

    http://download.Oracle.com/docs/CD/B19306_01/backup.102/b14192/recov004.htm#sthref582

    After a successful restore spfile deliver 'force startup nomount', so that the instance is restarted with a correct spfile. To restore the controlfiles and the 'rest' of the database again follow the docs:

    http://download.Oracle.com/docs/CD/B19306_01/backup.102/b14192/recov004.htm#sthref564

    Werner

  • Doubt about the expression of postfix

    That is the question

    public class {Twisty
    {index = 1 ;}
    int index;

    Public Shared Sub main (String [] args) {}
    new Twisty (.go ());
    }

    void getCurrentCard {}
    int [] [] dd = {{9,8,7}, {6,5,4}, {3,2,1,0}};
    System.out.println (DD [index ++] [index ++]);
    }
    }

    What is the result?
    How the postfix ++ operator works here?

    >

    What is the result?
    How the postfix ++ operator works here?

    Result = 4
    index = 1

    System.out.println(dd[index++]
    

    index = 2

    System.out.println(dd[index++][index++]);
    

    index = 3

    index is increased after treatment

  • doubts about the constraints

    Hi all

    I have the structure of the table as below. I need to create a unique constraint on the bottom of columns (combined) 3. I'll be able to create? I'm having a doubt because as a single column is nullable, can I create a unique constraint on all 3 columns?

    If Yes, then how can I create it?
    column_name           Null
    col A                   N
    col B                   N
    col C                   Y

    user12133456 wrote:
    Hi all

    I have the structure of the table as below. I need to create a unique constraint on the bottom of columns (combined) 3. I'll be able to create? I'm having a doubt because as a single column is nullable, can I create a unique constraint on all 3 columns?

    If Yes, then how can I create it?

    column_name           Null
    col A                   N
    col B                   N
    col C                   Y
    

    Try this,

    SQL> ALTER TABLE employee
      2  ADD CONSTRAINT emp_unique UNIQUE (
      3      first_name,
      4      last_name,
      5      start_date
      6      );
    
  • doubt about the collection of object type

    Hi all

    I'm a newbie!

    I tried an array of type object index, as follows.

    declare

    type ty_tab is table of the directory index ty_test;

    v_tab ty_tab;

    Start
    v_tab (1): = ty_test ('ashok', 1000);
    v_tab (2): = ty_test ('rashmi', 2000);
    v_tab (3): = ty_test ('baby', 3000);

    for me in v_tab.first... loop of v_tab. Last

    dbms_output.put_line (v_tab (i));

    end loop;

    end;

    /

    While running the script above, I got the following error message.
    ORA-06550: line 14, column 8:
    PLS-00306: wrong number or types of arguments in the call to "PUT_LINE '.
    ORA-06550: line 14, column 8:
    PL/SQL: Statement ignored

    Can anyone suggest me where I'm wrong.


    Thanx

    DBMS_OUTPUT. Put_line argument is a scalar value. It does not composite types. You should pass the individual attributes. Assuming that ty_test is:

    SQL> create or replace
      2    type ty_test
      3      as object(
      4                name varchar2(10),
      5                val  number
      6               )
      7  /
    
    Type created.
    
    SQL> 
    

    Use:

    declare
        type ty_tab is table of ty_test index by binary_integer;
        v_tab ty_tab;
    begin
        v_tab(1):=ty_test('ashok',1000);
        v_tab(2):=ty_test('rashmi',2000);
        v_tab(3):=ty_test('baby',3000);
        for i in v_tab.first .. v_tab.last loop
          dbms_output.put_line(rpad(v_tab(i).name,11) || v_tab(i).val);
        end loop;
    end;
    /
    ashok      1000
    rashmi     2000
    baby       3000
    
    PL/SQL procedure successfully completed.
    
    SQL> 
    

    SY.

Maybe you are looking for

  • Can I return the messages archived in their original form?

    I've archived my emails. They are now useless, because the header is more said who was the recipient, and they are all mixed. Can I restore them in the folder sent in their original form?

  • Mobile Home with Yosemite/EL Capitan record

    Hello everyone I have a small business with 30 macs odd configuration. The server runs 10.7.5 and 10.9.5 customers.  I have long and hard battle thanks to a non-existent documentation on records of mobile home that I managed to create a stable enviro

  • QuickLook 3 - error (not found user information)

    Hello! When I try to open QuickLook 3 on my laptop HP ProBook 4510 s, I get the following error: Error: No user information found. Cannot continue. Do you know why is that? Thx for your help

  • What is the name of the distribution of windows without code optimization?

    Hello. I heard there is a version of windows including the compilers of code optimization in that has been disabled and if the kernel debugging is easier. Do you know which distribution is it? And you know where to get it? Sincerely, L0TExp

  • Connection for external VGA monitor?

    Model: HP ENVY TouchSmart 15-j052nr My laptop is not VGA port and I didn't know when I bought this laptop. Can suggest how to connect my laptop to Dell VGA external display? Some forums are saying you can buy USB cable VGA or HDMI VGA cable but the s