Doubt about the Certification

Hello..

I did my certification CCNA (Routing & Switching), and I did not always not just any job.

If you could guide me please, on what courses I can do more to support my certification and job.

Thank you

BestRegards

Varun

There is a lot more to get a job and then Qualifications. How you present yourself, how you do something, your story... etc.

But more qualities depends entirely on what you yourself would do. You want to work in a role of network? Or you want to diversify and specialize in VoIP/security/data centres etc.? Or maybe even Admin/database analyst? I think it's something you have to decide yourself my friend.

Tags: Cisco Support

Similar Questions

  • I have a doubt about the file .folio and publications

    Hello, I m new here.

    I want to start working with DPS, but I have a doubt about which version to buy.

    At the moment I have one customer just wants to publish a magazine, but my intention is to have more customers and publish more magazines.

    If I buy the unique edition of DPS, I read that I can publish a single file .folio. What it means? Each folio file represents a publication?

    Please, I need help to understand this before you purchase the software.

    Thank you very much

    Paul

    Here's a quick blog I wrote to compare the simple edition and

    multifolio apps:

    http://boblevine.us/Digital-Publishing-Suite-101-single-Edition-vs-multi-Folio-apps/

    Bob

  • Doubt about the Index

    Hi all

    Oracle Database 11 g Enterprise Edition Release 11.2.0.2.0 - 64 bit Production
    PL/SQL Release 11.2.0.2.0 - Production
    "CORE 11.2.0.2.0 Production."
    AMT for Linux: Version 11.2.0.2.0 - Production
    NLSRTL Version 11.2.0.2.0 - Production

    I have a question about the index. Is - this required that the index will be useful if we have a "WHERE" clause I tried to find myself there but do not.
    In this example I haven't used where clause used but group. But it gives a comprehensive analysis. Is it possible to get the scan interval or something else using Group by?
    SELECT tag_id FROM taggen.tag_master GROUP by tag_id 
    
    Explain Plan:
    Plan hash value: 1688408656
     
    ---------------------------------------------------------------------------------------
    | Id  | Operation             | Name          | Rows  | Bytes | Cost (%CPU)| Time     |
    ---------------------------------------------------------------------------------------
    |   0 | SELECT STATEMENT      |               |  4045 | 20225 |     6  (17)| 00:00:01 |
    |   1 |  HASH GROUP BY        |               |  4045 | 20225 |     6  (17)| 00:00:01 |
    |   2 |   INDEX FAST FULL SCAN| TAG_MASTER_PK |  4045 | 20225 |     5   (0)| 00:00:01 |
    ---------------------------------------------------------------------------------------

    Hello

    SamFisher wrote:
    Since I was on what they do full scan. Is it possible to restrict of fullscan without using where clause?
    I guess having limit clause but not quite know.

    Why?
    If this query is producing good results, then you need a full analysis.
    If fool you somehow the optimizer by doing a scan of interval, it will be slower.

  • Some doubts about the topology, interfaces and security modules

    Hello

    Below, some questions about the ODI:


    1. to use an LKM ODI always ask to use two different DATASERVERS (one for the SOURCE) and another to the TARGET?

    2. what would be the best way to create a new IKM with GROUP BY clauses?

    3. What is the required minimum PROFILE for developers users could import projects created in other ODI environments?

    4. If a particular WORK_REP is lost, it is possible that retrieve projects from version control information stored in the MASTER_REP?

    1.) Yes. LKM always loads data from one root to another.
    More than once I saw that even if there is a single physical server, several servers are configured in the topology Manager. This would lead to the use of a LKM because ODI consider 2 different servers.
    If the physical server is set only once, LKM won't be necessary.

    2.) IKM automatically adds a GROUP BY clause if it detects an aggregation function in the Interface implementation.

    3.) try to use the profile of the creator of NG.

    4.) this is not an easy task. But all the versioned objects are compressed and stored in a BLOB field in the master repository.
    You will need to know the names and versions you need to recover.
    SNP_VERSION and SNP_DATA have this information. Retrieves the field BLOB SNP_DATA and unpack using a zip utility. This will give you the XML property of the object that was transferred.
    Now, you can import this xml file and retrieve the object.

    You will need to loop through all the records in order of I_DATA, then extract to .xml file, and then import them to build the work rep.

  • Ianyone of other ridiculous notice how the questions are about the certification exams?

    I just took and failed the test of 11g upgrade.  I tried to upgrade my certification from 10g to 11g.  I did not test certification since 2005 so maybe I'm a bit out of touch, but a lot of questions they were absurd.  I studied for about two or three months for this test, which is a test upgrade, always had to cover a very wide range of knowledge.  Now, I understand that they do not want to make it easy on us, but some questions were right out of left field.  There are maybe 10 or 15 questions that were up as "If you have this parameter is set to true, then, what is the advantage?  I can manage completely questions like that and that's what I studied for, but when they ask questions where you need not only to know all the parameters and then also what that all total possible outcomes could be and then they ask you those more obscure, he just drives me crazy.  It also seemed that some answers were incorrect on a few. I know I'm ventilation here because I didn't, and I'm going to hit the books again and it resume (another $245, Ouch!), but be warned, if you have not taken a test in a certain time, you can not only take more time to study, but also to study what the results might be.  I had already purchased the study books, but I also buy the oracle practice test.  The press review of oracle that I bought had some practical issues on a cd, but they were not the same as what was in the actual trial.  Now I have to wait 14 days before I can take it again.  Well, back to the study.

    John:

    It has no State specifically, but based on his post, guess (with all the dangers it) it's a 10 g OCP.  If so, the topics on 1Z0 - 034 are much broader than those that will encounter it on 1Z0 - 050.  This review would require the OP basically start over on his preparatory review process and to focus on a broader set of information. In many ways - this isn't a bad thing from the point of view of becoming a more experienced DBA.  However, after failing an examination, I wouldn't feel comfortable not him pointing to a second review which would have the same gross effect than the first, but who is bigger, longer and probably more difficult to pass.  He doesn't also know its results until sometime in March at best.

    Questions give - they are the only aspect of the series that I cannot agree with enthusiasm. Certainly, the questions and the answers may not be identical to what is real, but similar review is not prohibited. The Q & A in each book certification give I used don't have anything like the "feel" even for them than those of the real exam.  That's what reviews STS/transcend do much better.  Their questions and their responses are formatted looks a lot like the real thing and have a similar difficulty level.  You said before (I think on OraFAQ) that you write the questions on automatic test for your books.  I suspect that the person who does never took Oracle exams.

  • Some doubts about the navigation in unifying

    Hi all

    I had a few questions about unifying navigation.

    Is it possible to move the admin mode user mode access level?

    I mean, if a particular feature as Manager of the shell I can only access from admin mode is it possible to provide access even in user mode?

    If so, how?

    My 2nd question of doubt is, currently, we can access company BPs level "Journal of society" or "Resource Manager" under shell 'Company Workspace'.

    Is it possible to move the "journal of the society" or "Resource Manager" in the folder? If yes how?

    I tried in "navigation user mode" to move the company BPs level at shell of the House, but I can't do it.

    To answer your questions:

    (1) User-Mode browser can have the user feature included. You cannot change the view mode Admin or move functions admin for user mode.

    (2) you cannot move these on the Home tab.

  • Doubts about the speed

    Hello gentlemen;

    I have a few questions, I would like to ask more experienced people here. I have a program running on a computer that has a processor i7 processor. In this computer that I have programmed in LabVIEW, meanwhile in another lab, we have another PC, a little older, a dual core 2.3 Ghz, in this pc, we perform a testing platform for a couple of modems, let us not get into the details.

    My problem is that I discovered recently that my program, I programmed in the computer, i7, much slower work in the other machine, the dual core, so the timings are all wrong and the program does not run correctly. For example, there is a table with 166 values, which, in the i7 machine are filled quickly, leaving almost without delay, however, the double machine heart, it takes a few milliseconds to fill about 20 values in the table, and because of the timing, it can fill more values and so the waveform that I use is all wrong. This, of course, live of the whole program and I can't use it as a test I need to integrate.

    I have create a .exe program in labview and try it in the different PC that's how I got to this question.

    Now, I want to know if there is actually a big problem due to the characteristics of the computer, the program is slow in one machine. I know that, to ensure the eficiently program, I need to use States, sub - vi, idea of producer-consumer machines and other things. However, I discovered this is not a problem of the speed generated by the program, because, if that were the case, the table would eventually fill it completely, however in slow computer, it is not filled more with 20 values.

    Else, helps to hide unnecessary variables in the front panel?, because the time beeing I have keep track of lots of variables in the program, so when I create the .exe I still see them runing to keep this follow-up. In the final version, that I won't need them so I'll delete some and hide front panel some. It helps that require less condition?

    I would like to read your comments on this topic, if you have any ideas in machines to States, sub - vi, etc., if there is a way to force the computer to use more resources in the Labview program, etc.
    I'm not add any VI because, in the current state, I know you will say, state machines, sub.vi and so on, and I think that the main problem is between the difference in computers, and I'm still working in the things of the State/sub-VI/etc

    Thank you once again, we just let this hollow.

    Kind regards

    IRAN.

    Get started with, using suitable as a machine for States stream you can ensure that your large table would be always filled completely before moving on, regardless of how long it takes. Believe it or not add that a delay to your curls will do more all the program run faster and smoother, because while loops are eager and can consume 100% of CPU time just a loop waiting for a button press, at the same time all other processes are fighting for time CPU.

  • Doubt about the persistent object

    Hi friends,

    I've stored data object persistent after that some time, my Simulator has taken a lot of time to load the application so I run clear.bat do fast Simulator. But after I run clear.bat. The values of what I stored in the persistent object had disappeared. Someone at - he said, therefore, it is the persistent object data are parties to cause of the performer, the clear.bat or any other reason. pls clarify my doubt friends...

    Kind regards

    s.Kumaran.

    It is b'caz of clean.bat. Clean.bat will remove all applications and unnecessary files, etc...

  • Doubts about the migration parallel to Lync 2013-> Skype4B 2015 on VCS - C (not clustered)

    Hello everyone!

    As I saw on Cisco documents, applying "B2BUA/Microsoft Interoperability" on VCS can "communicate" with just an instance Microsoft Lync pool servers, but we need to migrate the Lync server on parallel to the servers of Skype, we need to have a few "maintenance window" to migrate all users!

    Can we keep 'UP' communication for VCS (lync and Skype) pool of two servers until the end of the migration? The lync Server legacy 2013 (shared resources) with VCS today can communicate with users (migrated) for 2015 of Skype with trunk Lync existing TLS today?

    I think we generate another certificate for TLS and affecting some Skype server on the option "host approved", that's okay, I forgot something? Or I have other ways to communicate two pools Microsoft server with a VCS - C with the application "B2BUA/Microsoft Interoperability?

    Thanks for help me!

    To see some possible examples of deployment options, refer to Appendix 3 of the infrastructure Microsoft (X8.8) Deployment Guide and totalled, suggest that you also look over the guide in full as it might answer some of your questions about what is supported.

  • doubts about the css class...

    I tried to load a background image in the theme universal apex 5 in the login page.

    and I used the code found in the following link and got it works

    Apex 5.0: Theme Roller and background image

    But I doubt that can be very simple for the css professionals.

    .t-PageBody-.t-body connection

    {

    Background: URL("Sports.jpg") repeat top center white scroll;

    Color: #000000;

    do-family: Arial, Helvetica, Sans-serif;

    do-size: 12px;

    line-height: 17px;

    }

    .t - PageBody.t - body

    How do you know .t-PageBody - .t-body connection was the main class to change...

    Let me know if my interpretation is correct

    .t-PageBody - login is the main class

    and .t-Body is the upper class?


    pauljohny100 wrote:

    I tried to load a background image in the theme universal apex 5 in the login page.

    and I used the code found in the following link and got it works

    Apex 5.0: Theme Roller and background image

    But I doubt that can be very simple for the css professionals.

    .t-PageBody-.t-body connection

    {

    Background: URL("Sports.jpg") repeat top center white scroll;

    Color: #000000;

    do-family: Arial, Helvetica, Sans-serif;

    do-size: 12px;

    line-height: 17px;

    }

    .t - PageBody.t - body

    How know .t-PageBody-.t-body connection was the main class change...

    Let me know if my interpretation is correct

    .t-PageBody - login is the main class

    and .t-Body is the upper class?

    .t-PageBody--login .t-Bodyis a descendant selector. It matches any element with a class attribute that contains a t-Body value having an element ancestor with a class attribute that contains a t-PageBody--login value. There is no concept of 'main' class or 'slot' in CSS. The required selector is likely to have been determined on the supplement page using a web Inspector.

    It is advisable to take some tutorials to get at least a basic understanding of web technologies when you work with APEX.

  • Doubt about the LDAP synchronization

    Hi all

    I have sync LDAP enabled on my server of IOM. I also installed OID connector. I installed it since I want a user to be able to see DIO user resource in service to him in the "resources" tab. Now, whenever I create a new user, the user is created successfully. Now I have an access policy that grants the user the user OID resource based on its role. Now, once the user is created, I see in the OID I use. Of course, it is placed in service in the cn = default user directory but I read here that it is configurable from the LDAP container rules xml file. Now this provisioning in OID arrives with LDAP synchronization, and so I do not see any resource under the tab "resources". Then I grants it the user the OID resource by attaching the role to him and now he gets put in service to the OID as well. Now I see that based on the pre fill out cards that I put in place, this user gets provisioned to the correct container in the OID. But the question is now I find myself with two users with the same name and details in the directory of the OID. I don't want that to happen. Is there some way I can cut the somehow OID LDAP synchronization over the create user operation? Commissioning product only when I apply the role and therefore in the correct container?

    Thank you
    $id

    This is where, with a solid knowledge of the IOM is required. Should be re-evaluated the connector. For example, if the user exists, you know that you can not use the default create the task from the user. You will need to put just a spot of AutoComplete, since you know that each user will exist. You must also remove all your form variables that are managed from the user profile of the IOM.

    I suggest the following:

    Change your form to include only the user ID, the common name and the OrclGUID and name of the organization. You can use a pre-fill adapter on all those who will come from the user profile, because they already exist. If you need to move them to a different OU, after execution of the AutoComplete that defines the status of provisioned, you could start an update task the organization name field, which then the user to the appropriate ORGANIZATIONAL unit.

    You really need to think about all the tasks, and what is involved and change the connector. When you implement two methods that accomplish the same thing, you need to remove a few pieces of one of them. If you need to look at all of the tasks that will be required and the actions which are carried out. Some of them will have to be autocomplété so you can always view the status of correct resource.

    -Kevin

  • Doubt about the restoration

    Hi all

    I have a doubt.

    I took the back of a database.

    All in restoration, the same I found that it is storing thw control file in the folder $ORACLE_HOME/dbs instead of the actual file.

    Can someone explain to me why this happens?

    The acutal controlfile location is "/ orasoft/test '.

    If 'restore spfile' already fails, you have a "dummy" spfile, who does not have an entry CONTROL_FILES. I guess that you did not indicate the DBID, which is mandatory when a catalog is not used. It is an example of the documentation, how to restore the spfile:

    http://download.Oracle.com/docs/CD/B19306_01/backup.102/b14192/recov004.htm#sthref582

    After a successful restore spfile deliver 'force startup nomount', so that the instance is restarted with a correct spfile. To restore the controlfiles and the 'rest' of the database again follow the docs:

    http://download.Oracle.com/docs/CD/B19306_01/backup.102/b14192/recov004.htm#sthref564

    Werner

  • doubts about the constraints

    Hi all

    I have the structure of the table as below. I need to create a unique constraint on the bottom of columns (combined) 3. I'll be able to create? I'm having a doubt because as a single column is nullable, can I create a unique constraint on all 3 columns?

    If Yes, then how can I create it?
    column_name           Null
    col A                   N
    col B                   N
    col C                   Y

    user12133456 wrote:
    Hi all

    I have the structure of the table as below. I need to create a unique constraint on the bottom of columns (combined) 3. I'll be able to create? I'm having a doubt because as a single column is nullable, can I create a unique constraint on all 3 columns?

    If Yes, then how can I create it?

    column_name           Null
    col A                   N
    col B                   N
    col C                   Y
    

    Try this,

    SQL> ALTER TABLE employee
      2  ADD CONSTRAINT emp_unique UNIQUE (
      3      first_name,
      4      last_name,
      5      start_date
      6      );
    
  • doubts about the metadata

    Hello

    (1) metadata are stored in the system tablespace only? If metadata is also stored in the spaces of storage independent of the system, then what is the diff between the metadata in the system tablespace and an independent the system tablespace.

    (2) how to export metadata for all the whole base?

    (3) in what scenarios we export the metadata structure? and how can we copy or upgrade the structure of a database of metadata metadata from database B data.

    concerning

    hungry_dba wrote:
    Laura thanks,

    This means transportable tablespace will bring tabledata and exp/imp with lines = no will to put the metadata I mean strcuture and when connect us the data files and import metadata database target will be even a source database.

    transportable_tablespace will import tablespace and you copy both data files, so all of the information in the tables will be imported as well.
    exp/imp with lines only = no will import only metadata (only structure, no data).
    Any connections datafiles that I did not understand what you mean.

    >

    a doubt more please

    users of the database are stored in the default tablesapce that we give when creating the user, the above procedure, it will come to the target database also we import metadata and connect the data file using option transportable tablespace, what happens if the users default tablespace is system on top of procedure we cannot transfer system tablespace?

    For the first default users tablespace must never be system. Never - it will make problems you sooner or later.

    >

    Laura thanks for the suggestion, I really appreciate. There are may other procedures to clone the database and what you say is really kool by rman, but I wanted to clarify my doubts on this procedure... coz I'm looking to upgrade the Database 9i and 10g with a maximum power of 20 minutes downtime.

    OK, but why you are not simply migrating the database? Why you need exp/imp? Moving to a different server/platform as well?
    What exact 9i version do you have? And to which 10g, you are planning to upgrade?

    I have a shell of doubt more ways to database 10g... 10g software more database created in this right?

    Hmm, where did you read this "shell of 10g?

  • doubt about the archive logs

    I doubt as follows:

    Suppose that the RAC db is archive active log and archive logs are generated in destination 1.

    At the same time, if I will configure rman backup scripts, so it's going to take incremental backup more newspapers, but here the destination to keep archive logs archive is destination 2 (diff than destination 1).

    This type of configuration is good or just a waste of space...

    I'll grateful if anyone can clear my doubts.

    "it depends". really. I think he's smart to copy newspapers archived to some other destination (other bands data center/room (off site)) regularly because I want to be able to recover to a specific point, even if I lose the entire site that hosts my database. In this case, Yes, you must copy these backups in a different location. It might be useless if 2 'is just an another partition/disk/volume on the same table as 1'.

Maybe you are looking for