Data Pump and grants - Frustration

Hi all

I worked on it for days, and I think I'm finally to the point where I can ask a meaningful question:

My task is to copy the metadata from the 6 important patterns in a PROD db equivalent patterns in a DEV database.

Now, these patterns all interact through foreign keys, views, triggers, packages and have various subsidies
imported from these objects compile.

In addition, there are about 10 unalterable PROD users who do not need to be represented in the DEV db. Thus,.
I don't want to import subsidies for these users.

How can I import only grants that I need and leave out others?
At the present time, I have either:
* Use exclude = grant, whereby case a bunch of grants must be applied subsequently and then all the re-update
or
* do not exclude grant, in which case, I get several thousand, masking actual errors that may occur.

Did someone knows a way to solve my problem?

Thanks in advance,
Chris

Operating system: Solaris 10
DB: 10.2.0.2.0

P.S. I could create all the users of the RO in DEV, but eventually I'll do this for about 20 DB and I don't have
you want space to user space with 10 * 20 unnecessary accounts for those useful 6 * 20.

I doubt it.

You will have to go with an automated script that creates all the accounts, handles the import and drops unnecessary accounts. OR write a script of the parser that reads the import log and ignores errors on subsidies unnecessary accounts.

Hemant K Collette

Tags: Database

Similar Questions

  • Differences between Data Pump and always by inheritance, import and export

    Hi all

    I work as a junior in my organization dba and I saw that my organization still uses legacy import and export utility instead of Oracle Data Pump.

    I want to convence my manager to change the existing deal with Oracle Data Pump, I have a meeting with them to keep my points and convence them for Oracle utility pump data.

    I have a week very convencing power but I won't miss to discover myself, I can't work myself and really a need with differences of strength against import and export, it would be really appreciated if someone can put strong points against Oracle Data pump on legacy import and export.

    Thank you

    Cabbage

    Hello

    a other people have already said the main advantage of datapump is performance - it is not just a little more fast exp/imp it is massively faster (especially when combined with parallel).

    It is also more flexible (once much more) - it will even create users with exports level schema which imp can't do for you (and it is very annoying that he could never).

    It is reusable

    It has an api plsql

    It supports all types of objects and new features (exp is not - and that alone is probably reason to spend)

    There even a 'legacy' at 11.2 mode where most of your old exp parameter file will still work with him - just change exp expdp and impdp print.

    The main obstacle to the transition to datapump seems to be all "what do you mean I have to create a directory so that it works", well who and where is my dumpfile why can't it be on my local machine. These are minor things to go well beyond.

    I suggest do you some sort of demo with real data of one of your large databases - do a full exp and a full expdp with parallel and show them the runtimes for them to compare...

    See you soon,.

    Rich

  • Data pump and the users and developers of Apex_Admin

    Hello

    I use Data Pump to save the schema that I use. It works very well. Now I would like to use Data Pump export and import the users listed in the application of Apex_Admin under workspaces manage / managing the Deverlopers and users.
    Where are stored the users? How to build the EXPDP / statements IMPDP.

    Thanks for your help.

    Hello

    Schema APEX_xxxxxx or FLOWS_xxxxxx are stored users APEX is where all your application metadata and workspace. The scheme name depends on your version of the APEX.
    Maybe you're using APEXExport. Check out this blog of Johns.
    http://Jes.blogs.shellprompt.NET/2006/12/12/backing-up-your-applications/

    Kind regards
    Jari

    http://dbswh.webhop.NET/dbswh/f?p=blog:Home:0

  • I want to learn more about the data pump and table space transferable

    Please notify easy tutorials as I want to know how to import and export between oracle 10 and 11.
    Thank you

    Hello
    Please check this oracle tutorial:
    http://www.exforsys.com/tutorials/Oracle-10G/Oracle-10G-using-data-pump-import.html
    about transportable table spaces, you may consult:
    http://www.rampant-books.com/art_otn_transportable_tablespace_tricks.htm
    Kind regards
    Mohamed
    Oracle DBA

  • Question about data pump and directory/sub-directories

    I have a process that makes these calls to Data Pump:
    dbms_datapump.add_file (dph, file_name |) '.log', ' DATA_PUMP_DIR/test', filetype = > DBMS_DATAPUMP. KU$ _FILE_TYPE_LOG_FILE);
    dbms_datapump.add_file (dph, file_name |) ".dmp", ' DATA_PUMP_DIR/test', filetype = > DBMS_DATAPUMP. KU$ _FILE_TYPE_DUMP_FILE);

    He failed with the subdirectory "/ test". If I delete it, as in these calls it works.
    dbms_datapump.add_file (dph, file_name |) '.log', "DATA_PUMP_DIR" filetype = > DBMS_DATAPUMP. KU$ _FILE_TYPE_LOG_FILE);
    dbms_datapump.add_file (dph, file_name |) ".dmp", "DATA_PUMP_DIR" filetype = > DBMS_DATAPUMP. KU$ _FILE_TYPE_DUMP_FILE);

    I even did a test 'chmod 777 '.

    No idea what I am doing wrong?

    I'm on 11.0.1.6

    It's the pile of error I get:
    ORA-39002: invalid operation
    ORA-06512: at "SYS." DBMS_SYS_ERROR', line 79
    ORA-06512: at "SYS." DBMS_DATAPUMP', line 3043
    ORA-06512: at "SYS." DBMS_DATAPUMP', line 3292
    ORA-06512: at "ORA_ADMIN. DATAPUMP_UTIL", line 46
    ORA-06512: at line 1

    Chris wrote:
    I have a process that makes these calls to Data Pump:
    dbms_datapump.add_file (dph, file_name |) '.log', ' DATA_PUMP_DIR/test', filetype => DBMS_DATAPUMP. KU$ _FILE_TYPE_LOG_FILE);
    dbms_datapump.add_file (dph, file_name |) ".dmp", ' DATA_PUMP_DIR/test', filetype => DBMS_DATAPUMP. KU$ _FILE_TYPE_DUMP_FILE);

    He failed with the subdirectory "/ test". If I delete it, as in these calls it works.
    dbms_datapump.add_file (dph, file_name |) '.log', "DATA_PUMP_DIR", filetype => DBMS_DATAPUMP. KU$ _FILE_TYPE_LOG_FILE);
    dbms_datapump.add_file (dph, file_name |) ".dmp", "DATA_PUMP_DIR", filetype => DBMS_DATAPUMP. KU$ _FILE_TYPE_DUMP_FILE);

    I even did a test 'chmod 777 '.

    No idea what I am doing wrong?

    invalid syntax

    DATA_PUMP_DIR

    above is DIRECTORY Oracle
    If you want a different location, then CREATE DIRECTORY DATA_PUMP_DIR_TEST

  • Data pump access methods

    Hi, I had a glance at the manual utilities 11 GR 2 regarding the methods of access to Data Pump and I need a bit of clarification-

    My understanding is that Data Pump (DBMS_DATAPUMP) can access metadata for the table (i.e. What is needed to create the DDL of the table) via DBMS_METADATA. However, whenever the data access if he can make it through

    Data file (essentially a method not sql using transportable tablespaces) copy or
    Direct Path (which bypasses Sql) or
    External table (a table to empty the system of mapping of files).

    Q1. What is uses Direct path if it is not through Sql?

    Q2. One of the methods use Sql? (I am trying to determine which Sql plays part in all)

    Q3. I understand that in certain circumstances, that the same method is not to be used on the export and import side for example Export can use Direct path and import could use the external Table - but that is a choice you make as a user or is a choice Data Pump made automatically?

    Thank you
    Jim

    Withdrawn.

  • Petition for grant to display all objects in the DB in the data base and discovers the DOF for each.

    Dear administrators,

    I created the user but now I have a requirement for the granting of privileges to view all objects in the DB in the data base and discovers the DOF for each.

    Any help please

    Ritz

    Thanks to advise all the

  • differences between the Data Pump to back up the database and use RMAN?

    What are the differences between the Data Pump to back up the database and use RMAN? What is DISADVANTAGES and BENEFITS?

    Thank you

    Search for the backup of the database in

    http://docs.Oracle.com/CD/B28359_01/server.111/b28318/backrec.htm#i1007289

    In brief

    RMAN-> physical backup. (copies of the physical database files)

    DataPump-> logical backup. (logical data such as tables, procedures)

    Docs for RMAN-

    http://docs.Oracle.com/CD/B28359_01/backup.111/b28270/rcmcncpt.htm#

    Datapump docs

    http://docs.Oracle.com/CD/B19306_01/server.102/b14215/dp_overview.htm

    Published by: Sunny kichloo on July 5, 2012 06:55

  • Expert secrets for using RMAN and Data Pump by kamran agayev and Aman Charm

    Hello

    This book is available in INDIA:
    Expert secrets for using RMAN and Data Pump by kamran agayev and Aman Sharma

    How can I get this book in India.

    Thank you
    Alok

    Published by: user12141893 on April 6, 2012 11:25

    I tried to purchase this book online official site crawling but shipping cost was more than the cost of the book.
    Book is not available on Flipkart and Amazon

    Helios,

    OP can send a mail to me too :-).

    OP,

    We are in discussions with the editor. In India, the book by another editor partner and details of it that I'll update when would be available. Thanks for your interest in our book.

    Aman...

    PS: This is a technical forum and is not intended for topics. Please discuss contacting Keita and me by email personally. Keep the forum in order that it is served for-technical discussions.

  • Data Pump Export/Import

    Hello Forum,

    I have a question regarding imports and exports of data pump, perhaps what I should already know.

    I need to empty a table that has about 200 million lines, that I need to get rid of about three quarters of data.

    My intention is to use the data pump to export the table and and indexes and constraints etc..

    The table has no relationship to any other table, it is composed of approximately 8 columns with constraints not null.

    My plan is

    1 truncate table

    2. disable or remove index

    3 leave the constraints in place?

    4. use data pump to import a lines to keep.

    My question

    will be my clues and imported too much constraints I want to import only a subset of my exported table?

    or

    If I dropped the table after truncation, I'll be able to import my table and indexes, even if I use the sub as part of my statement to import query functionality?

    My table using the query sub in data pump functionality must exist in the database before doing the import

    or handful of data pump import as usual IE will create table indices grants and statistics etc.?

    Thank you for your comments.

    Concerning

    Your approach is ineffective.

    What you need to do is to

    create the table in select foo * bar where the

    bar of truncate table;

    Insert / * + APPEND * / into select bar * foo.

    Rebuild the indexes on the table.

    Fact.

    This whole thing with expdp and impdp is only a waste of resources. My approach generates redo a minimum.

    ----------

    Sybrand Bakker

    Senior Oracle DBA

  • Export the whole (10 GB) using Data Pump utility export base

    Hello

    I have a requirement that we have to export all of the base (10 GB) using Data Pump export utility because it is not possible to send the discharge of 10 GB in a CD/DVD for the seller of our application system (to analyze a few problems that we have).

    Now when I checked online full export is available but not able to understand how it works, as we never used this data pump utility, we use the method of normal export. In addition, the data pump will reduce the size of the dump file so it can fit on a DVD or can we use utility export parallel DB full to split the files and include them in a DVD, it is possible.

    Please correct me if I am wrong and kindly help.

    Thank you for your help in advance.

    Pravin,

    The server saves files in the directory object that you specify on the command line. So what you want to do is:

    1. from your operating system, find an existing directory or create a new directory. In case you, C:/Dump is as good a place as any.

    2. connect to sqlplus and create the directory object. Just use the path. I use linux, so my directory looks like/scratch/xxx/yyy
    If you use Windows, the path to your directory would look like C:/Dump you should not attack.

    3. don't forget to grant access to this directory. You can grant access to a single user, group of users, or the public. Just like
    any other object.

    If it helps, or if she has answered your question, please mark messages with the appropriate tag.

    Thank you

    Dean

  • Data pump full export/Full Import - lack of subsidies sys

    I finished the complete database and then full import export in the new database. With the help of Data pump. Everything worked except some users lack of subsidies sys objects in the new database.

    For example, in the original database where I took full export UserX had this grant:
    GRANT SELECT ON SYS. DBA_DATA_FILES UserX, but the same user in the new database does not grant.

    Is this expected or a bug like this? I would be grateful if someone who had the same problem to share with me.

    Oracle Version: 10.2.4

    Thank you.

    Please, read the restriction of the following:
    http://download.Oracle.com/docs/CD/B19306_01/server.102/b14215/dp_export.htm#sthref126
    "+ Grants on belonged to the SYS schema objects are never exported. + »

    Nicolas.

  • How to choose the access method (direct path or external tables) for Data Pump export?

    I have this slow data export pump, and I have a few suggestions for settings that might improve the speed. But I can't seem to pass them through the DBMS_DATAPUMP package. Is this possible?

    REPORT THE NUMBER OF PUMP_HANDLE: = DBMS_DATAPUMP. OPEN (OPERATION = > 'EXPORT', JOB_MODE = > 'TABLE', JOB_NAME = > 'EXP_DATABASE_370');

    BEGIN

    DBMS_DATAPUMP. ADD_FILE (PUMP_HANDLE, DIRECTORY = > 'EXP_DATABASE_DIR', FILENAME = > ' MY_DATA_A1.) DMP', FILETYPE = > DBMS_DATAPUMP. KU$ _FILE_TYPE_DUMP_FILE);

    DBMS_DATAPUMP. ADD_FILE (PUMP_HANDLE, DIRECTORY = > 'EXP_DATABASE_DIR', FILENAME = > ' MY_DATA_A2.) DMP', FILETYPE = > DBMS_DATAPUMP. KU$ _FILE_TYPE_DUMP_FILE);

    DBMS_DATAPUMP. ADD_FILE (PUMP_HANDLE, DIRECTORY = > 'EXP_DATABASE_DIR', FILENAME = > ' MY_DATA_A5.) TXT', FILETYPE = > DBMS_DATAPUMP. KU$ _FILE_TYPE_LOG_FILE);

    DBMS_DATAPUMP. METADATA_FILTER (PUMP_HANDLE, NAME = > 'NAME_EXPR', VALUE = > 'IN ("MY_DATABASE_370")');

    DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'INCLUDE_METADATA', VALUE = > 1);

    DBMS_DATAPUMP. SET_PARALLEL (PUMP_HANDLE, LEVEL = > 4);

    < < THIS_LINE_FAILS > > DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'ACCESS_METHOD', VALUE = > "DIRECT_PATH");

    DBMS_DATAPUMP. START_JOB (PUMP_HANDLE);

    DBMS_DATAPUMP. DETACH (PUMP_HANDLE);

    END;

    < < THIS_LINE_FAILS > > line throws an exception:

    ORA-20020: error: ORA-39001: value of the invalid argument. ORA-39049: parameter not valid name ACCESS_METHOD;

    ORA-06512: at line 10

    Replace < < THIS_LINE_FAILS > > this call fails with the same message

    DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'ACCESS_METHOD', VALUE = > "EXTERNAL_TABLES");

    Replace < < THIS_LINE_FAILS > > this call fails with the same message

    DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'ACCESS_METHOD', VALUE = > 1); / * INTEGER does not seem to work either * /.

    Replace < < THIS_LINE_FAILS > > this call also fails with a message similar

    DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'PARALLEL_FORCE_LOCAL', VALUE = > 1);

    Replacement of < < THIS_LINE_FAILS > > with this call fails also, with a quite different message

    DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'Settings', VALUE = > "DISABLE_APPEND_HINT");

    ORA-20020: error: ORA-39001: value of the invalid argument. ORA-39207: NULL value is not valid for the parameter settings. ;

    Hello

    you have ACCESS_METHOD we DATA_ACCESS_METHOD. Just give a try.

    see you soon,

    rich

  • Data pump export

    Hello

    I use

    patterns of expdp dumpfile = 31082015.dmp system/ar8mswin1256@nt11g = dbo_mobile_webresults_test

    and I face this error:

    DEU-00018: customer data pump is incompatible with the version of database 11.01.00.07.00

    I think it's a problem of version.

    I found that the database on the server that I am connected 11.2.0.1.0 - 64 bit

    and my client is 11.1.0.7.0

    I tried it on another pc and it worked.

    Thank you very much

  • With the help of Data Pump for Migration

    Hi all

    Version of database - 11.2.0.3

    RHEL 6

    Size of the DB - 150 GB

    I have to run a Migration of database from one server to another (AIX for Linux), we will use the data pump Option, we will migrate from Source to the target using schemas expdp option (5 patterns will be exported and imported on the target Machine). But it won't go live for target, after that the development of this migration team will do a job on this machine target which will take 2 days for them to fill in and to cultivate these 2 days, source database will run as production.

    Now, I have obligation which, after the development team, complete their work that I have to the changes of 2 days of source to target after what target will act as production.

    I want to know what options are available in Data Pump can I use to do this.

    Kind regards

    No business will update something that has some data that are no longer representative of live.

    Sounds like a normal upgrade, but you test it just on a copy of the direct - make sure that the process works & then play it comfortable once again, but against your last set of timely production data.

    Deans suggestion is fine, but rather than dropping coins and importation, personally I tend to keep things simple and do it all (datapump full schema if it is possible). Live in this way, you know that you put off, inclusive of all sequences and objects (sequences could be incremented, so you must re-create the fall /). Otherwise you are dividing a upgrade in stages, several measures to trace & more to examine potential conflicts. Even if they are simple, a full datapump would be preferable. Simple is always best with production data

    Also - you do not know the changes that have been made to upgrade the new environment... so you roll this back etc? Useful to look at. Most of the migration would be a db via RMAN copy / transport-endian, as you must also make sure that you inherit all the subsidies system patterns, not only summary level.

Maybe you are looking for