daily export backup (via datapump) of a 600 GB production database

Hi guys,.
I have a 600GB DB.
10G database.
Currently, I have rman daily tape backups.

According to your experience, should I worry in order to perform a daily export of the database for the production of such an amount?
do you think it is useful?

Thank you

If you can export dump backups without putting too much load on the server and DB during off-peak hours, my personal suggestion is to go, for the following reasons:

(1) there are chances that data in other tables are accidentally deleted and in which case you can use export dumps just restore the table in case it is a kind of table of static type and not too frequently updated.

(2) you get all the structures of tables/indexes and if there are accidental changes to it you can know what the original structure would havce resembled shocks.

Tags: Database

Similar Questions

  • Backup error DataPump

    Hello
    I have created a backup of datapump task which suppose to repeat every day using the following script. But it fails after an execution to return an error-"ORA-39000: bad dump file specification"
    ORA-31641: failed to create the dump file 'e:\oracledmp\EXPDAT01. DMP ".
    ORA-27038: created file already exists
    OSD-04010: < create > option specified, the file already exists.

    How can I replace the file. or how I can set some parameters to the file in which each name may be different. Thanks in advance
    ----------------------------------------------------
    PL/SQL declares
    NUMBER of H1;
    Start
    Start
    H1: = dbms_datapump.open (operation = > 'EXPORT', job_mode = > job_name 'FULL' = > 'EXPORTFullDB', version = > 'COMPATIBLE');
    end;
    Start
    dbms_datapump.set_parallel (handle = > h1, degree = > 1);
    end;
    Start
    dbms_datapump.add_file (handle = > h1, filename = > ' EXPDAT.) (Login", directory = >"Dmp", filetype = > 3);
    end;
    Start
    dbms_datapump.set_parameter (handle = > h1, name = > 'KEEP_MASTER', value = > 0);
    end;
    Start
    dbms_datapump.set_parameter (handle = > h1, name = > "ESTIMATION", value = > 'BLOCKS');
    end;
    Start
    dbms_datapump.add_file (handle = > h1, filename = > 'EXPDAT%U.DMP', directory = > "Dmp", filetype = > 1);
    end;
    Start
    dbms_datapump.set_parameter (handle = > h1, name = > 'INCLUDE_METADATA', value = > 1);
    end;
    Start
    dbms_datapump.set_parameter (handle = > h1, name = > 'DATA_ACCESS_METHOD', value = > "AUTOMATIC").
    end;
    Start
    dbms_datapump. START_JOB (handle = > h1, skip_current = > 0, abort_step = > 0);
    end;
    Start
    dbms_datapump. Detach (handle = > h1);
    end;
    end;

    dbms_datapump.add_file (handle-online h1, => 'EXPDAT%U.DMP' file name, directory => "Dmp" file-online 1 type);

    You could do something like:

    declare
    mydate:=to_char(sysdate,'yyyymmddhh24miss');
    begin
    dbms_datapump.add_file(handle => h1, filename => 'EXPDAT_'||mydate||'_%U.DMP', directory => 'Dmp', filetype => 1);
    end;
    

    Nicolas.

  • How do I export/backup BIOS to my laptoop HP DV6T

    I have a HP Pavilion DV6T laptop and I want to export/backup of the BIOS and check it on my local NAS server. I don't have a problem at the moment.  This is 'just in case '.

    The problem is, I can't find a way to accomplish the backup BIOS.  Someone who did this please give some advice I can follow any box backup?

    HP does not provide a manual method or software to do this, but when you upgrade the bios and the HP_TOOLS partition is intact, it will save the old bios in a folder on the partition of tools, it also stores a copy of the current bios file to a different folder.

    They can be found in the bios folder on partition HP_TOOLS

    in the folder bios, there are 3 other files, current is the currently installed bios, previous is the version of the bios that was replaced last time the bios has been updated. Inside these folders will be a .bin and .sig, file save both and not mix it up with other bios bin or GIS, files cannot be flashed without the appropriate GIS file that matches the bin file.

    You may temporarily need to assign a drive letter to the partition of tools if it is hidden, use disk management to do this, be sure to remove the drive letter when you are finished.

  • Export backup

    Hello
    We are facing a very strange problem. We have imported a Level0 export backup our Essbase database to our dev system. We do not see some of our data in Level0 in dev but we see it in production. The backup of export, said that it completed successfully. Is there any problem in our backup export script? Any that have faced this situation. We run Essbase 9.3.1.3.0
    Any help will be appreciated, and this is the kind of an emergency.

    Thank you
    V

    Is that what your export more than two gigabytes? According to me, even for 64 bit Essbase, Essbase will reel in two files and paste un.1 or _1 or something like that on the end of the file is put on hold.

    In addition, if you use parallel exports, you have to shoot all the files which are also subject to the coil to a new file when more than two gigabytes.

    Failing all the above, can you dive into your file to export (s) and see if missing data are there? This can be a bit messy as the native export format is a little difficult to read - simply to interpret a bit to understand what are the rare combinations it writes the dense data. CALCDAT.txt of Sample.Basic file is an easy location to this experience.

    Kind regards

    Cameron Lackpour

  • Emergency on the export backup

    HII all,.

    I have

    Oracle db: 10.2.0.3
    backup: backup of exp (9i)

    I 9i export backup full backup. Now, I need to import the database.
    For this I must create respective storage areas? or import repositories directly creates? He import will then create to where I need to make changes to the data file names?

    Emergency aid is needed friends!

    Thank you.

    Jin

    It must create all the storage space or even import will be by mistake... Also search for users...

    Kind regards
    Deepak

  • I started the OS from a backup hard drive. Now My Adobe products does not work. "licensing does not work for this product" error code 150:30. Help me please!

    I started the OS from a backup hard drive. Now My Adobe products does not work. "licensing does not work for this product" error code 150:30. Help me please!

    Reinstall the software correctly. Migration / backups do not work due to the specific requirements of the activation system.

    Mylenium

  • Cannot export TEXT Index preferences via datapump

    Hello

    Source db 10.2.03

    Source o/s: Win 2003

    Target db: 12.2.0.1

    O/s goal: win 2012

    Please bear with me, since I am new to the Oracle TEXT and I do not have a background of developer side Oracle.

    I took a datapump against my user on the source db schema and doesn't have a datapump imported on target db, and it does not create index on one of the table (received error DRG-10700), doing some research on Google and searching through Oracle metalink, I found that my table contains a search index of text for which certain preferences ctx that are present in the source never made to target db in order to solve the problem, I had to create these preferences manually on target db, so my question is does anyone know why datapump exported not ctx default preferences?

    Here is a post that I found to be useful, which recommended to script the source db and recreate it on target db.

    https://community.Oracle.com/thread/2152091?start=0

    Is it reasonable to assume that datapump doesn't handle/export ctx preferences?

    -Learner

    This post may be useful - https://asktom.oracle.com/pls/asktom/f?p=100:11:0:P11_QUESTION_ID:7469700400346057467

  • Backup via Export OVF

    Pessoal estou utilizando o Export OVF model, so can not backup uma Máquina fazer very virtual to EU desligar a Máquina works direitinho, mas AI as vem o meu big problema, several nao can ser desligadas Máquinas.

    Alguem knows uma forma Fazer esse backup mesmo a virtual machine of is online?

    SE o seu licenciamento host for corn than Essentials voce pode fazer download do em VMware Data Recovery 1.2.1:

    http://downloads.VMware.com/d/info/datacenter_downloads/vmware_vsphere_4/4_0

  • Software BlackBerry BB link do not backup via cable, wifi sync still works

    Hello a few months ago, I noticed my classic used BB connect via cable to the BB link. Yes, I tried other ports and other cables. Yes, he stays cool and yes I see classic windows Explorer if really the cable and port are working well.

    But the cable is required for backup of BB. Shame, I'm so hot on any other data PC returns with several internal & external readers.

    I don't remember if it was after a device OS updated. as I only need to back up once a month or more.

    I read the forum seen reloading BBLInk. did not work. I went to re - download link, but I see that my version is 1.2.3.56 and download day is 1.2.0.52 which seems backwards.

    All thoughts, I will try this version first?

    (why we are forced to update BB OS with bl * daily messages dy practically forcing us to be updated)

    with the general 7/10 s/o updates cause more problems than they solve - if it ain't broke, don't fix it.

    SOLUTION - BOOM. (Problem when connected via the LINK says BB cable still disconnected)

    I tried to install on a different pc with nothing doesn't so I thought to be a thing of the phone.

    parameters--> storage and access--> access USB--> EXTINCTION mass storage mode

    (I thought by activating this option it permits mass storage at the same time as connected back to the top because my Bold9900 did not allow this simulateously).

    PLUGGED in DEVICE NEW [even if when I installed the last link it also installed mix & drivers] WHEN I PLUGGED in, the UNIT has GIVES ME OPTION INSTALL the DRIVERS. MASS STORAGE; SD ACCESS; IGNORE.

    Select the DRIVER INTALL

    Drivers have been installed from the device to the PC, a new icon in the systray, click on the drivers have been installed, but first item says "DEVICE DISCONNECTED", so when pilots I finished DEVICE UNPLUGGED AND re PLUGGED & BBLINK BOOM now says device has been plugged in.

    resolved, return to fact & done.

  • Application error backup via LCM HFM

    Hello

    We have 2 HFM applications on version + 11.1.2.3.500.   I have the following error message when you try to take an export of HFM via LCM applications. I don't know where to start by fixing this problem. I tried to reconfiguration of HFM Web but that does not solve the problem.

    com.hyperion.lcm.common.LCMLogger] [SRC_METHOD: logMessages:939] an error/exception occurred during the operation. Nested exception is []

    java.util.MissingResourceException: can't find resource for bundle java.util.PropertyResourceBundle, key EPMLCM-37006

    at java.util.ResourceBundle.getObject(ResourceBundle.java:374)

    at java.util.ResourceBundle.getString(ResourceBundle.java:334)

    at com.hyperion.lcm.common.manager.ManagerException.getCloudErrorCode(ManagerException.java:460)

    at com.hyperion.lcm.common.manager.ManagerException.getStatusCode(ManagerException.java:971)

    at com.hyperion.lcm.handler.util.GroupingConfiguration.addTaskMessage(GroupingConfiguration.java:302)

    at com.hyperion.lcm.handler.util.GroupingConfiguration.addTaskMessage(GroupingConfiguration.java:295)

    at com.hyperion.lcm.handler.util.ArtifactListingParser.complete(ArtifactListingParser.java:147)

    to com.hyperion.lcm.handler.util.ArtifactListingParser. < init > (ArtifactListingParser.java:137)

    at com.hyperion.lcm.handler.util.ArtifactListingParser.migrate(ArtifactListingParser.java:469)

    at com.hyperion.lcm.handler.ArtifactHandler.execute(ArtifactHandler.java:384)

    at com.hyperion.lcm.handler.TaskHandler.runTasks(TaskHandler.java:403)

    at com.hyperion.lcm.handler.TaskHandler.execute(TaskHandler.java:86)

    at com.hyperion.lcm.clu.async.AsyncMigrator.run(AsyncMigrator.java:56)

    at java.lang.Thread.run(Thread.java:662)

    On the front end or shared service I get

    Service not available error

    Measures taken

    =============

    1. I set the servers using the HFM Tuning guide.

    2. the reboots made

    3 connection to the version.505

    This issue has been partially addressed by

    1. modify the logging.xml file located in the Oracle_Home/Middleware/user_projects/domains/EPMSystem/config/fmwconfig/servers/FoundationServices0

    2. change the sections that follow, and set its TRACE logging: 32

    We found error

    Could not unlock the Journal ID by the cluster controller loading magazines

    This is due to the enormous tasks of verification and data

    Select count (*) in the app_data_audit;

    266152

    Select count (*) in the app_data_audit;

    362237

    Select count (*) in the app_errorlog;

    380179

    Archived and trimmed of these tables. We were able to take a full backup of the second application, including magazines.

    Thank you

    Anjum

  • Sequence after import via DataPump behavior

    Hi friends,

    I'm under Oracle DB 11.2.0.3 on Windows 2008 R2 SP1 servers and I faced a strange behavior sequences after importing a schema via Data Pump.

    The export is done in this way:

    EXPDP userid/password dumpfile = logfile = directory = remap_dumpfile = y (no news)

    Importation is made this way

    IMPDP userid/password dumpfile = logfile = directory = =(old_one:new_one) remap_tablespace = remap_schema (old_ones:new_ones, so on...)

    Import works fine. There is no errors and the sequences are thus imported without warning.

    Strange behavior, seems that sequences of "reset". When we call a sequence the NEXTVAL is just lower than the values already stored in the database, and we get ORA-00001 much. The sequence should know as vale. I don't have this problem when you use exp/imp, just through DataPump.

    So that when we create an order which will receive the value of 100, for example, because we have 99 commands on the system, Oracle suggests a value less than 99 or even the value number one (01).

    Then, we wrote a script to check the CURVAL of the sequences on the basic scheme to recreate sequences using the initial value on the new imported schema.

    Did anyone face this problem before?

    Any suggestions?

    TKS a lot

    Hello

    You should definitely make the consistent export - is not be default in datapump (although in previous versions, you might think that it was because of misleading him he used to write informational messages).

    You can either use flashback_time = systimestamp, lalshback_scn = xxxxx (where you work on what SNA use) or you are on 11.2 you can even use compatible = y as oracle reintroduced to facilitate upgrades of exp for the people.

    That might solve the problem, but if the number is reset to 1 in some cases it may be another problem.

    See you soon,.

    Harry

  • Export backup error when using where the clause

    I'm breathless Oracle9i on solaris platform. Whem I'm taking backup of a table export it gives below error:


    "tables swtiob exp = NDC_ATMPROOF_HIST file = NDC_ATMPROOF_HIST.dmp log = NDC_ATMPROOF_HIST.log query =" where PROOF_DATE > = July 1, 2010 "" statistics = none
    LRM-00112: multiple values not allowed for the parameter "query".

    EXP-00019: failure of the treatment of parameters, type 'HELP EXP = Y' help
    EXP-00000: export completed unsuccessfully

    You need avoid stuff like this:

    $ exp scott/tiger tables=emp file=emp.dmp log=emp.log query=\"where HIREDATE\>\'09-JUN-1981\'\"
    
    Export: Release 9.2.0.8.0 - Production on Wed Jul 7 12:54:48 2010
    
    Copyright (c) 1982, 2002, Oracle Corporation.  All rights reserved.
    
    Connected to: Oracle9i Enterprise Edition Release 9.2.0.8.0 - 64bit Production
    With the Partitioning, OLAP and Oracle Data Mining options
    JServer Release 9.2.0.8.0 - Production
    Export done in US7ASCII character set and AL16UTF16 NCHAR character set
    server uses WE8ISO8859P1 character set (possible charset conversion)
    
    About to export specified tables via Conventional Path ...
    . . exporting table                            EMP          6 rows exported
    EXP-00091: Exporting questionable statistics.
    Export terminated successfully with warnings.
    $
    
  • automatic backup via timecapsule eventhough WiFi is disabled?

    How this work? I have the timecapsule connected via cable and wireless. I have most of the time the wireless not enabled. But still, the time machine related backups successfully. Not timecapsule active wireless connection automatically?

    Thank you

    René

    Time Capsule cannot control the wifi on your Mac.

    For clarity, you have your desktop Mac Pro, right?

    I ask because many users mistakenly post in this forum (Mac Pro desktop computer) when they really have a MacBook Pro.  With laptops Mac, the Mac will still make hourly backups not connected to the TM volume and when it is then connected to the volume TM backups will be written on this volume and deleted the Mac.

    On the time Machine local snapshots - Apple Support

  • SGE2010 how to trigger backups via SNMP?

    Hello

    I've tried, so unsucessfully, to trigger backups on our switches SGE201 TFTP server. I have backups testesd TFTP through the web interface, and who does not. I need SNMP as I need a method for trgiger scriptable backups on a regular basis. I run the query SNMP from a RedHat Linux server. So far, I have the following query worked, but it is a failure:

    snmpset - v - 1 c COMMUNITY SWITCH. MGMT. JPM ADDRESS 1.3.6.1.4.1.9.6.1.101.87.2.1.7 I have 2 1.3.6.1.4.1.9.6.1.101.87.2.1.8 I have 3 1.3.6.1.4.1.9.6.1.101.87.2.1.9 has TFTP.SERVER.IP.ADDRESS 1.3.6.1.4.1.9.6.1.101.87.2.1.11 s 1.3.6.1.4.1.9.6.1.101.87.2.1.17 FILE name I have 4

    The error I get is generic, and the same query failed on several switches operating the 3.0.0.18 software. The switch is set with the community having full access from the IP address of the SNMP-admin server.

    If someone here was able to run SNMP triggers backups and would be ready to post their query and/or software that would be greatly appreciated. Any other suggestions, comments or tips are also welcome. Thank you for your time.

    Jeff,

    The procedure to download or update the config via SNMP is as follows:

    (1) download the MIB files that we officially freed from cisco.com

    http://www.Cisco.com/Cisco/software/release.html?mdfid=282414069&flowid=3650&softwareid=283415684&release=3.0.0&relind=available&rellifecycle=&RelType=latest

    (2) compile the MIB on MIB browser in format for example MG-SOFT, Kortright. Make sure that there is no error in compilation.

    (3) configure SNMPv2 or SNMPv3 accordingly on switch SGE2010

    (4) search for the CISCOSBCopy.mib file.

    The OID is SNMPv2 - SMI:enterprises (1.3.6.1.4.1) .cisco (9) .otherEnterpise (6) .ciscosb (1) .odm1 (101). CISCOSBCopy (87)

    With the help of rlCopyTable (2), create a new entry on this particular table:

    a. rlCopyRowStatus: 4 (createAndgo)

    b. rlCopySourceLocation: 1 (local)
    c. rlCopySourceIpAddress: 0.0.0.0

    d. rlCopySourceUnitNumber: 1
    e. rlCopySourceFileName: (empty)
    f. rlCopySourceFileType: 3 (config startup), 2 is for execution of config

    g. rlCopyDestinationLocation: 3 (tftp)
    h. rlCopyDestinationIpAddress: 192.168.10.22 (ip address of the tftp server)

    i. rlCopyDestinationUnitNumber: 1
    j. rlCopyDestinationFileName: 0 x 61: 62:63 (in hexadecimal, abc)

    By doing this, it should be able to backup startup config devices unit to remove the tftp server.

  • Avoid photos lost or missing/proper export/backup of catalog

    I am an amateur photographer who mainly take pictures of family and friends. Our PC just crashed about 3 weeks, so we bought a Mac and update of Lightroom backup off on our external hard drive as the internal hard drive from old PC. I got 4 Lightroom for almost 2 years and have been exporting photos in different folders on both drives (external and internal old PC) in order to not lose them, print them or simply classify elsewhere. For some reason, Lightroom showed hundreds of random as "missing or lost" photos THREE times since we moved to the Mac, and this last time instead of completely erase the Lightroom or charge, I spent 4 hours of hunting and gathering a TON of different files. I know I missed the part on export correctly to avoid this problem, but now that I have this huge mess, I have a few questions.

    1 how/why Lightroom is randomly pick up some files previously exported to lose without losing not others in the same exact import in the same exact folder? How or can I program it not to do so that I don't end up with 400 + photos "missing or lost" for the 4th time in a month?

    2 How to export? Obviously put on a desk or a hard drive, it is difficult for Lightroom to find later, is why the program loses records and I pull my hair back.

    3. can I save my catalog each time I have import new photos to go back if Lightroom shows a large majority of my photos "missing or lost" again?

    Thanks in advance for all your help. I'm so lost, just at the moment where I thought that I understood.

    Molly

    1. Lightroom is not 'lose' the files, this occurs because the user has done something wrong. You cannot move, rename, or delete these photos (or the folders that contain them) outside of Lightroom. If you stop managing your photos outside of Lightroom and do all the managing photos in Lightroom, the problem disappears. Better yet, develop a "workflow" that doesn't require you to move pictures from here to there. To resolve the problem with the missing photos, use these instructions: Adobe Lightroom - find folders and files moved or missing
    2. I would not use the export strategy that you have defined. I would export photos only when you need it outside of Lightroom (example: e-mail, web to print,) and once the exported photos have become useless outside of Lightroom, you delete the exported photos. The original is always managed in Lightroom, so you can find it whenever you need, and you can reproduce if necessary export.
    3. See my response to the #1, making backups is a good thing to do, in fact, I'd say it's mandatory, but is not the way to solve the lack of photo question, actually it will not solve the problem of missing photo.

Maybe you are looking for