Large file support function

I have a SP (code below) that creates a new data file or added to an existing file if a threshold is reached.

So as you can see, I can close with more than 1 data file.

A few examples:

maximum size = 30G, size of the data file is 28G, I add 1 G if the data file should be 29G

maximum size = 30G, size of the data file is 28 g, I add 2G if the datafuke should be 30 g

maximum size = 30G, size of the data file is 28 g, I add 3G so I should now have 2 data files 1 of them being of 30G and the other 1 G

Basically, the code will be go through each data file and make 30G (file_max_m) until all the space is exhausted

I can use some help create and integrate a function that can tell me if TBS supports large files. If it does not support

such features as described above should be left alone (may have multiple data files) if she does not support, I just want to keep

resize the file.

Any help with this change of function and the code will be greatly appreciated

01.intro create or replace
02.Markus (add_space) procedure of
03. p_tablespace varchar2,
04 p_gibabytes number
05.) is
06 add_m number: = 1024 * p_gibabytes;
07 required_m number: = 0;
08 created_m number: = 0;
09 file_max_m number: = 1024 * 30; -30G
10 ts_size number: = 0;
11 start
(12 for ts in)
13. Select
14 nom_tablespace,
(15 round)
(16 total)
17 case when CanGrow = "YES" and bytes, maxbytes then
18 maxbytes / 1024 / 1024
19. other
20 bytes / 1024 / 1024
21 end
22.) over (partition by tablespace_name)
23.) tablespace_m,.
24 file_name,
(25. round)
(26 total)
27 case when CanGrow = "YES" and bytes, maxbytes then
28 maxbytes / 1024 / 1024
29. on the other
30 bytes / 1024 / 1024
31 end
32.) over (partition by filename)
file_m 33).
34. of dba_data_files
"35. where nom_tablespace = upper (p_tablespace) and online_status = 'ONLINE' and status =" AVAILABLE "
36 and p_gibabytes between 1 and 999
37 order by desc - last file_id firstly the file
loop 38).
39 ts_size: = ts.tablespace_m; -for report
40 required_m: = ts.tablespace_m + add_m;
41 -
42 - resize datafile (s)
43 -
44 created_m: = created_m + ts.file_m;
45 if (created_m < required_m and ts.file_m < file_max_m) then
46 report
47 number size_m: = ts.file_m + required_m - created_m; -current size + difference is new size
48 start
49 if (size_m > file_max_m) then
50 size_m: = file_max_m;
51 end if;
52 dbms_output.put_line('alter database datafile ''' || ts.file_name ||) "' resize '. size_m | AM; -' | TS.file_m | Am to ' | round (size_m/1024, 3) | 'G');
53 created_m: = created_m + size_m;
54 end;
55 end if;
56 end loop;
57 -
58 - Add data files
59. -
60 while (required_m > 0 and created_m < required_m)
61 loop
62 report
63 size_m number: = required_m - created_m;
64 start
65 if (size_m > file_max_m) then
66 size_m: = file_max_m;
67 end if;
68 dbms_output.put_line('alter tablespace ' || p_tablespace || q'"add datafile '+DATA01' size"' || size_m ||) AM; -- ' || round (size_m/1024, 3) | 'G');
69 created_m: = created_m + size_m;
70 end;
71 end loop;
72 - report
73 dbms_output.put_line ('-tablespace ': upper (p_tablespace) |) "resize" | p_gibabytes | "G of |  ts_size | Am to ' | required_m | 'M');
74.end;
75 /
76.la sho err
77.
78.
79.Le ADD_SPACE compiled PROCEDURE
80.
81.
82 - bug fixed and tested here
83.Stentor set serveroutput size unlimited
84 create tablespace testts datafile ' C:\ORACLEXE\APP\ORACLE\ORADATA\XE\TESTTS01. DBF' size 9G Online;
85.Le exec add_space ('testts', 1);
86 drop tablespace testts including content and data files;
87.
88.
89. tablespace created TESTTS.
90.l anonymous block finished
91.Les alter database datafile ' C:\ORACLEXE\APP\ORACLE\ORADATA\XE\TESTTS01. DBF' resize 10240M; -from 9216M to 10G
92 - tablespace TESTTS resized 1 G of M 9216 10240 m
93.Le tablespace TESTTS dropped.
94.
95.
96 - another test
97 - add 1 G of users
98 add_space exec ('users', 1);
99.
100.
101 - script is created
102. anonymous block finished
103.Le alter database datafile ' C:\ORACLEXE\APP\ORACLE\ORADATA\XE\USERS03. DBF' resize 2499M; -from 1255M to 2, 44G
104 - tablespace USERS resized 1 G from 1475 to 2499 M
105.
106.
107 - I did run this script and file is resized
108.INO database datafile ' C:\ORACLEXE\APP\ORACLE\ORADATA\XE\USERS03. DBF' altered.
109.
110.
111 - Add another 1 G users
112 add_space exec ('users', 1);
113.
114.
115 - script is created
116.Le anonymous block finished
117.avant alter database datafile ' C:\ORACLEXE\APP\ORACLE\ORADATA\XE\USERS03. DBF' resize 3743M; -of 2499M 3 655 G
118 - tablespace USERS resized 1 G of 2719 M 3743 M

/

Hello

Try this. I didn't test it. Just remember that the command can be a bit difgferent for bigfiles. I'll check that. Now, he must be able to manage the resizing of bigfile tablespace.

create or replace
procedure add_space(
  p_tablespace varchar2,
  p_gibabytes number,
  p_diskgroup varchar2 default '+DATA01'
) is
  add_m       number := 1024 * p_gibabytes;
  required_m  number := 0;
  created_m   number := 0;
  file_max_m  number := 1024 * 30; -- 30G
  ts_size     number := 0;
begin
  for ts in (
    select
      tablespace_name,
      (select bigfile from dba_tablespaces where tablespace_name = dba_data_files.tablespace_name) bigfile,
      round(
        sum(
          case when autoextensible = 'YES' and bytes < maxbytes then
            maxbytes / 1024 / 1024
          else
            bytes / 1024 / 1024
          end
        ) over (partition by tablespace_name)
      ) tablespace_m,
      file_name,
      round(
        case when autoextensible = 'YES' and bytes < maxbytes then
          maxbytes / 1024 / 1024
        else
          bytes / 1024 / 1024
        end
      ) file_m
    from dba_data_files
    where tablespace_name = upper(p_tablespace) and online_status = 'ONLINE' and status = 'AVAILABLE'
    and p_gibabytes between 1 and 999
    order by file_id desc -- last file first
  ) loop
    ts_size := ts.tablespace_m; -- for report
    required_m := ts.tablespace_m + add_m;
    if (ts.bigfile = 'YES') then
      file_max_m := 1024 * 1024 * 32; -- 32TB
    end if;
    --
    -- resize datafile(s)
    --
    created_m := created_m + ts.file_m;
    if (created_m < required_m and ts.file_m < file_max_m) then
      declare
        size_m    number := ts.file_m + required_m - created_m; -- current size + difference is new size
        sql_text  varchar2(2000) := q'"ALTER DATABASE DATAFILE '{file_name}' RESIZE {size}M;"';
      begin
        if (ts.bigfile = 'YES') then
          sql_text  := q'"ALTER TABLESPACE '{tablespace_name}' RESIZE {size}M;"';
        end if;
        if (size_m > file_max_m) then
          size_m := file_max_m;
        end if;
        sql_text := replace(replace(replace(sql_text,
          '{tablespace_name}', ts.tablespace_name),
          '{file_name}', ts.file_name),
          '{size}',size_m
        );
        dbms_output.put_line(sql_text || '; -- from ' || ts.file_m || 'M to ' || round(size_m / 1024, 3) || 'G');
        created_m := created_m + size_m;
      end;
    end if;
  end loop;
  --
  -- add datafiles
  --
  while (required_m > 0 and created_m < required_m)
  loop
    declare
      size_m    number := required_m - created_m;
      sql_text  varchar2(2000) := q'"ALTER TABLESPACE {tablespace_name} ADD DATAFILE '{diskgroup}' SIZE {size}M;"';
    begin
      if (size_m > file_max_m) then
        size_m := file_max_m;
      end if;
      sql_text := replace(replace(replace(sql_text,
        '{tablespace_name}', p_tablespace),
        '{diskgroup}', p_diskgroup),
        '{size}',size_m
      );
      dbms_output.put_line(sql_text || '; --' || round(size_m / 1024, 3) || 'G');
      created_m := created_m + size_m;
    end;
  end loop;
  -- report
  dbms_output.put_line('-- tablespace ' || upper(p_tablespace) || ' resized ' || p_gibabytes || 'G from ' ||  ts_size  || 'M to ' || required_m || 'M');
end;
/
sho err

Tags: Database

Similar Questions

  • Adobe premium suite 10 windows... downloaded 2 files, an .exe file and a stuffed larger file support... said put them both in the same folder

    Windows... confussion... Adobe premium suite 10 windows... downloaded 2 files, an .exe file and a stuffed larger file support... told to put them both in the same folder... and

    then what? Should be a folder. For most people to file no is hard enough. so don't tell where to put the file... or for

    First open...

    OK... you Win10... what version of the Adobe software you download?

    In general, when the download page says to download two "associates" of files, you RUN the exe file and it does the rest... that's why both files must be in the same folder

    Download & install instructions https://forums.adobe.com/thread/2003339 can help

    Also go to https://forums.adobe.com/community/creative_cloud/creative_cloud_faq

  • Why not have transfer of larger files and function to keep Logo?

    I tend to like most of the Adobe products and I actively get others to use Adobe products.  However, this product is not worth the headache - you want us to use it or not?  There is no legitimate reason for Adobe to have so many restrictions on this type of product, especially when none of your competitors has them.  The logo was the main reason I chose AdobeSendNow at the expense of others.  Although I have never happy to not be able to send files greater than 2 GB when competitors allow up to 20 GB.  In addition, why are there so many types of limited files? This type of service should be automatically including all current and known file types used by PC and Mac.  because of all of this, I am obliged to pay and use another service to do my work today.

    I do not understand the logic of NOT respecting this than your competitors providing and noticed which includes end-users as most.  After reviewing the information, create a service that integrates and improves on all of these features, including the new features in Adobe original.  Now, Adobe should have a service that can easily be a product in this category.  It seems that whoever is responsible for this product/service does no sincere effort by creating a leading product.  This product has ceased to be an average product with some 'cool' features, for the worst product in its class.

    I really hope someone at Adobe pays attention to these comments and upgrade Adobe send to a leader of the actual product.

    Support for much larger files and sending brand has been added during the summer.

  • 4.2.3/.4 data support assistant - slow when loading large files

    Hello

    I use the load data wizard to load csv files into an existing table. It works very well with small files up to a few thousand rows. When loading 20 k lines or more loading process becomes very slow. The table has a unique numeric column to the primary key.

    The primary key is declared to the "shared components"-> logic-> "data loading tables" and is recognized as 'pk (number)' with 'case sensitve' set to 'No '.

    When loading the data, these Setup executes the following query for each line:

    Select 1 "Klaus". "' PD_IF_CSV_ROW ' where upper ("PK") = upper(:uk_1)

    which can be found in v$ sql view during loading.

    It makes the process of slow loading because of the superior function, no index can be used.

    It seems that the setting of "case sensitive" is not rated.

    Removing the numeric index for the key primary and with the help of a function based index do not help.

    Explain plan shows an implicit conversion "to_char":

    UPPER (TO_CHAR (PK) = UPPER(:UK_1)

    It is missing in the request, but perhaps that it is necessary for the function based index to work.

    Please provide a solution or workaround to load data Wizard work with large files in a reasonable time.

    Best regards

    Klaus

    You must have the correspondence of PK to be "exactly - case-sensitive" (or something like that)

    In addition, I was already on the slowness with Oracle Support when dealing with important content.

    They are aware of the problem.  Only around is known to break up the data into smaller segments.

    (I'll have to track down the correct ID for reference...)

    My observation:

    APEX data loader makes a slow-by-slow transformation of the data instead of treatment in bulk.

    You will need to break up the data, or to build your own data loader.

    Thank you

    MK

  • format hard drive for windows and larger files, TV supports

    Hello

    pls suggest me with the solution to the following problem.

    I need to know what format of hard drive works with mac, windows, & TV too, even for large files and quality bluray movies

    FAT32 only supports files up to a maximum size of 4 GB, which would be certainly much too small for Blu - Ray quality media.

    exFAT supports files of unlimited size effectively, Mac and Windows can read and write to exFAT to format external drives but neither can boot from such a unit. If a TV or other system supports exFAT drives, you need to check yourself formatted.

    NTFS supports files of unlimited size indeed, Mac can only read NTFS, it cannot format or write to NTFS, unless you get an add-on. Of course Windows supports NTFS. Again, you will need to check if your TV etc supports NTFS. If you want to add full support for NTFS to a Mac, I recommend this - https://www.paragon-software.com/home/ntfs-mac/

    HFS + is the current format of Apple and new supports files of unlimited size effectively. Mac can of course read, write and format disks. Does not support the Windows as a standard HFS +, but still, you can buy an add-on for Windows Add this feature. It is extremely unlikely that a TV would support HFS +. If you want to add full support for HFS + for Windows, take a look at this - http://www.mediafour.com/software/macdrive/

    There are also other formats of disc for example ZFS and BTRFS, Ext3, Ext4 and soon SATF, but in this case, they are not relevant.

    Note: as indicated above, in many cases, I said 'actually unlimited', it is of course a limit, but one so great that he will strike you not for individual files, including Blu - Ray or even files UltraHD (4 k).

  • Error message: "an internal support function returned an error" trying to attach files in emails.

    Original title: Error message-not invited

    Trying to electronic. Error message: 'an internal support function returned an error.'

    Hello

    Which email client you are using?

    If you are facing the issue in Microsoft Office, you can post your question in the Microsoft Office community & get help support professionals.

    Hope the helps of information.

    Please post back and we do know.

  • Send large files

    What is the name of the program to send large files (or rather for their recovery.)

    Use the mailbox, see this support document: limits of mailbox - Apple Support

    https://support.Apple.com/en-us/HT203093

    iCloud: Add an attachment to an e-mail https://support.Apple.com/kb/PH2629?locale=en_US 

  • Stor.E TV + Samba large files question

    Hello

    Using SAMBA I can copy files cannot copy files but small on my TV + Stor.e > ~ 500 MB. Y at - it a software/bios update that fixes it? Can't see all downloads available on the Support page for this camera. Can configure us the Samba share ourselves at all?

    I need to be able to copy large files on Samba to avoid having to locally secure the unit to the PC via a USB port whenever I want to copy my big digital files on the media player.

    Thank you

    Hello

    I think this could be due to the energy saving settings of the computer or the NAS server has interrupted the connection.
    If the settings of your laptop or your energy-efficient NAS are active, then they can go into a sleep mode idle state even if they are accessed via the network.
    I recommend you set the parameters of the source of the media to prevent it going in sleep during the transfer of files over the network.

    You can also try different connection modes:
    -Configuration of the LAN
    -Wlan installation (ad-hoc connection)
    -PPPoE Setup (if your device is connected to the network via DSL)

    I think it would be interesting to know what link is affected.

    Mate good luck

  • IX4 - 300 d / General SIN: How can I move large files between different folders?

    Hi @all.

    My new IX4 - 300 d arrived yesterday

    After doing his RAID-things all night, now I want to move my files from my old NAS to this one.

    I do this using Copy-supported by the ix4 - 300 d, which work fine.

    A big problem for me: I want to reorganize some large files / directories (> 200 GB), so I have to cut and paste the files from one folder to another (all on the ix4 - 300 d, but with different user groups).

    When I move files between subfolders in a folder of the sin, it works fast enough - I think here that the transfer will be treated directly by the nas server.

    When I do the same thing between subfolders in different folders (which appear as various network drives in windows Explorer), it seems that the transfer is managed by my pc or laptop - and it is very slow...

    I tried to move the files on my network environment (all folders are subfolders of a server: ix4 - 300 d) because I thought the problem was in my network mapping, but it does not work faster.

    How will I know my ix4 - 300 d to move the files directly (without the "help" from my pc)?

    Another idea for me was to do it via ftp, but my ix4 - 300 d only refused my connection (have to try a little more)

    Please help me

    I "solved" the problem by creating a copy of the temporary employment with the manual removal of duplicates once the task is completed. The copy job run all night.

    For future problems, I must think of an another "file management".

    Thanks for your response!

  • Splitting large files of UFF58b

    I tried to open some files UFF58b in Signal Express.  Unfortunately with larger files (300-500 MB), the program crashes whenever I try all the waveforms.  The affected functions are short bursts of vibration data of 22 channels.  Is it possible to extract a set of 22 functions with the same timestamp with Signal Express?

    Otherwise, what is the best way to retrieve a dataset using Labview?  I've been watching the VIs data storage and successfully open files with open data storage (UFF58b).  Is it possible to define a query to select a particular stamp?

    Hi Paul,.

    Thank you for that. Please see this example I created just to do what you described. At this point, you can specify the timestamp of the record 3 that you are interested in and it will return these wave functions.

    I hope this helps.

    Kind regards

  • Is that the compress/decompresss utility file supported by all types of window OS? or for some region the user install it?

    Hi all:

    Is the Compress/decompress utility file supported by all types of Windows OS (XP, VISTA, Windows 7)? or for some region the user install it?

    Any link from the website of Microsoft?

    I managed to get the guide from the Microsoft web site, but just a guide without a show of declaration of default installation during installation of the operating system.

    Window XP:

    http://support.Microsoft.com/kb/306531

    Windows Vista:

    http://Windows.Microsoft.com/en-us/Windows-Vista/compress-and-uncompress-files-ZIP-files

    Windows 7:

    http://Windows.Microsoft.com/en-us/Windows7/compress-and-uncompress-files-ZIP-files

    Thank you.
    Coulombe

    Hello

    Compress and decompress are not a separate tool/utility to download and install in the computer. They have functions that come with these 3 operating systems. These features are available in Windows XP/Vista/7 from across operating systems.

    However, you can also use your favorite search engine for non-Microsoft applications to compress and extract files such as Winzip and Winrar.

    Note: The use of third-party software, including hardware drivers can cause serious problems that may prevent your computer from starting properly. Microsoft cannot guarantee that problems resulting from the use of third-party software can be solved. Software using third party is at your own risk.

    It will be useful.

  • scanning - too large files!

    I have a HP OfficeJet J6480 All In One - when I scan a text of the document in PDF format, the file size is huge - a single page is at least 600Kb to 150 dpi - which makes for too large files to send, if you analyze 10 more pages. Is there a setting to change in order to reduce the file size while leaving the easily readable scan? Thank you

    If you use HP's Center version 10, download a version of the photosmart c6380 12 software. Advanced installation to select custom settings. Check the box for the imagery of function of the device and the HP Solution Center, and then uncheck the rest. Select the type of USB connection. in the lower left corner, you will see a box that says "If you cannot connect the device...". Plug the device later... ', check this box.

    MOD edit: also, here is a link to another post of Nickyton81 with installation instructions:

    http://h30434.www3.HP.com/PSG/board/message?board.ID=scan&message.ID=2307#M2307

    Message edited by Wendy on 27/04/2009 12:12
  • How to copy a large file from "my documents" on a CD?

    How to copy a large file on a CD?

    Hi Richard Shield.

    Please see article with some information:

    http://support.Microsoft.com/kb/306524

    I hope this helps!

  • Transfer of large files on home network freezes and stop the transfer

    I have a HP Desktop running Windows 7 and laptop HP on Vista 64-bit. I have two networked computers on a home network with a 3Com network 10/100 16-port switch.  When I try to transfer files greater than approximately 2.5 gigabytes between two PCs, file transfer will hang and stop after about 2.5 gigabits. Below the size of this file, generally no problem, the transfer ends. The two PCs have file sharing turned on and as mentioned on any file less than 2.5 gigabits transfer / very well.  Is there some sort of transfer of files, size limit on Windows Vista and 7 OS?  Are there settings that I can change to allow to transfer large files? I got an old Sony PC running Windows XP and I could transfer files more than 2.5 gibabits to my storage device shared networked home without any problem.  But under Vista and 7 OS, even once, I can't transfer any file more 2.5 gigabits to my home network shared storage device.  Anyone has any ideas how to solve this problem? Thank you. NEOhio123

    I fixed this problem on my laptop running Windows Vista 64 bit. I remembered that first of all, this started occurring after that I used the IOLO System Mechanic 9 network optimization tool to optimize the settings of my network on my laptop.  So I ran the cancellation on my Windows Vista laptop, and now the laptop with Vista will once more to transfer large files. If a note to all of you who use System Mechanic... do not function of optimization of the network on a Vista PC tool run... .or if you wash and start experiencing problems of transfer of large files... run the cancel this function and rebooting.  You should be able to transfer large files again. I just sent IOLO an email to warn of this problem with the optimization tool of the network within their product of System Mechanic 9.

    UNFORTUNATELY, my new desktop computer running that Windows 7 has still this problem , it stops file transfers after about 2.5 gigabits are sent to my device of storage attached to the network or the laptop. I learned that this seems to be a problem in a way. I can pull large files of my storage device, network or laptop, I can not just to transfer large files to these devices with the Windows 7 desktop. So if anyone has any ideas of how to resolve this problem of transferring large files with Windows 7, I would appreciate your insight.
    NEOhio123

  • on the large file download problem. Pls help. Thank you very much

    Hello.

    My application got a function is the downloading of files. App works when I download a small file (like a video (4 MB)).

    But had a problem about large file download.

    1. when I try to open a video file (size: 30 MB) use the code below, it cost much too much time to open it, how to optimize? :

               FileConnection fc = (FileConnection) Connector.open("file:///" + filePath);
                InputStream is = fc.openInputStream();
                ByteArrayOutputStream baos = new ByteArrayOutputStream();
                int j = 0;
                while((j=is.read()) != -1) {
                    baos.write(j);
                }
                postdata = baos.toByteArray();
    

    2. What is the file size limitation in by Httpconnection? HTTPConnection may transfer a large file (like 30MB video)?

    Any smaple can share to me?

    Thank you very much.

    In addition to what maghue gave you, I would recommend you rework your logic so that you don't send a 30 MB of data in a single shipment.  That's a lot of have to fire if it fails and this defect, especially on a WiFi connection not, chances are high enough, that I would have thought given the time it will take.

    In addition, the approach you've taken means that you have a buffer of 30 MB, as the transport mechanism to send the file.  You could reduce the memory required by a small segment of both reading and write in the http connection output stream, but that will always force the http connection to have a 30 MB buffer, which may have to be contiguous (depending on how it is implemented).  Normally transmission is not made until you ask the response code, so it will be stamped on the device.

Maybe you are looking for