Virtualization of servers of very large files

Hello world

As the subjects that I found on this topic are quite old and advanced capabilities of ESX, I create a new.

We are now virtualize most IS infrastructure, on ESXi servers connected to the SAN storage with Enterprise Plus licenses.

In the list of servers, we have 5 major plysical Windows file servers, with millions on the files for 2 to 5 terabytes of data according to the server.

Technically, we could create several large 2 TB vmdk files and use the Disk Manager Windows agregate volumes.

Backup of all servers is accomplished using Veeam Backup and replication, to quickly perform a full backup and restore of servers (current solution is TSM, and a complete restoration of the largest server is estimated at one week that we use only redundant backups of files).

However, if it is technically possible, we have not any idea of stability and performance.

Someone knows the server virtualization?

Even if VMware support this virtualization, it's a recommended solution?

You will encounter some limitations to us?

We have other alternatives to replace physical servers (for example before boxes NAS connected to SAN one allowing the equivalent functionality to windows, such as advertisements and VSS integration for users), but if possible, we would prefer to keep VMware solutions to virtualize servers.

Thanks for any information on your experience.

David Druard

Andrew M says:

If you really need a volume is greater than 2 TB disks dynamic comments is so in your case, the only solution.

Don't forget to take the vmdk of LUNS in the same RAID Group (just to not add physical dependence).

André

I agree, however, an alternative is to mount points can be used to "Extend" the file system by mounting a file that points to another logic unit number in the file existing within the OS System.

Tags: VMware

Similar Questions

  • Very large file upload > 2 GB with Adobe Flash

    Hello, anyone know how I can download very large files with Adobe Flash or how to use SWFUpload?

    Thanks in advance, any help will be much appreciated.

    If the error comes from php there is a server limit, not a flash limit.

    you will need to edit your php.ini or .htaccess file download limit.

  • Create a very large file for printing

    Hello world

    I need to create a VERY BIG file for printing.  The final print will be 148 "height x 124" wide. I tried to find a way to build it. I created all the graphics vector and enlarge the photos to a size that can be included, but whenever I try to create this file it fills the disk of my work and I can't perform no action.  I also tried to create in Illustrator, but I'm not too good with Illustrator that I am Photoshop and honestly have NO idea how to proceed.

    Anyone can guide me please thanks to this? It is due to this evening and I'm really at a huge loss.

    Thank you in advance!

    OK this isn't a huge size - but if you are struggling on your system with it, you could do like Mellissa and build it in stages, thereby minimizing the layers in the final assembly.  You could also work in 8 bits rather than 16 bits to assemble the image print - it will be half of the memory required.

    Dave

  • after effects cs5 - quicktime h264 of very large file size

    Hello

    I wonder if someone can recommend a codec quicktime good for very small file sizes, for overview makes.

    project in progress - my wmv renders out approximately 10 MB and of reasonable quality that is excellent.

    some of my clients have flip for mac, so I have to make a quicktime.

    the quicktime h264 out much more than expected. same timeline, pixel dimensions and speed = approximately 150 MB and a similar quality.

    I thought that in the past my QT h264 renders were perhaps double the size of my wmv file renders more leap of 15 x.

    am I missing something really obvious?

    pixel size = 1000 x 250,

    frame rate = 30 images/s,

    Timeline = 4 m 30 s

    H264 quality = 50%

    Audio = aac stereo

    using after effects cs5, win7 pro

    Thanks in advance

    s

    the file size is directly related to the baud rate at which you are encoding, not the size of the image or images per second. So, if you want a 10 sec clip to be 15 MB (excluding noise), divide 15 MB at 10 s and get you 1.5 Mbps bit rate, or 12 Mbps. You can get a better quality if you encode 2-pass VBR with a maximum bitrate set to what you calculated previously.

    In any case, it is usually best to make your comp to a losless format (QT none, or TGA sequence, etc.) and then compress your video with Media Encoder. In this way, you can play with the settings for H.264 until you get an acceptable result, without having to make EI every time.

  • How can store a very large file in Oracle XML DB?

    Hello
    I m looking for a quick way to store an XML file to an oracle 10 g XE. I was trying to store the file of 500 KB in the database as an xmltype or a clob, but I still have the same error: "ORA-01704: string literal too long." I m looking for a long time the possibility to store this file and one other (113 MB) in the database. I had sought by google to see if there is no solution, and the unique solution is to splitt the document in a loop (due to the limit of 32 KB) instruction. But this solution Don t allow storage with an XML schema and must slow down.
    Here is an example how I did it (but it didn t work):
    create table world (Nr int. xmldata xmltype);
    INSERT INTO global VALUES (1, 'big xml file');
    I would also try the alternative with a variable binding as follows:
    create or replace PROCEDURE ProcMondial IS
    poXML CLOB.
    BEGIN
    poXML: = 'large xml file ";
    INSERT INTO the world of the VALUES (1, XMLTYPE (poXML));
    EXCEPTION
    WHILE OTHERS THEN
    raise_application_error (-20101, ' Exception occurred in the :'|| global procedure) SQLERRM);
    END ProcMondial;
    I'm also getting the same error: long chain!
    I use developer sql for the query.
    Please help me, I m desperate.
    Thank you!

    Michael

    This should help
    {: identifier of the thread = 887934}

  • Size of the very large file for a single slide - is this normal?

    I work at 8 Captivate with .cptx file size ~ 500MB. I wanted to send an example of a click box (including placement, size and text indication) to a colleague. To make it easy for him, I want to send a single slide with this interaction, so it can copy and paste.

    I saved the file under a new name, and then I deleted all other slides. The size of the new file was still almost 400 MB! Then I deleted all unused library items and the link to a .ppt presentation. This reduces the size to about 300 MB.

    Am I missing something? Is this normal?

    It's normal... Unfortunately.

    Copy slides in a new empty project of the same size.  It should be small then.

  • treatment of very large file with util_file

    Hello
    MY GOAL IS TO LOAD THIS FILE INTO THE ORACLE TABLE
    I have a file with incompatible delimiters without specific tab and with transport return around data, some columns of data contains also (\t) as part of the data. When I try to load the file through the sql loader it errors because the lines are junks of data. If I use excel or .csv format areas are dispersed and mixed in the columns.
    I thought using breaklines UTIL_FILE with CARRIAGE RETURN and then concatenate to records in form of sense.


    What do you think??? Thanks in advance.

    It works for me on the database Oracle 11 g Enterprise Edition Release 11.2.0.2.0 - 64 bit Production
    company time ;) but I need also

    CREATE TABLE EXTERNAL_CLOB
    (
      THE_ROW  clob
    )
    ORGANIZATION EXTERNAL
      (  TYPE ORACLE_LOADER
         DEFAULT DIRECTORY DIRECTORY_NAME
         ACCESS PARAMETERS
         (
          records delimited by 'YYYYYYY'
          badfile 'EXTERNAL_CLOB'
          logfile 'EXTERNAL_CLOB'
          discardfile 'EXTERNAL_CLOB'
          fields missing field values are null
          reject rows with all null fields
          (the_row char(32000))
         )
         LOCATION ('clob_test.txt')
      )
    REJECT LIMIT UNLIMITED
    NOPARALLEL
    NOMONITORING;
    

    Produced two simple lines of type clob column (the second row as expected begins with a line break character)
    Addition of 50 ranks some 300 more characters for the second record do not throw an error - seems you're safe while the record lengths are less than 32000

    Concerning

    Etbin

    Edited by: Etbin on 23.7.2012 10:44

    CREATE TABLE SKLADI_DEV.EXTERNAL_CLOB_1
    (
      AN_ID  NUMBER,
      NOTES  CLOB
    )
    ORGANIZATION EXTERNAL
      (  TYPE ORACLE_LOADER
         DEFAULT DIRECTORY skrbnistvo
         ACCESS PARAMETERS
         (records delimited by 'YYYYYYY'
          badfile 'EXTERNAL_CLOB_1'
          logfile 'EXTERNAL_CLOB_1'
          discardfile 'EXTERNAL_CLOB_1'
          fields terminated by 'XXXXXXX' lrtrim
          missing field values are null
          reject rows with all null fields
          (an_id char(15),
           notes char(1000000)
          )
         )
         LOCATION ('clob_test.txt')
      )
    REJECT LIMIT UNLIMITED
    NOPARALLEL
    NOMONITORING;
    

    seems to work too

    select an_id,length(notes) clob_length
      from external_clob_1
    

    return to me

    AN_ID   ,CLOB_LENGTH
    74511453,        719
    74511454,      52035
    
  • How you Zip a very large PowerPoint file to email?

    How you Zip a very large file Power Point in order to send email?

    Select the .ppt file. CTRL - or RIGHT click and select compress... from the context menu. There is no assurance that zip the file to reduce its size enough for your needs. In this case you'll need a third-party utility that can split archive into several parts such that you can send each part in separate emails. Notify the recipient which utility you used to replace parts in the file size.

  • How to stop ghost backups in Windows Vista? The file is very large and cannot be defragmented.

    Hello.

    My friend has a computer with Windows Vista. Apparently, there's something that automatically done 'Phantom' backups out to "system restore." Which is an application of Windows Vista, or can only be linked to the software when Norton Internet Security has been uninstalled.

    This file is a very large file and cannot be defragmented. So, right now the computer shows it's 54% fragmented. Part of the name of the file is "Volume Information."... »

    Please, let me know how I may be able to stop these unnecessary backups. Onbly regular system restore is necessary.

    Thank you in advance,

    José M Romero

    This function called cliché instant and is a useful feature, take a look at:

    http://Windows.Microsoft.com/en-us/Windows-Vista/previous-versions-of-files-frequently-asked-questions

    If you want to get more space, you can use the cleaning disc and free up space in your hard drive.

    You can run disk defragnment to solve your problem to fragnmentation.

  • Need to free more RAM to sort and link several very large excel files

    HP Pavilion dm1 computer running windows 7 64 bit with Radeon HD graphics card laptop. Recently upgraded to 8 GB with 3.7 GB of usable memory.

    I have several very large files with more 150 000 lines in each excel file. Need to combine these 3-4 files and sort. Research to maximize the availability of memory.  The Task Manager is total view Mo 3578, available - 1591 MB, cache - 1598. Free 55MO.  A single instance of Chrome and Excel 2014 without worksheet are running.
    Appreciate your help.

    It's her! It worked... disabled the maximum memory and restarted.

    available memory is 7.6 GB to 8 GB. Oh boy, I can't believe it...
    Thank you guys. Many thanks to hairyfool, Gerry and countless others who have given pointers...
  • How to move large files from backup of a disk to another drive

    I have a very large backup file I want to go from a hot spare for the other. When I followed the instructions from Apple to copy with the finder, the operation fails after several hours. Is there a better way to move very large files?

    Try using disk/restore to copy the backup to a new location. Please note that this will reformat the destination partition.

  • Outlook Express to receive e-mail large file for 3 days of work

    Someone sent me an email with a very large file (about 20 MB) and OE tried to receive this message for 3 days with no progress.  I can't send or receive any other emails until it has been received, or better still, deleted.  Help!

    Just to add to the good advice of Brian, there is another thing you can do.

    Create a Message rule:

    Where the message size is size
    Delete server
    Click on size and place it for about 10 MB, and then click: Apply Now.

    Return to the Inbox, and then click send/receive. This should get rid of the message.

    Don't forget to go back to the rules of Message and either delete the rule, or uncheck it if you use it in the future.

    Bruce Hagen ~ MS - MVP [Mail]

  • CMD Prompt to find large files

    I have a few computers in my workspace that are real woorkhorses.  I noticed one is a bit more of 2/3 full on HARD drive.  I kinda think that some users have very large files somewhere and I want to identify who and what they are in size.  Can someone give me the DOS command prompt commands to help me identify these miscreants?

    There is a script PowerShell here: http://gallery.technet.microsoft.com/scriptcenter/36bf0988-867f-45be-92c0-f9b24bd766fb

    You may be able to modify it to your application. There's something similar here:http://blogs.technet.com/b/heyscriptingguy/archive/2009/08/19/hey-scripting-guy-can-i-determine-which-folders-are-using-the-most-space-on-my-computer.aspx you could put your script on a flash drive and consider from one computer to another to run from PowerShell. You must change the options of default PowerShell on each computer to allow scripts to run.

    From the command prompt, use dir/s/s / p for a list in ascending order.

    For the descent of the command dir/s/o:-s / p

  • Help! Windows Mail is frozen by entering large file!

    I've not being updated again to Windows 7, so it is a problem of Windows Vista Mail:

    I obviously have a VERY large file sent to one of my email accounts via Windows Mail.  He is sitting there to download for hours, days without success, so when I finally hit "stop" button to stop this hot mess it froze now "finished cuurent message".

    I went to my webmail to delete the email guilty, rebooted the computer & it always goes back to this original attempt to download.  Is there a way to pass or remove it, so it can move on the download of my other emails?  Now, it blocks the rest for 3 days.  Suggestions?

    Delete to your Webmail should have worked. Here is another option.

    Create a Message rule:

    Where the message size is size
    Delete server
    Click on size and place about 1 MB.
    Click here: apply now.

    Return to the Inbox, and then click send/receive. This should get rid of the message.

    Don't forget to go back to the rules of Message and either delete the rule, or uncheck it if you use it in the future.

  • CANNOT DELETE the LARGE FILE - Windows Explorer hangs

    Windows 8

    I made a very large pdf by mistake (14 GB) and now I can't do anything with it. Can not open, delete, move...

    I tried to delete the entire folder where it is located, but also that Windows Explorer crashes.
    I checked the disk for errors, and everything is OK.
    I tried to permanently delete without moving to the trash, as I understand it the trash can sometimes accept very large files. I can highlight the file, but as soon as I click the button remove Ribbon, or right-click, or try to do something with it, Windows Explorer crashes again.
    I have rather a small disc on this machine SSD, and I need this space to the rear...
    Can't think of anything else to try...
    Best regards, K

    Try to remove the file from a command prompt. You can shift right click a folder to see an option "open command window here", which will automatically open the command prompt in the appropriate folder. There is also an "open command prompt here" option on the file menu. If necessary, you can simply open a command prompt and use the cd command to go to the appropriate folder. (.. leave a folder, / to go to the root). Note that the led, remove and erase commands don't send files to trash either.

    Another thing you can try is crushing the PDF file. Type something in a program that is able to save to PDF and try to save it on this PDF. Please tell me if you receive errors overwhelming, because they can help to find out what is the problem. If you have any program capable of PDF, simply typing things into Notepad and overwrite the PDF file with it. (in the Save in drop-down menu, you can select "all files")

    The entry in NTFS, indicating the presence of this 14 GB file is maybe damaged. The disc may have recorded the location of the bad file, or the 14 GB file may not even be there. You can plan to run the chkdsk utility. It flows better in an administrator command prompt. If you are prompted to do so, you should probably run chkdsk /f. I don't know what the side effects might arise from running chkdsk on an SSD, so proceed at your own risk. (I tried to do a search, and all that I could find is that nothing else than / f should not be run on an SSD)

Maybe you are looking for