compression gzip http vCOPs

Hello

vCOPs generates a lot of http (s) traffic, we want to enable gzip compression save bandwidth.

On the device it run tomcat and apache2 demons. I guess that apache2 is proxy for tomcat.

It is: what is recommended to enable gzip compression in vCOPs env? Should it be enabled on tomcat or apache?

It is not recommended to enable gzip compression, if it is disabled OOTB. It would be unsupported. They run Linux + tomcat, etc, but they should be treated as owners of devices and managed/administered within our support guidelines.

You said that there are a lot of http traffic. Is that the customer traffic, or collection of adapter to vCenter + other sources?

Your users use vSphere UI or the custom of dashboards? HTTP traffic should not be so likely to cause a problem and I actually never heard of it as a complaint. It is possible that you are using customized dashboards that times of refreshing interval, which would result in HTTP traffic some more high due to the updating of content, but it would be a much less important problem compared to more load exerted on the VMS UI/analytical...

Tags: VMware

Similar Questions

  • By using compression gzip in application of the adf

    Dear all,

    I have a massive adf page with lots of af: tables in there... in fact, inside the stubborn taskflow, I want to use compression gzip for my application. Can someone so suggest me any solution for this. In fact, I searched a lot on the net, but do not get any solution. Hope you guys have a better solution,


    Thank you and best regards,
    David
    JDeveloper 11.1.1.5.0

    Have you seen this post: ADF and HTTP compression ?

    Kind regards

  • 7-zip rar files vs. compressed files?

    My question is this. What is the difference between the 7-zip rar files and when I use the selection of compressed file (zipped) on a file? What is the rar files?

    I rarely send zipped files. I used to have WinZip on my old computer. My generic desktop minitower PC has XP Pro, SP3, Office 2003. (In addition, awhile back I upgraded to Office 2007 but found totally lame and went back to 2003). A few reciepents some PDF files of architectural plan say they can not open my files when I send them with 7-zip. Sometimes it is 30-60 page pdf construction plans.

    Please explain.

    It depends on what you mean by "better"?

    I'm looking for a file compression utility that can work with various compressed formats.
    Are you willing to pay for a file compression utility, or do you want a free?
    Looking for a file compression utility that makes smaller files?

    In the end, it basically boils down to personal preference.  Here is a site that claims to have received "more than 500 comments" in order to select the most popular file compression utility: http://lifehacker.com/5065324/five-best-file-compression-tools

    And, if you are interested in self-promotion, here's how 7-zip compares its ability to file with some other compression: http://www.7-zip.org/

    Personally, I'd go with free rather than pay $29 for WinRAR, WinZip or PKZip.  For most people, the ability to built into Windows zip file is sufficient.  If you are likely to get files compressed in a variety of formats, 7-zip is probably the way to go (packing / unpacking: 7z, XZ, BZIP2, GZIP, TAR, ZIP, and WIM;) Unpacking only: ARJ, CAB, CHM, CPIO, CramFS, DEB, DMG, FAT, HFS, ISO, LZH, LZMA, MBR, MSI, NSIS, NTFS, RAR, RPM, SquashFS, UDF, VHD, WIM, XAR and Z).

  • Gzip analyzes XML

    I've seen posts about issues with extracting XML and gzip. Separately, I have no problem doing either. But I can't send a gzipinputstream to my xml parser - he just throws exception and told me no stack trace.

    I send a parameter in my http request that determines if the server should return an XML file or the gziped XML file.  When you ask gzip, I can read and analyze very well. If I ask gzip and just read steam like this...

                    byte[] data = new byte[256];
                    int len = 0;
                    int size = 0;
                    StringBuffer raw = new StringBuffer();
    
                    while ( -1 != (len = inputStream.read(data)) )
                    {
                        raw.append(new String(data, 0, len));
                        size += len;
                    }
    

    It prints like my xml... so I guess that means his power to receive the data very gzip.

    However, if I ask gzip and try to analyze using the same code that works when I just parse xml, it fails.  Is there another step I have to do to convert the inputstream so Manager xml can be read?

    httpConnection = (HttpConnection)Connector.open(url);
    inputStream = httpConnection.openDataInputStream();
    
    if(httpConnection.getResponseCode() == HttpConnection.HTTP_OK)
    {
    
        inputStream = new GZIPInputStream(inputStream);
        InputSource is = new InputSource(inputStream);
    
        is.setEncoding(desiredEncoding);
    
        SAXParserFactory factory = SAXParserFactory.newInstance();
        SAXParser parser = factory.newSAXParser();
    
        DefaultHandler hcHandler = new XMLParse();
    
        parser.parse(is,hcHandler);
    
    } //end if(httpConnection.getResponseCode() == HttpConnection.HTTP_OK)
    

    You should check to see if the content is gzip:

    String encoding = m_httpConnection.getHeaderField("Content-Encoding");
    If ((encodage! = null) & (encoding.indexOf ("gzip")! = - 1)) {}
    compressed = true;
    }

    Then, unzip it:

    If {(compressed)
    Gzip GZIPInputStream = new GZIPInputStream (dataInputStream);
    DataInputStream tmp = new DataInputStream (gzip);
    dataInputStream = tmp;
    }

    Now you can safely parse the XML code.

  • InputStreams for HTTP [S] response data: confirm a minimum memory buffer/reading early?

    I need to manage data from a web service response. The format of the response data is under my control and is returned as a stream (net.rim.device.api.crypto.tls.TLSInputStream) to my BlackBerry application. The content of the feed is an XML stream that contains simple header information and then one or more 'chunks' of data. The data is compressed (gzip) and coded (Base64). BB app must decrypt, decompress, and then process the data in the stream. For the purposes of my application, I never need all of the data at a time; processing flow for real, that's what I'm looking for. I've implemented a pipe which, in pseudocode, looks like this:

    SecureConnection httpsConn; already implemented

    InputStream httpsStream = httpsConn.openInputStream ();

    InputStream compressedStream = new (Base64InputStream.decode (httpsStream));

    InputStream is = new GZIPInputStream (compressedStream);

    int aByte = is.read ();

    The goal is to put buffer in the bit data as possible so that the operation on the side of BB is not intensive memory as the data grows. The effective implementation of this pseudo code works very well.

    The question I have is: I can confirm that the httpsStream I have created from the httpsConn is completely not himself read the BB in the specific RIM code? In other words, if there are 20 MB of data in the stream, I don't want to know that the stream has read data HTTPS completely - 20 MB all - and then put at disposal. Instead, I want to know only as much data is consumed as I makes (is.read) as well as a small buffer, perhaps, for the effectiveness of the network. A third way to ask the question: I think that it is supposed to be the definition of a well implemented InputStream, but I'm having a hard time finding a definitive '' Yes, J2ME (or BB) InputStreams promise to read HTTPS data on request and not all at once.

    I expect tons of streaming audio and video apps are partial evidence that works real data on the fly on the Net.  Still, I left the details as XML processing by SAX - it's one on the behavior of the InputStream HTTP [S].  But it brings the fourth way to phrase my question: if I use SAX instead of a tool of DOM to treat my HTML because I want to monitor the pressures of large data flows, will I get cancelled by buffering I can't control in the low-level InputStream HTTP [S]?

    Before you say, ' HTTP [S] is not where you should make streaming ", this is not streaming in itself. It is instead one - possibly large - answer to a POST.  Highly 'typical' web interaction

    If changes in response based on the version of the OS, presumably 4.6 or better is the target platform.

    Thank you!

    -Del

    I don't remember the said documentation. All I remember is that I proposed of workaround to someone on this forum and they later confirmed that he has solved the problem of buffering (they were streaming audio as great answers HTTP - streaming started to work very well, without a lot of latency).

  • During the compression of an entire disk in XP, combine multiple files to single-sector?

    I'm curious about the similarities of compressing an entire disk works in xp-Edition family versus how a unit with compression used to work in older versions of Windows like ME, 98, etc... I'm talking about how in older versions, Windows would give a specific amount of disk space and then would butt beginning of the right to one single file at the end of another. It seems to me that it is not good to compress the files in single-sector with the family xp Edition because they will still use a file area, or that is incorrect? If they are correct, then when you compress an entire drive with xp-Home Edition, then work just like the older versions?

    Thank you

    Hello

    You can see the links below for more information on the compression of a drive in Windows XP.

    Overview of file compression

    http://www.Microsoft.com/resources/documentation/Windows/XP/all/proddocs/en-us/ff_file_compress_overview.mspx?mfr=true

    The use of the Compression of files in Windows XP

    http://support.Microsoft.com/kb/307987

  • Since I compressed my NTFS drive last week, all the text of my files names are in blue. Is there a reason for this?

    Original title: file names have turned to the Blues

    file names have turned to the bruises on my dell hybrid. I compressed my NTFS drive week last to create more space and since then, all the names of txt on my files are now blue. is there a reason for this?

    Hello JordanMcGinn,

    Please see the link below regarding this issue.

    File changed police in blue: it's normal - blue means that the file is compressed

    http://answers.Microsoft.com/en-us/Windows/Forum/windows_vista-files/file-font-changed-to-blue/3efc8f58-edf8-4930-9e38-69feca448a92

    Thank you

  • vcops presents in vSphere client but not in vSphere Web Client

    Question for you.  I just installed vCOPs for demo in our environment, and I'm having a problem seen in the web client.  I am able to see it in real vSphere Client with all registered vcenters, but this isn't the best way to use the tool.  I use Chrome verstion 34.0.1847.116.  I also tried in IE.

    By using the web client of vSphere vCOPS isn't built the way but he is in the thick client.  In the client heavy, you have a plug-in, that you click and it basically gives you the web console window in a tab.  There is nothing like this for the Web Client.  If you click on an object, you can see the health of this object and so forth.  If you want to watch as you see in the thick client but you will need to connect to the user interface for vCOPS self https://vcops-ip/vcops-vspheremeaning.  Or if you look on the homepage of the web client, you can see under surveillance, you see the vCOPS icon and which will bring you to the login page.

  • vCOPS 5.7 change label to vCenter

    Hi all

    is there a way to Rename label of the vCenter added by the Admin interface? Or you have to Cancel registration et re-save it? I 5.7.1 version.

    Thank you very much.

    Matrix

    Hi Matrix,

    You can make the custom UI (https::///vcops-custom). Then, go to the overview of the environment, select the instance of the adapter and click on modify. Here, you can change the name of the resource whose efficiency has done what you want.

    If needed I can provide you with screenshots.

    Thank you

    Alex D.

  • supermetrics & vcops custom report

    Hi all

    I have a custom report in which I show a Supermetric.

    How can I do?

    The supermetric does not appear in the describe.xml file and its definition can be found in the database of the user interface.

    So, how can I put the supermetric in the user_ciq_metrics_collected.xml file?

    Thank you

    IG

    Unfortunately, this is not a very easy and simple process. But here's what you can do.

    Transfer Super-metrique involves the following steps:

    • Create package great metric attribute and assign it to the resource type. For more information about the metric development super pls. refer to the documentation of great metrics available in the custom user interface.
    • Ensure that the metrics super created appears in the section "all measures".
    • Find great key of DB alivevm metrics.
    (You can use the following address to execute DB queries: https:///vcops-custom/dbAccessQuery.action)
    select distinct attr_key from attributekey ak,resourceglobalkey rk where ak.attrkey_id=rk.attrkeyid and rk.supermetricid IN(select id from supermetric where name LIKE '%XXX%');
    
    • Update the file user_ciq_metrics_collected.xml in the Analytics and the UI VM using the great metric.
    • Create the xml file using the great metric to generate customizable view/report.

    Let us know if it works for you?

    Thank you

    Alex D.

  • Constantly getting script errors in VCOPS

    This issue is really bugging me and practically useless VCOPS.

    I am constantly VCOPS script errors now. They seem to be js errors...

    ... Home...

    After clicking Yes, the right pane is no reagent and don't have full details.

    Anyone else seeing this?

    Thank you

    -type just http:// should redirect you to vsphere-ui;

    or

    -Just type: http:///vcops-vsphere

    or

    -If you also have the license for custom_ui, use: http:///vsphere-custom

  • vcops 5.0.3 Basic windows and version of EGGS by comparing the question?

    Hi all

    I installed a vcops5.0.3 in windows server 2008 R2 and import a vcops5.0.1 in EGGS

    But I logined web console and find out there is different between windows and egg base

    Windows a watch "Custom UI".

    vcops ent.jpg

    OVA a watch "vSphere UI.

    vcops advance.jpg

    The content is totally different, I can't find the sync with vcenter function in a Windows

    I think I might hit a bad link.

    Who can tell me what is the correct way to installation vcops windows Basic, and see the same content as the OVA one.

    Thank you

    Rocky

    In the database windows (also called autonomous), there is no separate admin app and vsphere ui is not available. Only the custom user interface is available. http :// brings you to the custom user interface. You can make the custom itself UI administrative tasks.

    With eggs, you get a separate admin, vsphere ui app and custom user interface.

    For admin app - http :///Admin

    For vsphere ui - http :///vcops-vsphere

    For a custom UI - http :///vcops-custom

    Thank you

    Prasanna

  • Oracle RMAN compressed backupset on TSM disk pool

    Hello

    We have an Oracle 11.2.0.3 database, size * 2.2Tb* on AIX 6.1.

    There are a bunch of disks in TSM 6.3, size * 4 Tb *.
    When you try to take a Tablet backupset on this pool with the command:
    run {
    allocate channel ch01 device type SBT parms 'ENVTDPPTFILE=/usr/tivoli/tsm/client/oracle/bin64/tdpo.opt)';
    allocate channel ch02 device type SBT parms 'ENVTDPPTFILE=/usr/tivoli/tsm/client/oracle/bin64/tdpo.opt)';
    allocate channel ch03 device type SBT parms 'ENVTDPPTFILE=/usr/tivoli/tsm/client/oracle/bin64/tdpo.opt)';
    allocate channel ch04 device type SBT parms 'ENVTDPPTFILE=/usr/tivoli/tsm/client/oracle/bin64/tdpo.opt)';
    allocate channel ch05 device type SBT parms 'ENVTDPPTFILE=/usr/tivoli/tsm/client/oracle/bin64/tdpo.opt)';
    allocate channel ch06 device type SBT parms 'ENVTDPPTFILE=/usr/tivoli/tsm/client/oracle/bin64/tdpo.opt)';
    allocate channel ch07 device type SBT parms 'ENVTDPPTFILE=/usr/tivoli/tsm/client/oracle/bin64/tdpo.opt)';
    allocate channel ch08 device type SBT parms 'ENVTDPPTFILE=/usr/tivoli/tsm/client/oracle/bin64/tdpo.opt)';
    allocate channel ch09 device type SBT parms 'ENVTDPPTFILE=/usr/tivoli/tsm/client/oracle/bin64/tdpo.opt)';
    allocate channel ch10 device type SBT parms 'ENVTDPPTFILE=/usr/tivoli/tsm/client/oracle/bin64/tdpo.opt)';
    allocate channel ch11 device type SBT parms 'ENVTDPPTFILE=/usr/tivoli/tsm/client/oracle/bin64/tdpo.opt)';
    allocate channel ch12 device type SBT parms 'ENVTDPPTFILE=/usr/tivoli/tsm/client/oracle/bin64/tdpo.opt)';
    allocate channel ch13 device type SBT parms 'ENVTDPPTFILE=/usr/tivoli/tsm/client/oracle/bin64/tdpo.opt)';
    allocate channel ch14 device type SBT parms 'ENVTDPPTFILE=/usr/tivoli/tsm/client/oracle/bin64/tdpo.opt)';
    allocate channel ch15 device type SBT parms 'ENVTDPPTFILE=/usr/tivoli/tsm/client/oracle/bin64/tdpo.opt)';
    allocate channel ch16 device type SBT parms 'ENVTDPPTFILE=/usr/tivoli/tsm/client/oracle/bin64/tdpo.opt)';
    
    BACKUP AS COMPRESSED BACKUPSET FULL DATABASE;
    
    release channel ch01;
    release channel ch02;
    release channel ch03;
    release channel ch04;
    release channel ch05;
    release channel ch06;
    release channel ch07;
    release channel ch08;
    release channel ch09;
    release channel ch10;
    release channel ch11;
    release channel ch12;
    release channel ch13;
    release channel ch14;
    release channel ch15;
    release channel ch16;
    }
    backup fails with the error:
    RMAN-03009: failure of backup command on ch16 channel at 11/28/2012 12:22:17
    ORA-19502: write error on file "CS_20121128_qlnrehrm_1_1.DBF", block number 257 (block size=8192)
    ORA-27030: skgfwrt: sbtwrite2 returned error
    ORA-19511: Error received from media manager layer, error text:
       ANS1311E (RC11)   Server out of data storage space
    channel ch16 disabled, job failed on it will be run on another channel
    Attempts to run above the command also failed on cluster disks in TSM with size * 5 TB *, * 7 TB *. But this size is 2 - 3 times bigger than the original DB Dimensions (2.2 to).
    The last attempt to run our command successful on TSM disk pool of * 15 TB *!. During its execution, we could monitor the use of the disk pool of * ~ 7 TB *. Once the backup is complete, the disk pool released and only * ~ 100 GB * compressed backupset remained.

    -----
    So the question is: is this a problem or an RMAN or TSM normal?

    We will be very greatfull for your kind answers!


    Kind regards
    Iago

    Hello;

    Not sure how the algorithm works or how the channels have an effect on the compression. Would assume its done in memory or speed would suffer.

    It shows all and testing them to get an idea of what to expect:

    RMAN backup compression

    http://gavinsoorma.com/2009/12/11g-release-2-RMAN-backup-compression/

    And so for this:

    http://taliphakanozturken.WordPress.com/2012/04/07/comparing-of-RMAN-backup-compression-levels/

    And finally:

    http://www.dbspecialists.com/blog/database-backups/RMAN-compression-algorithms-in-11gr2/

    Best regards

    mseberg

    Published by: mseberg on December 7, 2012 03:52

  • OVF tool is unable to convert a compressed file OVF vmx

    I have a compressed file of OVF generated by OVF tool to 'compress' option. However, I can't convert the generated file ovf return to vmx. When I run the OVF tool to perform the conversion, it says "Drive transfer failed" and "error: cannot open the source drive: disk1.vmdk.gz. Any ideas how I should proceed with this?

    you are missing a

    OVF:compression = "gzip" in the section < file >. Insert in the ovf (you need to remove the .mf file as the manifest do not match) and everything should be good. I see that this has been made with ovftool 1.0.0 that you are more than welcome to upgrade to 2.0.1 ovftool and try export once more. If we still fail to define the ovf: compression = attribute "gzip" I will ensure that it is included in the next version.

     eske
    
  • Question about compression of the table

    I manage several databases 10g and have been invited to compress tables in several storage spaces. I am familiar with the use of compression. I read Oracle documentation on the statement ALTER TABLE... MOVE the COMPRESSION control, but have many questions. Can anyone recommend a good white paper, or other documents on the best way to compress tables with existing data? Thanks in advance for your advice!

    Hello

    Please find enclosed a few whitepapers on Compression:

    [http://www.oracle.com/technology/products/bi/db/10g/pdf/twp_data_compression_10gr2_0505.pdf]

    [http://www.trivadis.com/uploads/tx_cabagdownloadarea/table_compression2_0411EN_01.pdf]

    Of course, you can also check out the documentation, Oracle (the Guide of Directors of the chapter on Tables).

    Don't forget that there are some limitations on the Compression.

    Hope this can help.
    Best regards
    Jean Valentine

Maybe you are looking for