FDM export issues: incomplete DAT file

Hello

We have problems in the process of exporting to a place. The creation of DAT files seems to fail at halfway. The question seems to be intermittent, as we have 2/3 shipments pass on the total 5. Having said that, the time that they have completed successfully, it is early in the morning and no, rush hour in which we have a few midi. However we are not sure that this would contribute to this question. To add, in the integration settings, we turned the Protection of data.

Thank you!

Sorry for the long interval.

After some analysis, we found that it could not any charge which was more than 15 minutes. This has led us to some of the configuration of COM + adapter FM11x-G5-C. When we looked at the parameters that we found in the parameters "Pooling & recycling", the "time-out (minutes) was set at 15." Increase this value to a higher number finally solved this problem and the loads completed successfully.

Thank you for all your support and time.

Tags: Business Intelligence

Similar Questions

  • How can we export itno csv data file for oracle forms

    Hello

    How can we export itno csv data file for oracle forms

    For example. I have the block called A.what never the data displayed on a block, when I click on a button, displays the block data, must be exported to the csv file.

    My application is running on the unix operating system.

    Please help on this.

    First of all.  What is your version of forms (for example: 11.1.2.2.0 not 11g).  Finally, who will use the .csv file?  If it is a user on their client computer - CLIENT_TEXT_IO TEXT_IO or WebUtil are standard packages used to export data to a file of Oracle Forms.

    The amount of data to be exported?  If you export only a couple hundred lines - export of Froms will be OK.  If you export more lines than that (300 + lines) then the export will be extremely slow to your username.  Keep in mind that forms is not designed to perform data exports - there are better tools available for this...

    Craig...

  • Is it possible to generate the FDM export to .xml format file?

    Hello world

    We use the Hyperion financial management of the quality of the data (version 11.1.1.3.00).
    Earlier today we are generating the export format .dat file, is it possible to generate the export to .xml format file?

    What are the possible export available in MDF file formats?
    Any suggestion would be greatly helpful.

    Thanks in advance :)

    Aly Hassan

    The file output and its formatting is defined/created in the Export Action of each adapter FDM script. You can modify this script to print the file exported in any format you want. However, I don't see why you want to do for the entire target specific adapters supplied with FDM, well if we talk about changing the default output of the adapter then PULL, this would be where you would bring changes. There is no magic switch that automatically does for you :-)

  • Incomplete data file recovery

    Hello

    I use oracle database 10g (10.2.0.10) in RHEL5. I want to make a point in time recovery a backup data file. Through RMAN, I published the following command

    RMAN > run {}
    2 > sql ' alter session set nls_date_format = "dd-mon-yyyy hh24:mi:ss";
    3 > set until that time August 21, 2011 13:04 ';
    4 > restore datafile 4;
    5 > recover datafile 4;
    6 > alter database open resetlogs ;}

    But RMAN made a full recovery. Yet once again, I deleted the data file and restore the backup data file. Now, I wrote the following at the SQL prompt

    SQL > alter session set nls_date_format = 'dd-mon-yyyy hh24:mi:ss ";
    Modified session.
    SQL > recover datafile ' / u01/app/oracle/oradata/ORATESTDB/datafile/o1_mf_users_751h7fmh_.dbf' until August 21, 2011 13:04 ';
    ORA-00274: illegal until recovery option

    It displays the above error. But I am able to perform incomplete recovery of the complete database using RMAN same as above.

    Done datafile point-in-time recovery is not possible? or is there something wrong in my approach?

    Kind regards

    007

    A data file can be recovered to a point in time which is incosistent with the rest of the database.
    (why? Data integrity! A table with several extensions that can cover several data files. You can't have some scopes with data from 12:05 and other extensions into another file of data retrieved using data from 10:05! Even if it's a single datafile tablespace, you will violate referential integrity (whether or not they are applied) If, say, the SALES table has entries until 12:05, but the SALES_LINES table has entries until 10:05!)

    You can make a Tablespace recovery time by using an auxiliary Instance and then copy the tablespace back. You must maintain integrity.

    Hemant K Collette

  • issue of data file

    Hello

    If the data file, a column is using aliases instead of the name of the Member, but the rest of the columns are mapped on behalf of the Member, could it be loaded correctly for the Member of respective dimension using load exsiting rules file that maps to the name and member value?

    Or I need to replace the Alias with the name of Member?

    Thank you

    Published by: user8091395 on June 28, 2010 15:34

    Essbase apprentice,
    I'm afraid that you are must have done something else wrong. During a data load, you can use the name of Member or alias (from ONE of these alias tables). Because the alias associated with the outline, Essbase knows the relationship between an alias and the Member and does not care. You can load either. IT is not a recommended practice to load to alias as Alias names chang more often than the names of the members and we do a lot of interesting changes of aliases to make them uniquen. But yes they can be used

  • retrieve the information of partition table with lack of offline data file

    I just restored a backup of a database. When I unzipped data files, I discovered that a single data file was so damaged in the starting order of the database, I had to put the offline data file. The tablespace which reports the data file holds the information of partition of a table. Since I have no archives from when the data file has been created, I know it is not possible to recover the data file, I have to recover as many lines as possible other files in the tablespace data but when export or create the table as what to choose in the affected partition get ORA-00376 : cannot read file 1624 at the moment.

    is there a way to tell oracle to ignore the data offline file and export the remaining data file information?

    Just to clarify my situation a little more;
    iI have a table called PPGA_ACTABOPRE that is partitioned by HASH and has 20 partitions, each partition in a different tablespace and tablespace each have about 8 files of data. A data file in the partition 9 is missing and I want is to get the information from the files of the remaining data, but whenever I issue a select statement or an exp or anything I get ORA-00376.

    I try to set the flag to ignore blocks corrupt with run DBMS_REPAIR. SKIP_CORRUPT_BLOCKS ('PPGA', 'PPGA_ACTABOPRE'); and then select the info but it did not work.

    Any suggestion would be appreciated;
    Thanks in advance

    For the actabopre_p08 partition, some files are available, and we're not. You can access data at the segment level, because some extensions are missing.

    You would need to access the data by rowid. You could loop through dba_extents and use dbms_rowid.create to generate a list of ROWID that may exist in the available data files.

    This may help: -.

    http://kerryosborne.Oracle-guy.com/2009/02/saving-rows-from-corrupt-blocks/
    http://kerryosborne.Oracle-guy.com/scripts/save_u.SQL

    Published by: Robert Geier on March 12, 2010 15:33

  • Error exporting multiload FDM in Essbase - "Stream Failed, file path invalid provided!"

    Hi Experts, need your suggestion for my error loading multiload FDM "Stram failed, provided invalid file path" while loading in Essabse.

    I have successfully IMport, commit, export, but get error on 'Load', which reads ' supplied invalid file path down Stream! All options were working fine until yesterday and I was abe to load essabse but don't know how his work today. Please suggest any probable reason or where I can help out.

    the .dat files are to be created on the server can of course directory say it works fine until the export option. If you have seen this error before suggesting?

    ERRO file:

    * Start the journal entry for the Runtime Error FDM [2014-09-17 16:35:36] *.

    -------------------------------------------------------------

    ERROR:

    Code............................................. 4117

    Description... Failure of flow, file path invalid provided!

    \\Staap1655d\appfdm\FDMDev\Outbox\ArchiveRestore\CORP_RPE0211. Err

    Process... clsAppServer.fFileGetStream

    The component... upsAppSv

    Version.......................................... 1112

    Thread........................................... 5092

    IDENTIFICATION:

    User............................................. A019942

    Name of the computer... STAAP1655D

    App Name......................................... FDMDEV

    Client App....................................... WebClient

    CONNECTION:

    Provider......................................... SQLOLEDB

    Data, sql-hyperion-topas-dev server, 1436

    Name of the database... PlanApp1

    Trust connect... Fake

    Connect status... Open connection

    GLOBALS:

    Location......................................... CORP_RPE

    Location ID... 751

    Location Seg... 5

    Category......................................... Budget

    ID of the category... 13

    Period........................................... Feb - 2014

    Period ID........................................ 28/02/2014

    POV Local........................................ Fake

    Language......................................... 1033

    User Level....................................... 1

    All Partitions... True

    Is Auditor....................................... Fake

    Francisco Merci for your quick response.

    This is resolved by simply uncheck the option "Enable load chain" on the configuration of the adapter 'load' and it works now!

    Thank you

    Vivek

  • Query in the export of files in a data file

    Hello
    I did a project for the export of files in a data file using this link below:

    http://www.Oracle.com/WebFolder/technetwork/tutorials/OBE/FMW/ODI/odi_11g/odi_project_ff-to-FF/odi_project_flatfile-to-flatfile.htm

    My project works very well. Here, the source file is local m/c (C:\Oracle\Middleware\Oracle_ODI1\oracledi\demo\file).
    (1) what should I do if the source file is located in a remote m/c(says its' ip is 172.22.18.90)?
    (2) will cause no problem if the remote m / c operating system is unix
    Thank you
    Papai

    NB:-my m / c OS is windows 7.

    If you can access the other machine as a sharing folder then provide the same path in the physical schema as \\my_other_pc_on_shared\new_folder

    If you can't access it, and then create an agent on this machine that can access the path and run your project by using the agent.

    Thank you
    Chantal
    http://bhabaniranjan.com/

  • incomplete recovery of a data file.

    I'm performing incomplete recovery of a data file that uses the following scenario.
    SQL > select * v log$;
    shows current journal seq is 10;

    I perform dml as follows:
    SQL > update hr.emp set salary = 1000;
    SQL > commit;

    Now, I've deleted log archiving with sequence # 9 alongwith users.dbf datafile.
    Now to perform an incomplete recovery of the datafile, users.dbf. I followed the following steps.
    RMAN > the judgment;
    RMAN > bootable media.
    RMAN > run
    {
    until the sequence 9;
    restore data file 4;
    recover datafile 4;
    }
    RMAN > alter database open;
    Now, when I ask about hr. EMP, it shows the updated lines, i.e., salary column has been updated with 1000;
    I want to know that after the incomplete rcovrry until the seq, what other newspapers is applied. My example shows that this newspaper seq 10 has also been applied. Pls help me to improve my understanding.
    I can't understand the reason to get updated when the incomplete recovery lines.

    Hello.

    You cannot perform incomplete recovery of a single data file. After recovery, before opening the database, all SNA must be synchronized between controlfiles, redo logs and data files. A single data file must be restored and recovered to the last SCN for the database to open. If you want / need to do incomplete recovery, always restore ALL data files, recover their until the desired point in time and then open the database with resetlogs synchronize all data with controlfiles and redo logs files.

    In your case, you have made a complete recovery of the data file. Very probably data repeat still available in the online redo logs were used to recover the data at the last CSS file (since you said that you have removed the archiving log - which, incidentally, makes your online backup no need for a full recovery, after the redo logs switch line and crushes with sequence #9 redo log and beyond).

    When experimented with incomplete recovery, until the RCS or at least up to the time, it allows you to specify the point in time recovery with much more accurately and without guessing, what happened in whatever sequence of journal. You must manually remove the files of data or archivelog files before playing with incomplete/complete recovery. In fact, it is a good practice not to delete the database of linked files: o)

    Kind regards
    Martin

  • Accumulate in the FDM export file

    Hi all

    What is the purpose of checkbox "Accumulate in the queue" to FDM export times...
    There is an option at the same time as 'Accumulate' as a method

    'Accumulate' method adds the values to the existing values on the target.

    Which accumulates to do file?


    Thanks in advance


    J

    For this, if it says on the Tin, it accumulates the values in the file before the charge. It is used in combination with the charge to replace or merge. If its not used and the intersection itself appears more than once in the file to load only the last value will be charged

  • all export and import and the data files location

    Hi guys,.

    can someone please confirm...

    If I export the entire base and my X tablespace uses the loc * 1 data file * / test1.dbf, in the database where I matters now in the data file is in loc * 2 * / test1.dbf, it does make a difference and the data gets imported correctly, Yes?

    If I export the entire base and my X tablespace uses the loc * 1 data file * / test1.dbf, in the database where I matters now in the data file is in loc * 2 * / test1.dbf, it does make a difference and the data gets imported correctly, Yes?

    No, it makes no difference and the import of the data correctly.

  • Issue to the creation/editing UCM contributor data file

    Hello

    We have the requirements below:

    (a) create a contributor by program data file, and not using the contributor of the Studio Site for a definition of region RD1, by using a custom, template not the model of default.xml provided by the Studio (SS) Site.

    I created a data contributor using RD1 file and set it as your primary file to create other files of data for this RD1. I used CHECKIN_NEW IDC service to create the data file. PFA the code snippet. I was able to successfully create the data file.

    (b) change the correct display of data file

    However when I try to modify the data of the University Complutense of MADRID, I get the error message. PFA file error message.

    I would like to know if there is no Site Studio IDC service to create web resources based on the custom template. I want to create assets web Webcenter application and no contributor Site Studio.

    Hi Nikhil,

    Try this piece of code:

    DataBinder dataBinder = idcClient.createBinder ();

    dataBinder.putLocal ("IdcService", "SS_CHECKIN_NEW");

    dataBinder.putLocal ("xRegionDefinition", prop.getProperty ("regiondefinition"));

    dataBinder.putLocal ("dDocTitle", prop.getProperty ("title"));

    dataBinder.putLocal ("dSecurityGroup", "Public");

    dataBinder.putLocal ("ssDefaultDocumentToken", "SSContributorDataFile");

    dataBinder.addFile ("primaryFile", new leader (prop.getProperty ("filename")));

    dataBinder.putLocal ("xWebsiteObjectType", "Data file");

    dataBinder.putLocal ("dDocType", "Document");

    dataBinder.putLocal ("dataFileFieldValue", "Data file");  This parameter is passed when no particular site is selected in the menu drop-down

    Here I've created a companion file where in all connection and defintion region ucm to be fixed this CDF are defined.

    filename - this one is set to default.xml (to check normal working)

    regiondefinition - this is the content of Dr id to which is attached the cdf.

    title - title of the element content according to customer's requirement.

    With this code, test and whether the requirement that you are looking for is completed and update results.

    Thank you

    Srinath

  • RMAN issues - no backup or copy of the data file found

    Oracle 11 g 2

    Linux RHEL 6.5

    I inherited a database backup and restore question since the DBA is OoO.

    Here is the script used for the backup:

    Configure default device the disk type;
    Configure controlfile autobackup on;
    Configure controlfile autobackup peripheral type disc format in ' / u01/app/oracle/bkp/controlfile/%F.ctl';
    Configure retention policy to recovery of 30-day window;
    View all;
    Run {}
    stop immediately;
    bootable media;
    allocate channel dup1 device type disk;
    allocate channel dup2 device type disk;
    SQL "create pfile =" /u01/app/oracle/bkp/pfile/initpfile.ora "of spfile;
    backup format ' / u01/app/oracle/bkp/cold_db/cold_bkp_%U' database;
    output channel dup1;
    output channel dup2;
    ALTER database open;
    }

    When I try the following restore script:

    run
    {
    Start pfile='/u01/app/oracle/bkp/pfile/initpfile.ora' nomount;
    Restore controlfile to ' / u01/app/oracle/bkp/controlfile/c-123131414-20140509-00.ctl';
    change the editing of the database;
    restore the database;
    ALTER database open resetlogs;
    }

    I get error RMAN-06023: no backup or copy of the data file found

    I'm trying to restore a database backup from 5 days ago and I use this backup control file.

    I'll close this discussion and continue to involve the Oracle.  Thank you all for your help.

  • Simple data export issue

    Hello

    I'm testing the process feature of Livecycle WorkBench.

    However, I am facing some difficulties...

    As a test, I wanted to create a simple process that would:

    • Look for an email attachment, an interactive PDF form
    • Export of XML data in the form
    • Send the data XML email

    I managed to do step 1, but as soon as I get to step 2, I have the following problem:

    I have the PDF form that is stored in a variable, type of document, by I can not use this variable to assign in as input for the component "export data"...

    The data export component seems to expect an active...

    If I run my process, with the process as input (in the form of URI), I get an exception of entry saying that my PDF is not in a correct format...

    I thought that this test would be a simple

    Could someone show me how to do this in a process? Thank you very much!

    Please seek assistance from various examples of services available here: http://help.adobe.com/en_US/livecycle/9.0/samples/lc_sample_service.html

    Here is the link for all samples: http://www.adobe.com/devnet/livecycle/samples.html

    I hope this helps.

    Thank you

    Wasil

  • Event scripts FDM shot twice during data loads

    Here's an interesting question. I added the following three scripts to different event (one at a time, ensuring that one of them is both), clear data before loading to Essbase:


    Script event content:
    ' Declare local variables
    Dim ObjShell
    Dim strCMD
    «Call MaxL script to perform data clear the calculation.»
    Set objShell = CreateObject ("WScript.Shell")
    strCMD = "D:\Oracle\Middleware\EPMSystem11R1\products\Essbase\EssbaseClient\bin\startMAXL.cmd D:\Test.mxl"
    API. DataWindow.Utilities.mShellAndWait strCMD, 0


    MaxL Script:
    Login * identified by * on *;
    run the calculation ' FIX("Member1","Member2") CLEARDATA "Member3"; ENDFIX' on *. *** ;
    "exit";




    However, it seems that clear is performed twice, both before and after the data has been loaded to Essbase. This has been verified at every step, checking the newspaper of Essbase applications:

    No script event:
    -No Essbase data don't clear in the application log

    Above to add the script to the event "BefExportToDat":
    -The script is executed once when you click Export in the customer Web FDM (before the "target load" modal popup is displayed). Entries are visible in the log of Essbase applications.
    -Script is then run a second time when you click the OK button in the modal pop-up "target Load System". Entries are visible in the log of Essbase applications.

    Above to add the script to the event "AftExportToDat":
    -The script is executed once when you click Export in the customer Web FDM (before the "target load" modal popup is displayed). Entries are visible in the log of Essbase applications.
    -Script is then run a second time when you click the OK button in the modal pop-up "target Load System". Entries are visible in the log of Essbase applications.

    Above to add the script to the event "BefLoad":
    -Script only runs that after you click Export in the FDM Web Client (before 'target system load' modal popup is displayed).
    -Script is run AFTER loading to Essbase data when the OK button is clicked in the modal popup "load the target system". Entries are visible in the log of Essbase applications.

    Some notes on the above:
    1. "BefExportToDat" and "AftExportToDat" are both performed twice, before and after the modal popup "target Load System". :-(
    2. "befLoad" is executed WHEN the data is loaded to Essbase. :-( :-(

    Someone please any idea how we could run a clear Essbase database before the data is loaded, and not after we have charged for up-to-date data? And maybe about why event scripts above seem to be fired twice? It doesn't seem to be any logic to this!


    BefExportToDat - entered in the journal Application Essbase:
    + [Sea 16 May 16:19:51 2012]Local/Monthly/Monthly/admin@Native Directory/140095859451648/Info (1013091) +]
    + Received order [calculate] user [directory admin@Native] +.

    + [Sea 16 May 16:19:51 2012]Local/Monthly/Monthly/admin@Native Directory/140095859451648/Info (1013162) +]
    + Received order [calculate] user [directory admin@Native] +.

    + [Sea 16 May 16:19:51 2012]Local/Monthly/Monthly/admin@Native Directory/140095859451648/Info (1012555) +]
    + Erasure of the data in the partition [Member3] with fixed member [Period (Member1); Scenario (Member2)] +.
    +...+

    + [Sea 16 May 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info (1003037) +]
    Updated load cells [98] data

    + [Sea 16 May 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info (1003024) +]
    Data load time: seconds [0.52]
    +...+

    + [Sea 16 May 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info (1013091) +]
    + Received order [calculate] user [directory admin@Native] +.

    + [Sea 16 May 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info (1013162) +]
    + Received order [calculate] user [directory admin@Native] +.

    + [Sea 16 May 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info (1012555) +]
    + Erasure of the data in the partition [Member3] with fixed member [Period (Member1); Scenario (Member2)] +.


    AftExportToDat - entered in the journal Application Essbase:
    + [Sea 16 May 16:21:32 2012]Local/Monthly/Monthly/admin@Native Directory/140095933069056/Info (1013091) +]
    + Received order [calculate] user [directory admin@Native] +.

    + [Sea 16 May 16:21:32 2012]Local/Monthly/Monthly/admin@Native Directory/140095933069056/Info (1013162) +]
    + Received order [calculate] user [directory admin@Native] +.

    + [Sea 16 May 16:21:32 2012]Local/Monthly/Monthly/admin@Native Directory/140095933069056/Info (1012555) +]
    + Erasure of the data in the partition [Member3] with fixed member [Period (Member1); Scenario (Member2)] +.
    +...+

    + [Sea 16 May 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info (1003037) +]
    Updated load cells [98] data

    + [Sea 16 May 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info (1003024) +]
    Data load time: seconds [0.52]
    +...+

    + [Sea 16 May 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095928858368/Info (1013091) +]
    + Received order [calculate] user [directory admin@Native] +.

    + [Sea 16 May 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095928858368/Info (1013162) +]
    + Received order [calculate] user [directory admin@Native] +.

    + [Sea 16 May 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095928858368/Info (1012555) +]
    + Erasure of the data in the partition [Member3] with fixed member [Period (Member1); Scenario (Member2)] +.


    BefLoad - entered in the journal Application Essbase:
    + [Sea 16 May 16:23:43 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info (1013091) +]
    + Received order [calculate] user [directory admin@Native] +.

    + [Sea 16 May 16:23:43 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info (1013162) +]
    + Received order [calculate] user [directory admin@Native] +.

    + [Sea 16 May 16:23:43 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info (1012555) +]
    + Erasure of the data in the partition [Member3] with fixed member [Period (Member1); Scenario (Member2)] +.
    +...+

    + [Sea 16 May 16:23:44 2012]Local/Monthly/Monthly/admin@Native Directory/140095929911040/Info (1003037) +]
    Updated load cells [98] data

    + [Sea 16 May 16:23:44 2012]Local/Monthly/Monthly/admin@Native Directory/140095929911040/Info (1003024) +]
    Data load time: seconds [0.52]
    +...+

    + [Sea 16 May 16:23:45 2012]Local/Monthly/Monthly/admin@Native 140095860504320/Directory/Info (1013091) +]
    + Received order [calculate] user [directory admin@Native] +.

    + [Sea 16 May 16:23:45 2012]Local/Monthly/Monthly/admin@Native 140095860504320/Directory/Info (1013162) +]
    + Received order [calculate] user [directory admin@Native] +.

    + [Sea 16 May 16:23:45 2012]Local/Monthly/Monthly/admin@Native 140095860504320/Directory/Info (1012555) +]
    + Erasure of the data in the partition [Member3] with fixed member [Period (Member1); Scenario (Member2)] +.

    James, the scripts export and the Load event will fire four times, once for each type of file: the. DAT file (main TB file),-A.DAT (log file),-B.DAT and - c.DAT.

    To work around this problem, then only run during the loading of the main TB file, add the following or something similar at the beginning of your event scripts. This assumes that strFile is in the list of parameters to the subroutine:

    Select Case LCase(Right(strFile,6))
         Case "-a.dat", "-b.dat", "-c.dat" Exit Sub
    End Select
    

Maybe you are looking for

  • Should I remove Adobe Flash Player?

    Read here and here the responses on this forum, I was wondering if the Adobe Flash Player, I chose to install is necessary for a MacBook Air Yosemite 10.10.5 using Safari Version 9.1.1 (10601.6.17) at all?  What I read was that flash player is prone

  • iMac fan runs full speed after the migration wizard

    Hi, just a clean installed 10.9.5 BONES and then I used the migration assistant to transfer my backup time machine from my old computer. He made the transfer, took nearly 4 hours. Then when he was done he rebooted the computer, and as soon as he star

  • Windows Media Player shows no video just Audio

    Hi all I have the Satellite A110-275, Windows XP HOME Edition. I installed all types of codecs like K-Lite, all THE Codecs pack, Divx, filter AC and all the others, but my problem has not been resolved. My problem is that when I watch a movie online

  • HP G60T system Disable

    I'm trying to recover the password on an old laptop and get the Sytem Disable 82411687 key after pressing enter several times to the message "Power on password". Is there something I can do to overcome this? Thank you! Katie

  • I have a perfect wifi signal, so that's why my youtube so slow to buffer video?

    I have an excellent wifi signal.full bars.so my question is why youtube stops and departure for the buffering so at night.it takes 2 hours to watch a 1 hour video lol