Air and performance issues

Hey,.

I am new to mobile development AIR although I have around an experience of several decades with the Flash, but with air for mobile, I'm a little lost. So I was hoping someone can answer these questions for me...

1. is there currently no way to use phones vibrate with the air? I remember people request it on these forums, last year, and he always looks as if this extremely simple thing can't be done? And if he can't do that, are there plans to have in the AIR 2.7?

2. when I select the GPU acceleration, it puts in cache bitmaps all automatically? Or what I still have to select manually "Cache Bitmap" for everything I want to cache? (I use Flash Pro CS5.5). Because he seems to have no effect on the frame rate no matter what I do here as long as GPU render mode is selected.

3. I'd get better performance by using vectors or bitmaps for the graphics?

4. What is the size limit of texture for Android and iOS before it can no longer be cached?

5. does it matter performance wise if JPG and PNG is used for graphics instead of bitmaps? (when I didn't choose Cache as Bitmap).

6. What is the difference between an AIR for Android and IOS? I noticed that for Android it supports the use of the camera, for iOS it don't. Is there something else?

7. How can I integrate a HD video in AIR for Android app? I saw this thread: http://forums.adobe.com/thread/849395?tstart=0
but it is for AIR on iOS, I don't think it's different for Android, but I get a white screen when I use the code of this thread (and I tested on an Android phone too, not in Flash > test movie).

Or if anyone has ANY advice for performance at all then I would love to hear

1. I don't think there's a way to trigger the vibrator function.

2. There are two types of caching, cacheasbitmap and cacheasbitmapmatrix. With cacheasbitmap, everything what you cache is transformed into a bitmap version that can be read more quickly during the creation of the final scene. Bitmaps are not interested, because they already are bitmaps. Cachasbitmapmatrix is the part that makes the bitmap to be handled by the GPU. If you have cacheasbitmap and cachasbitmapmatrix on a displayobject, then in most cases, it is residing on the GPU and makes a translation rotation or scaling on this subject should be fast.

3 If you use GPU and do cacheasbitmap/cacheasbitmapmatrix properly, it shouldn't matter whether initially the vectors and bitmaps. If you use CPU, then bitmaps would no doubt faster.

4. not knowing the limits for Android, but the iPhone 4 and iPad is 2048 pixels. The previous features are 1024 pixels. If a texture is larger than this, it will not be cached and will constantly transfer to GPU memory.

5. in the past, I think that bitmaps are decompressed before use. If that's still the case then it shouldn't matter whether JPEG or PNG. If the bitmap is decompressed only when necessary, so it might be a difference in the use of memory or performance, even between the different levels of compression.

6. in CS5 there was a long list of differences between the capabilities of Android and iOS. There are only a few differences with CS5.5 and it is not all cases where Android more. iOS may allow you to use one of the two cameras for example.

7. on Android tablet I get white also. StageWebView works differently on Android, and I can't quite see how to make it work. I have a friend who wrote a book on development for AIR on Android, I'll ask him!

Tags: Adobe AIR

Similar Questions

  • Satellite P200 - 1 K 9 and performance issue

    I hope IM this ad in the right area...

    Hello

    I have a Satellite P200 - 1 K 9 with the following specifications:

    Intel Core 2 Duo T7700 @ 2.40 GHz CPU
    3 GB RAM
    32 bit
    Note Windows experience 4.8

    ATI Mobility Radeon HD 2600 map graphic and one
    HD sound card...

    My problem is that since I bought my laptop about 7-8 months ago slowed down the performance. Of course, since then I had filled my hard drive (partition) with the media and understand that this is at the expense of the performance of the system. However, since this grateful I released a lot of disk space. I have 40 GB free on my C: and 90 GB free on my D:. Total, shared capacity is 250 GB (125/125).

    Basically, while being largely ignorant of the technical questions, laptops and technology altogether, the way I noticed the inability of systems to "return" to its original factory running (which, I might add, I was very impressed by) is when it comes to games. Football Manager 2009, for example running very slowly. I had a peek in the Task Manager, because I play the game in a window, and the CPU usage works 100%, although I'm sure the culmination of the individual tasks and Applications is not equivalent to that (maybe there is a program hidden Hogging my CPU speed?). Call of Duty 4, which took place on medium-high graphics settings when the laptop was new, now struggles on the lower graph and the sound settings, without that there is a problem with internet connectivity. In addition, (for reference, I m not not the game breath) Battlefield 2, which is about 3-4 years of age very slowly also works. He ran too extremely fast, on the highest graphics settings when the laptop was brand new.

    Basically, what I want to do, is to return the system to its original performance. It is always too slow when two hard drives were full of games and media. Now, despite the withdrawal of many of them, the system works like a sloth.

    Means I ve done to solve performance problems are: disk defrag, disk clean up, error checking, virus scan (with Kaspersky), scanning of spyware (with PC tools Spyware Doctor), deleted the old files with CCleaner and erased files not deleted with a program called eraser... As well as the release of about 50% of the disk space.

    Add to that, I ve noticed performance was much more when I first turn on the laptop and the sides of it n t had the chance to become hot. I understand what is normal, and he did before when the laptop was new and was operating at optimum performance, but perhaps I have an overheating problem? I'll let you know what is the temperature, but I put just t know.

    I hesitate to return the system to its factory settings and if necessary I'll take to a computer store, if there is something they can do. But in the meantime I d like to listen to any advice on general issues that I face, and possible solutions to my problem.

    Thank you very much.

    The problem with Windows is that it becomes slower and slower that you operate.

    When the devices and programs are installed, they increase the size of the registry, activate the Services, add resident DLLS and drivers, slow down the file system by adding thousands of files. As well, the operating system can get damaged after some time due to the loss of power and software poorly written.
    Even if you use CCleaner and other utilities and uninstall several programs, it is never as fast as a fresh installation of windows. Uninstalling programs rarely remove the program completely 100%.

    The antivirus can have a huge effect on performance, that's why the system runs great when the image of the plant is installed (which has usually no anti-virus).

    So basically if you want the system making a lot once again, you must back up your data and perform the recovery. I myself do it once a year to restore performance.

  • Or ASM, raw devices, and performance issues

    Hello

    I was recently become familiar with ASM. Because I did that in 'play' VM boxes, it was not possible to draw conclusions on performance improvements.

    I would like to know what improvements performance forum members may have experienced after the migration of a production environment to a no - ASM to ASM configuration installation.

    It would seem that, since the DSO is a 'form' of crude device that the increase in performance should be noticeable. Was this the case for those who have emigrated to it? It was improving the performance or it only to facilitate the management of the database easier?

    Thank you for your contributions on the subject,

    John.

    440BX - 11 GR 2 wrote:

    I'm not sure what you mean by "hotspots", can clarify you a bit?

    John, to add to what Aman said rebalancing.

    Imagine the following scenario. New storage array must replace the old. You must move the entire database across from the old to the new. The existing storage array may have 20 x 128 GB lun. The new table will provide you with 5 x 1 TB lun.

    How do you do that without a single second of downtime? You use ASM.

    You will have an existing diskgroup of 20 aircraft. ASM, you draw these of being abandoned. Also using ASM, you add 5 new devices to this same diskgroup. Then, you issue a command of the ASM rebalancing. Close the session. And have a good night sleep from the operational database.

    The next morning that all data on the LUN 20 should have been distributed on the 5 new LUNS and the 20 released MON - ready to be physically removed and reused elsewhere. (you can set the "aggression" of the rebalancing operation - will be the more aggressive, more resources i/O that it will consume and faster the process of rebalancing).

    And even better - not a single second of downtime for maintenance. Compare this with the help of a storage provider (EMC comes to mind) and have to do this exercise using their s/w and the approach and the ASM not... No comparison at all. ASM is vastly superior in this regard, because it is specifically designed for the storage management problems we have to solve with an Oracle database.

  • New installed SSD and performance issues

    Hi all!

    I just reinstalled my computer with a new X 25 - V Intel SATA SSD disk 40 GB on which I have installed only, Win7 and CS5, and with my processor i7980x and 24 GB of ram, I was expecting a few performance super crazy (my number of Windows performance is 7.7 to 7.9 possible) and... my Premiere Pro is now painfully slow... Everything is fine in the computer...

    Look at a real-time video, when not returned, is impossible or extremely pixelated (it was fine before with my HARD drive).

    Any help is welcome!

    Thank you

    Etienne

    SSD make great records of program/OS, but they may be quite low for CS5, especially at the price/generation X 25-V series of Intel. As to why SSDS are poor for CS5, there is a problem with slow performance in writing.

    Put everything on your hard drive unless "media DB cache" (CS5 projects, media, cache of media output, etc) and you should be fine. BTW, you have a powerful processor and plenty of RAM, so your material would be much better balanced if you were to use a total of 4 disks 7200 RPM SATA: 2 in RAID 0 for projects and media; 2 in RAID 0 for the media in cache and make out. Check out the 'material' forum for a lot of discussion material.

    Kind regards

    Jim

  • Some questions and Performance issue

    Hi all

    I'm trying to make an application using berkeley DB.
    But I came across a really bad performance (insert 439 / sec) and I also try to go to "DB_QUEUE", but I have a
    "size illegal record number" when the power of the record in my db.

    # length. / prog 10000
    made in 22766,2 ms (updates 439.248/s)

    Real 0m22.807s
    user 0m0.320s
    sys 0m0.580s


    And when I call "db-> set_re_len (1024)" and replace "DB_TREE" "DB_QUEUE" (to the db-> open line).
    I got this:

    illegal record number size
    terminate called after throwing an instance of 'DbException.
    What(): Db::put: invalid argument
    Aborted (core dumped)



    Could someone explain to me what I'm doing wrong here?
    Thanks a lot for any help

    Laurent


    Here is the program:


    ========================================================
    #include < db_cxx.h >
    #include < sys/time.h >



    char buffer [1024];
    #define get_timeval (timeval) gettimeofday (& (timeval), 0)
    struct timeval start, end;


    Sub
    usage()
    {
    std::CERR < < "you should give a number such as ARG1." \n » ;
    exit (1);
    }


    int
    main (int argc, char * argv)
    {
    If (argc! = 2)
    usage();
    DbEnv * env;
    DB * db;
    DbSequence * ff.;
    DB * dbSeq;
    DBT seqKey;
    DBT key;
    DBT data;

    env = new DbEnv (0);
    env-> set_flags (DB_AUTO_COMMIT, 1);
    env-> set_flags (DB_TXN_WRITE_NOSYNC, 1);

    Key.SET_DATA (new db_seq_t);
    Key.set_Size (sizeof (db_seq_t));
    Data.SET_DATA (new char [1024]);
    Data.set_Size (1024);

    u_int32_t flags = DB_CREATE
    | DB_PRIVATE
    | DB_INIT_LOCK
    | DB_INIT_LOG
    | DB_INIT_TXN
    | DB_INIT_MPOOL;
    env-> open("/files/lauma/test/PONG",flags,0);

    dbSeq = new Db(env,0);
    dbSeq-> open(0,"sequences.db",0,DB_BTREE,DB_CREATE,0640);

    DB = new Db (env, 0);
    DB-> set_pagesize (65536);
    DB-> set_re_len (1024);
    DB-> open (0, "mytables.db", NULL, DB_BTREE, DB_CREATE, 0640);

    SEQ = new DbSequence (dbSeq, 0);
    seqKey.set_data (const_cast < char * >("mytables.db"));
    seqKey.set_size (strlen ("mytables.db") + 1);
    SEQ-> open (0, & key, DB_CREATE);

    int count = atoi(argv[1]);
    get_timeval (Start);
    for (int i = 0; I < Count; ++ I)
    {
    If ((i % 1000) == 0) std::cout < < I < < 'Save... \n ";
    DbTxn * txn;
    env-> txn_begin (0, & txn, 0);
    dbid db_seq_t;
    SEQ-> get (txn, 1, & dbid, 0);
    memmove (key.get_data (), & dbid, sizeof (db_seq_t));
    memmove (data.get_data (), buffer, 1024);
    DB-> put (txn, & keys and data, 0);
    TXN-> commit (0);
    }
    get_timeval (end);
    double d = ((end.tv_sec*1000.0 + end.tv_usec/1000.0) - (start.tv_sec * 1000.0 + start.tv_usec/1000.0));
    std::cout < < 'fact' < < d < < "ms (" < < (d? ")" (County) * 1000.0 / d): - 1) < < "updates/s)" < < std::endl;

    SEQ-> close (0);
    dbSeq-> close (0);
    DB-> close (0);
    env-> close (0);
    remove the seq;
    Remove dbSeq;
    delete db;
    Remove env;
    }

    Hello.

    If you are on a little endian machine, you will need to specify a custom key comparison function to ensure that the entire keys are sorted correctly. Without this feature, your random insertions, instead of sequential access, will give you.

    The DB-> set_bt_compare to set the comparison function:
    http://www.Oracle.com/technology/documentation/Berkeley-DB/DB/api_c/db_set_bt_compare.html

    The Getting Started Guide goes further in detail on this subject:
    http://www.Oracle.com/technology/documentation/Berkeley-DB/DB/GSG/C/btree.html#comparators

    Ben Schmeckpeper

  • Adobe Air performance issues?

    I have recently finished a huge project in Adobe Muse and I love the new sensitive design tools. However, as the project continued to develop, Muse has become increasingly slow to react and interact with the sensitive cursor was almost impossible. As someone who worked for Apple, I'd be the first person to point the finger my Mac and offer to buy a new computer, but it is a new, maxed-out MacBook Pro 15 inches. All other Adobe CC applications work without problem. When Muse was first published, as I remember, it was written on top of the Adobe AIR platform, instead of being natively written in C++. That this has something to do with performance issues that I see? If so, is there something I can do to improve performance, I see?

    Air is not basis for Muse more for a long time. Nativly revised muse was released in early 2014.

    In my view, the Muse team works actively on performance boosts, but I think that we should not expect any miracle: no other application I know not a calculate and display real-time changing dynamically and the same objects interactively from time on endless pages in response.

    In my opinion, the only way to significantly increase the speed would be gray on all goods during the move of the scrubber.

  • HP Elitebook 8570p - Windows XP USB very slow and the transfer rate performance issues.

    Hello. I was wondering if anyone else knows some performance issues with the HP Elitebook 8570p and Windows XP Pro - SP3.

    I get a VERY slow start, general poor performance (waiting time when opening files and programs).

    I asked to burn in tests that show that everything is good.

    USB transfer rates are VERY slow (tried all ports of a Seagate FreeAgent USB drive, which works fine on other laptops etc.) for example a 30-minute transfer a 7gig single file of the USB HD on the desktop. Only 3 minutes to do the same on another laptop.

    I think there are problems with the drivers, like Windows 7 works very well and does boot from a bootable USB.

    USB transfer rate are also good when starting from a Windows boot environment.

    I installed HP SoftPaq Download Manager, and all the drivers are up to date.

    Hello:

    I think outside of the box here... Select the Intel (r) 7 Series Chipset Family SATA AHCI Controller instead.

    If it does not, I raise the white flag.

    Paul

  • Smartphone blackBerry Storm performance issues (and suggestions)

    I want to start and say that I don't like my phone.  When it works (which is most of the time), it works fine.

    But there are certainly problems of software.  As a software engineer, the OS has some memory serious management problems.  This seems to cause most of the problems I'm having with the phone.  I'm running the latest Verizon S/W (v4.7.0.148).

    Problem #1:

    I am a paranoid person.  There are a lot of personal information on my phone, I have currently international calls helped that I need this feature.  So to protect me my phone is locked when it is the case or after a period of inactivity time.  It is a good practice.  Part of the draw to the fate of the Blackberry is the security that you can configure on the device (I'm not paranoid enough to allow all the protections of encryption - yet).  Problem of the storm is that it can take 1 to 5 minutes to get the phone unlocked when unholstering.  It is not an exaggeration.  As the performance of the memory gets worse, time takes more time.  If I had to make an emergency call, it could be the difference between life and death.  5 minutes is a long time to wait to call 911.  This must be corrected!

    Problem #2:

    The phone must be "reset" once a day to remember "lost."  It is just poor programming.  I don't know if this problem of BB OS or Apps I'm running, but in both cases, this isn't fair.  After the reboot if I go to the display of memory under options I could have about 3-5 MB of free memory.  After a few hours of use, this will almost always '0 '.  I installed MemoryUp Pro, which seems to have extended the period of time between the battery grips, but I still don't think we should ask all owners of BB Storm to pull there battery once a day.  The other answer I get is "do not run so many Apps.  I get all my applications when finished.  Are the only ones that I continue to turn: Blackberry Messenger, Yahoo Messenger, Google Talk and Google Maps (Latitude).  Other applications running, you can't get out (phone, browser).  I tried doesn't work is not the 'optional' tricks and performance has not really much better.

    Problem #3:

    Missed calls.  This is probably related to problems of memory above, but if the storm can be used as a phone, then I should buy something else.  Although the occurrence of this problem has gone down because I installed MemoryUp Pro and perform the reset once a day, sometimes still about 1 to 5 times.  It takes so long for the phone to 'wake up' when a call comes in (especially if it is holstered), at the time where I hit "Answer", a call has been Messaging voice or even worse, the phone doesn't ring at all.  All I get is a notification of appeal lacks the 'unknown number '.

    Problem #4:

    Battery performance could be improved.  The battery that comes with the storm has about 24 hours on 24 with the Radio and Bluetooth.  Really, this isn't a problem if you know it.  I solved the problem by buying a 2700 mAh battery for extended life of BoxWave I use when traveling.  Makes the storm bit bulky, it will give me 2-3 days of life.  I don't know how this could be improved if not

    I hope that the developers of RIM are trying to solve some of these performance issues.  Certain tasks should be a higher priority as they seem to have (like phone & Unlock).

    Sean

    Here is the link...

    http://www.BlackBerryForums.com/General-9500-series-discussion-storm/196496-latest-9530-OS-4-7-0-208...

  • Management performance issues and huge catalog

    I shoot architecture and real estate. I turn every day hundreds of frames and my current catalog becomes huge. I'm also starting to see a huge performance hit recently. Last night I just changed until 03:30 because LR would take some time as 20 seconds to perform a single task (like changing a slide or photo of the flag). A lot of time spinning beach ball. I have an external hard drive which I support to locally.

    What I was wondering is if I have everything except the most recent project files copy the disk external then keep it unplugged when I WIP edition if it would help performance issues?

    MacBook Pro (10.9.5) and LR CC running. I optimized my catalog and I also have a Cocktail for Mac running on my system to clean the memory, etc.

    Other ideas of huge catalogs management would be useful. I prefer not to make a new catalog for each customer that I shoot several each day and passing backward between catalog every day would be a pain in the thigh.

    Hi KristianWalker,

    Have you tried to increase/purge your Cache? See How to improve and accelerate the performance of Photoshop Lightroom for more ways to optimize Lightroom.

    Also what size is your catalog at the moment?

    Kind regards

    Assani

  • Pilots and controllers RAID performance issues.

    Hello world

    I'll beging assuming that I am new to VMware.

    I decided to build a server ESXi 4.1 autonomous based on map of Intel S5520HC.

    http://www.Intel.com/products/server/motherboards/s5520hc/s5520hc-overview.htm

    I used 5 Seagate 1 TB SATA HDD 7200 RPM 32 MB cache connected to the MB controller. Of course ESX couldn't see the RAID so I used this:

    http://www.Intel.com/products/server/RAID-controllers/SROMBSASMR/SROMBSASMR-overview.htm

    I connected the 4 hard drives in a RAID 10 conf and used this driver during installation so that this controller and R10 table to be found.

    http://Downloadcenter.Intel.com/Detail_Desc.aspx?AGR=Y & DwnldID = 18453 & ProdId = 3148 & lang = eng & OSVersion = A %0 & DownloadType = drivers

    I installed ESX on the 5th disc and also affected the ability of reserve as a "scratch" data store I configured ESX to use this data as a swap for virtual machines store. I also added a datastore iSCSI on a homebuild storage based on Openfiler using similar disks (RAID 10 Setup). All net connections are gigabit Intel on the 3Com gigabit switches. We have therefore 3 data warehouses. 1 is on one disc, the 2nd is on hardware RAID10, 3rd is on iSCSI gigabit. Reading and writing of the 2nd data store performance has been disastrous. It was much lower than these other two stores of data. (reading or writing would not exceed 20000 Kbps as seen on the performance of vSphere client tab.

    I decided to swap with the controller on that is supported natively by ESX, so I used RS2BL080 Intel

    http://www.Intel.com/products/server/RAID-controllers/RS2BL080/RS2BL080-overview.htm

    so I used 4 same disks and RAID 10 configuration and performance was even worse.

    I noticed that ESX used the same 1 controller driver since they are compatible. (megaraid sas)

    I want to permanantly unload/remove this driver and let the ESX to use a native without reinstallation of course.

    Any other suggestions to improve performance are welcome.

    Thank you.

    Welcome to the community,

    It is not the driver, what questions it's the writing mode in which operates the RAID controller. RAID controllers usually work in write-through mode, unless they have a cache with battery attached backup module. With the cache with battery backup module, they are able to operate in mode of writeback who basically dope performance. ESX(I) entirely relies on the caching of the RAID controller and does not any software based in implementing caching for security reasons.

    Generally, you will see rates of transfer between 5-20 MB/s in view write-through and > 80 MB / s in write-back mode.

    André

  • MacBook Air and Macbook Pro for a country with a moderate speed internet

    Hello.

    Please can anyone provide me suggestions on what kind of Macbook (preferably between the Macbook Air and Macbook Pro) to buy? I will be moving to Nigeria and will love a stylish laptop that I can use when there is a poor network.

    Thank you.

    The two manage networks in the same way... slow food makes slow refresh, fast food done fast refresh.

    My concern would be supported in Nigeria.  New units of the retina are sealed and cannot be repaired easily, and if you perform self-service you cancel the warranty Apple Stores years FSAA (Apple authorized service providers) are your best source of support.

    Keep several good backups in case of failure, as although Macs are reliable, the bad parts pass through.  There are also thieves in all regions of the world who love bright shiny Macs.

  • I have a book and an Air and would like to keep in sync how can I do?

    I have a Macbook and Mac Air and you want to keep them in sync, how do I do it?

    Mail can be synchronized by storing it in an e-mail server that uses the IMAP protocol, like iCloud. Mail remains on the server, it is automatically synchronized with all e-mail clients.

    Documents and several other types of data can be synchronized via iCloud, such as calendars, contacts, photos, Safari tabs and bookmarks and passwords. iCloud is easy to use, for the most part. The disadvantages of its use are, firstly, that it synchronizes everything; and second, that some of your personal information will be potentially accessible to foreigners. Read the list features iCloud and privacy notice carefully before deciding if or how to use it. There should be no problem with iCloud Keychain Access privacy, because data is encrypted end-to-end and are not accessible to anyone at Apple - but of course, you have to take the word Apple for that.

    A more complete solution and one that does not raise issues of confidentiality, is set up an OS X Server on your network and create mobile accounts on your other Mac. You will then be able to synchronize all files in the home folder automatically disconnected, or at any other time manually. The main drawback here is that OS X Server would be difficult for non-technical users more to install and maintain. Another is that synchronization only works with the Mac, mobile no.

    Third-party software can also be used to sync files on a network. I don't have a specific recommendation. The software may not be easy to implement in a way that does not lead to conflicts when files are changed on the various devices between synchronizations. To synchronize more than two Mac in this way, you'll want to use it as the master and synchronization between her and the others.

  • Performance issues when coming out of sleep

    I recently built a gaming PC. On Christmas day, I finished putting the thing together. It has an Intel i7-4790 CPU running Intel HD Graphics, 8 GB of Corsair Vengeance RAM, an Asus Z97 - A Mobo and a 2 terabyte Seagate HDD. I installed Windows and put everything in place. Then a week later, my MSI GTX 970 arrived in the mail.

    So, I take the thing to sleep and decreases the performance of the whole thing. Videos YouTube stutter, simple as games fight of World of Tanks to run at its usual rate of 60 fps. To avoid this, I simply restart my comp and then it works fine. performance is back to normal.

    A few weeks, Windows gave me a message on one of the Nvidia drivers causing performance issues when you wake up from sleep. I don't think a lot of it and rejected it. I'll try to reinstall all my Nvidia drivers. I uninstall the Intel HD graphics driver a few days ago and has not changed anything. I would appreciate an answer to this problem because I would like to continue to use the "sleep mode" because it consumes less energy and prolongs the life of my case of LED. Thank you!

    In order to diagnose your problem, we need run Windows performance toolkit, the instructions that are in this wiki

    If you have any questions do not hesitate to ask

    Please run the trace when you encounter the problem
  • Performance issue Windows 7 after installing a game Psychonauts

    Original thread: I defragmented my hard drive, my computer won't start?

    While I was trying to run the game "Psychonauts" steam, which had worked fine on this computer, back on the first installation of Windows 7 I had. In November my OS stopped working completely, and to this day I don't know why. I had Ubuntu since then, and I got Windows 7 back on here again in July, the same person who bought the computer from. Psychonauts worked until I got to the world of Waterloo (a level in the game). Research online told to defragment my hard drive, so I jumped into that knowing exactly what I was doing. Smart, I know. I don't know how long it was, but all of a sudden my computer is turned off and shut right down before arriving at the Toshiba screen. I left for a few hours and came back, turned on, and ran startup repair. He finished after about 20 minutes, said I had to check Microsoft for solutions or something, then it prompted me to shut it down. I did, tried to leave it to the top, it works. Sorry for the long intro. I just want to know if my computer will be correct? Is this what I can run (the irony) to check it out? I had so many problems with it already, I don't know how much more it can survive.

    Hello
     

    1 is that the problem is specific to the game Psychonauts?
    2. you receive an error message/code?
    3 are you facing any other issue on the computer windows 7?
     
    Follow the steps.
     
    Method 1: resolution of Performance problems to automatically find and fix problems. If please click the below mentioned link and follow the instructions.
    http://Windows.Microsoft.com/en-us/Windows7/open-the-performance-Troubleshooter
     

    Method 2: to help troubleshoot error messages and other issues, you can start Windows by using a minimal set of drivers and startup programs. This type of boot is known as a "clean boot". A clean boot helps eliminate software conflicts.
    http://support.Microsoft.com/kb/929135
    Note: Follow step 7 to reset the computer to start as usual after the boot process.
     
     
    Method 3: perform the steps mentioned in the statutes of the help.

    The problems of the game performance
    Diagnosing basic problems with DirectX
     
    Response with more information to help you.
  • Optimal catalog size / Performance issues

    Hello

    I was wondering, is what size a good catalog - if inverse geo-location and facial recognition is enabled.

    Problem: So far, I've had a single catalog for about 30,000 photos. I recently reinstalled my PC and thought, that it would be beneficial (after various OS upgrades and Versions of Lightroom) to make all things new. However, despite the powerful Hardware, it seems that Lightroom is facing serious problems with the number of photos in this catalogue and says little. Now, I wonder if break the number of photos in smaller catalogues would help.

    Hardware specifications:

    CPU: I7 - 6700 k

    RAM: 64 GB

    GPU: GTX 1080, 8 GB

    Lightroom is installed on a SAMSUNG SSD Pro 850 / 1 TB

    The pictures are on an other Samsung SSD, EVO 850 / 1 TB

    Software:

    Windows Pro 10

    6.6 Lightroom

    Do you have any advice? Or can Lightroom generally manage the number of photos easily and more exist somewhere a mistake?

    Thanks to advandce

    Michael

    @trashner @dj_paige: thank you both.

    6.6 Lightroom formed - what I would call 'the perfect storm '. 6.5.1 also first, problems of performance handling of faces in the largest catalogues (I'll explain later). With 6.6 and known issues of performance and memory, by head screeen personas to a wiered Lightroom application behavior difficult to answer and sometimes. Why not everyone knows this? I suppose that, because most of the users to install LIghtroom 6.6 as an update to an already existing and fully indexed catalog.

    Then, my experiences with 6.5.1 were as follows: generally more rapid then 6.6. But, when I went to the screen personas, the application started late, but, fortunately, to a degree where it was still a little sensitive. So while 60% of the faces (30,000 pictures, which took more then 15 hours) have been recognized for the complete catalog, I had the opportunity to 'play' nearby. And I discovered a few interesting things: once again, the application when booting with the root directory of the catalog is still slow and accept suggestions of name is not funny at all. But I have the files/catalaog organized in folders and subfolders (year and month can, for the most part) and began to experiment with subfolders as starting point. And all the sudden Lightroom has started to become much more quickly in many ways. Face recognition from one subfolder to another (on avareage 300 Photos) began to speed up a lot and also work with name suggestions, etc. became easy. When I went a hierachy upward (meaning a year), with sometimes 6,000 or as little a few hundred speed, was a kind of uproportional goes downwards or upwards depending on the number of photos in this directory. While the indexing/recognition was still not fully filled 6000 pictures mean, hard to work with it and 500 is very easy.

    Then, I started to activate up to 1000 subfolders and waiting always face rocognition was completed. That took a few minutes. I repeaded that for each subdirectory and been fact within 2 hours for the remaining 40%. In this way the recognition of work face 3-4 times faster, then having lightroom do the recognition for the entire catalog at a time.

    Then I started to work on the suggestions of name etc. also based on subdirectory to subdirectory, which was still much faster, and then the applicaton behaved when the entire catalog has been selected.

    Conclusio

    -With the help of 6.5.1 instead of 6.6, was an important stepp.

    -However, the answer to the question of the optimum size of catalog would be:

    (a) catalog can be very very large

    (b) but to speed up the recognition of faces and to work with her, breaking the process into pieces of 500 photos.

    See you soon

Maybe you are looking for