For best practices

Reading on the web make me thing if im doing good things on BB as im starting. Can someone tell me which would be the best way in this case?

And if I can get a nice link to read more info on this type of advice would be nice.

staticclassFoo{        int mSplat;    }    Foo[] mArray =...

    publicvoid zero(){        int sum =0;        for(int i =0; i < mArray.length;++i){            sum += mArray[i].mSplat;        }    }

    publicvoid one(){        int sum =0;        Foo[] localArray = mArray;        int len = localArray.length;

        for(int i =0; i < len;++i){            sum += localArray[i].mSplat;        }    }

    publicvoid two(){        int sum =0;        for(Foo a : mArray){            sum += a.mSplat;        }    }

Kind regards!

two does not work because the uses of BB level JAVA does not support for each.

one, there is no point of recording mArray in localArray. And performance wise there really isn't much difference between call .length vs len, so .length is fine.

zero is more common, and it is good to use. Also usually in a loop for, you will see i ++ instead of ++ I, but in this case, it makes a difference.

Tags: BlackBerry Developers

Similar Questions

  • Suggestions for best practices for ODI

    Hello

    We need some general indications for ODI best practices inplementations, regardless of the design and implementation, etc.

    Thanks for your suggestions. :)

    Mahesh

    Hello
    This document could be useful for you http://www.oracle.com/technetwork/middleware/data-integrator/overview/odi-bestpractices-datawarehouse-whi-129686.pdf (ODI 10 g)

  • I'm looking for 'best practices' tips for a configuration of a project (first CS5.5, Windows 7 Pro)

    I was instructed to update an old corporate video with narration, sounds, graphics and new music.

    Video camera ONLY material that I've had a DVD corporate (NTSC) - the original source material is simply not available. .

    I can rip the DVD in MPEG 2 and import it into first without problem, but I would like to be able to use resolutions higher the updated graphics.

    I intend to export it to Youtube (HD) but can export to DVD or BluRay as well.

    My question really reduced management of sequences.   I tell myself that my sequence settings must be the highest resolution, I expect to work with (Youtube HD or Bluray) which allows you to easily export to lower quality, but I don't really know if it's as simple as that.  Is there a standard method in the industry?  Am I close?

    First may actually use the DVD .vob files, just copy DVDs to hard disk, and then import in first.

    What is the DVD existing video in a 4:3 or 16:9 format? If 4:3, think about how you will work in a program of 16:9. Rather than having the black terminal on the side bars, many publishers will some sort of background to fill this space. A popular solution is to reproduce a bass track, stretch 4:3 video to fill the screen and add Gaussian blur effects. Because the colors/content of the sidebars will correspond to the main video, it's a little disguises the fact that the video is 4:3.

    If you put SD images in a HD program, it will look 'soft' compared to the HD images. As a compromise, you can consider the new edition as 720 p instead of 1080 p program, so that the SD resolution is not to be tense. All the computer video/video YouTube is not interlaced, meaning to sequences and exports should be done as Progressive.

    Flow problem of potential work - if you take the images existing SD from the DVD and it high end in a HD sequence, then export to DVD (SD resolution), the video turned into mush. She was raised, and then again once, dowscaled really kill the quality. May be required to copy the HD sequence in a sequence of SD, difficulty of the problems of size / scale of graphics and titles and DVD for the export of the sequence of SD to avoid the top of range/reduce images SD. Note that the DVD-video is highly compressed, so you already work with a weak source material, recompress again will be a hit in quality in all cases, but certainly avoid the top of range/cut again.

    I don't have Adobe apps in front of me at the moment, but I think that Blu - ray 720 p export options are limited, perhaps 720 p at 59.94 only, not sure. Which may affect your decision to 1080 p and 720 p.

    Hope these tips help you

    Thank you

    Jeff Pulera

    Safe Harbor computers

  • looking for best practices on the display of the data inside a TableView

    I have an app that can extract one ore more of 300,000 lines as a result of a request, and I am their display in a TableView... I use a mechanism to bring the data "off-site", when I do the query, an object with the ID of the lines is set in memory and a 'queryResultKey' went to start asking for the data set.
    The controllers send the queryResultKey saying the DAO that she already results N, the DAO will return X records (from N + 1 to N + J) or less if the total number of records is reached. At this point, records are added to the elements of the table view using an addAll to list related to the itemsProperty table.
    It works very well for early records already, but all of a sudden a Null pointer Exception is raised:
    ene 15, 2013 12:56:40 PM mx.gob.scjn.iusjfx.presentacion.tesis.TablaResultadosController$5 call
    SEVERE: null
    java.lang.NullPointerException
         at com.sun.javafx.collections.ListListenerHelper$Generic.fireValueChangedEvent(ListListenerHelper.java:291)
         at com.sun.javafx.collections.ListListenerHelper.fireValueChangedEvent(ListListenerHelper.java:48)
         at com.sun.javafx.scene.control.ReadOnlyUnbackedObservableList.callObservers(ReadOnlyUnbackedObservableList.java:74)
         at javafx.scene.control.TableView$TableViewArrayListSelectionModel$3.onChanged(TableView.java:1725)
         at com.sun.javafx.collections.ListListenerHelper$SingleChange.fireValueChangedEvent(ListListenerHelper.java:134)
         at com.sun.javafx.collections.ListListenerHelper.fireValueChangedEvent(ListListenerHelper.java:48)
         at com.sun.javafx.collections.ObservableListWrapper.callObservers(ObservableListWrapper.java:97)
         at com.sun.javafx.collections.ObservableListWrapper.clear(ObservableListWrapper.java:184)
         at javafx.scene.control.TableView$TableViewArrayListSelectionModel.quietClearSelection(TableView.java:2154)
         at javafx.scene.control.TableView$TableViewArrayListSelectionModel.updateSelection(TableView.java:1902)
         at javafx.scene.control.TableView$TableViewArrayListSelectionModel.access$2600(TableView.java:1681)
         at javafx.scene.control.TableView$TableViewArrayListSelectionModel$8.onChanged(TableView.java:1802)
         at com.sun.javafx.scene.control.WeakListChangeListener.onChanged(WeakListChangeListener.java:71)
         at com.sun.javafx.collections.ListListenerHelper$Generic.fireValueChangedEvent(ListListenerHelper.java:291)
         at com.sun.javafx.collections.ListListenerHelper.fireValueChangedEvent(ListListenerHelper.java:48)
         at com.sun.javafx.collections.ObservableListWrapper.callObservers(ObservableListWrapper.java:97)
         at com.sun.javafx.collections.ObservableListWrapper.addAll(ObservableListWrapper.java:171)
         at com.sun.javafx.collections.ObservableListWrapper.addAll(ObservableListWrapper.java:160)
         at javafx.beans.binding.ListExpression.addAll(ListExpression.java:280)
         at mx.gob.scjn.iusjfx.presentacion.tesis.TablaResultadosController$5.call(TablaResultadosController.java:433)
         at mx.gob.scjn.iusjfx.presentacion.tesis.TablaResultadosController$5.call(TablaResultadosController.java:427)
         at javafx.concurrent.Task$TaskCallable.call(Task.java:1259)
         at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
         at java.util.concurrent.FutureTask.run(FutureTask.java:166)
         at java.lang.Thread.run(Thread.java:722)
    This exception multibordure inside the Thread that fills the table:
    task = new Task<Integer>() {
                @Override
                protected Integer call() throws Exception {
                    while (listaTesis.size() < pag.getLargo()) {
                        List<TesisTO> tesisParaincrustar = fac.getTesisParaLista(pag.getId(), listaTesis.size());
                        try {
                            listaTesis.addAll(tesisParaincrustar);
                        } catch (Exception exc) {
                            Logger.getLogger(TablaResultadosController.class.getName()).log(Level.SEVERE, null, exc);
                        }
                        tesisActuales.addAll(tesisParaincrustar);
                        updateProgress(listaTesis.size(), pag.getLargo());
                    }
    
                    return new Integer(100);
                }
    
                @Override
                protected void succeeded() {
                    status.getProgreso().setVisible(false);
                    preparaFiltros();
                    llenandoTabla = false;
                    tblResultados.getSelectionModel().select(0);
                }
            };
            status.getProgreso().progressProperty().bind(task.progressProperty());
            new Thread((Runnable) task).start();
    So that may be another strategy to simulate (or have) all the registry values are in memory and 'capture only the necessary links' to display them in the TableView?
    Any idea will be useful.

    The first thing I would do here is to check that

    fac.getTesisParaLista(pag.getId(), listaTesis.size())
    

    cannot return null in all circumstances. If it returns null, then it is the source of your exception. In this case, you must add the caution appropriate to your code:

    if (tesisParaincrustar != null) {
                        try {
                            listaTesis.addAll(tesisParaincrustar);
                        } catch (Exception exc) {
                            Logger.getLogger(TablaResultadosController.class.getName()).log(Level.SEVERE, null, exc);
                        }
                        tesisActuales.addAll(tesisParaincrustar);
    }
    

    For the number of thread, you can plan everything which modifies the user interface to run on the Thread of the FX Application using Platform.runLater (...). Note that any data that you pass into that must not be changed elsewhere. If you can do something like:

    Platform.runLater(new Runnable() {
      @Override
      public void run() {
                        try {
                            listaTesis.addAll(tesisParaincrustar);
                        } catch (Exception exc) {
                            Logger.getLogger(TablaResultadosController.class.getName()).log(Level.SEVERE, null, exc);
                        }
                        tesisActuales.addAll(tesisParaincrustar);
      }
    });
    

    You need to add the 'final' keyword to the declaration of tesisParaincrustar to get this to compile.

    That's fine, assuming that the call to fac.getTesisParaLista (...) returns a new instance of the list each time (i.e. it does not change an existing list).

    You'll end up with only one problem: the condition in your while loop (...) refers to listaTesis.size (). Given that this list is now updated on a different thread to the thread of the while loop (...) management. This can (and probably won't be) cause the loop to terminate at the wrong time. You can adjust the condition while (...) so it does not refer to this list (for example, advance calculate how many things must be read, give this number to your implementation of the task and use it in the while condition (...).)

    There are actually two rules for multithreading in JavaFX:
    1. do not access the user interface outside the FX request Thread. This includes data related to the user interface structures. I guess at some point you have called tableView.setItems (listaTesis), so listaTesis should be considered as part of the user interface.
    2. don't run long running on the Thread of the FX Application tasks.

    In trying to comply with rule 2, you have violated rule 1.

    The incremental updates to a list that appears in a background thread is one of the most delicate uses of the task. Read the documentation of the API for task (http://docs.oracle.com/javafx/2/api/javafx/concurrent/Task.html): the example of "PartialResultsTask" under "a task that returns partial causes" is an example of what you're trying to do.

  • Beginner looking for best practices to implement my first ESXi server.

    Hello, tomorrow I will start my first company ESXi.  I ordered a HP Proliant ML350 with e200i Raid 5 controller.  I bought 4 1 TB Sata disks and plan to put them all in Raid 5 for 3 TB of data storage.  I currently have a file Win2k3 server which has approximately1.5TB of files on several physical disks spread now.  My question is how these data should be moved to the new server?  Should I build a new machine virtual server OS 2003 or 2008 and then allocate all 1.5 + TB I this one VM?  How the data will look like ESXi?  Have I not separated or just 1 virtual players a large?  Or is there a better solution?  What should I do when I start running out of space and should allocate more space to the descent to a certain VM?  I apologize if this is a stupid question, but I'm just a little confused and trying to get all oriented upward when I started on this project in the future.

    Thank you very much!

    Dave

    You can build on Windows Server as shown above, and then create a virtual disk to host your data. You can then use something like copy robo to synchronize the data from your original data source, you could kick off for a weekend and then use robocopy to just capture changes before making your actual failover. Just be aware of the limits that the other person has remarked with the 2 TB. You can select the block maximum size when you create your VMFS datastore for your data drive, otherwise the VMDK max, you create will be a little less then you are looking for.

  • Beginner looking for best practices in configuration in AE CS4 (MacPro 8core)

    Hello

    I am a video editor familiar with FCP and motion, moving in AE CS4 because it's more power in a 3D space. I installed the program and started my basic training of Lynda, but I want to make sure I got the installer so as not to cause problems on the road of.

    For example, in FCP, I use a second drive for makes etc., and a RAID array for FCP files and various media (both imported and exported) organized by project and by type. I intend to do the same for EI, but looking at the AE prefs, I see I can assign memory per CPU etc, and I just want to be sure that I use the Procs and RAM effectively. The machine is a machine of 8-Core MacPro with 12 GB of RAM. The program sees 16 processor cores, so I should split the 10 GB hosting AE to use during these carrots (using the pref multiprocessing tab), or simply let AE understand?

    In addition, I am confused about the overflow Volumes option. What gets put there, when and why?

    Finally, if there is a newsletter along the lines of the Larry Jordan FCP (larryjordan.biz) site I'd reference.

    Thanks for your help.

    -C

    Concerning the use of RAM and processor, see the following page:"Performance Tip: don't starve to death your software of RAM '. Following the guidelines there for 12 GB your computer, would you let 2 GB for other applications, perhaps a couple of GB for the trial leading to hold executives preview RAM - leaving you with 2 GB each of four background rendering process.

    Also, don't be fooled. 8-core computer is just an 8-core in After Effects is. The so-called doubling of processor cores created by hyperthreading is not relevant for simultaneous rendering of several images of multiprocessing.

    Regarding overflow volumes, see the following page: "of the volume of overflow and segment settings"

    If you are new, please start here. Pretty please?

  • best practices for networking for esx / vsphere 6

    best practices for networking for esx / vsphere 6

    Refer to VMware best practices documentation to get the depth on the networks.

    https://www.VMware.com/files/PDF/Techpaper/VMware-PerfBest-practices-vSphere6-0.PDF

    https://KB.VMware.com/selfservice/microsites/search.do?language=en_US&cmd=displayKC&externalID=2107948

    https://www.VMware.com/files/PDF/Techpaper/VMW_Netioc_BestPractices.PDF

    See also below article for best practices documentation relating to the different versions of vSphere.

    http://vmwareinsight.com/articles/2016/5/5798853/best-practices-for-VMware-vSphere-architecture

  • Best practices for automation of ghettoVCBg2 starting from cron

    Hello world!

    I set up an instance of vma for scheduling backups with ghettoVCBg2 in a SIN store. Everything works like a charm from the command line, I use vi fastpass for authentication, backups complete very well.

    However, I would like to invade the cron script and got stuck. Since vifp is designed to run only command line and as I read not supposed to work from a script, it seems that the only possibility would be to create a backup user dedicated with administrator privileges and store the user and pass in the shell script. I'm not happy to do so. I searched through the forums but couldn't ' find any simple solution.

    any IDE for best practices?

    Thank you

    eliott100

    In fact, incorrect. The script relies on the fact that the host ESX or ESXi are led by vi-fastpass... but when you run the script, it does not use vifpinit command to connect. It access credentials via the modules of vi-fastpass which don't vifpinit library but as you have noticed, you cannot run this utility in off-line mode. Therefore, it can be scheduled via cron, basically, but you must run the script interactively, just set up in your crontab. Please take a look at the documentation for more information

    =========================================================================

    William Lam

    VMware vExpert 2009

    Scripts for VMware ESX/ESXi and resources at: http://engineering.ucsb.edu/~duonglt/vmware/

    Twitter: @lamw

    repository scripts vGhetto

    Introduction to the vMA (tips/tricks)

    Getting started with vSphere SDK for Perl

    VMware Code Central - Scripts/code samples for developers and administrators

    VMware developer community

    If you find this information useful, please give points to "correct" or "useful".

  • Best practices Upgrade Path - Server 3 to 5?

    Hello

    I am trying a migration and upgrade of a server in the Profile Manager. I currently run an older mac mini Server 10.9.5 and Server 3 with a vast installation of Profile Manager. I recently successfully migrated the server itself out of the old mac mini on a Xserve end 2009 of cloning the drive. Still of double controls everything, but it seems that the transition between the mini and the Xserve was successful and everything works as it should (just with improved performance).

    My main question is now that I want to get this software-wise at day and pass to the Server 5 and 10.11. I see a lot of documentation (still officially Apple) best practices for the upgrade of the Server 3 to 4 and Yosemite, but can't find much on the Server 5 and El captain, a fortiori from 3 to 5. I understand that I'll probably have to buy.app even once and that's fine... but should I be this staging with 10.9 to 10.10 and Server 4... Make sure that all is well... and the jump off 10.11 and Server 5... Or is it 'safe' (or ok) to jump 3 to 5 Server (and 10.9.5 to 10.11.x)? Obviously, the AppStore is pleased to make the jump from 10.9 to 10.11, but once again, looking for best practices here.

    I will of course ensure that all backups are up-to-date and make another clone just before any which way that take... but I was wondering if someone has made the leap from 3-5... and had things (like the Profile Manager) still work correctly on the other side?

    Thanks for any info and/or management.

    In your post I keep the Mini running Server 3, El Capitan and Server 5 install the Xserve and walk through setting up Server 5 by hand. Things that need to be 'migrated' as Open directory must be handled by exporting the mini and reimport on Xserve.

    According to my experience, OS X Server facilities that were "migrated" always seem to end up with esoteric problems that are difficult to correct, and it's easier to adopt the procedure above that to lose one day try.

    YMMV

    C.

  • VCenter 6.0 upgrade vCenter Server 6.0 best practices?

    Can someone help me best practices for best practices of vCenter server 6.0

    These KB should help you to best practices and upgrade guide

    https://pubs.VMware.com/vSphere-60/topic/com.VMware.ICbase/PDF/vSphere-ESXi-vCenter-Server-60-upgrade-guide.PDF

    Deployment Guide

    https://www.VMware.com/files/PDF/Techpaper/VMware-vCenter-Server6-deployment-guide.PDF

    Best practices

    http://KB.VMware.com/selfservice/microsites/search.do?language=en_US&cmd=displayKC&externalID=2107948

    and new features have the peculiarity of cross

    http://featurewalkthrough.VMware.com/#! / vsphere-6-0

  • Using PowerPoint and Captivate together - what are your best practices?

    HI - here are my two questions...

    (1) it seems that when you insert PowerPoint slides in CP7 and duplicate a slide - slide 2 is not a fresh new creation.  It is always related to the original slide & all changes to one of them in PowerPoint, a direct impact on other related slides.  I realize now that I can go back in PP and duplicate it... But it just seems like extra which steps.  Am I missing something here?

    (2) I would like to hear how others approach developers creating a new project of CP, using PowerPoint as a starting point.  Go you back or do you create new themes based on the drawings of PP and then copy and paste the content?  I guess I'm looking here for best practices.

    I appreciate your help!

    Denise

    Hello

    If you start from scratch, PowerPoint is a terrible thing to consider. Don't get me wrong it's a great tool for what it does. But the reason we see even any possibility for Captivate shoot in a PowerPoint presentation is because there are still masses of people out there who have hundreds of thousands of PowerPoint presentations they want to reallocate in e-learning. So it is to get this way managed job 'quick and dirty '.

    The process of importing a PowerPoint performs a conversion of the PowerPoint slide in Flash SWF file. Then, the SWF file is configured as the subject of the slide background. That's why when you duplicated the slide, you have seen the behaviour you have done.

    Here is that if you start from scratch, start with a blank Captivate project. Use PPT if you wish, but only as a design tool. Copy the images of PPT, then insert into Captivate.

    See you soon... Rick

  • Roles, permissions - DataCenter, file, Cluster, host Layout - best Practices\How-to

    Have a little problem with permissions and roles. I'm sure it will be an easy one for those of you with more experience of working with roles. I hope that my layout organization made with quote boxes is readable.

    The Organization has just spun a new host ESXi 4 for developers and added in vCenter. Developers want to use the vSphere Client\VIC to manage the ESX Server. They need rights to create virtual machines, remove VMs, clone VMs, VMs potential power. However, we don't want them to be able to reach production.

    According to the diagram below, the new host of development, labeled as "HostC (autonomous DEVELOPMENT host)", is located under "Data Center-City-2", who also owns the production ESX clusters. " And obviously I don't want developers having rights on production groups.

    Lets say I have create a role called 'HostC Dev Sandbox Rights', add users and assign directly to "HostC" below. This role contains the VM 'create' right, however when I run the wizard Creation of VM of HostC as a member of the role the vSphere Client tells me this task requires rights create VM on the level of data center! But given these developers to create VMS access on the data center would give them rights to create virtual machines in the poles of Production! Which is obviously a problem.

    I can't believe that our need to give these rights to ONLY one host in a DataCenter is rare. I don't know that there is a misunderstanding on my part of how to configure VMware roles for best practices.

    Anyone with more expirence on VMware roles ready to help me on this one? Thanks in advance!

    Organization representative Schema using quote boxes:

    vSphere (vCenter Server)

    City of DataCenter-1

    Many cases, clusters, hosts

    City of DataCenter-2

    FolderA (Division A)

    ClusterA (A Cluster of Production)

    HostA1 (Production host in Group A)

    HostA2 (Production host in Group A)

    %Windir%$NTUninstallKB941568_DX8$\Spuninstb (division B)

    Focus (Production Cluster B)

    HostB1 (Production host in Group B)

    HostB2 (Production host in Group B)

    HostC (autonomous DEVELOPMENT host) - under %windir%$NTUninstallKB941568_DX8$\Spuninstb but not in the cluster

    City Center-3

    Many cases, clusters, hosts

    You can apply permissions directly to the data store.  I didn't need to go further than the clusters in our environment, but what really works for you is to place data warehouses in folders for storage.  Have the records be the names of your groups hosts and clusters.  Then place the warehouses of data for each cluster in the corresponding folder.  Then, just apply permissions for the data on the record instead of warehouses in each individual data store.  Off topic a little, but a records of something in the store of data discovered lack is the function of "views of storage" and I put a future application.

    Yes, if you set permissision to the view of the data store the user can turn opinion and see.  Extensive your permissions framework tests is guaranteed before pushing users.  Looks like you are already doing.

  • Best practices collections

    Apex 3.2

    I searched this site like google for best practices from the collections in the Apex, but came away empty-handed. What I really want is to know how to limit the loss of data due to a break in the session. I'm working on an Application that uses collections for an assistant in 5 steps to collect data before starting the data to the tables. What to do if the user has to step away from the computer in the middle of the wizard or don't have the time to finish it, but wants to continue at a later date or if the PC crashes? Is it possible to prevent or recover from the interruption?

    I was wondering what others do to mitigate the potential loss of data collection.

    Ray

    rgarza28 wrote:
    Apex 3.2

    I searched this site like google for best practices from the collections in the Apex, but came away empty-handed. What I really want is to know how to limit the loss of data due to a break in the session. I'm working on an Application that uses collections for an assistant in 5 steps to collect data before starting the data to the tables. What to do if the user has to step away from the computer in the middle of the wizard or don't have the time to finish it, but wants to continue at a later date or if the PC crashes? Is it possible to prevent or recover from the interruption?

    I remember apex cleans the collections at the end of the session.

    Then you can check the actual tables for any sign of data for recovery

    select c.collection_name, m.seq_id, m.c001, m.c002, m.c003, m.c004, m.c005, m.c006, m.c007,
               m.c008, m.c009, m.c010, m.c011, m.c012, m.c013, m.c014, m.c015, m.c016, m.c017,
               m.c018, m.c019, m.c020, m.c021, m.c022, m.c023, m.c024, m.c025, m.c026, m.c027,
               m.c028, m.c029, m.c030, m.c031, m.c032, m.c033, m.c034, m.c035, m.c036, m.c037,
               m.c038, m.c039, m.c040, m.c041, m.c042, m.c043, m.c044, m.c045, m.c046, m.c047,
               m.c048, m.c049, m.c050, m.clob001, m.blob001, m.xmltype001, m.n001, m.n002, m.n003,
               m.n004, m.n005, m.d001, m.d002, m.d003, m.d004, m.d005, m.md5_original
          from wwv_flow_collections$ c, wwv_flow_collection_members$ m
         where c.session_id = 
           and c.security_group_id = 
           and c.id = m.collection_id
           and c.flow_id = ;
    

    I was wondering what others do to mitigate the potential loss of data collection.

    But I prefer to create my own tables to store data temporarily, and it will be easy to recover because you will only clean the table if the user ended up actually of the form

  • Best practices: collaborate between Mac and PC

    Department of development for the Web of my not-for-profit organization is expanding - me on a PC using Dreamweaver and ColdFusion, me / PC more a co-worker in Dreamweaver on Mac.

    I'm looking for some advice from someone who did the development with a Mac and PC. Basically, I'm looking for "best practices" on how we work together, without spoiling each and other work. It will focus on aspects of the design of the projects: CSS, for example. I work primarily on ColdFusion coding that allows to make information in our databases .cfm pages.

    I am just afraid to get into a situation where he has made changes to a single version of a file and I have make changes to another - disaster! Otherwise, I'm worried (as he's a new guy... I've known him for about two weeks!) that his work could "ruin" pages that worked very well until he got his hands on them.

    Advice would be really appreciated. Because him and me have these complementary skills, this could be a great collaboration. I want to just make sure we do it as straight as possible from the get-go. -diane

    > I'm looking for some advice from someone who did the development with a Mac
    > and
    > PC. Basically, I'm looking for "best practices" on how we should
    > collaborate, without spoiling each and other work. It will primarily focus
    > on
    > aspects of the design of the projects: CSS, for example. I work mainly on
    > the
    > ColdFusion coding that allows us to extract information in .cfm pages
    > of
    > our databases.

    Consider using check-in/check-out option in DW.

    -Darrel

  • Best practices for the .ini file, reading

    Hello LabViewers

    I have a pretty big application that uses a lot of communication material of various devices. I created an executable file, because the software runs on multiple sites. Some settings are currently hardcoded, others I put in a file .ini, such as the focus of the camera. The thought process was that this kind of parameters may vary from one place to another and can be defined by a user in the .ini file.

    I would now like to extend the application of the possibility of using two different versions of the device hardware key (an atomic Force Microscope). I think it makes sense to do so using two versions of the .ini file. I intend to create two different .ini files and a trained user there could still adjust settings, such as the focus of the camera, if necessary. The other settings, it can not touch. I also EMI to force the user to select an .ini to start the executable file using a dialog box file, unlike now where the ini (only) file is automatically read in. If no .ini file is specified, then the application would stop. This use of the .ini file has a meaning?

    My real question now solves on how to manage playback in the sector of .ini file. My estimate is that between 20-30 settings will be stored in the .ini file, I see two possibilities, but I don't know what the best choice or if im missing a third

    (1) (current solution) I created a vi in reading where I write all the .ini values to the global variables of the project. All other read only VI the value of global variables (no other writing) ommit competitive situations

    (2) I have pass the path to the .ini file in the subVIs and read the values in the .ini file if necessary. I can open them read-only.

    What is the best practice? What is more scalable? Advantages/disadvantages?

    Thank you very much

    1. I recommend just using a configuration file.  You have just a key to say what type of device is actually used.  This will make things easier on the user, because they will not have to keep selecting the right file.

    2. I use the globals.  There is no need to constantly open, get values and close a file when it is the same everywhere.  And since it's just a moment read at first, globals are perfect for this.

Maybe you are looking for