Grouping data via GroupDataModel and DataSource

Hello

I'm loading below via a data source, the data correctly loads the XML data structure and the data are displayed in the ListView.


  
  
    
      
        3
          4.28
          
            375802
            Civic
            
              589
              Honda
            
          
          
            375803
            Accord
            
              590
              Honda
            
          
          
            375804
            Camry
            
              591
              Toyota
            
          
        
    
  

The GroupDataModel & DataSource is set as shown below...

    attachedObjects: [

        GroupDataModel
        {
            id: dataModel

            // This works and the header is displayed properly by rating
            //sortingKeys: ["rating"] 

            // Sorting does not work, and listview header is displayed @ top with empty data
            sortingKeys: ["manufacturer.name"]
            sortedAscending: true
            grouping: ItemGrouping.ByFullValue
        },

        DataSource
        {
            id: dataSource

            // Load the XML data from a remote data source
            source: "http://www.mydataservice.com/getdata.php"
            query: "/dataresponse/search/results/cars"
            type: DataSourceType.Xml

            onDataLoaded:
            {
                // After the data is loaded, clear any existing items in the data
                // model and populate it with the new data
                dataModel.clear();
                dataModel.insertList(data)
            }
        }
    ]

The listview is defined per below...

ListView
{
   id: myListView
   // Associate the list view with the data model that's defined in the
   // attachedObjects list
   dataModel: dataModel

   // Sticky header
   layout: StackListLayout { headerMode: ListHeaderMode.Sticky }

   listItemComponents: [

     ListItemComponent
     {
        type: "item"

        // Use a standard list item to display the data in the data
        // model
        StandardListItem
        {
            imageSpaceReserved: false;
            title: ListItemData.car.model
            description: ListItemData.manufacturer.name  + " ID: " + ListItemData.manufacturer.id
        }
     }
   ]
}

So, in order to be able to group the data by the automaker, I thought that I could just specify the following sort key sortingKeys: ["manufacturer.name"] but it does not work.

Any suggestions?

Hi joelajean,

According to the documentation on this link "In a GroupDataModel, there are only two levels of items". Looking at how your xml file is formatted, I can see that the manufacturer.name is a level deeper than the second level and this is probably why you can not retrieve the item. I suggest that you try to use a XmlDataModel instead of a GroupDataModel or analyze your file xml in two levels of items.

Tags: BlackBerry Developers

Similar Questions

  • SQL query to group data by Code and dates

    Hello

    I have the following table structure

    col1 col2 col3
    January 21, 2012 tested Code1
    January 20, 2012 tested Code1
    June 1, 2012 tested Code1
    June 1, 2012 tested Code2
    code 3 tested June 4, 2012

    so now

    The output should be something like

    code week1 week semaine2 3 semaine4 week5 until the last 14 weeks from the date that we run
    code 1 1 0 0 0 0
    Code2 1 0 0 0 0
    code 3 0 1 0 0 0

    where 1, 0 is in fact the charges and no sum and the week in this case perhaps should since we are in the second week, he should be

    code... .week3 may semaine4 peut week1 jun week2june


    Was looking for suggestions on how to achieve this.

    I guess that this would require some kind of a pivot query?

    Thank you
    Sun

    Hello

    Here's how you can make this pivot in Oracle 10.2. (In fact, it will work in Oracle 9.1 or higher.)

    WITH  got_week_num  AS
    (
         SELECT  error_code, date_logged
         ,     1 + FLOOR ( ( TO_DATE (:end_dt_txt, 'DD-Mon-YYYY') - date_logged)
                         / 7
                     )     AS week_num
         FROM    data_analysis
         WHERE     date_logged     >= TO_DATE (:start_dt_txt, 'DD-Mon-YYYY')
         AND     date_logged     <  TO_DATE (:end_dt_txt,   'DD-Mon-YYYY') + 1
    )
    ,     all_weeks     AS
    (
         SELECT     LEVEL               AS week_num
         ,     TO_CHAR ( 1 + TO_DATE (:end_dt_txt, 'DD-Mon-YYYY')
                       - (7 * LEVEL)
                   , 'fmDD-Mon-YYYY'
                   )          AS heading
         FROM    dual
         CONNECT BY     LEVEL <= 1 + FLOOR ( ( TO_DATE (:end_dt_txt,   'DD-Mon-YYYY')
                                             - TO_DATE (:start_dt_txt, 'DD-Mon-YYYY')
                                  )
                                / 7
                                )
    )
    SELECT       NULL                                   AS error_code
    ,       MIN (CASE WHEN week_num =  1 THEN heading END)     AS week_1
    ,       MIN (CASE WHEN week_num =  2 THEN heading END)     AS week_2
    --       ...
    ,       MIN (CASE WHEN week_num =  5 THEN heading END)     AS week_5
    FROM       all_weeks
           --
         UNION ALL
                --
    SELECT       error_code
    ,       TO_CHAR (COUNT (CASE WHEN week_num =  1 THEN 1 END))     AS week_1
    ,       TO_CHAR (COUNT (CASE WHEN week_num =  2 THEN 1 END))     AS week_2
    --       ...
    ,       TO_CHAR (COUNT (CASE WHEN week_num =  5 THEN 1 END))     AS week_5
    FROM       got_week_num
    GROUP BY  error_code
                 --
    ORDER BY  error_code     NULLS FIRST
    ;
    

    Output:

    ERROR_CODE WEEK_1      WEEK_2      WEEK_5
    ---------- ----------- ----------- -----------
               4-Jun-2012  28-May-2012 7-May-2012
    a          3           0           0
    b          0           2           1
    c          0           0           1
    

    Once more, the number of columns, such as aliases, is hard-coded in the query.
    If you want the number of columns, or their aliases depends on the data in the table, then you need dynamic SQL. See {message identifier: = 3527823}

    Did you ever what defined a "week" is in this query?
    The query above makes week_1 end to the given date (: end_dt_txt). The first week (in other words, an ioncluding: start_dt_txt) may have less than 7 days.
    If you want all the weeks to start Monday (in which case, the first and the last few weeks may have less than 7 days) see stew solution, using TRUNC (date_logged, 'IW').

  • I need to transfer my history of the text and images from my old iPhone to my new iPhone, but have already moved all other data to my new phone via iTunes and spent time to organize.  How do I reset this transfer without any?

    I need to transfer my history of the text and images from my old iPhone to my new iPhone, but have already moved all other data to my new phone via iTunes and spent time to organize.  How do I reset this transfer without any?  I transfer a 5s to itself.

    For your photos, try importing them to your computer and their synchronization then back to the SE.

    Import photos and videos from your iPhone, iPad or iPod touch - Apple Support

    For your texts, they moved with the backup restore?

  • Grouping data with dates of Max and Min problem

    Ladies and gentlemen,

    I have a problem that I have tried from different angles. It's probably very easy for some of you, or some of you may have met before, so any help on this is greatly appreciated. I will describe below.

    I have the following data:

    User station site code dstamp ref Qty
    -------- --------- ---------- ------------- --------------------------------------------- ------- -------
    Site_1 user_1 RPT104 Activity_1 16 May 11 13.43.06.566193000 ref_1 1125
    Site_1 user_1 RPT104 Activity_1 16 May 11 13.49.31.364224000 ref_1 1125
    Site_1 user_1 RPT104 Activity_1 16 May 11 13.49.47.413252000 ref_1 1125
    Site_1 user_1 RPT104 Activity_1 16 May 11 13.51.48.906793000 ref_1 1125
    Site_1 user_1 RPT104 Activity_1 16 May 11 13.51.56.947312000 ref_1 1125
    Site_1 user_1 RPT104 Activity_1 16 May 11 13.54.29.396052000 ref_1 1125
    Site_1 user_1 RPT104 Activity_1 16 May 11 13.54.37.444307000 ref_1 1125
    Site_1 user_1 RPT104 Activity_1 16 May 11 13.57.00.237546000 ref_1 1125
    Site_1 user_1 RPT104 Activity_1 16 May 11 13.57.04.285148000 ref_1 1125
    Site_1 user_1 RPT104 Activity_1 16 May 11 13.59.24.745162000 ref_1 1125
    Site_1 user_1 RPT104 Activity_1 16 May 11 13.59.44.774318000 ref_1 1125
    Site_1 user_1 RPT104 Activity_1 16 May 11 14.01.22.434940000 ref_1 1125
    Site_1 user_1 RPT104 Activity_1 16 May 11 14.01.51.291059000 ref_1 1125
    Site_1 user_1 RPT104 Activity_2 16 May 11 14.05.23.572211000 ref_2 1125
    Site_1 user_1 RPT104 Activity_1 16 May 11 14.06.01.058978000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 16 May 11 14.06.41.341612000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 16 May 11 14.06.49.375972000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 16 May 11 14.06.49.388699000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 16 May 11 14.06.49.401287000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 16 May 11 14.06.49.413361000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 16 May 11 14.06.49.425675000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 16 May 11 14.06.49.437360000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 16 May 11 14.06.49.449079000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 16 May 11 14.06.49.460697000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 16 May 11 14.06.49.472606000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 16 May 11 14.06.49.484031000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 16 May 11 14.06.49.495551000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 16 May 11 14.06.49.513645000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 16 May 11 14.06.49.530405000 ref_1 1125


    and I'm looking for it in this format:


    Site user station code start end ref Qty
    --------     ---------     ----------     -------------     ---------------------------------------------     ---------------------------------------------          ----------     -------
    Site_1 user_1 RPT104 Activity_1 13.43.06.566193000 16 May 11 May 16, 11 14.05.23.572211000 ref_1 1125
    Site_1 user_1 RPT104 Activity_2 14.05.23.572211000 16 May 11 May 16, 11 14.06.01.058978000 ref_2 1125
    Site_1 user_1 RPT104 Activity_1 14.06.01.058978000 16 May 11 May 16, 11 14.06.41.341612000 ref_1 1125
    Site_1 user_1 RPT104 Activity_3 14.06.41.341612000 16 May 11 (May 16, 11 14.06.49.530405000 + 4secs ref_1 1125)
    either may 16, 11 14.06.53.530405000)


    I can get the hours start and end without problem using data intial twice and it compensation by a rownum is but using the functions max and min based on the data that I get:

    Site user station code start end ref Qty
    --------     ---------     ----------     -------------     ---------------------------------------------     ---------------------------------------------          ----------     -------
    Site_1 user_1 RPT104 Activity_1 16 May 11 13.43.06.566193000 * May 16, 11 14.06.41.341612000 * ref_1 1125
    Site_1 user_1 RPT104 Activity_2 14.05.23.572211000 16 May 11 May 16, 11 14.06.01.058978000 ref_2 1125
    Site_1 user_1 RPT104 Activity_3 * 14.06.41.341612000 * 16 May 11 (May 16, 11 14.06.49.530405000 + 4secs ref_1 1125)
    either may 16, 11 14.06.53.530405000)

    who is missing on the 3rd line of the previous dataset (if everything goes well in fat after validation) and assigns the wrong time end.

    I think the solution may have soemthing to do using the function dense_rank() (any ORDER by code, start) but I'm not too familiar with it and I think that the facts in the Start column data is unique it affects its functioning.

    If anyone can offer any help or point me in the right direction I'll offer eternal grace and rest a drink we should never meet!

    see you soon

    Published by: MickyMick on June 7, 2011 03:21

    BobLilly wrote:
    Tabibitosan of Aketi method can be applied here (see {: identifier of the thread = 1005478})

    Site_1 user_1 RPT104 Activity_1 2011-05-16 13.43.06.566193000 2011-05-16 14.05.23.572211000 ref_1 1125
    Site_1 user_1 RPT104 Activity_2 2011-05-16 14.05.23.572211000 2011-05-16 14.06.01.058978000 ref_2 1125
    Site_1 user_1 RPT104 Activity_1 2011-05-16 14.06.01.058978000 2011-05-16 14.06.41.341612000 ref_1 1125
    Site_1 RPT104 Activity_3 2011-05-16 14.06.41.341612000 user_1 ref_1 14.06.45.341612000 2011-05-16 1125

    According to OP we may 16, 11 14.06.49.530405000 + 4secs. In any case, use method start_of_group:

    With t as (
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_1' as Code
    , to_timestamp('16-MAY-11 13.43.06.566193000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_1' as Code
    , to_timestamp('16-MAY-11 13.49.31.364224000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_1' as Code
    , to_timestamp('16-MAY-11 13.49.47.413252000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_1' as Code
    , to_timestamp('16-MAY-11 13.51.48.906793000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_1' as Code
    , to_timestamp('16-MAY-11 13.51.56.947312000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_1' as Code
    , to_timestamp('16-MAY-11 13.54.29.396052000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_1' as Code
    , to_timestamp('16-MAY-11 13.54.37.444307000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_1' as Code
    , to_timestamp('16-MAY-11 13.57.00.237546000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_1' as Code
    , to_timestamp('16-MAY-11 13.57.04.285148000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_1' as Code
    , to_timestamp('16-MAY-11 13.59.24.745162000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_1' as Code
    , to_timestamp('16-MAY-11 13.59.44.774318000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_1' as Code
    , to_timestamp('16-MAY-11 14.01.22.434940000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_1' as Code
    , to_timestamp('16-MAY-11 14.01.51.291059000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_2' as Code
    , to_timestamp('16-MAY-11 14.05.23.572211000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_2' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_1' as Code
    , to_timestamp('16-MAY-11 14.06.01.058978000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_3' as Code
    , to_timestamp('16-MAY-11 14.06.41.341612000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_3' as Code
    , to_timestamp('16-MAY-11 14.06.49.375972000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_3' as Code
    , to_timestamp('16-MAY-11 14.06.49.388699000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_3' as Code
    , to_timestamp('16-MAY-11 14.06.49.401287000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_3' as Code
    , to_timestamp('16-MAY-11 14.06.49.413361000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_3' as Code
    , to_timestamp('16-MAY-11 14.06.49.425675000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_3' as Code
    , to_timestamp('16-MAY-11 14.06.49.437360000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_3' as Code
    , to_timestamp('16-MAY-11 14.06.49.449079000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_3' as Code
    , to_timestamp('16-MAY-11 14.06.49.460697000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_3' as Code
    , to_timestamp('16-MAY-11 14.06.49.472606000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_3' as Code
    , to_timestamp('16-MAY-11 14.06.49.484031000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_3' as Code
    , to_timestamp('16-MAY-11 14.06.49.495551000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_3' as Code
    , to_timestamp('16-MAY-11 14.06.49.513645000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual union all
    Select 'Site_1' as Site, 'user_1' as Usr, 'RPT104' as Station, 'Activity_3' as Code
    , to_timestamp('16-MAY-11 14.06.49.530405000', 'DD-MON-RR HH24:MI:SS.FF') as DTStamp, 'ref_1' as Ref, 1125 as Qty from dual
    ),
    t1 as (
           select  t.*,
                   lead(DTStamp,1,DTStamp + interval '4' second) over(order by DTStamp) ENDTS,
                   case
                     when     lag(Site) over(order by DTStamp)  = Site
                          and
                              lag(Usr) over(order by DTStamp)  = Usr
                          and
                              lag(Station) over(order by DTStamp)  = Station
                          and
                              lag(Code) over(order by DTStamp)  = Code
                          and
                              lag(Ref) over(order by DTStamp)  = Ref
                          and
                              lag(Qty) over(order by DTStamp)  = Qty
                       then 0
                     else 1
                   end start_of_group
             from  t
          ),
    t2 as (
           select  t1.*,
                   sum(start_of_group) over(order by DTStamp) grp
             from  t1
          )
    select  Site,
            Usr,
            Station,
            Code,
            min(DTStamp) STARTTS,
            max(ENDTS) ENDTS,
            Ref,
            Qty
      from  t2
      group by grp,
               Site,
               Usr,
               Station,
               Code,
               Ref,
               Qty
      order by STARTTS
    /
    
    SITE   USR    STATIO CODE       STARTTS                             ENDTS                               REF          QTY
    ------ ------ ------ ---------- ----------------------------------- ----------------------------------- ----- ----------
    Site_1 user_1 RPT104 Activity_1 16-MAY-11 01.43.06.566193000 PM     16-MAY-11 02.05.23.572211000 PM     ref_1       1125
    Site_1 user_1 RPT104 Activity_2 16-MAY-11 02.05.23.572211000 PM     16-MAY-11 02.06.01.058978000 PM     ref_2       1125
    Site_1 user_1 RPT104 Activity_1 16-MAY-11 02.06.01.058978000 PM     16-MAY-11 02.06.41.341612000 PM     ref_1       1125
    Site_1 user_1 RPT104 Activity_3 16-MAY-11 02.06.41.341612000 PM     16-MAY-11 02.06.53.530405000 PM     ref_1       1125
    
    SQL> 
    

    SY.

  • data via sql dev store

    Hello
    I've given cluster and can store files and it is clustering ID in the new database using the ODM. I group these data via SQL dev. Oracle again but I'm not for any folder for the storage of cluster ID. How do I? can I put something in response or detail node?

    Hi Nasiri,
    In your description, you have only the Cluster node as entry in the node to apply it. You should also connect a node type of source of data as well. Just to get this to work, why not connect the source of input data to your Cluster node to the node to apply it. If the node to apply it should have 2 inputs, a node of type of data source, but also a model node.
    Thank you, Mark

  • How do you export groups of tabs, bookmarks, and history fron a computer to another computer

    I found myself replacing my computer for various reasons and need to recover all my data from my old computer to the new. However, when it came to Firefox and my groups of tabs, bookmarks and history, I have not found any real way to export them to the other computer. is it possible to record all this information and then move it to the new computer? It's exhausting to have to lose all my save information as well as all of the tabs that I have because I need them and it takes a long time to copy and paste all my address in a text and saving it like that.

    in the links list, I read on registration and export and restore everything except for groups.

    Of course, it would be nice if they just don't go away, but as long a they do (as with the last upgrade FF8.0) I would like to know where they are stored (so I can get time machine) and how I can export for use on other devices.

    It would also be useful for FF sync to synchronize them - there is no mention of this - that's why I stayed with Xmarks for the moment - as has also lets me sync with Chrome and Safari.

  • How to make my computer send all audio data via an optical audio cable instead of making headphones?

    Howdy,

    To listen to the radio online or CD played from my computer, I used to connect the computer to my entertainment system at home since the headphone jack of the computer to the port on my home entertainment system.

    Now, I wanted to get a better sound and bought the optical audio cable. However, I don't know how to tell my computer to send the audio signal to the system entertainmeny home via digital cable optical, rather than by the headphone. If I just disconnect the cable between the headphone and the port to THE and have only the PC and the system connected with the optical audio cable, I can't hear any sound. I suspect that the computer is not sending the data through the optical audio port. I'm not able to figure out how I can adjust the settings on my computer so that, from now on, all sounds are emitted through the optical audio.

    (1) I want to do it especially for my desktop HP (configuration below) and advice on how to do this would be appreciated.

    (2) I also have a HP laptop (configuration below) and I was wondering if this would be feasible too (even if she does not have an audio output, but it has an HDMI output - you can convert audio optical?)

    Thanks in advance!

    My office is:

    WANT to h8xt,
    • Windows 8 64
    • 3rd generation Intel (r) Core processor quad-core i7-3770 [3.4 GHz, 8 MB of shared cache]
    • 12GB DDR3 1333 MHz SDRAM [3 DIMMS]
    • 1 TB 7200 RPM SATA hard drive
    • No secondary hard drive
    • 1 GB AMD Radeon HD 7570 [DVI, HDMI, DP, VGA adapter]
    • 300W power supply
    • DVD SuperMulti burner
    • LAN wireless-N (1 x 1) card
    • 15-in-1 reader cards, 2 USB 2.0 (front), 2 USB 3.0 (top)
    • No additional desktop software
    • No additional security software
    • No TV Tuner
    • Beats Audio (tm) - a built-in studio quality sound
    • HP USB volume control keyboard and mouse with Win 8 keyboard
    • Adobe Premiere Elements and Photoshop Elements 10

    And the laptop:

    HP ENVY 15t Quad

    Hello @_goma,.

    Welcome to the HP Forums, I hope you enjoy your experience!

    I read your post about how you want to send all the audio data via an optical audio cable instead of the headphone jack of your computer, and I'd be happy to help you in this case!

    To configure your desktop computer to activate the optical audio cable, I advise you to follow the steps below:

    Step 1. Click on the button of the Windows key on your desktop

    Step 2. Type "Control Panel."

    Step 3. Select "Control Panel" in the upper right corner

    Step 4. Select sound

    Step 5. On the Read tab, right click on the white box under available devices

    Step 6. Select "Show disabled" and "show disconnected devices".

    Step 7. Connect your Audio optical cable

    Step 8. Select your cable Audio perspective as the default device, and click 'enable '.

    Because it is not possible to convert the audio output HDMI output on your laptop, it is regrettable that the laptop is not able to connect with an optical audio cable.

    Please re-post with the results of your troubleshooting, and I look forward to your response!

    Concerning

  • Just got an email from "Windows Live Hotmail", _ @_. _ asking me to answer in my name, password, date of birth and country.

    Received suspicious email

    Just got an email from "Windows Live Hotmail", * address email is removed from the privacy * asking me to answer in my name, password, date of birth and country.  ????  "permanent suspension" if I can't do this... links to all the pages from Microsoft including the privacy statement... I will not answer but worry that if it is a scam people.  Any thoughts?

    Hello

    Its a SCAM!

    In the United States, you can contact the FBI, Attorney general, the police authorities and consumer
    Watch groups. Arm yourself with knowledge.

    The Internet Crime Complaint Center (IC3) is a partnership between the Federal Bureau of Investigation
    (FBI) and the National White Collar Crime Center (NW3C), funded in part by the Bureau of Justice Assistance
    (BJA).
    http://www.ic3.gov/complaint/default.aspx

    No, Microsoft would not email or call you unsolicited. Or they would know if there were any errors
    on your computer. So those are the frauds or scams to get your money or worse to steal your
    identity.

    Avoid scams that use the Microsoft name fraudulently - Microsoft is not unsolicited
    phone calls to help you fix your computer
    http://www.Microsoft.com/protect/fraud/phishing/msName.aspx

    Scams and hoaxes
    http://support.Microsoft.com/contactus/cu_sc_virsec_master?ws=support#tab3

    Microsoft Support Center consumer
    https://consumersecuritysupport.Microsoft.com/default.aspx?altbrand=true&SD=GN&ln=en-us&St=1&wfxredirect=1&gssnb=1

    Microsoft technical support
    http://support.Microsoft.com/contactus/?ws=support#TAB0

    Microsoft - contact technical support
    http://Windows.Microsoft.com/en-us/Windows/help/contact-support

    ==============================================================

    Windows Live Solution Center - HotMail - HotMail Forums Solutions
    http://answers.Microsoft.com/en-us/windowslive

    Windows Help and how
    http://Windows.Microsoft.com/en-us/Windows/help

    Hotmail - Forums
    http://answers.Microsoft.com/en-us/windowslive/Forum/Hotmail

    Hotmail - Solutions and how additional resources
    http://Windows.Microsoft.com/en-us/Hotmail/help-center

    Hotmail - how to - FAQ
    http://Windows.Microsoft.com/en-us/Hotmail/Hotmail-how-do-i-FAQ

    Why my account is temporarily blocked?
    http://Windows.Microsoft.com/en-us/Hotmail/account-temporarily-blocked-FAQ

    What should I do if my account has been hacked?
    http://windowslivehelp.com/solution.aspx?SolutionID=6ea0c7b3-1473-4176-b03f-145b951dcb41

    Hotmail hacked? Take these steps
    http://blogs.msdn.com/b/securitytipstalk/archive/2010/07/07/Hotmail-hacked-take-these-steps.aspx

    I hope this helps.

    Rob Brown - Microsoft MVP<- profile="" -="" windows="" expert="" -="" consumer="" :="" bicycle=""><- mark="" twain="" said="" it="">

  • Transmission of data via the Ad Hoc network in LabVIEW

    Hello

    I'm trying to transmit some data (position, speed, etc.) a laptop computer to a host computer.

    The laptop and desktop have the card installed wireless so I thought this would be a good

    the idea of transmitting data via Wi - Fi using TCP.

    However, the problem is that there is no router/modem in the area where the laptop is sitting

    so, I create a network of Ad-hoc (computer to computer) between the laptop and the desktop computer.

    Then, I use "ipconfig/all" in the Windows command prompt to extract the IP as follows.

    IP address wireless laptop: 169.254.165.72

    IP address Wireless Desktop: 169.254.102.126

    Desktop Ethernet IP address: 129.94.229.230

    I then use the code example in the TCP library named 'Data Server.vi' and "Data Client.vi."

    I listen on port 6340 times the wireless and ethernet desktop on the side map server.

    And on the side of the customer, I opened a connection to port 6340 169.254.102.126 IP address.

    However, the connection cannot be granted. Anyone know why this is the case?

    Note: I have disabled all firewalls.

    Interesting, when 2 computers are connected to the same modem, their IP addresses

    differ only by the last number (e.g. XXX.XXX. XXX.123 vs XXX.XXX. XXX.256), but when an ad

    the House of Commons network is established, their IP addresses have 2 numbers that are different. (XXX.XXX.

    102,126 vs XXX. XXX.165.72) I wonder if this has nothing to do with the failure.

    Thanks for all your help.

    Ron Liou

    I somehow this job.

    What I've done

    -Once again set up an Ad hoc network for the labtop and Office

    -verification of the IP address is always the same

    > Portable wireless IP address: 169.254.165.72

    > Portable wireless subnet mask: 255.255.0.0

    > Wireless Office IP address: 169.254.102.126

    > Wireless Office subnet mask: 255.255.0.0

    -switch on the ICS of the ethernet on the desktop card

    (This resulted in an IP different addresses assigned beng)

    > Portable wireless IP address: 192.168.137.21

    > Portable wireless Subnet Mask: 255.255.255.0

    > Wireless Office IP address: 192.168.137.1

    > Office Wireless Subnet Mask: 255.255.255.0

    Now I use the ip address of office (192.168.137.1) to open the connection

    and it works! ??

    I am very pleased with the fact that it works but I would like to have one

    explanation on why it takes an ICS.

    Thank you!

  • When I try to use the Windows Update link for my XP computer I get a message indicating that the location where the Windows Update stores data has changed and it needs to be repaired. How can I solve this problem?

    When I try to use the Windows Update link for my XP computer and after using Windows Mr. Fix - It, I get a message indicating that the location where the Windows Update stores data has changed and must be repaired. How can I solve this problem?

    I'm not that computer literate and do not understand what needs to be fixed.

    This problem just started a few weeks when I noticed that I had any recent download automatic update that I regularly get. So I tried to do it manually through access via my control panel.

    I use ESET Antivirus Node32 software.

    Hello

    1. What is the error message or an exact error code?

    2 have you made changes on the computer before this problem?

    3. you try to check the updates?

    I would suggest trying the following methods and check if it helps.

    Method 1:

    Reset Windows Update components and then try to download the updates.

    How to reset the Windows Update components?

    http://support.Microsoft.com/kb/971058

    Warning: Important This section, method, or task contains steps that tell you how to modify the registry. However, serious problems can occur if you modify the registry incorrectly. Therefore, make sure that you proceed with caution. For added protection, back up the registry before you edit it. Then you can restore the registry if a problem occurs. For more information about how to back up and restore the registry, click on the number below to view the article in the Microsoft Knowledge Base: http://support.microsoft.com/kb/322756

     

    Method 2:

    File system scan tool checker and then try to press Ctrl + Alt + Delete and check.

    Description of Windows XP and Windows Server 2003 System File Checker (Sfc.exe):

    http://support.Microsoft.com/kb/310747

    Please respond with more information so that we could help you more.

  • How to send data via TCP

    Hi all

    I am trying to write a very simple application that will transfer data via TCP to another computer running a TCP server. (About 3K of data)

    Although I followed the code example in the 4.6 Java Development Guide, page 101, on the use of the socket connections, (http://na.blackberry.com/eng/deliverables/3802/development.pdf) I've been running into questions that data transfer crashes after an undetermined number of bytes.

    Someone at - it sample code to open a TCP connection and sending the data? This seems to be a very common thing to do, so I don't know what is my problem.

    Thank you
    Daniel

    I'm not going to answer your first question, I think that we must resolve this problem before you watch something else (and I suspect they are all related).

    I would almost guarantee that you run your socket send and receive on the thread of events.  The thread of events, that's what treats your interaction Menu, trackball movement etc.  If you perform a long running or blocking on this Thread, your device will freeze.

    Look at the demo of Socket and move your network of transformation to a Thread separate, as does the sample.

    To confirm that the treatment of your socket is executed on the event Thread, you can add this code, run it in the Simulator and watch it in the output window of the debugger and set a breakpoint on the line to System.out.

    If {(Application.getApplication (), isEventThread ())}

    System.out.println ("running in the thread of events and should not be");

    }

  • Get to Eloqua 9 data via the API?

    Can anyone shed light on what is available via the Eloqua9 API to get the Eloqua data and also have a system (such as SQL2012) external data in Eloqua push? We are trying to implement the data cleanup efforts in our new tool of Data Quality Services and we would like to integrate with Eloqua. Can you give us some insight and detail on what we can and cannot do?

    Dennis,

    It depends on your business case here and the volume of data.

    Eloqua offers various options of the Soap API & REST.

    Also low volume & bulk options as well.

    For E9, you can start going through the documentation of the Soap API at the following link: -.

    https://www.eloquatrainingcenter.com/Portal/documentation/API/default.htm

    For authentication and exploitation of low volume of use of the WSDL for the Service: https://secure.eloqua.com/API/1.2/Service.svc?wsdl

    More data volume use the WSDL Service of data: https://secure.eloqua.com/API/1.2/DataTransferService.svc?wsdl

    An air of challenge, you can accomplish connecting SQL2012 to Eloqua defining HTTP endpoints and using components of script for mappings of field and transfer data using SSIS.

    For a more robust solution I would develop a .net stand alone application, and you can call it of SSIS in order automation and business intelligence.

    Good luck

    Daniel Nader Shaheen

  • Is vMotion road data via the server vCenter Server?

    Hello

    I need help to solve a query I have.

    We currently have a vCenter server that manages certain clusters, but also some stand-alone hosts with local storage (not shared). Stand-alone hosts are physically located on remote sites. Some sites have more than one of these stand-alone hosts. We have an obligation to move a VM to a standalone host guest to another independent host on the same remote site.

    As there is no shared storage the VM should be stopped before trying the vMotion.

    The virtual machine in question has a 4 TB thick VMDK accrued that is migrated between the physical host computers.

    I think when the vMotion is launched the data will be limited to the subnet that the two physical hosts are connected and therefore use the bandwidth available via local network equipment and will not let the physical site. My colleague suggested, however, that data travels through the vCenter server. Of course, this would cause a problem to try to move the data so much and would effectively mean that this 4TB would leave travel via the vCenter server, remote site, then return to the remote site. Not only this would be very slow, it could also have a negative impact on bandwidth to and from the site.

    So my question is who is correct? (Assuming that one of us is!)

    Thank you

    When you start migrating cold from a virtual computer by selecting migrate both (host and storage) where your hosts are based in the same place, it will happen with esxi source and target host, it will not go all the way to your host to vCenter.

    If your environment is vSphere client of web use 5.1 or higher, and you can move VM while it is powered on and residing in the local storage of the host. This feature is called Cross-host migration. available only by using the web client.

  • Some servers are build via VCenter and these servers do they appear in the VRA. How can we solve this problem?

    Some servers are build via VCenter and these servers do they appear in the VRA. How can we solve this problem?

    Please perform data collection and create a fake blue print.

    Go to infrastructure Organizer select the cluster and in the next tab filling details such as blue print and other information, click Finish.

    You can manage these servers through vRA.

  • How to rename a data via command store during automatic installation

    Hi, the automatic installation of my servers esx and post script exec I wan't to rename the local data store to esx 3.5. I tried to find a solution here in communities and also tried the following commands

    • vimsh - n e "datastore/hostsvc / / rename oldname newname.

    • VMware-vim-cmd hostsvc/datastore/rename oldname newname.

    but nothing happens when I use these commands and I get no error message.

    Anyone know what I am doing wrong and how can I rename local warehouses of data via command line?

    Thankx

    Frank

    We have a set of post scripts that are executed after the first reboot after the construction, by default the local data store must be named storage1

    You can do the following:

    ln -sf $(readlink -f /vmfs/volumes/storage1)  /vmfs/volumes/$(hostname -s)-local-storage
    

    or of any naming convention you want

    =========================================================================

    William Lam

    VMware vExpert 2009

    Scripts for VMware ESX/ESXi and resources at: http://engineering.ucsb.edu/~duonglt/vmware/

Maybe you are looking for