screen_window_t

I am trying to capture a screenshot of my app to help

screen_read_window(screen_window_t win, screen_buffer_t buf, int count, const int *save_rects, int flags)

for screen_window_t im to get "win" is used uninitialized in this function [-Wuninitialized]

I belive I get this because my application is based largely off QML so I never created windows in c ++, my question is how can I continue to put everything I've got in a window, or how to get the id of the window of what is already show QML

I don't know what is his GLOBALS_PTR(), but it is unique in its own app. You must get the mainWindow of the object Application of stunts... that's what Window1() is going back. You cannot just take a var not initialized as "winner" and wait until it is useful in this case... it is a handle to an existing thing, so it must be defined correctly, while in your code it has a certain value of rest wandering in it everything that was on your battery at this time.

Tags: BlackBerry Developers

Similar Questions

  • img_load_file() Help!

    I'm having a devil of a time trying to understand this function. Here's what I think to but!

     img_t           mindLogo;
        img_lib_t       ilib = NULL;
        const char      *LP  = "assets/logo.png";
        int logo;
        mindLogo.flags = 0;
        mindLogo.w  = 600;
        mindLogo.flags  |= IMG_W;
        mindLogo.h  = 400;
        mindLogo.flags  |= IMG_H;
        logo = img_load_file(ilib, LP, NULL, &mindLogo);
    

    It is part of a model how-to-understand windows and file interaction object for myself.  I'm pretty sure that my problem is with the const char * path of the img_load_file(). If your application why I set the size I thought it would be safe incase future screen size or orientation changes, no cause.

    Here is the code of actual work so far below that is just for a black parent window where I am trying to create a child img to display in the Center.  Shortly after the text.  But the point is to make a file a funtion for beginners practice.  Excuse the heavy commeting, normally I wouldn't have this.

    #include 
    #include 
    #include 
    
    //Start create window variables and define needed values........................
    screen_context_t    hscreen_c = 0;              //Define context data type.
    screen_window_t     hscreen_w = 0;          //Define window data type.
    static const char   *hmgroups = "Home Windows"; //Pointer to home group.
    screen_buffer_t     hbuffer;
    int hformat_w   = SCREEN_FORMAT_RGBA8888;       //
    int husage_w    = SCREEN_USAGE_NATIVE;          //
    int hvis        = 0;                            //
    int hcolor      = 0xff000000;                   //
    int hshap[4]    = {0, 0, 1, 1};                 //
    int hdims[2]    = {0, 0};                       //
    int hcount = 0;                         //
    //End create window variables and define needed values..........................
    
    //Start create child window variables and define needed values..................
    screen_window_t     hcscreen_w = 0;             //Define child window data type.
    //End create child window variables and define needed values....................
    
    int main()
    {
        //Start creating a parent window............................................
        if(screen_create_context(&hscreen_c, SCREEN_APPLICATION_CONTEXT) != 0)//Create context first.
        {
            return EXIT_FAILURE;
        }//end if
    
        if(screen_create_window(&hscreen_w, hscreen_c) != 0)//Create window.
        {
            screen_destroy_context(hscreen_c);
            return EXIT_FAILURE;
        }//end if
    
        if(screen_create_window_group(hscreen_w, hmgroups) != 0)//Create group for window.
        {
            return EXIT_FAILURE;
        }//end if
    
        if(screen_set_window_property_iv(hscreen_w, SCREEN_PROPERTY_FORMAT, &hformat_w) != 0)//Set window color display property.
        {
            return EXIT_FAILURE;
        }//end if
    
        if(screen_set_window_property_iv(hscreen_w, SCREEN_PROPERTY_USAGE, &husage_w) != 0)//Set window usage property.
        {
            return EXIT_FAILURE;
        }//end if
    
        if(screen_set_window_property_iv(hscreen_w, SCREEN_PROPERTY_VISIBLE, &hvis) != 0)//Set window visibility.
        {
            return EXIT_FAILURE;
        }
    
        if(screen_set_window_property_iv(hscreen_w, SCREEN_PROPERTY_COLOR, &hcolor) != 0)//Set window color.(Black)
        {
            return EXIT_FAILURE;
        }
    
        if(screen_set_window_property_iv(hscreen_w, SCREEN_PROPERTY_BUFFER_SIZE, hshap+2) != 0)//Set window shape.
        {
            return EXIT_FAILURE;
        }
    
        screen_get_context_property_iv(hscreen_c, SCREEN_PROPERTY_DISPLAY_COUNT, &hcount);
    
        screen_display_t *hdisps = calloc(hcount, sizeof(screen_display_t));
        screen_get_context_property_pv(hscreen_c, SCREEN_PROPERTY_DISPLAYS, (void **)hdisps);
    
        screen_display_t hdisp = hdisps[0];
        free(hdisps);
    
        screen_get_display_property_iv(hdisp, SCREEN_PROPERTY_SIZE, hdims);
    
        if(screen_set_window_property_iv(hscreen_w, SCREEN_PROPERTY_SOURCE_SIZE, hdims) != 0)//Set window
        {
            return EXIT_FAILURE;
        }
        int hpos[2] = { -hdims[0], -hdims[1] };
        screen_set_window_property_iv(hscreen_w, SCREEN_PROPERTY_SOURCE_POSITION, hpos);
    
        if(screen_create_window_buffers(hscreen_w, 1) != 0)//Set number of screen buffers.
        {
            return EXIT_FAILURE;
        }//end if
    
        screen_get_window_property_pv(hscreen_w, SCREEN_PROPERTY_RENDER_BUFFERS, (void **)&hbuffer);
        screen_post_window(hscreen_w, hbuffer, 1, hshap, 0);
    
        //End creating a parent window..............................................
        //Note: There is now a black screen that is 1024 x 600.  Objects go here.
    
        //Start destroying window. (To be done on exit.)............................
        if(screen_destroy_window(hscreen_w) != 0)//Destroy the window first.
        {
            return EXIT_FAILURE;
        }
    
        if(screen_destroy_context(hscreen_c) != 0)//Destroy the context last.
        {
            return EXIT_FAILURE;
        }
        //End destroying window.....................................................
    
    return 0;
    }//end main
    

    To make sure that I don't confuse associations I just heap image loading information just after that the parent window is displayed.  I also #include .  Can someone lead me may be on the right track.

    Soultion is to link the img library.  Also my above is correct with the change of:

    largeLogo = img_load_file(logolib, LP, NULL, &mindLogo);
    

    TO

    largeLogo   = img_load_file(ilib, LP, NULL, &mindLogo);
    

    See other post for more information:

    http://supportforums.BlackBerry.com/T5/native-SDK-for-BlackBerry-Tablet/IMG-lib-attach-how-to-questi...

  • img_lib_attach(): how questions?

    I am very confused on how to load it.

    I understand the code of documentation:

    img_lib_t ilib = NULL;
    int rc;
    rc = img_lib_attach(&ilib);
    

    How do I join img_codec_png.so so it gives me an undefined reference error.  I'm a little confused on what the documentation says about the config file in reference to this function.

    I'm trying to insert in the main function of a file of learning that I worked on. If I solve this problem, I think it will help me with img_load_file() error im receiving.  The labour code is below which will give you a black parent window and includes a child window to it.  The child is not defined or pushed on the screen again. (Excuse the fi business funtions, trying to practice practical intelligent coding.)

    #include 
    #include 
    #include 
    #include 
    
    screen_context_t    homeContext = 0;
    screen_window_t     homeWindow = 0;//parent window
    static const char   *homeGroup = "Home Windows";
    screen_buffer_t     homeBuffer;
    int hformat_w   = SCREEN_FORMAT_RGBA8888;
    int husage_w    = SCREEN_USAGE_NATIVE;
    int hvis        = 0;
    int hcolor      = 0xff000000;
    int hshap[4]    = {0, 0, 1, 1};
    int hdims[2]    = {0, 0};
    int hcount = 0;     
    
    screen_window_t     hcscreen_w = 0;//child window
    
    int main()
    {
        if(screen_create_context(&homeContext, SCREEN_APPLICATION_CONTEXT) != 0)
        {
            return EXIT_FAILURE;
        }
    
        if(screen_create_window(&homeWindow, homeContext) != 0)
        {
            screen_destroy_context(homeContext);
            return EXIT_FAILURE;
        }
    
        if(screen_create_window_group(homeWindow, homeGroup) != 0)
        {
            return EXIT_FAILURE;
        }
    
        if(screen_set_window_property_iv(homeWindow, SCREEN_PROPERTY_FORMAT, &hformat_w) != 0)
        {
            return EXIT_FAILURE;
        }
    
        if(screen_set_window_property_iv(homeWindow, SCREEN_PROPERTY_USAGE, &husage_w) != 0)
        {
            return EXIT_FAILURE;
        }
    
        if(screen_set_window_property_iv(homeWindow, SCREEN_PROPERTY_VISIBLE, &hvis) != 0)
        {
            return EXIT_FAILURE;
        }
    
        if(screen_set_window_property_iv(homeWindow, SCREEN_PROPERTY_BUFFER_SIZE, hshap+2) != 0)
        {
            return EXIT_FAILURE;
        }
    
        screen_get_context_property_iv(homeContext, SCREEN_PROPERTY_DISPLAY_COUNT, &hcount);
    
        screen_display_t *hdisps = calloc(hcount, sizeof(screen_display_t));
        screen_get_context_property_pv(homeContext, SCREEN_PROPERTY_DISPLAYS, (void **)hdisps);
    
        screen_display_t hdisp = hdisps[0];
        free(hdisps);
    
        screen_get_display_property_iv(hdisp, SCREEN_PROPERTY_SIZE, hdims);
    
        if(screen_set_window_property_iv(homeWindow, SCREEN_PROPERTY_COLOR, &hcolor) != 0)
        {
            return EXIT_FAILURE;
        }
    
        if(screen_set_window_property_iv(homeWindow, SCREEN_PROPERTY_SOURCE_SIZE, hdims) != 0)
        {
            return EXIT_FAILURE;
        }
        int hpos[2] = { -hdims[0], -hdims[1] };
        screen_set_window_property_iv(homeWindow, SCREEN_PROPERTY_SOURCE_POSITION, hpos);
    
        if(screen_create_window_buffers(homeWindow, 1) != 0)
        {
            return EXIT_FAILURE;
        }
    
        screen_get_window_property_pv(homeWindow, SCREEN_PROPERTY_RENDER_BUFFERS, (void **)&homeBuffer);
        screen_post_window(homeWindow, homeBuffer, 1, hshap, 0);
    
        if(screen_create_context(&homeContext, SCREEN_APPLICATION_CONTEXT) != 0)
        {
            return EXIT_FAILURE;
        }
    
        if(screen_create_window_type(&hcscreen_w, homeContext, SCREEN_CHILD_WINDOW) != 0)
        {
            screen_destroy_context(homeContext);
            return EXIT_FAILURE;
        }
    
        if(screen_join_window_group(hcscreen_w, homeGroup))
        {
            return EXIT_FAILURE;
        }
    
        if(screen_destroy_window(homeWindow) != 0)
        {
            return EXIT_FAILURE;
        }
    
        if(screen_destroy_context(homeContext) != 0)
        {
            return EXIT_FAILURE;
        }
    
    return 0;
    }
    

    You must link against libimg.  Add "img" to your list of libraries.

  • How to display data on the screen and save the data in a file at the same time?

    The code can display acceleration on the screen of the playbook.

    But, when fprintf (f, "X (m/s2), Accel Accel Y (m/s2), Accel Z (m/s2) \n" ") runs, debug displays no source available.

    ??

    Does anyone know how to solve the problem, which writes data to a file?

    The code is below.

    /*
    * Copyright (c) 2011 Research In Motion Limited.
    *
    Licensed under the Apache License, Version 2.0 (the "license");
    * You may not use this file except in compliance with the license.
    * You can get a copy of the license at
    *
    www.Apache.org/licenses/license-2.0
    *
    * Unless required by the applicable law or agreement written, software
    * distributed under the license is distributed on an "AS IS" BASIS.
    * WITHOUT WARRANTIES OR CONDITIONS of ANY KIND, express or implied.
    * See the license for the specific language governing permissions and
    * limitations under the license.
    */

    #include
    #include
    #include
    #include
    #include
    #include
    #include
    #include
    #include
    #include

    /**
    The milliseconds between the accelerometer readings. It's the same thing
    * rate in which the accelerometer data will be updated
    * material. The update of material rate is below to help
    * accelerometer_set_update_frequency (FREQ_40_HZ).
    */
    public static const int ACCELEROMETER_MAX_POLL_INTERVAL = 25;

    public static screen_context_t screen_ctx;
    public static screen_window_t screen_win;
    dialog_instance_t main_dialog = 0;

    paused int = 0;

    The forces of the accelerometer
    float force_x, force_y, force_z;

    file elements
    int _logcounter = 0;
    FullName Char [256];
    FILE * f;

    /**
    * Use the PID to set the id of Group window.
    */
    public static char
    get_window_group_id()
    {
    public static char s_window_group_id [16] = "";
    If (s_window_group_id [0] == '\0') {}
    snprintf (s_window_group_id, sizeof (s_window_group_id), "%d", getpid());
    }
    Return s_window_group_id;
    }

    /**
    * Set up a basic screen, so that the browser will be
    * Send window state events when the State of the window changes.
    *
    * @return @c EXIT_SUCCESS or EXIT_FAILURE @c
    */
    public static int
    setup_screen()
    {
    If (screen_create_context (& screen_ctx, SCREEN_APPLICATION_CONTEXT)! = 0) {}
    Return EXIT_FAILURE;
    }
    If (screen_create_window (& screen_win, screen_ctx)! = 0) {}
    screen_destroy_context (screen_ctx);
    Return EXIT_FAILURE;
    }
    use of int = SCREEN_USAGE_NATIVE;
    If (screen_set_window_property_iv (screen_win, SCREEN_PROPERTY_USAGE, & use)! = 0) goto fail;
    If (screen_create_window_buffers (screen_win, 1)! = 0) goto fail;
    If (screen_create_window_group (screen_win, get_window_group_id())! = 0) goto fail;
    buff screen_buffer_t;
    If (screen_get_window_property_pv (screen_win, SCREEN_PROPERTY_RENDER_BUFFERS, (void *) & buff)! = 0) goto fail;
    buffer_size int [2];
    If (screen_get_buffer_property_iv (buff, SCREEN_PROPERTY_BUFFER_SIZE, buffer_size)! = 0) goto fail;
    int attributes [1] = {SCREEN_BLIT_END};
    If (screen_fill (screen_ctx, chamois, attributes)! = 0) goto fail;
    int dirty_rects [4] = {0, 0, buffer_size [0], buffer_size [1]};
    If (screen_post_window (screen_win, buff, 1, (const int *) dirty_rects, 0)! = 0) goto fail;
    Return EXIT_SUCCESS;
    in case of failure:
    screen_destroy_window (screen_win);
    screen_destroy_context (screen_ctx);
    Return EXIT_FAILURE;
    }

    /**
    * Rotates the screen at the specified angle.
    *
    @param angle angle to rotate the screen.  Must be 0, 90, 180 or 270.
    *
    * @return @c EXIT_SUCCESS on success, to another @c EXIT_FAILURE
    */
    public static int
    rotate_screen (int angle)
    {
    If ((angle! = 0) & (angle! = 90) & (angle! = 180) & (angle! = 270)) {}
    fprintf (stderr, "Invalid Angle\n");
    Return EXIT_FAILURE;
    }
    int rc;
    int rotation;
    RC = screen_get_window_property_iv (screen_win, SCREEN_PROPERTY_ROTATION, & rotation);
    If (rc! = 0) {}
    fprintf (stderr, "error getting the window of the screen rotation: %d\n", rc);
    Return EXIT_FAILURE;
    }
    int size [2];
    RC = screen_get_window_property_iv (screen_win, SCREEN_PROPERTY_BUFFER_SIZE, size);
    If (rc! = 0) {}
    fprintf (stderr, "error getting size memory buffer window screen: %d\n", rc);
    Return EXIT_FAILURE;
    }
    int temp;
    switch (angle - rotation) {}
    case - 270:
    case - 90:
    decision 90:
    case 270:
    Temp = size [0];
    size [0] = size [1];
    size [1] = temp;
    break;
    by default:
    break;
    }
    RC = screen_set_window_property_iv (screen_win, SCREEN_PROPERTY_ROTATION, &angle);)
    If (rc! = 0) {}
    fprintf (stderr, "screen rotation window for parameter error: %d\n", rc);
    Return EXIT_FAILURE;
    }
    RC = screen_set_window_property_iv (screen_win, SCREEN_PROPERTY_SIZE, size);
    If (rc! = 0) {}
    fprintf (stderr, "error creating window screen size: %d\n", rc);
    Return EXIT_FAILURE;
    }
    RC = screen_set_window_property_iv (screen_win, SCREEN_PROPERTY_SOURCE_SIZE, size);
    If (rc! = 0) {}
    fprintf (stderr, "error creating window screen size: %d\n", rc);
    Return EXIT_FAILURE;
    }
    RC = screen_set_window_property_iv (screen_win, SCREEN_PROPERTY_BUFFER_SIZE, size);
    If (rc! = 0) {}
    fprintf (stderr, "error creating window buffer size of the screen: %d\n", rc);
    Return EXIT_FAILURE;
    }
    Return EXIT_SUCCESS;
    }

    /**
    To manage an event of navigator.
    *
    * @return @c the value true if the application should shut down, another fake.
    */
    public static bool
    handle_navigator_event(bps_event_t *Event)
    {
    bool should_exit = false;
    switch (bps_event_get_code (event)) {}
    case NAVIGATOR_EXIT:
    should_exit = true;
    break;
    case NAVIGATOR_ORIENTATION_CHECK:
    navigator_orientation_check_response (event, true);
    break;
    case NAVIGATOR_ORIENTATION:
    {
    int angle = navigator_event_get_orientation_angle (event);
    If (rotate_screen (angle) == EXIT_FAILURE) {}
    should_exit = true;
    }
    navigator_done_orientation (Event);
    break;
    }
    }
    Return should_exit;
    }
    /**
    * Display an alert dialog box that contains the location data.
    */
    public static void
    show_main_dialog()
    {
    If {(main_dialog)
    return;
    }
    dialog_create_alert(&main_dialog);
    dialog_set_alert_message_text (main_dialog, "Acceleration to first fix");
    dialog_set_size (main_dialog, DIALOG_SIZE_FULL);
    dialog_set_group_id (main_dialog, get_window_group_id());
    dialog_set_cancel_required (main_dialog, true);
    dialog_show (main_dialog);
    }

    /**
    * Displays geolocation data in the main dialog box.
    */
    public static void
    display_acceleration_data (float force_x, float force_y, float force_z) {}
    char buf [1024];
    snprintf (buf, sizeof buf,
    "\tX: m\n % 7.3.f.
    "\tY: m\n % 7.3.f.
    "\tZ: m\n % 7.3.f."
    force_x, force_y, force_z);
    dialog_set_alert_message_text (main_dialog, buf);
    dialog_update (main_dialog);
    }

    void createafile() {}
    sprintf (fullname, "shared/documents/Raw-%d.txt",_logcounter);
    {if(f==null)}
    file does not exist
    f = fopen (fullname, "r");
    {while(f!=null)}
    file exists
    fclose (f);
    ++ _logcounter;
    sprintf (fullname, "shared/documents/Raw-%d.txt",_logcounter);
    f = fopen (fullname, "r");
    }
    }
    }

    write data to the file
    void writedataintofile() {}
    f = fopen (fullname, "w");
    fprintf (f, "X (m/s2), Accel Accel Y (m/s2), Accel Z (m/s2) \n" ");
    fprintf (f, "" 7% 7% .3f, .3f, %7.3f\n ', force_x, force_y, force_z ");
    fclose (f);
    }

    /**
    * A sample application shows the native APIs of BlackBerry for accelerometer.
    * The example initializes and reads the accelerometer periodically until one
    * NAVIGATOR_EXIT event is received.
    Enforcement also tuned to changes status window in the browser so that
    * It can stop reading the accelerometer when the application is no longer visible.
    */
    int main (int argc, char * argv {})
    bool exit_application = false;
    /*
    * Until we can listen for events from the BlackBerry Tablet OS platform
    * services, we need to initialize the BPS infrastructure
    */
    bps_initialize();
    /*
    * Once the BPS infrastructure has been initialized, we can save to
    * events of different services of BlackBerry Tablet OS platform. The
    * Browser service manages and provides the life cycle of application and
    * events of visibility.
    For the example, we ask browser events so that we can follow when
    * the system ends the application (NAVIGATOR_EXIT event). This allows to
    * us to clean up resources in the application.
    */
    navigator_request_events (0);
    dialog_request_events (0);
    /*
    * Initialize the screen so that the Windows Id is correctly set, to allow
    * dialogs to display.
    */
    If (setup_screen()! = EXIT_SUCCESS) {}
    fprintf (stderr, "Failed to initialize the screen.");
    Exit (-1);
    }
    /*
    * Once the BPS infrastructure has been initialized, we can save to
    * events of different services of BlackBerry Tablet OS platform. The
    * Browser service manages and provides the life cycle of application and
    * events of visibility.
    *
    For the example, we ask browser events so that we can follow when
    * the system ends the application (NAVIGATOR_EXIT event).
    *
    * We ask events of dialogue so that we can be notified when the service of dialogue
    * answers to our queries/requests.
    */
    If (BPS_SUCCESS! = navigator_request_events (0)) {}
    fprintf (stderr, "error asking navigator events: %s", strerror(errno) (errno));
    Exit (-1);
    }
    If (BPS_SUCCESS! = dialog_request_events (0)) {}
    fprintf (stderr, "error asking dialog events: %s", strerror(errno) (errno));
    Exit (-1);
    }
    /*
    * Create and display the dialog box that displays the data from the accelerometer.
    */
    show_main_dialog();
    /*
    Prior to initialize the accelerometer function, we must ensure the unit
    * takes in charge
    */
    If (sensor_is_supported (SENSOR_TYPE_ACCELEROMETER)) {}
    /*
    * If the device does not support the accelerometer so warn the user,.
    * clean in and out
    */
    public static const int SENSOR_RATE = 40;
    sensor_set_rate (SENSOR_TYPE_ACCELEROMETER, SENSOR_RATE);
    sensor_request_events (SENSOR_TYPE_ACCELEROMETER);

    }
    /*
    * Initialize the accelerometer setting the rate at which the
    * accelerometer values will be updated material
    */
    accelerometer_set_update_frequency (FREQ_40_HZ);

    /*
    Treat browser events and take the accelerometer readings periodically
    * until we receive a NAVIGATOR_EXIT event.
    */
    createafile();
    f = fopen (fullname, "w");

    While (! exit_application) {}
    /*
    * By setting the timeout to bps_get_event to ACCELEROMETER_MAX_POLL_INTERVAL,
    * We assign the maximum duration (in millis) who will wait before
    * release so that we can take a reading of the accelerometer.
    */
    bps_event_t * event = NULL;
    bps_get_event (& event, ACCELEROMETER_MAX_POLL_INTERVAL);

    If {(event)
    If (bps_event_get_domain (event) is {sensor_get_domain()})
    /*
    * We woke up. See if we are in the paused state. If not,
    take a reading of the accelerometer
    */
    If (! pause) {}
    sensor_event_get_xyz (event, & force_x, & force_y, & force_z);
    display_acceleration_data (force_x, force_y, force_z);
    fprintf (f, "X (m/s2), Accel Accel Y (m/s2), Accel Z (m/s2) \n" ");
    fprintf (f, "" 7% 7% .3f, .3f, %7.3f\n ', force_x, force_y, force_z ");
    } //paused
    }
    / * If this is an event of the dialog box, determine the response code and handle
    * the event accordingly.
    */
    {ElseIf (bps_event_get_domain (event) == dialog_get_domain() {)}
    ;
    }
    /*
    * If it's a NAVIGATOR_EXIT event and then set the exit_application
    * indicator so that the application stops processing events, clean and
    * output.
    */
    {ElseIf (bps_event_get_domain (event) == navigator_get_domain() {)}
    exit_application = handle_navigator_event (event);
    }
    } //if event
    } //while
    /*
    * Destroy the dialog box, if it exists.
    */
    If {(main_dialog)
    dialog_destroy (main_dialog);
    }

    fclose (f);

    /*
    * Clean the infrastructure bps and output
    */
    sensor_stop_events (SENSOR_TYPE_ACCELEROMETER);
    bps_shutdown();
    screen_destroy_window (screen_win);
    screen_destroy_context (screen_ctx);
    return 0;
    } //main

    Hello

    As I said on your other thread:

    In order to write in the shared directory/documents, your application should request action "access_shared" and it must be granted by the user. Make sure you have

    access_shared

    in your bar - descriptor.xml.

    Also note that the shared documents folder may not be the best place to write the application log data. There is one connects / in the sandbox to this effect, or the application directory data / directory if the information should be persisted. The documents/shared/folder is designed for documents that the user creates or interacts with.

    HTH,

    Christian

  • prevent a screen lock

    Hello guys,.

    I encountered the following problem. Let's say I have an app flashlight. So I turn flash on the search for my keys or anything else. But when the screen turns off the flash no longer works.

    How could avoid this situation? I checked the documentation, but I can read only status screen lock, not put it.

    Any help would be appreciated.

    Thank you!

    Thanks a lot Peter,.

    So, I found 3 ways.

    (1) QML

    Application.mainWindow.screenIdleMode = 1
    

    (2) C++

    WId winId = this->mainWindow->winId();
    if( winId != NULL )
    {
      int idleMode = SCREEN_IDLE_MODE_KEEP_AWAKE;
      screen_set_window_property_iv( screen_window_t(winId), SCREEN_PROPERTY_IDLE_MODE, &idleMode);
    }
    

    (3) C++

    1) Create a new thread
    2) In the thread create a screen context
    3) request events for screen and navigator
    4) when you get a message of screen idle, inject a touch event into the screen queue
    
  • [HELP] H264 video camera trying to change HelloVideoCamera

    I am very new to recording video and video encoding of the animals.

    In this line, I have error 22

     

    camera_error_t error = camera_start_encode(mCameraHandle, NULL, NULL, NULL, NULL, NULL);
    
    fprintf(stderr, "camera_start_encode() error %d", error); // It returns 22
    

    I checked here https://developer.blackberry.com/native/reference/core/com.qnx.doc.camera.lib_ref/topic/camera_error... what is the error 22, but I do not see the error with 22 values

    Here's my complete code

    HelloVideoCameraApp::HelloVideoCameraApp(bb::cascades::Application *app) :
            QObject(app),
            mCameraHandle(CAMERA_HANDLE_INVALID),
            mVideoFileDescriptor(-1)
    {
        mViewfinderWindow = ForeignWindowControl::create().windowId(QString("cameraViewfinder"));
        mViewfinderWindow->setUpdatedProperties(WindowProperty::Position | WindowProperty::Size | WindowProperty::Visible);
    
        QObject::connect(mViewfinderWindow, SIGNAL(windowAttached(screen_window_t, const QString &, const QString &)), this, SLOT(onWindowAttached(screen_window_t, const QString &,const QString &)));
    
        mStartFrontButton = Button::create("Front Camera").onClicked(this, SLOT(onStartFront()));
        mStartRearButton = Button::create("Rear Camera").onClicked(this, SLOT(onStartRear()));
        mStopButton = Button::create("Stop Camera").onClicked(this, SLOT(onStopCamera()));
        mStopButton->setVisible(false);
        mStartStopButton = Button::create("Record Start").onClicked(this, SLOT(onStartStopRecording()));
        mStartStopButton->setVisible(false);
    
        mStatusLabel = Label::create("filename");
        mStatusLabel->setVisible(false);
    
        Container* container = Container::create()
            .layout(DockLayout::create())
            .add(Container::create()
                .horizontal(HorizontalAlignment::Center)
                .vertical(VerticalAlignment::Center)
                .add(mViewfinderWindow))
            .add(Container::create()
                .horizontal(HorizontalAlignment::Left)
                .vertical(VerticalAlignment::Top)
                .add(mStatusLabel))
            .add(Container::create()
                .horizontal(HorizontalAlignment::Center)
                .vertical(VerticalAlignment::Bottom)
                .layout(StackLayout::create()
                            .orientation(LayoutOrientation::LeftToRight))
                            .add(mStartFrontButton)
                            .add(mStartRearButton)
                            .add(mStartStopButton)
                            .add(mStopButton));
    
        app->setScene(Page::create().content(container));
    }
    
    HelloVideoCameraApp::~HelloVideoCameraApp()
    {
    }
    
    void HelloVideoCameraApp::onWindowAttached(screen_window_t win, const QString &group, const QString &id)
    {
        int i = (mCameraUnit == CAMERA_UNIT_FRONT);
        screen_set_window_property_iv(win, SCREEN_PROPERTY_MIRROR, &i);
        i = -1;
        screen_set_window_property_iv(win, SCREEN_PROPERTY_ZORDER, &i);
        screen_context_t screen_ctx;
        screen_get_window_property_pv(win, SCREEN_PROPERTY_CONTEXT, (void **)&screen_ctx);
        screen_flush_context(screen_ctx, 0);
    }
    
    int HelloVideoCameraApp::createViewfinder(camera_unit_t cameraUnit, const QString &group, const QString &id)
    {
        if (mCameraHandle != CAMERA_HANDLE_INVALID)
        {
            qDebug() << "camera already running";
            return EBUSY;
        }
    
        mCameraUnit = cameraUnit;
    
        if (camera_open(mCameraUnit,CAMERA_MODE_RW | CAMERA_MODE_ROLL,&mCameraHandle) != CAMERA_EOK)
        {
            qDebug() << "could not open camera";
            return EIO;
        }
    
        qDebug() << "camera opened";
    
        //camera_set_video_property(mCameraHandle, CAMERA_IMGPROP_VIDEOCODEC, CAMERA_VIDEOCODEC_H264);
    
        camera_error_t error = camera_set_video_property(mCameraHandle,
                                                                    CAMERA_IMGPROP_VIDEOCODEC, CAMERA_VIDEOCODEC_H264,
                                                                    CAMERA_IMGPROP_WIDTH, 640,
                                                                    CAMERA_IMGPROP_HEIGHT, 352,
                                                                    CAMERA_IMGPROP_FRAMERATE, (double)30.0);
    
        if(error == CAMERA_EOK)
            {
                    qDebug() << "VIDEO PROPERTY SUCCESS";
            }
            else
            {
                    qDebug() << "VIDEO PROPERTY ERROR: " << error;
            }
    
        if (camera_set_videovf_property(mCameraHandle,CAMERA_IMGPROP_WIN_GROUPID, group.toStdString().c_str(),CAMERA_IMGPROP_WIN_ID, id.toStdString().c_str()) == CAMERA_EOK)
        {
            qDebug() << "viewfinder configured";
    
            if (camera_start_video_viewfinder(mCameraHandle, NULL, NULL, NULL) == CAMERA_EOK)
            {
                qDebug() << "viewfinder started";
                mStartFrontButton->setVisible(false);
                mStartRearButton->setVisible(false);
                mStopButton->setVisible(true);
                mStartStopButton->setText("Start Recording");
                mStartStopButton->setVisible(true);
                mStartStopButton->setEnabled(true);
                return EOK;
            }
        }
    
        qDebug() << "couldn't start viewfinder";
    
        camera_close(mCameraHandle);
        mCameraHandle = CAMERA_HANDLE_INVALID;
        return EIO;
    }
    
    void HelloVideoCameraApp::onStartFront()
    {
        qDebug() << "onStartFront";
    
        if (mViewfinderWindow)
        {
            // create a window and see if we can catch the join
            if (createViewfinder(CAMERA_UNIT_FRONT,mViewfinderWindow->windowGroup().toStdString().c_str(),mViewfinderWindow->windowId().toStdString().c_str()) == EOK)
            {
                qDebug() << "created viewfinder";
            }
        }
    }
    
    void HelloVideoCameraApp::onStartRear()
    {
        qDebug() << "onStartRear";
    
        if (mViewfinderWindow)
        {
            // create a window and see if we can catch the join
            if (createViewfinder(CAMERA_UNIT_REAR,mViewfinderWindow->windowGroup().toStdString().c_str(),mViewfinderWindow->windowId().toStdString().c_str()) == EOK)
            {
                qDebug() << "created viewfinder";
            }
        }
    }
    
    void HelloVideoCameraApp::onStopCamera()
    {
        qDebug() << "onStopCamera";
    
        if (mCameraHandle != CAMERA_HANDLE_INVALID)
        {
            // closing the camera handle causes the viewfinder to stop which will in turn
            // cause it to detach from the foreign window
            camera_close(mCameraHandle);
            mCameraHandle = CAMERA_HANDLE_INVALID;
            // reset button visibility
            mStartStopButton->setVisible(false);
            mStopButton->setVisible(false);
            mStartFrontButton->setVisible(true);
            mStartRearButton->setVisible(true);
        }
    }
    
    void HelloVideoCameraApp::onStartStopRecording()
    {
        qDebug() << "onStartStopRecording";
    
        if (mCameraHandle != CAMERA_HANDLE_INVALID)
        {
            if (mVideoFileDescriptor == -1)
            {
                soundplayer_play_sound_blocking("event_recording_start");
    
                char filename[CAMERA_ROLL_NAMELEN];
    
                if (camera_roll_open_video(mCameraHandle,&mVideoFileDescriptor,filename,sizeof(filename),CAMERA_ROLL_VIDEO_FMT_DEFAULT) == CAMERA_EOK)
                {
                    qDebug() << "opened " << filename;
    
    //                if (camera_start_video(mCameraHandle,filename,NULL,NULL,NULL) == CAMERA_EOK)
    //                {
    //                    qDebug() << "started recording";
    //                    mStartStopButton->setText("Stop Recording");
    //                    mStopButton->setEnabled(false);
    //                    mStatusLabel->setText(basename(filename));
    //                    mStatusLabel->setVisible(true);
    //                    return;
    //                }
    
                    camera_error_t error = camera_start_encode(mCameraHandle, NULL, NULL, NULL, NULL, NULL);
    
                                    if(error == CAMERA_EOK)
                                    {
                                            qDebug() << "Encoding started\n";
                                            mStartStopButton->setText("Stop Recording");
                                            mStopButton->setEnabled(false);
                                            mStatusLabel->setText(basename(filename));
                                            mStatusLabel->setVisible(true);
                                            return;
                                    }
                                    else
                                    {
                                            qDebug() << "Encoding Failed\n";
                                    }
    
                                    fprintf(stderr, "camera_start_encode() error %d", error);
    
                    qDebug() << "failed to start recording: " << error;
                    camera_roll_close_video(mVideoFileDescriptor);
                    mVideoFileDescriptor = -1;
                }
    
                soundplayer_play_sound("event_recording_stop");
            }
            else
            {
                soundplayer_play_sound("event_recording_stop");
                camera_stop_encode(mCameraHandle);
                qDebug() << "stopped recording";
                camera_roll_close_video(mVideoFileDescriptor);
                mVideoFileDescriptor = -1;
                mStartStopButton->setText("Start Recording");
                mStopButton->setEnabled(true);
                mStatusLabel->setVisible(false);
            }
        }
    }
    

    Any help is appreciated. Please, I beg you. Thank you.

    Thank you... I eliminated the camera_start_encode function

    my main problem was I really need to encode H264 video recording... and now I have the solution... Knobtviker taugtht me... I used the BestCam sample...

    added to this camera_init_video_encoder(); to the constructor

    and added this line after the opening of the camera

    camera_set_videoencoder_parameter (mHandle, CAMERA_H264AVC_BITRATE, 1000000, CAMERA_H264AVC_KEYFRAMEINTERVAL, 3, CAMERA_H264AVC_RATECONTROL, CAMERA_H264AVC_RATECONTROL_VBR, CAMERA_H264AVC_PROFILE, CAMERA_H264AVC_PROFILE_HIGH, CAMERA_H264AVC_LEVEL, CAMERA_H264AVC_LEVEL_4);

    These properties too

    QByteArray groupBA is mFwc-> windowGroup () .toLocal8Bit ();.
    QByteArray winBA is mFwc-> windowId () .toLocal8Bit ();.
    ERR = camera_set_videovf_property (mHandle, CAMERA_IMGPROP_HWOVERLAY, 1, CAMERA_IMGPROP_FORMAT, CAMERA_FRAMETYPE_NV12, CAMERA_IMGPROP_WIN_GROUPID, groupBA.data (), CAMERA_IMGPROP_WIN_ID, winBA.data (), CAMERA_IMGPROP_WIDTH, CAMERA_IMGPROP_HEIGHT, 720, 720, CAMERA_IMGPROP_MAXFOV, 0);
    ERR = camera_set_video_property (mHandle, CAMERA_IMGPROP_WIDTH, CAMERA_IMGPROP_HEIGHT, 720, 720, CAMERA_IMGPROP_AUDIOCODEC, CAMERA_AUDIOCODEC_AAC, CAMERA_IMGPROP_STABILIZATION, 1); CAMERA_IMGPROP_VIDEOCODEC, CAMERA_VIDEOCODEC_AVC1,

    now I can record video in H264 format.

  • How we prevent screen to cloud

    I am writing an Application that needs to turn off screen dimming. How can I do in C/C++?

    See if you can do this:

    WId winId = window.winId ();
    If (winId)
    {
    ZOrder = int layer;
    screen_set_window_property_iv (screen_window_t (winId), SCREEN_PROPERTY_ZORDER, & zorder);
    }

    Make sure that 'window' is a higher level of QWidget. From the code in the plugin of bb, his winId must be the native handle.

  • Turn off the display of gradation

    Hello

    I create a game, I use another tactile model interaction.

    But gray/black screen if I don't touch it for a while.

    How can I avoid this display dimming?

    Thank you

    I found the solution,

    QScopedPointer discovered (new QDeclarativeView());

    int idle_mode = SCREEN_IDLE_MODE_KEEP_AWAKE;
    WId winId = view-> winId();
    screen_set_window_property_iv (screen_window_t (winId), SCREEN_PROPERTY_IDLE_MODE, & idle_mode);

    It works for me with pure Qt application

  • Returns the name of the Group window

    Hello!

    I created screen_window_t but how I can return group ID screen_window_t name?

    SCREEN_PROPERTY_GROUP

    https://developer.BlackBerry.com/native/beta/reference/screen_libref/topic/screen_8h_1Screen_Propert...

  • Rotation of the screen (displayDirection) in the cascades (QtQuick) app

    Hi all

    I am new to BB, but have Qt / Qt Quick experience.

    I'm trying to transfer my Qt Quick to BB10 application. The app is SmartCam: http://sourceforge.net/projects/smartcam/

    I use camera APIs but without waterfalls. I have problems with the viewfinder of the camera and the orientation of my user interface.

    I demand BB10 to support orientation to landscape for my application like this:

    (1) in bar - descriptor.xml:

        
            landscape
            false
            none
            false
        
    

    (2) in main.cpp:

        QmlApplicationViewer qmlViewer;
     qmlViewer.setOrientation(QmlApplicationViewer::ScreenOrientationLockLandscape); // ORIENTATION
    

    It of OK, my application starts in the landscape and does not turn into the portrait when the orientation of the device changes (in fact many times the application started in portrait but only 0.5%; this isn't my main problem).

    My problem is that even if it begins in the landscape, the displayDirection (the name of it comes from the OrientationSupport class cascades) is sometimes different: sometimes the text of the Blackberry logo displayed briefly when my application starts is up (80%) and sometimes she is facing down (20%).

    I need to specify an angle with the viewfinder of the camera according to the value of the displayDirection (90 degrees or 270) but I have no way I know of to question the value of this setting, because the waterfalls of OrientationSupport is not available for Qt Quick application, it's me linking errors:

    undefined reference to `bb::cascades::OrientationSupport::instance()'
    

    I added this in the .pro file:

    LIBS += -lcamapi -lscreen -lbtapi -ljpeg -lbps
    

    but nothing helped.

    The RotationCamera example also uses the class OrientationSupport is not a big help for me.

    I tried also in main.cpp:

        // Disable screen power down
        int idle_mode = SCREEN_IDLE_MODE_KEEP_AWAKE;
        WId winId = glWidget->winId();
        screen_set_window_property_iv(screen_window_t(winId), SCREEN_PROPERTY_IDLE_MODE, &idle_mode);
        int angle = 0;
        screen_set_window_property_iv(screen_window_t(winId), SCREEN_PROPERTY_ROTATION, &angle);
        winId = qmlViewer.winId();
        screen_set_window_property_iv(screen_window_t(winId), SCREEN_PROPERTY_ROTATION, &angle);
    

    But the definition of SCREEN_PROPERTY_ROTATION did not help either...

    Because of this problem that my app was rejected form the AppWorld.

    Can anyone help?

    Thank you

    Ionut

    Fixed it by calling

    navigator_set_orientation(NAVIGATOR_LEFT_UP, NULL);
    

    in response to the event of bps navigator: NAVIGATOR_WINDOW_ACTIVE

    This way my app always starts in the landscape (what was going on as before) but the display direction will be constant at startup.

  • Number with gestures. Am I missing something?

    Hi all, I was experimenting with the sample of gestures of https://github.com/blackberry/NDK-Samples/tree/ndk2/Gesture ... I wanted to do something when a tap gesture were found (some processing and registration for the file...)...

    I noticed that after the treatment, the gestures would stop working...

    I was able to recreate the problem simply by putting a 2 second delay in the section of the tap of the gestures of the Gesturs sample application standard callback function...  I am pasting the main.c below...

    I tried searching in the gesture API on delays documentation or any configuration of the time-out option, but could not find something... am I missing something? How can I recover when it arrives? (of course I could put the personalised treatment in another thread, but I don't need this way, and I don't see why it wouldn't work...). Anyone else having this problem?

    This is the code:

    /*
    * Copyright (c) 2011-2012 Research In Motion Limited.
    *
    * Licensed under the Apache License, Version 2.0 (the "License");
    * you may not use this file except in compliance with the License.
    * You may obtain a copy of the License at
    *
    * http://www.apache.org/licenses/LICENSE-2.0
    *
    * Unless required by applicable law or agreed to in writing, software
    * distributed under the License is distributed on an "AS IS" BASIS,
    * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    * See the License for the specific language governing permissions and
    * limitations under the License.
    */
    
    #include 
    #include 
    #include 
    #include 
    #include 
    #include 
    #include 
    #include 
    #include 
    #include 
    
    #include "input/screen_helpers.h"
    #include "gestures/double_tap.h"
    #include "gestures/pinch.h"
    #include "gestures/set.h"
    #include "gestures/swipe.h"
    #include "gestures/tap.h"
    #include "gestures/two_finger_pan.h"
    
    #define MIN_VIEWPORT_SIZE 128
    #define MAX_VIEWPORT_SIZE 4096
    screen_context_t screen_ctx;
    screen_window_t screen_win;
    static bool shutdown;
    struct gestures_set * set;
    const char* img_path = "app/native/wallpaper.jpg"; /* Relative path to image asset */
    int viewport_pos[2] = { 0, 0 };
    int viewport_size[2] = { 0, 0 };
    int last_touch[2] = { 0, 0 };
    
    /**
     * The callback invoked when a gesture is recognized or updated.
     */
    void
    gesture_callback(gesture_base_t* gesture, mtouch_event_t* event, void* param, int async)
    {
        if (async) {
            fprintf(stderr,"[async] ");
        }
        switch (gesture->type) {
            case GESTURE_TWO_FINGER_PAN: {
                gesture_tfpan_t* tfpan = (gesture_tfpan_t*)gesture;
                fprintf(stderr,"Two finger pan: %d, %d", (tfpan->last_centroid.x - tfpan->centroid.x), (tfpan->last_centroid.y - tfpan->centroid.y));
                if (tfpan->last_centroid.x && tfpan->last_centroid.y) {
                    viewport_pos[0] += (tfpan->last_centroid.x - tfpan->centroid.x) >> 1;
                    viewport_pos[1] += (tfpan->last_centroid.y - tfpan->centroid.y) >> 1;
                }
                break;
            }
            case GESTURE_PINCH: {
                gesture_pinch_t* pinch = (gesture_pinch_t*)gesture;
                fprintf(stderr,"Pinch %d, %d", (pinch->last_distance.x - pinch->distance.x), (pinch->last_distance.y - pinch->distance.y));
    
                int dist_x = pinch->distance.x;
                int dist_y = pinch->distance.y;
                int last_dist_x = pinch->last_distance.x;
                int last_dist_y = pinch->last_distance.y;
    
                int reldist = sqrt((dist_x)*(dist_x) + (dist_y)*(dist_y));
                int last_reldist = sqrt((last_dist_x)*(last_dist_x) + (last_dist_y)*(last_dist_y));
    
                if (reldist && last_reldist) {
                    viewport_size[0] += (last_reldist - reldist) >> 1;
                    viewport_size[1] += (last_reldist - reldist) >> 1;
    
                    /* Size restrictions */
                    if (viewport_size[0] < MIN_VIEWPORT_SIZE) {
                        viewport_size[0] = MIN_VIEWPORT_SIZE;
                    } else if (viewport_size[0] > MAX_VIEWPORT_SIZE) {
                        viewport_size[0] = MAX_VIEWPORT_SIZE;
                    }
                    if (viewport_size[1] < MIN_VIEWPORT_SIZE) {
                        viewport_size[1] = MIN_VIEWPORT_SIZE;
                    } else if (viewport_size[1] > MAX_VIEWPORT_SIZE) {
                        viewport_size[1] = MAX_VIEWPORT_SIZE;
                    }
    
                    /* Zoom into center of image */
                    if (viewport_size[0] > MIN_VIEWPORT_SIZE && viewport_size[1] > MIN_VIEWPORT_SIZE &&
                            viewport_size[0] < MAX_VIEWPORT_SIZE && viewport_size[1] < MAX_VIEWPORT_SIZE) {
                        viewport_pos[0] -= (last_reldist - reldist) >> 2;
                        viewport_pos[1] -= (last_reldist - reldist) >> 2;
                    }
                }
                break;
            }
            case GESTURE_TAP: {
                gesture_tap_t* tap = (gesture_tap_t*)gesture;
                fprintf(stderr,"Tap x:%d y:%d delay 2000",tap->touch_coords.x,tap->touch_coords.y);
                // *** 2 sec delay to reproduce issue ***
                delay(2000);
                break;
            }
            case GESTURE_DOUBLE_TAP: {
                gesture_tap_t* d_tap = (gesture_tap_t*)gesture;
                fprintf(stderr,"Double tap x:%d y:%d", d_tap->touch_coords.x, d_tap->touch_coords.y);
                break;
            }
            default: {
                fprintf(stderr,"Unknown Gesture");
                break;
            }
        }
        fprintf(stderr,"\n");
    }
    
    /**
     * Initialize the gestures sets
     */
    static void
    init_gestures()
    {
        gesture_tap_t* tap;
        gesture_double_tap_t* double_tap;
        set = gestures_set_alloc();
        if (NULL != set) {
            tap = tap_gesture_alloc(NULL, gesture_callback, set);
            double_tap = double_tap_gesture_alloc(NULL, gesture_callback, set);
            tfpan_gesture_alloc(NULL, gesture_callback, set);
            pinch_gesture_alloc(NULL, gesture_callback, set);
        } else {
            fprintf(stderr, "Failed to allocate gestures set\n");
        }
    }
    
    static void
    gestures_cleanup()
    {
        if (NULL != set) {
            gestures_set_free(set);
            set = NULL;
        }
    }
    
    static void
    handle_screen_event(bps_event_t *event)
    {
        int screen_val, rc;
    
        screen_event_t screen_event = screen_event_get_event(event);
        mtouch_event_t mtouch_event;
        rc = screen_get_event_property_iv(screen_event, SCREEN_PROPERTY_TYPE, &screen_val);
        if(screen_val == SCREEN_EVENT_MTOUCH_TOUCH || screen_val == SCREEN_EVENT_MTOUCH_MOVE || screen_val == SCREEN_EVENT_MTOUCH_RELEASE) {
            rc = screen_get_mtouch_event(screen_event, &mtouch_event, 0);
            if (rc) {
                fprintf(stderr, "Error: failed to get mtouch event\n");
            }
            rc = gestures_set_process_event(set, &mtouch_event, NULL);
    
            /* No gesture detected, treat as pan. */
            if (!rc) {
                if (mtouch_event.contact_id == 0) {
                    if(last_touch[0] && last_touch[1]) {
                        fprintf(stderr,"Pan %d %d\n",(last_touch[0] - mtouch_event.x),(last_touch[1] - mtouch_event.y));
                        viewport_pos[0] += (last_touch[0] - mtouch_event.x) >> 1;
                        viewport_pos[1] += (last_touch[1] - mtouch_event.y) >> 1;
                    }
                    last_touch[0] = mtouch_event.x;
                    last_touch[1] = mtouch_event.y;
                }
            }
            if (screen_val == SCREEN_EVENT_MTOUCH_RELEASE) {
                last_touch[0] = 0;
                last_touch[1] = 0;
            }
        }
    }
    
    static void
    handle_navigator_event(bps_event_t *event) {
        switch (bps_event_get_code(event)) {
        case NAVIGATOR_EXIT:
            shutdown = true;
            break;
        }
    }
    
    static void
    handle_events()
    {
        int rc, domain;
        bool has_events = true;
    
        while(has_events) {
            bps_event_t *event = NULL;
            rc = bps_get_event(&event, 50);
            assert(rc == BPS_SUCCESS);
            if (event) {
                domain = bps_event_get_domain(event);
                if (domain == navigator_get_domain()) {
                    handle_navigator_event(event);
                } else if (domain == screen_get_domain()) {
                    handle_screen_event(event);
                    /* Re-draw the screen after a screen event */
                    screen_set_window_property_iv(screen_win, SCREEN_PROPERTY_SOURCE_POSITION , viewport_pos);
                    screen_set_window_property_iv(screen_win, SCREEN_PROPERTY_SOURCE_SIZE , viewport_size);
                    screen_flush_context(screen_ctx,0);
                }
            } else {
                has_events = false;
            }
    
        }
    }
    
    static int decode_setup(uintptr_t data, img_t *img, unsigned flags)
    {
        screen_window_t screen_win = (screen_window_t)data;
        screen_buffer_t screen_buf;
        int size[2];
    
        size[0] = img->w;
        size[1] = img->h;
        screen_set_window_property_iv(screen_win, SCREEN_PROPERTY_BUFFER_SIZE, size);
        screen_create_window_buffers(screen_win, 1);
    
        screen_get_window_property_pv(screen_win, SCREEN_PROPERTY_RENDER_BUFFERS, (void **)&screen_buf);
        screen_get_buffer_property_pv(screen_buf, SCREEN_PROPERTY_POINTER, (void **)&img->access.direct.data);
        screen_get_buffer_property_iv(screen_buf, SCREEN_PROPERTY_STRIDE, (int *)&img->access.direct.stride);
    
        img->flags |= IMG_DIRECT;
        return IMG_ERR_OK;
    }
    
    static void decode_abort(uintptr_t data, img_t *img)
    {
        screen_window_t screen_win = (screen_window_t)data;
        screen_destroy_window_buffers(screen_win);
    }
    
    int
    load_image(screen_window_t screen_win, const char *path)
    {
        img_decode_callouts_t callouts;
        img_lib_t ilib = NULL;
        img_t img;
        int rc;
    
        rc = img_lib_attach(&ilib);
        if (rc != IMG_ERR_OK) {
            return -1;
        }
    
        memset(&img, 0, sizeof(img));
        img.flags |= IMG_FORMAT;
        img.format = IMG_FMT_PKLE_XRGB8888;
    
        memset(&callouts, 0, sizeof(callouts));
        callouts.setup_f = decode_setup;
        callouts.abort_f = decode_abort;
        callouts.data = (uintptr_t)screen_win;
    
        rc = img_load_file(ilib, path, &callouts, &img);
        img_lib_detach(ilib);
    
        return rc == IMG_ERR_OK ? 0 : -1;
    }
    
    int
    main(int argc, char **argv)
    {
        const int usage = SCREEN_USAGE_WRITE;
    
        screen_buffer_t screen_buf = NULL;
        int rect[4] = { 0, 0, 0, 0 };
    
        /* Setup the window */
        screen_create_context(&screen_ctx, 0);
        screen_create_window(&screen_win, screen_ctx);
        screen_set_window_property_iv(screen_win, SCREEN_PROPERTY_USAGE, &usage);
    
        load_image(screen_win, img_path);
    
        screen_get_window_property_pv(screen_win, SCREEN_PROPERTY_RENDER_BUFFERS, (void **)&screen_buf);
        screen_get_window_property_iv(screen_win, SCREEN_PROPERTY_BUFFER_SIZE, rect+2);
        viewport_size[0] = rect[2];
        viewport_size[1] = rect[3];
        screen_set_window_property_iv(screen_win, SCREEN_PROPERTY_SOURCE_SIZE , viewport_size);
    
        screen_post_window(screen_win, screen_buf, 1, rect, 0);
    
        init_gestures();
    
        /* Signal bps library that navigator and screen events will be requested */
        bps_initialize();
        screen_request_events(screen_ctx);
        navigator_request_events(0);
    
        while (!shutdown) {
            /* Handle user input */
            handle_events();
        }
    
        /* Clean up */
        gestures_cleanup();
        screen_stop_events(screen_ctx);
        bps_shutdown();
        screen_destroy_window(screen_win);
        screen_destroy_context(screen_ctx);
        return 0;
    }
    

    Yes, when I run the code above I get the same problem.  It seems that the two finger gestures and pinching are fly events and not 'fault', so that it goes to the tap gesture.  If I rearrange the gesture_allocs to this

    _tfpan = tfpan_gesture_alloc (& tfparams, cb, _gest_set);
    _pinch = pinch_gesture_alloc (& pinparams, cb, _gest_set);
    _double_tap = double_tap_gesture_alloc (& dtparams, cb, _gest_set);
    _tap = tap_gesture_alloc (& tparams, cb, _gest_set);

    This works.  But the double tap does not work, probably because it takes 2 pressures, and to know that you have only 1 tap must wait or get an idle screen event.  In this case, I guess you could buffer events and use gestures_set_process_event_list with the functions in event_list.h

    https://developer.BlackBerry.com/native/reference/BB10/com.QNX.doc.gestures.lib_ref/topic/about_even...

    In addition, time of the event is the time of the real event, not when you have received the event with bps_get_event.  Any delay should not affect the treatment of the gesture.

  • Background image for the Drawing Application

    Hi all

    I'm trying to understand the following: how to add a background image to my drawing application. I have a drawing application that is based on the example of gesture application.

    What is done currently is:

    case GESTURE_SWIPE: {
                gesture_swipe_t* tap = (gesture_swipe_t*)gesture;
    
                int bg[] = { SCREEN_BLIT_DESTINATION_X, swipe->coords.x,
                SCREEN_BLIT_DESTINATION_Y, swipe->coords.y,
                SCREEN_BLIT_DESTINATION_WIDTH, 10,
                SCREEN_BLIT_DESTINATION_HEIGHT, 10,    SCREEN_BLIT_COLOR, 0xffffff00, SCREEN_BLIT_END };
                screen_fill(screen_ctx, screen_buf[0], bg);
                screen_post_window(screen_win, screen_buf[0], 1, rect, 0);
    
                fprintf(stderr,"Tap x:%d y:%d",swipe->coords.x,swipe->coords.y);
                break;
            }
    

    Whenever a movement is recognized, I simply 'draw' a box in this area. Given that the gesture is wide, many boxes are then created after another, therefore, create a line.

    My question is, since I am that based on the example of application of gesture, where only one screen is used (I intend to leave it as a screen), why my "design" does not appear on the background image already in place?

    The sample application gesture used this function to load the image:

    int
    load_image(screen_window_t screen_win, const char *path)
    {
        img_decode_callouts_t callouts;
        img_lib_t ilib = NULL;
        img_t img;
        int rc;
    
        rc = img_lib_attach(&ilib);
        if (rc != IMG_ERR_OK) {
            return -1;
        }
    
        memset(&img, 0, sizeof(img));
        img.flags |= IMG_FORMAT;
        img.format = IMG_FMT_PKLE_XRGB8888;
    
        memset(&callouts, 0, sizeof(callouts));
        callouts.setup_f = decode_setup;
        callouts.abort_f = decode_abort;
        callouts.data = (uintptr_t)screen_win;
    
        rc = img_load_file(ilib, path, &callouts, &img);
        img_lib_detach(ilib);
    
        return rc == IMG_ERR_OK ? 0 : -1;
    }
    

    Where decode_setup and decode_abort are:

    static int decode_setup(uintptr_t data, img_t *img, unsigned flags)
    {
        screen_window_t screen_win = (screen_window_t)data;
        screen_buffer_t screen_buf;
        int size[2];
    
        size[0] = img->w;
        size[1] = img->h;
        screen_set_window_property_iv(screen_win, SCREEN_PROPERTY_BUFFER_SIZE, size);
        screen_create_window_buffers(screen_win, 1);
    
        screen_get_window_property_pv(screen_win, SCREEN_PROPERTY_RENDER_BUFFERS, (void **)&screen_buf);
        screen_get_buffer_property_pv(screen_buf, SCREEN_PROPERTY_POINTER, (void **)&img->access.direct.data);
        screen_get_buffer_property_iv(screen_buf, SCREEN_PROPERTY_STRIDE, (int *)&img->access.direct.stride);
    
        img->flags |= IMG_DIRECT;
        return IMG_ERR_OK;
    }
    
    static void decode_abort(uintptr_t data, img_t *img)
    {
        screen_window_t screen_win = (screen_window_t)data;
        screen_destroy_window_buffers(screen_win);
    }
    

    To the help of my function above (from case Gesture_Swipe), should not "boxes appear on the background image because it replaces the buffer by the loading of the image? What I see is the location of the image and the x and y as messages form the debugging process.

    Previously, that used to work, is another loading image, I would create a fill of WHITE screen by hand (like a whiteboard for example). It's used to work. However, with fiasca image I don't see anything. Can someone help me with what I do?

    Thank you

    A27med

    Apparently, all I did was fill the screen with a WHITE background and then load the image in the main. This solved the problem for me

  • Signal connect problem

    Hi all

    I have a problem strange connection of signals to the slot machines. I have the following code:

    Container container = root->findChild ("timeline_container");
    m_pForeignWindow = (ForeignWindowControl *) (container-> at (0));

    m_pForeingWindow is not null here

    Boolean success = connect (m_pForeignWindow,
    SIGNAL (windowAttached (screen_window_t, const QString &, const QString &)),.
    This,
    SLOT (onWindowAttached (screen_window_t, const QString &, const QString &)));))

    success is true here

    success = QObject::connect (m_pForeignWindow,
    SIGNAL (touch(TouchEvent *)),
    m_timeline,
    CRACK (onTouch(TouchEvent *)));

    success is false here

    Research with the debugger, I got the following message:

    Object::connect: No such signal bb::cascades:ForeignWindowControl:touch(TouchEvent_*)

    but this signal exists for all objects in VisualNode... What I am doing wrong?

    Kind regards.

    See if it actually compiles.  FDI in the current beta version is infamous for being reported wrong these errors, so I suggest to turn them off.

    Window-> Preferences-> C / C++-> the analysis of Code-> Qt syntax problem

  • Need assistance creating a LED color changer Application

    Hello

    I guess I can consider this day 1 of programming with C++ and for Blackberry devices 10 but unfortunately a lot of the syntax seems vaguely similar to Java that I have some experience with.

    I am trying to create an application that listens whenever another application on the device requires that the front led as such and allow the user to define static color of the LED when it is activated by a myriad of applications.

    So far, I read the API functions includes and have used with success the led_request_color(); function to change the color of the LED when the action as a sliding down is engaged on the touch screen...

    Here is the code:

    /*
    * Copyright (c) 2011-2012 Research In Motion Limited.
    *
    * Licensed under the Apache License, Version 2.0 (the "License");
    * you may not use this file except in compliance with the License.
    * You may obtain a copy of the License at
    *
    * http://www.apache.org/licenses/LICENSE-2.0
    *
    * Unless required by applicable law or agreed to in writing, software
    * distributed under the License is distributed on an "AS IS" BASIS,
    * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    * See the License for the specific language governing permissions and
    * limitations under the License.
    */
    
    #include 
    #include 
    #include 
    #include 
    #include 
    #include 
    #include 
    
    static bool shutdown;
    
    static void
    handle_screen_event(bps_event_t *event)
    {
        int screen_val;
    
        screen_event_t screen_event = screen_event_get_event(event);
        screen_get_event_property_iv(screen_event, SCREEN_PROPERTY_TYPE, &screen_val);
    
        switch (screen_val) {
        case SCREEN_EVENT_MTOUCH_TOUCH:
            fprintf(stderr,"Touch event");
            led_request_color("1", LED_COLOR_BLUE, 1);
            break;
        case SCREEN_EVENT_MTOUCH_MOVE:
            fprintf(stderr,"Move event");
    
            break;
        case SCREEN_EVENT_MTOUCH_RELEASE:
            fprintf(stderr,"Release event");
            led_request_color("1", LED_COLOR_RED, 1);
            break;
        default:
            break;
        }
        fprintf(stderr,"\n");
    }
    
    static void
    handle_navigator_event(bps_event_t *event) {
        switch (bps_event_get_code(event)) {
        case NAVIGATOR_SWIPE_DOWN:
            fprintf(stderr,"Swipe down event");
            break;
        case NAVIGATOR_EXIT:
            fprintf(stderr,"Exit event");
            shutdown = true;
            break;
        default:
            break;
        }
        fprintf(stderr,"\n");
    }
    
    static void
    handle_event()
    {
        int domain;
    
        bps_event_t *event = NULL;
        if (BPS_SUCCESS != bps_get_event(&event, -1)) {
            fprintf(stderr, "bps_get_event() failed\n");
            return;
        }
        if (event) {
            domain = bps_event_get_domain(event);
            if (domain == navigator_get_domain()) {
                handle_navigator_event(event);
            } else if (domain == screen_get_domain()) {
                handle_screen_event(event);
            }
        }
    }
    
    int
    main(int argc, char **argv)
    {
        const int usage = SCREEN_USAGE_NATIVE;
    
        screen_context_t screen_ctx;
        screen_window_t screen_win;
        screen_buffer_t screen_buf = NULL;
        int rect[4] = { 0, 0, 0, 0 };
    
        /* Setup the window */
        screen_create_context(&screen_ctx, 0);
        screen_create_window(&screen_win, screen_ctx);
        screen_set_window_property_iv(screen_win, SCREEN_PROPERTY_USAGE, &usage);
        screen_create_window_buffers(screen_win, 1);
    
        screen_get_window_property_pv(screen_win, SCREEN_PROPERTY_RENDER_BUFFERS, (void **)&screen_buf);
        screen_get_window_property_iv(screen_win, SCREEN_PROPERTY_BUFFER_SIZE, rect+2);
    
        /* Fill the screen buffer with blue */
        int attribs[] = { SCREEN_BLIT_COLOR, 0xff0000ff, SCREEN_BLIT_END };
        screen_fill(screen_ctx, screen_buf, attribs);
        screen_post_window(screen_win, screen_buf, 1, rect, 0);
    
        char set = 222;
    
        /* Signal bps library that navigator and screen events will be requested */
        bps_initialize();
        screen_request_events(screen_ctx);
        navigator_request_events(0);
    
        /* Get Events - This doesn't work */
        if(bps_get_event() == led_get_domain()){
            led_request_color("1", LED_COLOR_BLUE, 0);
        }
    
        while (!shutdown) {
            /* Handle user input */
            handle_event();
        }
    
        /* Clean up */
        screen_stop_events(screen_ctx);
        bps_shutdown();
        screen_destroy_window(screen_win);
        screen_destroy_context(screen_ctx);
        return 0;
    }
    

    I don't know if I'm doing this right, but I'm trying to get the led_get_domain() function to retrieve the IDS of events LED triggered by other applications? That's why I think that's what I tried to do, but to no avail in the code above:

     /* Get Events - This doesn't work */
        if(bps_get_event() == led_get_domain()){
            led_request_color("1", LED_COLOR_BLUE, 0);
        }
    

    So, I guess I'm a newbie in this and I would appreciate your help and any improvements possible. Thank you!

    Accroding to https://developer.blackberry.com/native/reference/bb10/com.qnx.doc.bps.lib_ref/com.qnx.doc.bps.lib_r... you have that one event for LED that is LED_INFO it's just tells you the current state of the LED.

    To start to receive this event, you must first call led_request_events().

  • Prevent the display go to sleep

    Hi, is there a way to prevent display dimming?

    I came across a few threads, but these discussions were dated a few months ago.

    So I would like to know if theres any new API released to fix this problem a solution?

    0o0o

    PS: I used the following code, but it does not work, it's weird

        ForeignWindowControl *foreignWindow=ForeignWindowControl::create();
        int idle_mode = SCREEN_IDLE_MODE_KEEP_AWAKE;
        screen_set_window_property_iv((screen_window_t)foreignWindow, SCREEN_PROPERTY_IDLE_MODE,&idle_mode);
    

    See my replies in this thread: http://supportforums.blackberry.com/t5/Cascades-Development/Display-Keep-Alive-in-Cascades/m-p/19280...

Maybe you are looking for