Dates dates/LOV selector limiting

Hello

I need to limit the ability of the user to select a date (to be precise – month/year) based on some condition.

for example:

If the value of the item = "Jan 2015", the user can select only Jan2015 + dates.

I looked at the date picker, but he didn't seem I can dynamically change its value 'minimum '.

Yes, I understand that I will still need to create a validation, because the user can manually enter the date. But I think that 90% of the time people prefer 'click' and select the dates, then it will be a nice feature to allow them to select the right month to start with and take the opportunity to choose the previous month. If I use the date picker I can still choose previous months

Hi AZZ.

AZZ says:

I need to limit the ability of the user to select a date (to be precise – month/year) based on some condition.

for example:

If the value of the item = "Jan 2015", the user can select only Jan2015 + dates.

I looked at the date picker, but he didn't seem I can dynamically change its value 'minimum '.

Yes, I understand that I will still need to create a validation, because the user can manually enter the date. But I think that 90% of the time people prefer 'click' and select the dates, then it will be a nice feature to allow them to select the right month to start with and take the opportunity to choose the previous month. If I use the date picker I can still choose previous months

Please always mention Details of the environment with your question. This helps the members of the forum to check the context of the problem.

Your question must at least mention the APEX Version type, theme and Page template picker you use.

Anyway, if you use jQuery Date Picker in Oracle APEX, you can set the option of 'minDate' of jQuery UI Datepicker with the help of jQuery code executed by both dynamic Action.

Here's the blog post: & Oracle Apex Geekery: Dynamic Date Range in APEX Datepicker - no need to Plugin

Here's the implementation on apex.oracle.com: https://apex.oracle.com/pls/apex/f?p=52380:1:

NOTE: Refer to the section named "Datepicker with minDate = article and point maxDate Date + 1"

In addition, if still not able to work then re - produce on apex.oracle.com and share the credentials from the workspace.

Kind regards

Kiran

Tags: Database

Similar Questions

  • How to fill a data user selector automatically?

    I understand the functioning of (for the most part) user data preparers.  We used them during the creation process of Shell to allow the creator to specify which users can access the shell and which groups of users they will be placed.  This works fine, because the shell is created, these users are added and assigned to their respective groups.


    Documentation, user data preparers are supposed to support auto-fill, but the system, I cannot select a source where you're getting.  For example, the form attribute shell contains a data user picker that contains the name of the MP who has been assigned to the hull.  I have a BP in this shell on which I want the name of the PM to appear.  I think I would place a user on the BP data selector and automatically fill this field in the shell user attribute data selector, but uDesigner does not have the shape of the source attribute as it does for other types OF.  This is important in this field, the name of the PM, must be on the BP if I want to use it in a report of LiveCycle.

    Not having the user select the name again in the BP, how to pass the names of members of shell who were selected during the creation of shell using the user data selectors?


    Thank you.

    Please reach out to me by email to [email protected]. I want to send you screenshots.

  • Date of PRESSING: limited to a date in SO schedule

    Hi all

    I tried to program a client command, but it is limited to this day as (31-dec-2008), I can't change this date. Is there any facility for, why date is limited to a date specified? Help, please.

    Thank you
    Shishir

    It seems you may have activated ATP. Define the profile according to yes and try again
    OM: Authorized to substitute the ATP.

  • data pump export limited

    Hi all
    Export using default coherent datapump? If I do an export on the database of production for updating the production dev, do I need to start limiting production database? Not just need?
    Please let me know.

    Hello

    [If I do an export on the database of production for updating the production dev, do I need to start limiting production database? Do not right?]

    years is: no need to start the production database in restricted

  • I have limited cell data in my Z320 and want to browse, email and applications updated by wifi?

    I Z320 smart phone with the data of the limited phone and I would like to use wifi in the navigation, update apps and other animals that can wear out my data, can I do and how I can spend according to the data of the wifi phone?  Also I have dual sim, can I use a SIM data card slot and one for the phone and send SMS.  I'm still a novice, please bear with my stupid questions.  Thanks to all those who give their time to respond to my querries.

    SMS = text

    When wifi is connected, your phone uses Wi-Fi only, so you don't need to switch the SIM card, either by the way if you want to be 100% sure it will not be used, you can just use the quick toggle setting to toggle mobile data connection.

    phone Acer, normally there is a quick setting of mobile data, you just pull down the notification bar.

    Google game you don't need to activate, but check if updates will automatically install only if you are using wifi.

    https://support.Google.com/googleplay/answer/113412?hl=en

  • Date failed, LOV in the search panel

    Hi all

    Weare on Jdev 12 c.

    We have use cases where the hiring Dates are LOV in query af search panel.

    the loading of the page, we need to display the last date of rental disc.

    I've been following this blog and he works for Integer(Department_ID).but if I'm even since the hiring Date of search results are the filling but rental Date LOV is reprogrammed not to hire later datei .e first record of LOV.

    Andrejus Baranovskis Blog: Dynamic value by default for the field in query ADF search

    When I run the browser BC and execution of the ViewCtritera I am according to the newspapers and null pointer exception hiring date is not in the list of LOV.

    Check the data type of attribute EO/VO.

    Default value is met if you remove LOV HireDate?

    If this is the case, then it is a problem of date comparison and as a workaround you can probably add new string attribute to your hand and lov your, something like: to_char (hire_date, 'a model'), attach lov and search by attribute (just remember to create functional indexes in db)

    Dario

  • The most effective way to log data and read simultaneously (DAQmx, PDM) high data rates

    Hello
     
    I want to acquire the data of several Modules cDAQ using several chassis to
    high data rates (100 k samples per second if possible). Let's say the measurement time is 10 minutes and we got a large number of channels (40 for example). The measured data is written to a PDM file. I guess, the memory or the HARD disk speed is the limits. For the user, there must be a possibility to view the selection of channels in a graph during the measurement.

    My question: what is the best and most effective way to save and read data at the same time?

    First of all, I use an architecture of producer-consumer and I don't want to write and display the data in the same loop. I expect two possibilities:

    [1] to use the 'DAQmx configure logging.vi' with the operation 'journal and read' to write the data to a PDM file. To display the data in a second loop, I would create a DVR samples documented and 'sent' the DVR for the second loop, where the data will be displayed in a graph (data value reference). This method has the disadvantage that the data of all channels is copied into memory. Correct me if I'm wrong.

    [2] use 'DAQmx configure logging.vi', but only with the "journal" operation to write the data to a PDM file. To view the selected data, I had read a number of samples of the TDMS file in the second loop (I'm currently writing the TDMS file). In this case, I have only one copy data from the selected channels (not), but there will be more HARD drive accesses necessary.

    What is the most effective and efficient solution in this case?

    Are there ways to connect and read data with high frequencies of sampling?

    Thank you for your help.

    You say that the measurement time is 10 minutes. If you have 40 channels and you enjoy all CHs at 100 kHz, it is quite a number of values.

    In this case, I always try to approach under the conditions of use. If a measure is only 10 minutes, I just connect all PDM data and create a graphic module that could be in the same loop of consumers where connect you the data. You can always work on the raw data files big offline afterwards, the extraction of all the information you need (have a look at the product called NI DIAdem: http://www.ni.com/diadem/)

    The main issue is that the user needs to see in the graph (or perhaps a chart can be useful too). Lets say that the graph is 1024 pixels wide. It makes no sense to show multiple data to 1024 points, Yes? Every second will produce you 100 data points k per channel. What is the useful information, which should see your username? It depends on the application. In similar cases, I usually use some kind of data reduction method: I use a moving average (Point by point Mean.VI for example) with a size of the interval of 100. This way you get 100 data points of 1000 per channel every second. If you feed your graph every second with these average values, it will be able to data points in 1024 of the store (as a default) by channel (curve), which is a little more than 10 minutes, so that the user will see the entire measurement.

    So it depends on the frequency at which you send data to the consumer. For example, collect you values 1024 by iteration of the producer and send it to the consumer. Here you can make a normal means calc or a bearing (according to your needs) and he draw a graphic. This way your chart will display only the values of the last 10 seconds...

    Once I programmed some kind of module where I use a chart and not a graph, and the user can specify the interval of the absolute timestamp that is traced. If the data size is larger than the size of the chart in pixels, the module performs an average calculation in order to reduce the number of data points. Of course, if you need to see the raw data, you can specify an interval that is small. It all depends on how you program zoom functions, etc... In my case I hade a rate of 1 Hz, so I just kept all data in RAM limiting the berries to keep 24 hours of data, so that technicians could monitor the system. In your case, given the enormous amount of data, only a file read/write approach can work, if you really need access to all of the RAW data on the fly. But I hope that the values of working capital means will be enough?

  • Is vMotion road data via the server vCenter Server?

    Hello

    I need help to solve a query I have.

    We currently have a vCenter server that manages certain clusters, but also some stand-alone hosts with local storage (not shared). Stand-alone hosts are physically located on remote sites. Some sites have more than one of these stand-alone hosts. We have an obligation to move a VM to a standalone host guest to another independent host on the same remote site.

    As there is no shared storage the VM should be stopped before trying the vMotion.

    The virtual machine in question has a 4 TB thick VMDK accrued that is migrated between the physical host computers.

    I think when the vMotion is launched the data will be limited to the subnet that the two physical hosts are connected and therefore use the bandwidth available via local network equipment and will not let the physical site. My colleague suggested, however, that data travels through the vCenter server. Of course, this would cause a problem to try to move the data so much and would effectively mean that this 4TB would leave travel via the vCenter server, remote site, then return to the remote site. Not only this would be very slow, it could also have a negative impact on bandwidth to and from the site.

    So my question is who is correct? (Assuming that one of us is!)

    Thank you

    When you start migrating cold from a virtual computer by selecting migrate both (host and storage) where your hosts are based in the same place, it will happen with esxi source and target host, it will not go all the way to your host to vCenter.

    If your environment is vSphere client of web use 5.1 or higher, and you can move VM while it is powered on and residing in the local storage of the host. This feature is called Cross-host migration. available only by using the web client.

  • 12 c data CBD/PDB view (s) dictionary

    I just started to investigate the database Oracle 12 c and decided to dig into the multi-tennant feature to learn more about this architecture. This framework is to understand how new data dictionary views organize databases for CDBs/PDB, and how the security and roles are related to this new system. One thing I found confusing, is that if you want to display data from the views such as v$ containers, v$ PDB or v$ datafile, data will be limited to the context of the current container (not con_id) unless you are connected "as SYSDBA. Same SYSTEM returns context sensitive results. Is there a way to work around this problem, such as one user other than SYS can have this universal view of the database? I was hoping that common users (c ##username) would be able to examine such information without connect as SYS. Part of this is because I would like to be able to log in as user 'read only' for surveys on a daily basis and not connect as a privileged account (much less SYS) unless I have a specific need. In addition, all the actions performed by the user SYS are checked and if I connect to the database just to browse the data dictionary, it generates unnecessary audit records and noise in the audit trail.

    Hello

    I found a function (by accident) that can do what you want - doesn't really seem to be documented anywhere but...

    Oracle Database Blog 2.0: Query all the oracle 12 c connectable to databases in a database of the container at the same time?

    See you soon,.

    Rich

  • To access the data in pages ADF (.jspx) Web Service

    I have developed a .jspx page in the human task BPM from ADF.
    Now, I want to extract data LOV web service for one of the field. Can you please help.

    1. speak your version of JDev.
    2 Goto page editor, select selectOneChoice.
    3 Goto Inspector, find property "UnselectedLabel", the value "nothing selected".
    4 restart your application and see if it makes a difference. If this isn't the case, poster of the pieces of your code.

  • Reference data JSFL to CSXS extension

    I have a CSXS flash extension I load in Flash Pro CS5. The CS SDK extension I invoke the JSFL function in my AS3 code with the:

    var syncResult:SyncRequestResult = CSXSInterface.instance.evalScript ("myJSFL");

    The JSFL script gets charge and the function is called as I can see tracing instructions spit the text in the console of the Flash output.

    However, my function in JSFL tries to return a simple string value and unfortunately no matter what I try the syncResult.data is always undefined. The syncResult.status has the full value, so the call goes thru. Only the data of JSFL is not being consistent with my CS SDK extension.

    Is there a special way to get data beyond the limits of JSFL-CSXS? In my Photoshop extension I have JSX return a XML code snippet and the CS SDK Photoshop extension works very well. So, what's the trick to the JSFL Flash script?

    JSFL Script-


    function myJSFL() {}

    FL.trace ("called myJSFL");

    return to 'something';

    }

    You need to format data from your JSFL function return as stated in reference to the API for CSXSInterface.evalScript (). For example:

    Hello

    In general, I think that you can format the return values using the syntax defined here. Here's an example slightly more complex bearing on a table, please excuse the bad formatting.

    var xml = "";
    XML += "12."
    XML += '27.5Hello there!';
    xml += '
    ';
    return xml;

  • Overlapping Dates, tables Denormaization

    Hi guys,.

    I'm in a situation that we are denormalize tables for better performance and reduce the joins. I joined the tables using the logic to check for dates and add the fields for these dates to represent the joined data correct at some point of time. A simple example is:

    In the tables below: in Table1, 0023 a case_number is with "AC" status during the period 15 January 08 to 31-dec-9999 (date to be limited or current composition), and in Table2 the same case is to have center_code "C12" for 22 June 08 and 31-dec-9999. When the tables are joined too experienced dates will be common as data * 0023 has, June 22, 2008, 31-DEC-9999, AC, C12 *.

    My [previous assignment | http://forums.oracle.com/forums/thread.jspa?forumID=75 & threadID = 682324] on the forums who answered by Frank and was perfectly working for me to get the results of different data as above mentioned thread, but the current set of data throw me the problem by not checking the overlapping Dates no
    CREATE TABLE TABLE1 (
    CASE_NUMBER VARCHAR2(5),
    CHANGE_EFF_DATE DATE,
    END_EFF_DATE DATE,
    STATUS VARCHAR2(2) );
    
    INSERT INTO TABLE1 VALUES
    ( '0023A'
    ,TO_DATE('15-JAN-2008','DD-MON-YYYY')
    ,TO_DATE('31-DEC-9999','DD-MON-YYYY')
    ,'AC'
    );
    
    INSERT INTO TABLE1 VALUES
    ( '0023A'
    ,TO_DATE('07-OCT-2007','DD-MON-YYYY')
    ,TO_DATE('14-JAN-2008','DD-MON-YYYY')
    ,'CL'
    );
    
    INSERT INTO TABLE1 VALUES
    ( '0023A'
    ,TO_DATE('08-APR-2007','DD-MON-YYYY')
    ,TO_DATE('06-OCT-2007','DD-MON-YYYY')
    ,'AC'
    );
    
    INSERT INTO TABLE1 VALUES
    ( '0023A'
    ,TO_DATE('13-MAR-2007','DD-MON-YYYY')
    ,TO_DATE('07-APR-2007','DD-MON-YYYY')
    ,'RJ'
    );
    
    INSERT INTO TABLE1 VALUES
    ( '0023A'
    ,TO_DATE('31-MAY-2005','DD-MON-YYYY')
    ,TO_DATE('12-MAR-2007','DD-MON-YYYY')
    ,'AP'
    );
    
    CREATE TABLE TABLE2 (
    CASE_NUMBER VARCHAR2(5),
    CHANGE_EFF_DATE DATE,
    END_EFF_DATE DATE,
    CENTER_CODE VARCHAR2(3) );
    
    INSERT INTO TABLE2 VALUES
    ( '0023A'
    ,TO_DATE('22-JUN-2007','DD-MON-YYYY')
    ,TO_DATE('31-DEC-9999','DD-MON-YYYY')
    ,'C12'
    );
    
    INSERT INTO TABLE2 VALUES
    ( '0023A'
    ,TO_DATE('09-MAR-2007','DD-MON-YYYY')
    ,TO_DATE('21-JUN-2007','DD-MON-YYYY')
    ,'101'
    );
    
    SQL> SELECT * FROM TABLE1; 
    
    CASE_ CHANGE_EF END_EFF_D ST
    --------------------------------------------------------
    0023A 15-JAN-08 31-DEC-99 AC 
    0023A 07-OCT-07 14-JAN-08 CL 
    0023A 08-APR-07 06-OCT-07 AC 
    0023A 13-MAR-07 07-APR-07 RJ 
    0023A 31-MAY-05 12-MAR-07 AP 
    
    SQL> SELECT * FROM TABLE2; 
    
    CASE_ CHANGE_EF END_EFF_D CEN
    --------------------------------------------------
    0023A 22-JUN-07 31-DEC-99 C12
    0023A 09-MAR-07 21-JUN-07 101
    -----
    Here's the query I have spin for the attached information from the two tables with regard to the specific point in time.
    SELECT T1.CASE_NUMBER
    ,GREATEST(T1.CHANGE_EFF_DATE,T2.CHANGE_EFF_DATE) CHANGE_EFF_DATE
    ,LEAST(T1.END_EFF_DATE,T2.END_EFF_DATE) END_EFF_DATE
    ,T1.STATUS
    ,T2.CENTER_CODE
    FROM 
    TABLE1 T1 
    LEFT OUTER JOIN
    TABLE2 T2 
    ON 
    T1.CASE_NUMBER=T2.CASE_NUMBER AND
    T1.CHANGE_EFF_DATE <= T2.END_EFF_DATE AND 
    T2.CHANGE_EFF_DATE <= T1.END_EFF_DATE
    ORDER BY 2;
    
    Here is the result-set I am getting :
    
    CASE_ CHANGE_EF END_EFF_D ST CEN
    ------------------------------------------------------
    0023A 09-MAR-07 12-MAR-07 AP 101
    0023A 13-MAR-07 07-APR-07 RJ 101
    0023A 08-APR-07 21-JUN-07 AC 101
    0023A 22-JUN-07 06-OCT-07 AC C12
    0023A 07-OCT-07 14-JAN-08 CL C12
    0023A 15-JAN-08 31-DEC-99 AC C12
    My result set should include the overlap of dates as well which should look like this, but miss me the top (first record) in my output:
    CASE_ CHANGE_EF END_EFF_D ST CEN
    -----------------------------------------------------
    
    0023A 31-MAY-07 08-MAR-07 AP
    0023A 09-MAR-07 12-MAR-07 AP 101
    0023A 13-MAR-07 07-APR-07 RJ 101
    0023A 08-APR-07 21-JUN-07 AC 101
    0023A 22-JUN-07 06-OCT-07 AC C12
    0023A 07-OCT-07 14-JAN-08 CL C12
    0023A 15-JAN-08 31-DEC-99 AC C12
    I'll be really grateful if you guys can me help me.

    Thank you.

    Vlaminck

    Published by: Oracle, developer on December 11, 2008 21:30

    Hi, Vlaminck,

    If I understand the problem, you need a line of table2 which covers all the dates before the first change_eff_date, similar to the way you see a line that covers all dates after the last change_eff_date. You don't have to store such a line in the table: you can generate a runtime and use the UNION to add it to your actual data.

    WITH     t2     AS
    (
         SELECT     case_number
         ,     change_eff_date
         ,     end_eff_date
         ,     center_code
         FROM     table2
         UNION
         SELECT     case_number
         ,     TO_DATE (1, 'J')          AS change_eff_date     -- Earliest possible date
         ,     MIN (change_eff_date) - 1     AS end_eff_date
         ,     NULL                    AS center_code
         FROM     table2
         GROUP BY     case_number
    )
    SELECT     T1.CASE_NUMBER
    ,     GREATEST (T1.CHANGE_EFF_DATE, T2.CHANGE_EFF_DATE)     CHANGE_EFF_DATE
    ,     LEAST (T1.END_EFF_DATE ,T2.END_EFF_DATE)          END_EFF_DATE
    ,     T1.STATUS
    ,     T2.CENTER_CODE
    FROM          TABLE1     T1
    LEFT OUTER JOIN          T2
    ON     T1.CASE_NUMBER          =  T2.CASE_NUMBER
    AND     T1.CHANGE_EFF_DATE     <= T2.END_EFF_DATE
    AND     T2.CHANGE_EFF_DATE     <= T1.END_EFF_DATE
    ORDER BY 1, 2;
    

    Note that the main request is exactly what you had before, with the exception of the definition of t2.
    Where t2 had simply table2, now it is the UNION of table2 with one line per case_number, with a change_eff_date in 4712 BCE.

  • ASA5505 VPN Site to site and limiting access - URGENT

    I'll admit knowledge limited to the front, so forgive me if I look like a fool.  The company that I work began recently to hosting our application for some of our customers. to do this, we are renting rack space, connections and equipment in a data center.  We must send data to our request for an application in the center of data of our customers.  They have an ASA 5505.

    Our data center will support VPN site-to-site and nothing else.  Our client find it unacceptable, citing security and the inability to restrict access to only the small number of servers, our application needs to access.  I have to be able to talk intelligently and with the facts (and, preferably, examples of configuration on hand) with their staff of the IOC and network in the next day or so.

    The ASA 5505 can be configured for a VPM from site to site with our data center which limits our application server to access a limited set of IP addresses within their network?  If so, this is quite easily possible?  Anyone done this?

    Thank you

    Leighton Wingerd

    Leighton,

    Sounds complicated problem - but are simple actuall.  Remember that a VPN ensures the transmission from site A to site B on a precarious environment - internet.  For example, you can DEFINE the traffic that goes through the VPN, you also DEFINE the traffic that will launch the VPN tunnel in the first place.  With these statements said - using your supposed information you would create valuable traffic as the exact traffic you want to allow through the vpn;

    access-list permits datacentre_2_client tcp host 1.2.3.4 host 192.168.1.2 eq 1521

    And you will use the same ACL to set which can cross traffic.  However, I know for a fact that an ODBC Oracle connection uses more than one TCP port!

    The confidentiality of data is something else - that your customer needs to define requirements.  An SSL connection is fine and dandy - you will just be to encrypt the traffic twice!

  • Download limits

    Is there a file size maximum that you can upload in Eloqua?

    Eloqua has a max storage limit? If so, is it possible to see how much storage we used in our case.

    Max file size single (Image or video), you can download in Eloqua is 10 MB. Here is a link to a post of your * 2639084 * im size limit upload/import data file size limits

  • vCloud Director 5.5 device - limits

    Hi all

    Is there any vmware document that specifies the boundaries of the vCloud Director 5.5 device? I know it is trial / PoC and should not be used in a Production Env, but I am interested in a CEP more complex and has been reading here: the vCloud Director unit. VMware vSphere Blog - VMware Blogs limitations:

    Now the vCloud Director device it has some limitations.  As it is only intended for vCloud Director assessments, the scale of the deployment is limited to:

    • A cell of vCloud Director
    • Two vCenter servers
    • 10 VDC organization
    • 100 virtual machines
    • Up to 11 GB of information stored in the database shipped

    Being a former link, I'm sure that is more up-to-date and these limitations have increased, but would like to find something about it.

    I know it supports CentOS, but unfortunately in this PoC I do not have a Windows Server (will also use the vCenter device) and any database.

    Thanks in advance for any comments.

    We produce is no longer the device.  One of the limitations would be so you don't have updates for it.  The limits are generally the same as what you posted.

    CentOS support any really, since the device is produced as a black box.  SuSE or CentOS according to what version it is.  The embedded DB have size limits.

    If you do a CEP more complex, I suggest to use the BIN on CentOS Installer and have at least 2 vCloud Director of cells in a load balanced configuration.  Several cells is something you just can't do with the device.

Maybe you are looking for