Fail the query v$ views of shell script
Hello worldthe following query is runing with a SQL * MORE (database is mounted and unopened)
SQL > SELECT Member FROM v$ logfile;
MEMBERS
--------------------------------------------------------------------------------
+Data/testgfi/onlinelog/group_1.298.773871647
+Data/testgfi/onlinelog/group_2.294.773871647
+Data/testgfi/onlinelog/group_3.295.773871647
But in error if I run Shell script?
$ s sqlplus "/ as sysdba" < < EOF
SELECT Member FROM v$ logfile;Member SELECT FROM v
EOF
*
ERROR on line 1:
ORA-01219: database is not open: motions allowed on fixed tables/views only...
Help, please...
Thank you...
Aljaro
Aljaro wrote:
Hello worldthe following query is runing with a SQL * MORE (database is mounted and unopened)
SQL > SELECT Member FROM v$ logfile;
MEMBERS
--------------------------------------------------------------------------------
+Data/testgfi/onlinelog/group_1.298.773871647
+Data/testgfi/onlinelog/group_2.294.773871647
+Data/testgfi/onlinelog/group_3.295.773871647But in error if I run Shell script?
$ s sqlplus "/ as sysdba".<>
SELECT Member FROM v$ logfile;
change as below
SELECT Member FROM v$ logfile;
Tags: Database
Similar Questions
-
Rewrite the query without views
I remember seeing a function in oracle PL/SQL package, which would be to rewrite a query as it contained no views more (but the only real tables)... but I don't remember the name of the function.
What function/package for this?
Hello
looking for the new feature of 12 c DBMS_UTILITY. EXPAND_SQL_TEXT?
Concerning
Marcus
-
How to get rid of repeat groups using the query? Its my sql script.
Mr President.
worm of Oracle's 10g express edition
I want to get rid of repeating groups and wants to get result like
If my table script is as below
BEGIN -- drop tables EXECUTE IMMEDIATE 'DROP TABLE CUSTOMER'; EXECUTE IMMEDIATE 'DROP TABLE PRODUCT'; EXECUTE IMMEDIATE 'DROP TABLE SUPPLIER'; EXECUTE IMMEDIATE 'DROP TABLE PURCHASE'; EXECUTE IMMEDIATE 'DROP TABLE PURCHASELINE'; EXECUTE IMMEDIATE 'DROP TABLE SALES'; EXECUTE IMMEDIATE 'DROP TABLE SALESLINE'; EXECUTE IMMEDIATE 'DROP TABLE STOCK'; EXCEPTION WHEN OTHERS THEN DBMS_OUTPUT.PUT_LINE(''); END; / CREATE TABLE CUSTOMER ( cust_id NUMBER NOT NULL , name VARCHAR2(50) NOT NULL , address VARCHAR2(100) DEFAULT NULL , contactno VARCHAR2(20) DEFAULT NULL , CONSTRAINT CUSTOMER_PK PRIMARY KEY ( cust_id ) ENABLE ); CREATE TABLE PRODUCT ( prod_id NUMBER NOT NULL , name VARCHAR2(50) NOT NULL , description VARCHAR2(50) DEFAULT NULL , CONSTRAINT PRODUCT_PK PRIMARY KEY ( prod_id ) ENABLE ); CREATE TABLE SUPPLIER ( suplr_id NUMBER NOT NULL , name VARCHAR2(50) NOT NULL , address VARCHAR2(100) DEFAULT NULL , contactno VARCHAR2(20) DEFAULT NULL , CONSTRAINT SUPPLIER_PK PRIMARY KEY ( suplr_id ) ENABLE ); CREATE TABLE PURCHASE ( pur_id NUMBER NOT NULL , pur_date DATE NOT NULL , suplr_id NUMBER DEFAULT '0' , CONSTRAINT PUR_SUPLR_FK FOREIGN KEY ( suplr_id ) REFERENCES SUPPLIER ( suplr_id ) , CONSTRAINT PURCHASE_PK PRIMARY KEY ( pur_id ) ENABLE ); CREATE TABLE PURCHASELINE ( pur_id NUMBER DEFAULT '0' NOT NULL , prod_id NUMBER DEFAULT '0' NOT NULL , pur_qty NUMBER DEFAULT '0' NOT NULL , unit_pur_price NUMBER DEFAULT '0' NOT NULL , CONSTRAINT PUR_LINE_PUR_FK FOREIGN KEY ( pur_id ) REFERENCES PURCHASE ( pur_id ) , CONSTRAINT PUR_LINE_POD_FK FOREIGN KEY ( prod_id ) REFERENCES PRODUCT ( prod_id ) , CONSTRAINT PUR_LINE_PK PRIMARY KEY ( pur_id,prod_id ) ENABLE ); CREATE TABLE SALES ( sal_id NUMBER NOT NULL , sal_date DATE DEFAULT NULL , cust_id NUMBER DEFAULT '0' , CONSTRAINT PUR_CUSTR_FK FOREIGN KEY ( cust_id ) REFERENCES CUSTOMER ( cust_id ) , CONSTRAINT SALES_PK PRIMARY KEY ( sal_id ) ENABLE ); CREATE TABLE SALESLINE ( sal_id NUMBER DEFAULT '0' NOT NULL , prod_id NUMBER DEFAULT '0' NOT NULL , sal_qty NUMBER DEFAULT '0' NOT NULL , unit_sal_price NUMBER DEFAULT '0' NOT NULL , CONSTRAINT SAL_LINE_SAL_FK FOREIGN KEY ( sal_id ) REFERENCES SALES ( sal_id ) , CONSTRAINT SAL_LINE_POD_FK FOREIGN KEY ( prod_id ) REFERENCES PRODUCT ( prod_id ) , CONSTRAINT SAL_LINE_PK PRIMARY KEY ( sal_id,prod_id ) ENABLE ); CREATE TABLE STOCK ( prod_id NUMBER NOT NULL , prod_qty NUMBER DEFAULT '0' NOT NULL , re_ord_level NUMBER DEFAULT '0' NOT NULL , CONSTRAINT STOCK_POD_FK FOREIGN KEY ( prod_id ) REFERENCES PRODUCT ( prod_id ) , CONSTRAINT STOCK_PK PRIMARY KEY ( prod_id ) ENABLE ); SET DEFINE OFF; -- ***** Populate Tables ***** --CUSTOMER table data INSERT INTO CUSTOMER VALUES(1,'Kamrul Hasan','Moghbazar, Dhaka','0456789123'); INSERT INTO CUSTOMER VALUES(2,'Rabiul Alam','Motijheel, Dhaka','0567891234'); INSERT INTO CUSTOMER VALUES(3,'Shahed Hasan','2-G/1,2-2,Mirpur, Dhaka','0678912345'); --PRODUCT table data INSERT INTO PRODUCT VALUES(1,'RAM',NULL); INSERT INTO PRODUCT VALUES(2,'DVD Drive',NULL); INSERT INTO PRODUCT VALUES(3,'HDD','160 GB Satta'); INSERT INTO PRODUCT VALUES(4,'Monitor','LCD 19\"'); INSERT INTO PRODUCT VALUES(5,'Printer','HP Color'); INSERT INTO PRODUCT VALUES(6,'Keyboard','Multimedia Keyborad (Customised)'); INSERT INTO PRODUCT VALUES(7,'Mouse','Customised Mouse'); -- SUPPLIER table data INSERT INTO SUPPLIER VALUES(1,'Salam Enterprise','2-H/1-10, Mirpur, Dhaka, Bangladesh','0123456789'); INSERT INTO SUPPLIER VALUES(2,'ABC Supplies','Dhanmondi, Dhaka','0234567891'); INSERT INTO SUPPLIER VALUES(3,'XYZ Company','52 Gabtali, Dhaka','0345678912'); --PURCHASE table data INSERT INTO PURCHASE VALUES(1,TO_DATE('12-12-2007','dd-mm-yyyy'),1); INSERT INTO PURCHASE VALUES(2,TO_DATE('13-12-2007','dd-mm-yyyy'),2); INSERT INTO PURCHASE VALUES(3,TO_DATE('13-12-2007','dd-mm-yyyy'),1); INSERT INTO PURCHASE VALUES(4,TO_DATE('14-12-2007','dd-mm-yyyy'),1); INSERT INTO PURCHASE VALUES(5,TO_DATE('15-12-2007','dd-mm-yyyy'),2); INSERT INTO PURCHASE VALUES(6,TO_DATE('20-12-2007','dd-mm-yyyy'),3); INSERT INTO PURCHASE VALUES(7,TO_DATE('05-01-2007','dd-mm-yyyy'),2); INSERT INTO PURCHASE VALUES(8,TO_DATE('06-05-2007','dd-mm-yyyy'),3); INSERT INTO PURCHASE VALUES(9,TO_DATE('15-07-2008','dd-mm-yyyy'),1); --PURCHASELINE table data INSERT INTO PURCHASELINE VALUES(1,1,25,900); INSERT INTO PURCHASELINE VALUES(1,2,10,1700); INSERT INTO PURCHASELINE VALUES(1,3,10,5000); INSERT INTO PURCHASELINE VALUES(1,4,9,5500); INSERT INTO PURCHASELINE VALUES(1,7,100,250); INSERT INTO PURCHASELINE VALUES(2,3,15,5000); INSERT INTO PURCHASELINE VALUES(2,6,20,500); INSERT INTO PURCHASELINE VALUES(3,6,25,450); INSERT INTO PURCHASELINE VALUES(4,7,100,200); INSERT INTO PURCHASELINE VALUES(5,5,10,3450); INSERT INTO PURCHASELINE VALUES(5,6,10,180); INSERT INTO PURCHASELINE VALUES(6,1,15,900); --SALES table data INSERT INTO SALES VALUES(1,TO_DATE('12-12-2007','dd-mm-yyyy'),2); INSERT INTO SALES VALUES(2,TO_DATE('15-12-2007','dd-mm-yyyy'),3); INSERT INTO SALES VALUES(3,TO_DATE('20-12-2007','dd-mm-yyyy'),2); INSERT INTO SALES VALUES(4,TO_DATE('28-12-2007','dd-mm-yyyy'),3); INSERT INTO SALES VALUES(5,TO_DATE('05-01-2008','dd-mm-yyyy'),1); INSERT INTO SALES VALUES(6,TO_DATE('12-01-2008','dd-mm-yyyy'),3); INSERT INTO SALES VALUES(7,TO_DATE('12-02-2008','dd-mm-yyyy'),1); INSERT INTO SALES VALUES(8,TO_DATE('12-02-2008','dd-mm-yyyy'),2); INSERT INTO SALES VALUES(9,TO_DATE('12-02-2008','dd-mm-yyyy'),3); --SALESLINE table data INSERT INTO SALESLINE VALUES(1,1,3,1000); INSERT INTO SALESLINE VALUES(1,3,1,5500); INSERT INTO SALESLINE VALUES(1,4,1,6000); INSERT INTO SALESLINE VALUES(1,6,2,500); INSERT INTO SALESLINE VALUES(1,7,2,200); INSERT INTO SALESLINE VALUES(2,2,2,1900); INSERT INTO SALESLINE VALUES(2,7,2,200); INSERT INTO SALESLINE VALUES(3,4,1,5500); INSERT INTO SALESLINE VALUES(4,2,1,2200); INSERT INTO SALESLINE VALUES(5,6,1,300); INSERT INTO SALESLINE VALUES(5,7,2,250); INSERT INTO SALESLINE VALUES(6,6,1,300); INSERT INTO SALESLINE VALUES(6,7,3,180); INSERT INTO SALESLINE VALUES(7,1,2,1000); INSERT INTO SALESLINE VALUES(7,2,1,1900); INSERT INTO SALESLINE VALUES(7,3,2,5500); INSERT INTO SALESLINE VALUES(8,4,1,5500); INSERT INTO SALESLINE VALUES(8,6,2,300); INSERT INTO SALESLINE VALUES(8,7,1,200); INSERT INTO SALESLINE VALUES(9,1,3,1000); INSERT INTO SALESLINE VALUES(9,3,2,5500); INSERT INTO SALESLINE VALUES(9,5,2,1000); --STOCK table data INSERT INTO STOCK VALUES(1,200,15); INSERT INTO STOCK VALUES(2,25,10); INSERT INTO STOCK VALUES(3,40,10); INSERT INTO STOCK VALUES(4,20,10); INSERT INTO STOCK VALUES(5,10,10); INSERT INTO STOCK VALUES(6,50,20); INSERT INTO STOCK VALUES(7,150,20); COMMIT;
Concerning
I could not figure out how you pulled your expected results, especially the last column. I guess that your problem I have several prod_id table productline / salesline. If a group of prod_id in these table should solve your problem. Something like that
SQL> select p.prod_id as product_prod_id 2 , p.name as product_name 3 , p.description as product_description 4 , s.prod_qty as stock_prod_qty 5 , s.re_ord_level as stock_re_ord_level 6 , pl.pur_qty as purchaseline_pur_qty 7 , pl.unit_pur_price as purchaseline_unit_pur_price 8 , sl.sal_qty as salesline_sal_qty 9 , sl.unit_sal_price as salesline_unit_sal_price 10 from product p 11 join stock s 12 on p.prod_id = s.prod_id 13 join ( 14 select prod_id 15 , sum(pur_qty) pur_qty 16 , sum(unit_pur_price) unit_pur_price 17 from purchaseline 18 group 19 by prod_id 20 ) pl 21 on p.prod_id = pl.prod_id 22 join ( 23 select prod_id 24 , sum(sal_qty) sal_qty 25 , sum(unit_sal_price) unit_sal_price 26 from salesline 27 group 28 by prod_id 29 ) sl 30 on p.prod_id = sl.prod_id 31 order 32 by p.prod_id; PRODUCT_PROD_ID PRODUCT_NAME PRODUCT_DESCRIPTION STOCK_PROD_QTY STOCK_RE_ORD_LEVEL PURCHASELINE_PUR_QTY PURCHASELINE_UNIT_PUR_PRICE SALESLINE_SAL_QTY SALESLINE_UNIT_SAL_PRICE --------------- -------------------- ---------------------------------------- -------------- ------------------ -------------------- --------------------------- ----------------- ------------------------ 1 RAM 200 15 40 1800 8 3000 2 DVD Drive 25 10 10 1700 4 6000 3 HDD 160 GB Satta 40 10 25 10000 5 16500 4 Monitor LCD 19\" 20 10 9 5500 3 17000 5 Printer HP Color 10 10 10 3450 2 1000 6 Keyboard Multimedia Keyborad (Customised) 50 20 55 1130 6 1400 7 Mouse Customised Mouse 150 20 200 450 10 1030 7 rows selected.
-
Execution of shell script using OSLinetoken fetchlet
HII,
I have a requirement. I need to use a shell script in the fetchlet of OSLineToken. In the metric system of response I will check whether or not the directory already exists on the server. To verify the existence of the directory, I created a shell script. But how can I report its result with the metric of the answer? The shell script is as follows:
Shell script:
If test d $1; then
echo "DIR exists."
on the other
echo 'no '.
FI
The answer for the same metric will be:
< QueryDescriptor FETCHLET_ID = "OSLineToken" >
< property NAME = 'command' SCOPE = 'GLOBAL' >
SH {nom_repertoire where the shell script is transferred} / {shell script file name} {dir_name_parameter} < / property >
< property NAME = 'startsWith' SCOPE 'GLOBAL' = > em_result = < / property >
< property NAME = "separator" SCOPE = 'GLOBAL' > | < / property >
< / QueryDescriptor >
Please suggest what is the use of em_result here?
Once the existence of the directory is checked, if it upward, then I need to call another shell script to concatenate the contents of all files with .log extension (this will be the setting of shell script). Get the output of shell script and view it in custom management plug-in. As I use cat *.log > > consolidatefile command to concatenate the data, I need to read the file consolidatefile from the server and return these data to concatenated plug-in file. Yet once, how can I read the contents of the consolidatefile file in EMF? I will create another matrix to this effect say 'read_content '. the querydesciptor the same thing will be as follows:
< QueryDescriptor FETCHLET_ID = "OSLineToken" >
< property NAME = 'command' SCOPE = 'GLOBAL' >
SH {nom_repertoire where the shell script is transferred} / {shell script file name} {dir_name_parameter} {extension of the files concatenated} < / property >
< property NAME = 'startsWith' SCOPE 'GLOBAL' = > em_result = < / property >
< property NAME = "separator" SCOPE = 'GLOBAL' > | < / property >
< / QueryDescriptor >
I'm not sure what all the properties to use in this case... I saw that several example files use some of them perbin, scriptsdir, but some of them do not... The related pdf also does not say anything about this kind of properties. Please suggest.
I hope that the explanation of the problem is not so heavy. Please let me know if you have any question to understand.
Thank you
ASThe guide of extensibility will have instructions on how to package your files as an Archive Plug-in management (MPA). Once you have this file archive, the EM grid control will point to import your plug-in management (MP) in the repository. Once imported, you can deploy them through the user interface for all agents in your grid control. (You can get to the UI Plug - ins by clicking on Setup in the upper right corner, then management Plug-ins to halfway down on the left side of the page).
When the MP is deployed to the agent, the emx /
the directory structure is created and all the scripts that you have the package with your MP are placed in this directory. So when you create your target type metadata file, you'll want to consult this directory. Properties with a range of systemeparametres get their values from the emd.properties file.
If you decide to put up on the agent (which is probably easier than reconditioning of all your files whenever you are editing, and then re-importation and redeploy their), you will need to create the directory structure in the % scriptsDir %. So if you finally Pack your plug-ins all work even.
-
How to validate SQL * MORE connection in Unix shell script
I wrote the following function in unix shell script to validate SQL * MORE connection and throw a user-defined message.
function check_db_conn
{
output | sqlplus-s-L $User/$Password@$SID >/dev/null
If [[$?-no 0]]; then
echo 'credentials incorrect DB.
FI
}
However, I would like to change this feature so that the user has entered a good connection and that there is a problem with TNS listener must display the appropriate message.
Hello
Try adding:
lsnrctl status $listener_name > /dev/null if [[ $? -ne 0 ]]; then echo "Issue with Listener" fi
-
Using the shell script sql variable
OS: Oracle Linux 5.10
DB Oracle 11.2.0
I have need to query the database and assign the return value to a variable of shell, but it does not work.
create table imr_env (key varchar2(1000), value varchar2(1000) ; insert into imr_env values('TblspcUsagePct','90'); commit;
Here is the shell script:
#!/bin/bash echo "in script" ORACLE_HOME=/u01/app/oracle/product/11.2.0/db_1 ; export ORACLE_HOME ORACLE_SID=IMR1 ; export ORACLE_SID export PATH=$PATH:$ORACLE_HOME/bin pct=`sqlplus -S app/manager <<END set serveroutput on declare output_val number ; BEGIN select value into output_val from imr_env where key = 'TblspcUsagePct' ; dbms_output.put_line('output_val: ' || to_char(output_val)) ; END ; / exit; END` ## another sqlplus connection, use $pct in the where clause echo "var value is $pct"
Here is the result:
SQL> ./test.sh in script var value is output_val: 90
Why isn't the shell variable is populated with the value retrieved from the database?
I need to use $pct in another query that will be storage spaces that are at least 90% full.
I had to change to a korn shell - I couldn't get the "READ" command in shell or bash.
But it is not only "READ" command:
SQL > select * from IMR_ENV;
THE KEY VALUE
------------------------- -----
TblspcWarningLimit 90
TblspcUsagePct 50
SQL > exit
Disconnected from the database to Oracle 11 g Enterprise Edition Release 11.2.0.3.0 - 64 bit Production
With partitioning, OLAP, Data Mining and Real Application Testing options
$ cat test.sh
VAL = $(sqlplus-S test/test)<>
Set feedback off
set the position
Select (value) from imr_env, including the key to_number = "TblspcUsagePct";
Select (value) from imr_env, including the key to_number = "TblspcWarningLimit";
"exit";
EXPRESSIONS OF FOLKLORE
)
set $VAL
Echo $1
Echo $2
$./test.sh
50
90
$ echo $SHELL
/ bin/bash
$
-
Query does not not in shell scripts
Hi Experts,
I created a query to check the temp tablespace use in production servers and I wanted to run the query using shell scripts, but when I run the query in the script I get no error or output I am unale to correct where exactly, I make the mistake
This is the query
SELECT A.tablespace_name 'space', D.gb_total, ROUND (SUM (A.used_blocks * D.block_size) / 1024 / 1024/1024,2) "gb_used", ROUND (D.gb_total - SUM (A.used_blocks * D.block_size) / 1024 / 1024/1024,2) 'gb_free' v$ \sort_segment A, (SELECT B.name, C.block_size, ROUND(SUM (C.bytes) / 1024 / 1024/1024,2) 'gb_total' v$ \tablespace B, v$ \tempfile C WHERE B.ts #= C.ts # GROUP BY B.name, C.block_size) D WHERE A.tablespace_name = D.name GROUP by A.tablespace_name D.gb_total
You can help in this.
Thank you...
Basic Linux/Unix shell script format
#! / bin/sh
oracle set environment #.
Export ORACLE_HOME =...
export of SID =... or TWO_TASK = export
.. etc.
# launch sqlplus
sqlplus-s/nolog<>
Connect scott/tiger
together... environment of sqlplus
-execution of script sql (or sql query)
@script.sql
output
EXPRESSIONS OF FOLKLORE
#end script
Character $ is interpreted by the shell as environment variables, you must escape the $ character is as well as it not assimilated to the variable approx.
For example, v$ instance should be written as v$ instance.
-
Call the OBIEE Unix Shell Script
Hi all
Is it possible to make a call to a Unix Shell Script with the framework of the Action of OBIEE.
Thanks in advanceI don't think, but we can go for it instead side BI. Try to transfer the same features side shell script so that the script can identify your report using the tail of the log nqquery and execute the rest of your lines in shell script.
This can done and works as expected. If you identify your report instead of this long logical query at all with my suggestion that you can go for a little keyword.
Score pls help if
-
Get the progress and result of a shell script
Hello
I would use a shell script to convert MP3 / audio FLAC files MP3 files with a bitrate of 192kbs and do other things in Automator or AppleScript script.
So I installed lame and flac with brew, run an Automator process with a selected file and perform an action of AppleScrip:
on run {input, parameters} if input is not {} then repeat with theFile in input set posix_file to POSIX path of theFile if input is not {} then tell application "Terminal" set thescript to "/usr/local/bin/lame -b 192 \"" & posix_file & "\" " do script thescript end tell end if end repeat end if return input end run
The Terminal window is there because you tell the Terminal to run your script - try using shell script.
-
import program does not process customized when you use the shell script
Hello
I have the pkg in that
PROC1 without leaving param in this procedure, I call you created proc2 import and program
Now, I call this proc1 in the shell script, here my flat program to import does not work
Why? and I try to use CONSCUB on proc1 using 2 output param which is errbuf, retcode in the newspaper, I'm "wrong number of arguments"
Please help me?
Thank you
Renon,
Proc1 is your first program at the same time, right?
How many parameters is defined for that?
You can paste a screenshot of the window parameter?
If there is no parameter, just use this:
$FND_TOP/bin/CONCSUB $LOGIN SQLAP 'Accounts payable Manager' WAIT ' rajesh' = N SIMULTANEOUS SQLAP PROC1
Assuming that "Renon" is a username of fnd and the program is set in the Application of accounts payable.
See you soon
AJ
-
The work of the Disqualification of Linux Shell Script
Hello
We are the 11.1.1.7.3 with Weblogic Web server and Linux App Server Disqualification.
I want to know how we can carry out the work of the Disqualification of the Linux and write these commands in the shell script.
(1) present the task (run)
(2) check the employment status to see if the work is in working condition or supplement
I know that on Windows, it utility jmxtools.jar of command line operations in dnDirector/Tools folder. But I'm not able to find the jmxtools.jar file in one of the edq.home or edq.local.home of the directories under Linux install.
Can you guide me to the location where the jmx utility will be available under Linux and also the command to check the status of the job?
Thank you
Ravi
It's him!
Jobs called via the command line interface return an exit status when finished.
Monitor the precise status of the long-term work requires the use of the server of the Disqualification Console user interface.
-
Hallo,
I would like to know if it is possible to change the criteria for the view or the query used by an LOV entry before the opening of the pop-up window (on the click the icon in the search).
Thank you
Federico
You can change the implicit vc used by the LOV, overwriting the popuplistener and a different VC as
{} public void onLauchLov (LaunchPopupEvent launchPopupEvent)
String submittedValue = (String) launchPopupEvent.getSubmittedValue ();
run the queries only if value is submitted
If (submittedValue! = null & submittedValue.length () > 0) {}
RichInputListOfValues lovComp = launchPopupEvent.getComponent ((RichInputListOfValues));
BindingContext bindingCtx = BindingContext.getCurrent ();
BindingContainer links = bindingCtx.getCurrentBindingsEntry ();
JUCtrlListBinding lov = bindings.get ("JobId") (JUCtrlListBinding);
VCM ViewCriteriaManager = lov.getListIterBinding () .getViewObject () .getViewCriteriaManager ();
ensure that the criteria of display is disabled
vcm.removeViewCriteria (ViewCriteriaManager.IMPLICIT_VIEW_CRITERIA_NAME);
Create a new view of the criteria of
ViewCriteria vc = new ViewCriteria (lov.getListIterBinding () .getViewObject ());
Use the name of the default view criteria
'__DefaultViewCriteria__ '.
vc.setName (ViewCriteriaManager.IMPLICIT_VIEW_CRITERIA_NAME);
create a display for all queryable attributes criteria
VCR ViewCriteriaRow = new ViewCriteriaRow (vc);
for this example, I put the query filter for 60 DepartmentId.
You can determine when running in reading to a managed bean
or the link layer
vcr.setAttribute ("JobId", submittedValue + "%");
Note also that the line of view criteria consists of all attributes
who belong to the view list LOV object, which means that you can
filter on multiple attributes
vc.addRow (vcr);
lov.getListIterBinding () .getViewObject () .applyViewCriteria (vc);
}
}
Here's the LOV on the employee on the work of attribte based on http://www.oracle.com/technetwork/developer-tools/adf/learnmore/29-adf-model-driven-llist-of-values-169171.pdf
Timo
-
Call the batch file and not the shell script using measures of the event
Can I use EVENTACTIONS at the door of gold like
eventactions (SHELL, /path-to-batch-file/test1.bat)
will this work? I am on a windows machine, so I need to run a batch file and not a shell script
In GGSCI, you can shell OUT to the BONE and call a command/executable file.
For example on Windows:
GGSCI (WIN2003) 2 > shell dir
Volume in drive C has no label.
Volume serial number is 8CCC-9E58
Directory of C:\ggs
18/08/2012 07:23
. 18/08/2012 07:23
... 2010-10-15 06:37 bcpfmt.tpl 426
2010-10-15 06:37 bcrypt.txt 1 725
22/04/2011 03:41 2 560 category.dll
05/11/2011 10:43
cfg 2010-10-15 07:15 739 chkpt_ora_create.sql
...
What were the results when you tried this in an EVENTACTIONS clause?
-
When I have a virtual machine to export to the FVO on an ESXi 4.0 and then try again to import the FVO on VMware Player or Workstation 7.1, I get the following message:
«Could not open the Virtual Machine: failed to query the Source for more information.» I can successfully re-import this file OVF in an ESXi server without any problems. Anyone know why this is happening and what can be done to remedy this?
Thank you
Christopher
Hello.
Try to use ovftool to convert a VMX of the FVO.
Good luck!
-
Why the sql statement was extucted twice in shell script?
I tried to test the rac load balancing using the shell script depending on suse 10 + oracle 10g rac.
After run the shell script, I got the result to follow.oracle@SZDB:~> more load_balance.sh #!/bin/bash for i in {1..20} do echo $i sqlplus -S system/oracle@ORA10G <<EOF select instance_name from v\$instance; / EOF sleep 1 done exit 0
Seem that the sql statement was run twice in each loop. If you help please take a look. Thanks in advance.oracle@SZDB:~> ./load_balance.sh 1 INSTANCE_NAME ---------------- ora10g2 INSTANCE_NAME ---------------- ora10g2 2 INSTANCE_NAME ---------------- ora10g1 INSTANCE_NAME ---------------- ora10g1 3 INSTANCE_NAME ---------------- ora10g1 INSTANCE_NAME ---------------- ora10g1
RobinsonBecause you have one; and one.
Maybe you are looking for
-
Hello I am using Logix pro X for about 2 years now with a focusrite scarlett 6i6 and a mac book pro with the lates OS. I have never had any latency issues ever... Even with the full sessions using VSTs in the edge. Now, all of a sudden since th
-
Windows 7 certificate of authenticity missing
Last month, I bought a new awareness of Lenovo t410. The OS is Windows 7 Professional 64 bit. Recently, I noticed that I can't find anywhere a certificate of authenticity of Windows on the t410. The only mention of Windows 7 is the small square decal
-
Definition for KB915597 Defender Windows Update (Vista Pro)
Repeating the failures to install this update: KB915597, 1.71.1143.0, ideas, any help or advice? Thank you!
-
There were 68 'important' or 'recommended' updates automatically installed since 03/09/09. What concerns me, is the amount of disk space I'm dedicated to these updates. Replace the previous updates or are they cumulative? How do I know what to do th
-
removing the disc HARD dv4t-5300
Oops someone showed me where the maintenance manual is on this. Thank you! I just bought a laptop HP dv4t-5300 series. I removed the small bottom plate to find that there are no screws to remove the disk hard so that I can put in my drive SSD OCZ V