cache of SQL query problem
I'm seeing the log file to manage the sql query sessions in the responses. I see that if we run the same report multiple times, the sql query is displayed only the first time. Second time if I run it is not displayed. If I make a new report with diff columns selected, it gives me the sql code then. Where can I put this option to display the sql query whenever I run a report, even if it's the same report executed several times. Is this caching problem?It should not... You have disabled the "Cache" on the physical layer for this table? If you go to the Advanced tab, is the option "ignore the cache Oracle BI" checked?
Tags: Business Intelligence
Similar Questions
-
A difficult dynamic SQL query problem
Hi all
I have a very interesting problem to work:
We have this special table defined as follows:
CREATE TABLE sales_data)
sales_id NUMBER,
NUMBER of sales_m01
NUMBER of sales_m02
NUMBER of sales_m03
NUMBER of sales_m04
NUMBER of sales_m05
NUMBER of sales_m06
NUMBER of sales_m07
NUMBER of sales_m08
NUMBER of sales_m09
NUMBER of sales_m10
NUMBER of sales_m11
NUMBER of sales_m12
sales_prior_yr NUMBER);
/
Columns ' sales_m01... sales_m12' represents aggregated monthly sales, what "sales_m01" is translated by "sales for the month of January, January is the first month,"sales_m02"in sales for the month of February and so on.»
The problem I encounter is that we have a project that requires that a parameter is passed to a stored procedure that represents the number of months which is then used to create a SQL query with aggregations of next mandatory field, which depends on the parameter passed:
Example 1: entry of parameter: 4
Should be built dynamically to SQL query:
SELECT
Sum (sales_m04) as CURRENT_SALES,
Sum (sales_m01 + sales_m02 + sales_m03 + sales_m04) SALES_YTD
Of
sales_data
WHERE
sales_id = '0599768';
Example 2: input parameter: 8
Should be built dynamically to SQL query:
SELECT
Sum (sales_m08) as CURRENT_SALES,
SUM (sales_m01 + sales_m02 + sales_m03 + sales_m04 +)
sales_m05 + sales_m06 + sales_m07 + sales_m08) SALES_YTD
Of
sales_data
WHERE
sales_id = '0599768';
So in a sense, the contents of SUM(sales_m01...n) would vary according to the parameter, which must be a number between 1... 12 what is a month, which in turn corresponds to a range of real field on the table itself. The resulting dynamic query should include only those columns/fields in the table that is within the range given by the input parameter and does not account for all the other columns/fields.
Any solution is greatly appreciated.
Thank you.SQL> declare cols long; param integer := 6; sales_id integer := 0599768; begin for i in 1..param loop cols := cols || 'sales_m' || to_char(i, 'fm00') || '+'; end loop; dbms_output.put_line('select sum(sales_m'||to_char(param,'fm00')||') current_sales, sum(' || rtrim(cols,'+') || ') sales_ytd from sales_data WHERE sales_id = :1'); end; / select sum(sales_m06) current_sales, sum(sales_m01+sales_m02+sales_m03+sales_m04+sales_m05+sales_m06) sales_ytd from sales_data WHERE sales_id = :1 PL/SQL procedure successfully completed.
Now it should be obvious:
Instead of dbms_output OPEN a refcursor:SQL> declare cols long; param integer := 6; sales_id integer := 0599768; cur sys_refcursor; begin for i in 1..param loop cols := cols || 'sales_m' || to_char(i, 'fm00') || ','; end loop; open cur for 'select sum(sales_m'||to_char(param,'fm00')||') current_sales, sum(' || rtrim(cols,',') || ') sales_ytd from sales_data WHERE sales_id = :1' using sales_id; ..... end; /
Published by: michaels2 on July 26, 2009 09:49
replaced ',' with ' + '.
-
SQL query problem - (internal has not managed to the outer query)
Hi all:
Here is my SQL query:
Why am I the overall percentage of all Sites as opposed to each percentage calculation for each site. I've grouped by Rd. SITE, so I assume he would calculate the percentages for each site. What I've done wrong?SELECT RD.SITE, ROUND( (SELECT COUNT (DISTINCT PB.EMP_ID) FROM BOOK PB WHERE PB.EMP_ID IN (SELECT PB.EMP_ID FROM BOOK PB ) ) / (SELECT COUNT (DISTINCT PB.EMP_ID) FROM BOOK PB WHERE PB.EMP_ID IN (SELECT PB.EMP_ID FROM BOOK PB WHERE MO.QUALIFIER > 4 ) )* 100, 2) AS PERCENTAGE FROM BOOK PB LEFT JOIN POSITION MO ON PB.EMP_ID = PO.EMP_ID INNER JOIN PHYS_LOCATION RD ON MO.HOUSED = RD.SITE_ID WHERE MO.ACTUAL_END IS NULL GROUP BY RD.SITE;
Thank you for your help.AquaNX4 wrote:
Hi all:Here is my SQL query:
SELECT RD.SITE, ROUND( (SELECT COUNT (DISTINCT PB.EMP_ID) FROM BOOK PB WHERE PB.EMP_ID IN (SELECT PB.EMP_ID FROM BOOK PB ) ) / (SELECT COUNT (DISTINCT PB.EMP_ID) FROM BOOK PB WHERE PB.EMP_ID IN (SELECT PB.EMP_ID FROM BOOK PB WHERE MO.QUALIFIER > 4 ) )* 100, 2) AS PERCENTAGE FROM BOOK PB LEFT JOIN POSITION MO ON PB.EMP_ID = PO.EMP_ID INNER JOIN PHYS_LOCATION RD ON MO.HOUSED = RD.SITE_ID WHERE MO.ACTUAL_END IS NULL GROUP BY RD.SITE;
Why am I the overall percentage of all Sites as opposed to each percentage calculation for each site. I've grouped by Rd. SITE, so I assume he would calculate the percentages for each site. What I've done wrong?
It's what you're asking. Your subquery scalar to get the percentage is not restricted by the current site. Add columns to filter to restrict the values selected for the calculation
Published by: riedelme on May 8, 2013 07:26
-
Hi friends,
I have a view called 'risk_efforts' with the field user_id, user_name, wknd_dt, Etiquettemois, prod_efforts, unprod_efforts.
Name Type
-------------- -------------
ROW_ID NUMBER
USER_ID VARCHAR2 (14)
VARCHAR2 (50) USER_NAME
WKND_DT VARCHAR2 (8)
ETIQUETTEMOIS VARCHAR2 (250)
NUMBER OF PROD_EFFORTS
NUMBER OF UNPROD_EFFORTS
data are like this:
When there is some data in prod_efforts, unprod_efforts will be null
When there is some data in unprod_efforts, prod_efforts will be null
for example:
USER_ID, USER_NAME WKND_DT ETIQUETTEMOIS PROD_EFFORTS UNPROD_EFFORTS
G666999 20100403 GTest KILLS null 3
G666999 GTest 20100403 Mar 14 null
now I want to combine these 2 rows in i.e 1 row o/p should be like this
USER_ID, USER_NAME WKND_DT ETIQUETTEMOIS PROD_EFFORTS UNPROD_EFFORTS
G666999 20100403 GTest KILLS 14 3
I tried all combinations but couldn't make the query. Please help me with the exact SQL select query.
Thank you
GirishWelcome to the forum.
First read this:
Second, it is always helpful to provide the following information:
1. oracle version (SELECT * FROM V$ VERSION)
2. examples of data in the form to CREATE / INSERT commands.
3. expected results
4 explanation of the expected results (alias "business logic")
5. use.tags for #2 and #3. See FAQ (Link on top right side) for details. You have provided #3 and #4. However with no usable form of sample data forum members will often not respond as quickly as they could if you provided #2. I'm just wagering a guess here but what about this:
SELECT ROW_ID
USER_ID
WKND_DT
ETIQUETTEMOIS
MAX (PROD_EFFORTS) AS PROD_EFFORTS
MAX (UNPROD_EFFORTS) AS UNPROD_EFFORTS
OF RISK_EFFORTS
ROW_ID GROUP
USER_ID
WKND_DT
ETIQUETTEMOIS -
Hi there I try to execute the following set of records-
< %
Dim LiveProperties
Dim LiveProperties_cmd
Dim LiveProperties_numRows
Set LiveProperties_cmd = Server.CreateObject ("ADODB.Command")
LiveProperties_cmd. ActiveConnection = MM_recruta2_STRING
LiveProperties_cmd.CommandText = "SELECT COUNT (PropertyID) As NumberofProperties, propertylive, propertylive WHERE propertylocation = 'y' FROM dbo.easytoletproperty GROUP BY propertylocation.
LiveProperties_cmd. Prepared = true
Set LiveProperties = LiveProperties_cmd. Run
LiveProperties_numRows = 0
% >
However, when I test this situation, I get the following error:
Provider Microsoft OLE DB for SQL Server error '80040e14 '.
Incorrect syntax near the keyword 'FROM '.
/ PropertiesbyTown2.asp, line 358
Any ideas as to what I did wrong?
Thank you'Column 'dbo.easytoletproperty.propertylive' is invalid in the select list because it is not contained in the clause Group By or an aggregate function'
Fix. You have used an aggregate function, then all other columns that are not part of an aggregate must be grouped together by using the group by clause.
-
Hi all
I have table T_TEST with column Date_Time and email...
Select TO_CHAR(date_time,'DD-MON-YY HH:MM:SS') Date_Time, email T_TEST
Date_Time E-mail 1 JUNE 13 12:06:29 [email protected] 1 JANUARY 14 01:06:31 [email protected] 1ST FEBRUARY 14 03:00 [email protected] 1ST FEBRUARY 14 01:09 [email protected] MARCH 2, 11 07:00 [email protected] JUNE 9, 10 11:00 [email protected] I need to later / recent DATA-data date time
the final out put only on files
Date_Time E-mail 1ST FEBRUARY 14 03:00 [email protected] Please help me on this...
Hello
Try it below:
with t as
(
Select to_date (1 June 13 12:06:29 ',' DD-MON-YY HH12:MI:SS') Date_Time,' [email protected]' email of all the double union
Select to_date (1 January 14 01:06:31 "," DD-MON-YY HH12:MI:SS'),' [email protected]' Union double all the
Select to_date (1 February 14 03:00 "," DD-MON-YY HH12:MI:SS'),' [email protected]' Union double all the
Select to_date (1 February 14 01:09 ',' DD-MON-YY HH12:MI:SS'),' [email protected]' of the double
)
Select to_char (date_time, "DD-MON-YY HH12:MI:SS) Date_Time, email from t where to_char (date_time," DD-MON-YY HH12:MI:SS) in (select to_char (max (date_time), "DD-MON-YY HH12:MI:SS") t);
OUTPUT:
DATE_TIME EMAIL
--------------------------- --------------
1st February 14 03:00 [email protected]
-
SQL query problem finding difference in documents
Hi all
I use oracle 10g. I need emergency aid to find the difference in documents based on the date:
I have sales of the table as below:
seller SALES_COUNT DATE
JOHN 20 04/01/2012
DENNY 15 04/01/2012
JOHN 30 04/02/2012
DENNY 30 04/02/2012
JOHN 45 04/03/2012
DENNY 50 04/03/2012
SALES_COUNT is up to man including the date of sale. Its similar cumulative number. John has total sales of 01/04/2012 to 03/04/2012 is 50 and same case for Denny. This SALES_COUNT will keep increasing with dates as sales continue to add in the table for each salesperson.
But I want to have seprate for each seller counties.
for example: JOHN SALES_COUNT 04/02/2012 is 30-20 = 10
JOHN SALES_COUNT 03/04/2012 is 45-30 = 15
DENNY SALES_COUNT, 02/04/2012 is 30-15 = 15
JOHN SALES_COUNT 03/04/2012 is 50-30 = 20
Please help me with this scenario and let me know if you need clarification. I would much appreciate your help.
Thank you.This gives you what you want?
with t as ( select 'JOHN' salesman, 20 sales_count, to_date('04/01/2012', 'mm/dd/yyyy') sale_date from dual union all select 'DENNY' salesman, 15 sales_count, to_date('04/01/2012', 'mm/dd/yyyy') sale_date from dual union all select 'JOHN' salesman, 30 sales_count, to_date('04/02/2012', 'mm/dd/yyyy') sale_date from dual union all select 'DENNY' salesman, 30 sales_count, to_date('04/02/2012', 'mm/dd/yyyy') sale_date from dual union all select 'JOHN' salesman, 45 sales_count, to_date('04/03/2012', 'mm/dd/yyyy') sale_date from dual union all select 'DENNY' salesman, 50 sales_count, to_date('04/03/2012', 'mm/dd/yyyy') sale_date from dual ) select salesman, sales_count sales_todate, sale_date, sales_count - lag(sales_count, 1, 0) over (partition by salesman order by sale_date) daily_sales from t SALESMAN,SALES_TODATE,SALE_DATE,DAILY_SALES DENNY,15,4/1/2012,15 DENNY,30,4/2/2012,15 DENNY,50,4/3/2012,20 JOHN,20,4/1/2012,20 JOHN,30,4/2/2012,10 JOHN,45,4/3/2012,15
-
Performance problem on the SQL query that does not use the primary key index
Hello!
I have some performance issues on a single SQL query (Oracle 10 g).
I could solve the problem by using the INDEX indicator, but I would like to know WHY this is happening.
* Tables *.
create table jobs)
ID number (5) not null,
name varchar2 (100),
primary key constraint Job_PK (id)
)
/
-Record count: 298
create table Comp)
integer ID not null,
name varchar2 (100),
primary key constraint Comp_PK (id)
)
/
-Record count: 193
-Relation m: n
create table JobComp)
integer ID not null,
id_job integer not null,
id_comp integer not null,
primary key constraint JobComp_PK (id),
unique key constraint JobComp_UK (id_job, id_comp),
Constraint JobComp_FK_Job foreign key (id_job) refers to Job (id),
Constraint JobComp_FK_Comp foreign key (id_comp) makes reference Comp (id)
)
/
create index JobComp_IX_Comp on JobComp (Cod_Comp)
/
create index JobComp_IX_Job on JobComp (Cod_Job)
/
-Record count: 6431
* Ask *.
When I run this query, the execution plan shows the index using (JobComp_PK and JobComp_IX_Comp).
No problem.
Select JobComp.*
of JobComp
Join jobs
on Job.id = JobComp.id_job
where JobComp.id_comp = 134
/
-runs in 0.20 sec
But when I add the field 'name' of the work table the plan uses full access table to the table of work
Select JobComp.*, Job.name
of JobComp
Join jobs
on Job.id = JobComp.id_job
where JobComp.id_comp = 134
/
-runs in the 2.70 dry
With the help of the index
Select / * + INDEX (Job Job_PK) * /.
JobComp.*, Job.name
of JobComp
Join jobs
on Job.id = JobComp.id_job
where JobComp.id_comp = 134
/
-runs in 0.20 sec
* Doubt *.
This behavior is correct?
PS. : I tried to recalculate the statistics, but nothing changes:
analyze the job calculation table statistics.
/
change the statistical calculation of index Job_PK reconstruction;
/
Start
dbms_utility.analyze_schema (sys_context ('userenv', 'current_schema'), 'CALCULATE');
end;
/
[of]
Gustavo EhrhardtGus.EHR wrote:
Hello.
I'm sorry for the plan unformatted.
The execution time of the querys "without field name' and 'with the field name with suspicion" are equal.
He has no problem caching, because I get the plans of the sequence different from the querys and repeated the performance. The result is always the same.I don't think that there is no problem with oracle crossing LOOP IMBRIQUEE to the HASH JOIN when you include the field name and this should be the expected behavior. But it seems that your WORKING table has a degree of parallelism set against what is causing the query to run in parallel (as JOB table is now available with full table scan, instead of indexed access earlier). It could be that the parallel execution is contributor to extra Runtime.
(a) do you know why the degree of parallelism on the WORK table has been defined? Do you need it?You can see if the following query provides a better response time?
select /*+ NOPARALLEL(JOB) */ JobComp.*, Job.Name from JobComp join Job on Job.id = JobComp.id_job where JobComp.id_comp = 134
-
Pool pane there problem with long sql query?
Hello
I use Jdeveloper 11.1.2.2.0
In the application I'm developing, there is a long sql query to call (it's a function from a package, which may take a few minutes to run) and I want to display a progress bar for the user.
The progress bar component is a "progress indicator" and the percentage of achivement is refreshed by a component 'pool' with 1-second interval.
The two components are connected in a javabean.
The function with the sql query is the javabean too.
To run the sql query that is long on background I am calling from a thread, and the pool component get the advancement of the percentage of a pipe filled with the sql function.
If the function to run is just a long loop of java operations the progress bar works fine, but if I put my long sql query instead, stop listener pool should be performed to stop long, s sql queryo progress are updated only at the end of the long sql query.
You have any ideas?
Thank you
Thanks for your replies. Unfortunately, it was not the solution to my case, sql procedures to block the entire application while ADF awaits them at the end...
To solve my problem, I finally used a PL SQL Job to call my sql procedure (dbms_job.submit).
The application is released right after the call and the component of the pool is not more secure!
Kind regards
Yann
-
problem with bind variables in the SQL query view object
Hi all
I use JDev 11.1.2.4.0.
I have a problem with bind variables in the SQL query view object.
This is my original SQL
SELECT sum(t.TIME) , t.legertype_id FROM LEDGER t WHERE t.nctuser_id = '20022' AND to_char(t.insertdate,'YYYYMMDD') in ('20130930','20130929') group by t.legertype_id
In my view .xml object query tab, I am writing this
SELECT sum(t.TIME) , t.legertype_id FROM LEDGER t WHERE t.nctuser_id = '20022' AND to_char(t.insertdate,'YYYYMMDD') in :dddd group by t.legertype_id
Davis here is a variable of Type liaison: String, updatable and necessary.
I try to deal with Davis as ('20130930 ', ' 20130929') hoping the view object, run as my original SQL.
But failed. The view object retrieves 0 line after that I run.
Why?
Thank you! ('2original SQL0130930', '20130929') ('20130930 ', ' 20130929')
A variable binding cannot be used as this is why you must use years table. Check decompilation binary ADF: using oracle.jbo.domain.Array with ViewCriteria to see a solution.
Timo
-
Components catalog SQL query and caching
Hello
It seems that SQL query result sets components catalog caching and go to the DB only after restarting the engine. Is it possible to disable this caching and force SQL queries to run each time?
Thank you.
Nick.Hi Nick,
Take a look at the log file of your engine. To execute SQL queries when running logic. See if you get any 'error', 'Serious' or 'Warning' engine log messages when you run your query.
Hope this helps,
Dan -
PL/SQL function body return query sql, no problem found data
Hi all
We are trying to build a dynamic report based on the selection of the item by the user. We use the SQL query (body of function from PL/SQL returning SQL query). However when a user change the item and submit the page. The following error is displayed.
ORA-01403: no data is found.
our request is simple
declare
l_query varchar2 (30000) default 'select id from chw.
Start
if(:P11_PARA=1) then
l_query: = "select name from chw.
end if;
Return l_query;
end;
any quick help please.Hello
I managed to recreate the error
To remove the error to change the area and choose "use generic Column Names (analysis of query runtime only).
Concerning
Paul
-
After migrating my APEX 4.1 application to a new environment, one classic report displays a "cannot parse the SQL query: ORA-00942: table or view does not exist" error when the page is displayed. Change the region to report SQL source to somehow (e.g., remove spaces, changing the order of the variables in the WHERE clause) immediately solves the problem, but by returning to the source of the region causes the report error again (the source region valid code without error, however).
Throw the error message:
Select v.id,
v.Col1
of view_vw v
where (: P1_FILTER is null or)
v.col2 = :P1_FILTER)
Do not throw error:
Select v.id,
v.Col1
of view_vw v
where (:P1_FILTER is null or)
v.col2 = :P1_FILTER)
Changing the order of column in the report has the same effect; i.e. He arranges, but return back to the original column order causes the error to display.
It's as if a cached result for the correct select statement used by the report is displayed. However, the application does not use the caching of page/region. Any ideas of what could be the cause?
He solved. Ultimately, all that was necessary was to clear the DB cache using:
alter system flush shared_pool;
-
Œuvres SQL query to MS SQL Server 2008, but not when you use the database kit
I have this SQL query:
DECLARE TABLE (@DataTypeTable)
Name varchar (128).
TypeID INT)-Add comma delimeted type data in the temporary table names
INSERT INTO @DataTypeTable (name)
SELECT * from WhatWeShouldDoRead.func_Split (@DataTypeTrimmed, ',')SELECT the name OF @DataTypeTable
That takes a comma delimited by the string and returns the string as a table. It works correctly in Microsoft SQL Server Management Studio. When I run this as a stored procedure I return nothing. There are no errors, SQL or otherwise. I checked that I am connected to the correct database and the stored procedure is responsible without changing any error chain which is reported of this stored procedure (that code is not shown in the example above). Has anyone seen this problem before, or have experience with SQL/Labview interfaces to tell me what I am doing wrong?
Thanks in advance.
-
HI HI... I'm a student doing a project related to labview. My task is to create a vi, type a user name and password to continue the whole VI.
As I am a newbie to SQL query language, can anyone help me this?... This isn't like the VI with password lock
There is a connection of the user called button in my main façade... u by clicking on it, a pop-up window will come out asking you a user name and a password. If the user name and the password is correct, then you can proceed. The problem is that I'm stuck with database...
Help me pls!with respect,
Ray
Hello
You have two cases:
(1) connect to the database with string (link a string of connection information), and then type something like this:
Driver is SQL Native Client;. Server = IP. Add.re.SS; UID = username; PW = *** ; Database = MyDatabase (depends on your database)(2) use a UDL file (you can configure it to connect to your database, with specific format). Remember that the connection is successful with test button.
There is a UDL file that you can edit here: C:\Program NIUninstaller Instruments\LabVIEW 2010\examples\database\Labview.udl
Edit: The connection dropped, you can set the path to an mdb file, and I think you can give the path of your accdb file.
Kind regards
Maybe you are looking for
-
My hard drive has been replaced and my Firefox bookmarks appear to have been imported in IE, but not Firefox. I want to import them into Firefox, but the import function does not work (it cannot be selected).
-
Successful HotSync but no information from Palm Desktop.
Wireless Hotsync "completed", but there is no information in my Palm Desktop, no contact, no notes, no tasks, etc., although this information is in the Palm Pilot. I have Windows 7, 64 bit computer laptop. HotSync wireless worked well in the past.
-
The editor displays bad instrument
Logic Pro X Strange problem happened today. For some reason, whenever I select a Midi track and open the editor, the false track appears. The same track Midi Drum guard opens in the editor of any track I select and open with editor. It is not do with
-
Computer laptop Windows XP does not start.
My laptop does not start. When I first put it on the screen says 'we apologize... ". Windows does not start automatically in... "Then the Windows XP screen as it tries to load there is this small blue screen that appears a second microphone, too fast
-
I was referred by the 'Help' site on my computer to a website "JUST ANSWER", however this turns out be a waste of time and from what I could find on a website thewre were 100's of other unhappy customers, some suggesting the site is a scam?