Data model - SQL query
Hello worldI hope someone can point me in the right direction.
I have the SQL code following in the type of SQL query data model.
Select * from ap_invoices
The following select statement is not to return all the rows in the view, but the fact of sqlplus.
I have setup the EBS security model uses BEEP 10.1.3.4.1.
What I do wrong, is there some setting I'm missing?
Do I need to use a different security model to BEEP?
Help, please
Thank you.
Hello
AP_INVOICES is a view as you know, sound based on the ORG_ID set when you select from it. IM thinking that when you use just select * from AP_INVOICES at RANDOM. Its not abreast of your org id and its is not defining for you when you run a query, so its not rehired data. Two options:
1. use a data model and in the pre-reading trigger set the context of the org. As you may be in an Oracle report. Data models are very powerful and offer everything that OReports done in EBS incl. triggers flex extensions
2 use the AP_INVOICES_ALL table and set the org id in the query where clause.
Both approaches need so that you know the id org that you want to set. The integration of the security that you won't it's to say it does not pass it when the user opens a session. I suppose you have a report folder for every responsibility you could hardcode the org id in the reports in a given folder. Or you can add it as a parameter to the user to fill out report.
Integration level just does not quite go enough away for the time to manage the org and, therefore, the views of the org.
hope this is clear
Concerning
Tim
Tags: Business Intelligence
Similar Questions
-
Data model - dynamic query with parameters
Hi friends, is it possible to dynamically call a query using the parameters of the data models?
I try to have a BEEP report with data model something like that
Data model
Param1
Q1 = Select * from table 1
Q2 = Select * from table 2
If Param1 = X then
FinalQuery = Q1
On the other
FinalQuery = Q2
End
Any help is greately appreciated.
Thank you
SAI.Take a look at the following:
http://www.Oracle.com/global/de/community/BIP/tipps/dynamische_queries/index_en.html
http://blogs.Oracle.com/XmlPublisher/files/BIPublisher_dynamic_column%20Blog.PDF
Thank you!
-
Data Modeler SQL Developer 3.0 EA1: Auto Road off
I admire the Data Modeler tries to automatically position the flow of data and relationships. I'm sure it's very difficult to automate, and I don't think it works very well in this tool.
So, I'm happy you have the option to turn off 'auto road' I'm quite particular about how diagrams should look like.
In the EA1, it seems that even with "auto road" shot, all connected relationships and that is repositioned automatically as soon as an entity or process box is resized or moved - even the smallest forest.
I think that the tool does not reposition relations or traffic as long as their current position can be maintained after the process/entities connected is resized or moved. When the lines must be moved the tool should try to maintain a position close to the original as possible position.
Let me know if there is a way to do this.
Also, when relationships or that must be repositioned by the tool, I think they should be placed further from each other as possible (which gives more space for the labels). Currently, it seems that the default behavior is to bunch of relationships and who reconciled when they sign the same entities or processes.
/ Marc OliveiraHi Jon,
In the pop-up menu for the diagram - use 'right click' on the space empty diagram in order to get it.
Philippe
-
How can I Data Modeler does not generate constraints on NNC_ tables?
Hi all
Unfortunately I have not found the answer using Google ;-)
When I generate my DDL scripts on my model on some tables Data Modeler (version 4.0.0) automatically generates constraints on the columns of the table "NNC_" for example:
CREATE TABLE STG_DURCHGANGSKNOTEN
(
ID NUMBER CONSTRAINT NNC_STG_DURCHGANGSKNOTENv1_ID NOT NULL,
Kilometrierung VARCHAR2 (20) CONSTRAINT NNC_STG_DURCHGANGSKNOTENv1_Kilometrierung NOT NULL,
Letzte_Aenderung DATE CONSTRAINT NNC_STG_DURCHGANGSKNOTENv1_Letzte_Aenderung NOT NULL,
Knotentyp VARCHAR2 (100) CONSTRAINT NNC_STG_DURCHGANGSKNOTENv1_Knotentyp NOT NULL,
Name VARCHAR2 (100),
BZ_Bezeichner VARCHAR2 (100),
GUI_Bezeichner VARCHAR2 (100),
Spurplanabschnitt_ID NUMBER NNC_STG_DURCHGANGSKNOTENv1_Spurplanabschnitt_ID CONSTRAINT NOT NULL,.
XML_Document XMLTYPE
);
How can I avoid this? I like to just get something like this:
CREATE TABLE STG_DURCHGANGSKNOTEN
(
IDENTIFICATION NUMBER NOT NULL,
Kilometrierung VARCHAR2 (20) NOT NULL,
Letzte_Aenderung DATE NOT NULL,
Knotentyp VARCHAR2 (100) NOT NULL,
Name VARCHAR2 (100),
BZ_Bezeichner VARCHAR2 (100),
GUI_Bezeichner VARCHAR2 (100),
Spurplanabschnitt_ID NUMBER NOT NULL,
XML_Document XMLTYPE
);
Thank you
Matthias
Hi Matthias,
The NOT NULL Constraint clause appears likely because 'Not Null Constraint Name' property is set to the column. (It is indicated on the Panel "forced by default and ' in the column properties dialog box.)
To stop these products, you can go to the Data Modeler/DOF of the preferences page (on the Tools menu) and set the option 'generate short form of NO forced NULL.
Note that there now is a forum specifically for the Data Modeler: SQL Developer Data Modeler
David
-
Re: Data model - degree of relationship, ERD Notation of Barker
Oh, I see now! I worked on the relational model, these icons are only accessible from logic model. I think I need to spend more time with the Modelere data. Thank you!
Sorry for the topic of opening in the wrong forum (I have not noticed the appropriate injector). I would be grateful if someone could move it where it belongs.
Hello
There is an icon in the buttons above the schema to create a 1:1 relationship. (The icon looks like 1<-->1)
You can also set the properties 'Target cardinality of Source' and the 'target Source of cardinality' 1 in the Properties dialog box for the relationship.
Just to add that there is a forum especially for questions of data Modeler: SQL Developer Data Modeler
David
--> -
Hello OTN.
I don't understand why my sql query will pass by in the data model of the BI Publisher. I created a new data model, chose the data source and type of Standard SQL = SQL. I tried several databases and all the same error in BI Publisher, but the application works well in TOAD / SQL Developer. So, I think it might be something with my case so I'm tender hand to you to try and let me know if you get the same result as me.
The query is:
SELECT to_char (to_date ('15-' |)) TO_CHAR(:P_MONTH) | » -'|| (To_char(:P_YEAR), "YYYY-DD-MONTH") - 90, "YYYYMM") as yrmth FROM DUAL
Values of the variable:
: P_MONTH = APRIL
: P_YEAR = 2015
I tried multiple variations and not had much luck. Here are the other options I've tried:
WITH DATES AS
(
Select TO_NUMBER (decode (: P_MONTH, 'JANUARY', '01',))
'FEBRUARY', '02',.
'MARCH', '03'.
'APRIL', '04'
'MAY', '05'.
'JUNE', '06'.
'JULY', '07',.
'AUGUST', '08'.
'SEPTEMBER', '09'.
'OCTOBER', '10',.
'NOVEMBER', '11'.
"DECEMBER", "12."
'01')) as mth_nbr
of the double
)
SELECT to_char (to_date ('15-' |)) MTH_NBR | » -'|| (TO_CHAR(:P_YEAR), 'DD-MM-YYYY') - 90, "YYYYMM")
OF DATES
SELECT to_char (to_date ('15-' |: P_MONTH |)) » -'|| ((: P_YEAR, 'MONTH-DD-YYYY')-90, "YYYYMM") as yrmth FROM DUAL
I'm running out of ideas and I don't know why it does not work. If anyone has any suggestions or ideas, please let me know. I always mark answers correct and useful in my thread and I appreciate all your help.
Best regards
-Konrad
So I thought to it. It seems that there is a bug/lag between the guest screen that appears when you enter SQL in the data model and parameter values, to at model/value data.
Here's how I solved my problem.
I have created a new data model and first created all my settings required in the data model (including the default values without quotes, i.e. APRIL instead "Of APRIL") and then saved.
Then I stuck my sql query in the data model and when I clicked ok, I entered my string values in the message box with single quotes (i.e. "in APRIL' instead of APRIL)
After entering the values of string with single quotes in the dialog box, I was able to retrieve the columns in the data model and save.
In the data tab, is no longer, I had to enter the values in single quotes, but entered values normally instead, and the code worked.
It seems the box prompted to bind the values of the variables when the SQL text in a data model expects strings to be wrapped in single quotes, but no where else. It's a big headache for me, but I'm glad that I solved it, and I hope this can be of help to other institutions.
See you soon.
-
SQL query for retrieving data based on Certain model
Hi all
I want to retrieve all the identifiers of all the people who are permanently seated during the last hour.
Data are expressed as below:
-Creation of the activity Table
CREATE TABLE activity_log
(
Username, NUMBER of
Activity VARCHAR2 (30),
StartTime VARCHAR2 (6).
EndTime VARCHAR2 (6)
);
-Filling with sample data
INSERT INTO activity_log VALUES('39','Walking','09:01','09:05');
INSERT INTO activity_log VALUES('39','Walking','09:06','09:10');
INSERT INTO activity_log VALUES('39','Sitting','09:11','09:15');
INSERT INTO activity_log VALUES('39','Sitting','09:16','09:20');
INSERT INTO activity_log VALUES('39','Sitting','09:21','09:25');
INSERT INTO activity_log VALUES('39','Standing','09:26','09:30');
INSERT INTO activity_log VALUES('39','Standing','09:31','09:35');
INSERT INTO activity_log VALUES('39','Sitting','09:36','09:40');
INSERT INTO activity_log VALUES('39','Sitting','09:41','09:45');
INSERT INTO activity_log VALUES('39','Sitting','09:46','09:50');
INSERT INTO activity_log VALUES('39','Sitting','09:51','09:55');
INSERT INTO activity_log VALUES('39','Sitting','09:56','10:00');
INSERT INTO activity_log VALUES('39','Sitting','10:01','10:05');
INSERT INTO activity_log VALUES('39','Sitting','10:06','10:10');
INSERT INTO activity_log VALUES('39','Sitting','10:11','10:15');
INSERT INTO activity_log VALUES('39','Sitting','10:16','10:20');
INSERT INTO activity_log VALUES('39','Sitting','10:21','10:25');
INSERT INTO activity_log VALUES('39','Sitting','10:26','10:30');
INSERT INTO activity_log VALUES('39','Sitting','10:31','10:35');
INSERT INTO activity_log VALUES('39','Standing','10:36','10:40');
INSERT INTO activity_log VALUES('39','Standing','10:41','10:45');
INSERT INTO activity_log VALUES('39','Walking','10:46','10:50');
INSERT INTO activity_log VALUES('39','Walking','10:51','10:55');
INSERT INTO activity_log VALUES('39','Walking','10:56','11:00');
INSERT INTO activity_log VALUES('40','Walking','09:01','09:05');
INSERT INTO activity_log VALUES('40','Walking','09:06','09:10');
INSERT INTO activity_log VALUES('40','Sitting','09:11','09:15');
INSERT INTO activity_log VALUES('40','Sitting','09:16','09:20');
INSERT INTO activity_log VALUES('40','Sitting','09:21','09:25');
INSERT INTO activity_log VALUES('40','Standing','09:26','09:30');
INSERT INTO activity_log VALUES('40','Standing','09:31','09:35');
INSERT INTO activity_log VALUES('40','Sitting','09:36','09:40');
INSERT INTO activity_log VALUES('40','Sitting','09:41','09:45');
INSERT INTO activity_log VALUES('40','Sitting','09:46','09:50');
INSERT INTO activity_log VALUES('40','Sitting','09:51','09:55');
INSERT INTO activity_log VALUES('40','Walking','09:56','10:00');
INSERT INTO activity_log VALUES('40','Sitting','10:01','10:05');
INSERT INTO activity_log VALUES('40','Standing','10:06','10:10');
INSERT INTO activity_log VALUES('40','Standing','10:11','10:15');
INSERT INTO activity_log VALUES('40','Walking','10:16','10:20');
INSERT INTO activity_log VALUES('40','Walking','10:21','10:25');
INSERT INTO activity_log VALUES('40','Walking','10:26','10:30');
INSERT INTO activity_log VALUES('40','Sitting','10:31','10:35');
INSERT INTO activity_log VALUES('40','Sitting','10:36','10:40');
INSERT INTO activity_log VALUES('40','Sitting','10:41','10:45');
INSERT INTO activity_log VALUES('40','Standing','10:46','10:50');
INSERT INTO activity_log VALUES('40','Walking','10:51','10:55');
INSERT INTO activity_log VALUES('40','Walking','10:56','11:00');
Based on the data, the user ID 39 must be found, since it's been sitting since 09:36-10:35, which is continuous 1 hour.
Any guidance how to do using SQL query will be of great help and appreciation.
Thank you very much
Kind regards
Bilal
So what exactly is wrong with the request that I already gave you?
Just to remind one untested (because of lack of insert statements) rewrite according to your new data:
with grp as)
Select
username
UserRecognizedActivityID activity
starttime
starttime + endetime + 1
row_number() over (partition by order of starttime userid)
-ROW_NUMBER() over (partition of userid, UserRecognizedActivityID order of starttime)
RN
of activity_log
)
Select
username
min (starttime) starttime
max (endtime) endtime
max (activity) activity
GRP
Group userid, rn
with round (max (endtime) - min (starttime) * 24 * 60) > = 59
-
Can I use session variables in data model BI publisher SQL query?
Hi Experts,
We apply security at the level of the BI Publisher 11g data.
In OBIEE we do so using session variables, so I wanted to just ask if we can use the same session variables in BI Publisher as well
That is, we can include a where clause in the SQL for the sample data as
Where ORG_ID = @{biServer.variables ['NQ_SESSION.]} {[INV_ORG']}
I would like to know your opinion on this.
PS: We implement security EBS r12 in BI Publisher.
Thank youRead this-> OBIEE 11 g: error: "[nQSError: 23006] the session variable, NQ_SESSION.» LAN_INT, has no definition of value. "When you create a SQL query using the session NQ_SESSION variable. LAN_INT in BI Publisher [ID 1511676.1]
Follow the ER - BUG: 13607750 -NEED TO be able TO SET up a SESSION IN OBIEE VARIABLE AND use it IN BI PUBLISHER
HTH,
SVS -
The search syntax of SQL query against the data type varchar2 preserving valid data.
Have a data model that we are not allowed to change and the column in question is a varchar2 (20). The column has at this stage no foreign key to the list of valid values. So, until we can get those who control the data model in order to make the adjustments we need for a SQL query that root out us bad data on the hours fixed.
Is what we expect to be good data below:
-Whole number, without floating point
-Length of 5 or less (greater than zero but less than 99999)
-Text "No_RP" can exist.
Request demo below works most of the time with the exception of 'or Column1 is null' is not contagious in the null record. I tried to change the logical terms around, but did not understand the correct layout still provide it. So help would be greatly appreciated it someone could put me straight on how to properly register a null value in the recordset that has been selected with other types of error for end users to correct their mistakes. Another thing, I suppose there could be a better approach syntactically to a call find all offender characters such as *, &, (and so on.)
WITH Sample_Data AS (SELECT '0' collar OF DOUBLE UNION ALL)
SELECT "2" collar OF DOUBLE UNION ALL
SELECT "99999" col OF DOUBLE UNION ALL
SELECT "100000" col OF DOUBLE UNION ALL
SELECT '1 a' collar OF DOUBLE UNION ALL
SELECT the "ABCD" OF DOUBLE UNION ALL pass
SELECT 'A1' collar OF DOUBLE UNION ALL
SELECT ' *' collar OF DOUBLE UNION ALL
SELECT "/" pass OF DOUBLE UNION ALL
SELECT '-' col OF DOUBLE UNION ALL
SELECT ' ' collar OF DOUBLE UNION ALL
SELECT "pass OF DOUBLE UNION ALL
4. SELECT 5 6' collar OF DOUBLE UNION ALL
SELECT "24.5" collar OF DOUBLE UNION ALL
SELECT '-3' collar OF DOUBLE UNION ALL.
SELECT 'A' collar OF DOUBLE UNION ALL
SELECT 'F' OF DOUBLE UNION ALL cervical
SELECT the 'Z' OF DOUBLE UNION ALL pass
SELECT the pass 'Bye' FROM DUAL UNION ALL
SELECT the "Hello World" OF DOUBLE UNION ALL pass
SELECT "=" col OF DOUBLE UNION ALL
SELECT "+" col OF DOUBLE UNION ALL
SELECT '_' pass OF DOUBLE UNION ALL
SELECT '-' col OF DOUBLE UNION ALL
SELECT ' (' col OF DOUBLE UNION ALL)
SELECT ')' collar OF DOUBLE UNION ALL
SELECT '&' collar OF DOUBLE UNION ALL
SELECT ' ^' collar OF DOUBLE UNION ALL
SELECT '%' collar OF DOUBLE UNION ALL
SELECT the pass of "$" OF DOUBLE UNION ALL
SELECT the pass ' # ' TO DOUBLE UNION ALL
SELECT ' @' collar OF DOUBLE UNION ALL
SELECT '!' collar OF DOUBLE UNION ALL
SELECT ' ~' collar OF DOUBLE UNION ALL
SELECT "' collar OF DOUBLE UNION ALL
SELECT '.' pass FROM DUAL
)
SELECT col from Sample_data
WHERE (translate (col, '_0123456789', '_') is not null
or length (col) > 5
col = 0 or
or col is null)
and (upper (col) <>'NO_RP');
One more thing, I also took the approach of the regular expression, but he could not understand. If anyone knows how to do with this approach, I would also appreciate learning this method as well. Below is a close because I had. Impossible to get a range to work as "between 0 and 100000", guessing because of the comparison of varchar2 and # even attempted using to_char and to_number.
Select to_number (column1) from the testsql where REGEXP_LIKE (column1, ' ^ [[: digit:]] + $') ORDER BY to_number (column1) CSA;
Thanks in advance for anyone to help.
NickHello
Thanks for posting the sample data in a useable form.
It would be useful that you also posted the accurate results you wanted from this data. You want the same results as those produced by the query you posted, except that nulls should be included? If so:SELECT col FROM sample_data WHERE CASE WHEN UPPER (col) = 'NO_RP' THEN 1 WHEN col IS NULL THEN -1 WHEN LTRIM (col, '0123456789') IS NOT NULL THEN -2 WHEN LENGTH (col) > 5 THEN -3 ELSE TO_NUMBER (col) END NOT BETWEEN 1 AND 99999 ;
The requirement that pass! = 0 gives that much more difficult. You could test easily for an integer from 1 to 5 digits, but then you must have a separate condition to make sure that the chain was not '0', '00', '000', ' 0000 'or ' 00000'.
(Unlike Solomon, I guess that do not want to choose no-0 numbers starting by 0, such as ' 007 'or ' 02138'.)Using regular expressions, you may lose a few keystrokes, but you also lose a lot of clarity:
SELECT col FROM sample_data WHERE REGEXP_LIKE ( col , '^0{1,5}$' ) OR NOT REGEXP_LIKE ( NVL ( UPPER (col) , 'BAD' ) , '^(([1-9][0-9]{0,4})|NO_RP)$' ) ;
Published by: Frank Kulash, December 13, 2010 21:50
Published by: Frank Kulash, December 13, 2010 22:11
Added regular expression solution -
Is it possible to make the condition depending on the sql statements in the data models
Hi all
Is it possible to include the condition based on sql statements in the data models.
For example
Is something like this? Also, the good doc is available for ' how to take full advantage of the "data models" in BI Publisher?if (some parameter is not null) <sqlstatement name="STATEMENT_1"> ... </sqlstatement> else <sqlstatement name="STATEMENT_2"> ... </sqlstatement>
Thank you
-SookieHello Sookie,
I couldn't find the time to get a data model of demonstration of work for you, but I'll try to explain.First, write a PL/SQL package. Make sure that you set all the parameters of model of data such as a global variable in the default PL/SQL package.
CREATE OR REPLACE
package as employee
function BeforeReportTrigger return Boolean;
query_text varchar (2000);
number of p_DEPTNO;
END;
/CREATE OR REPLACE
package as body employee
function BeforeReportTrigger return Boolean IS
StartIF (p_DEPTNO = 10) THEN
query_text: = select col1, col2, col3 from HR.
elsif (p_DEPTNO = 20) THEN
query_text: = select col1, col2, col3 hr_history.
on the other
query_text: = select col1, col2, col3 hr_history1.
end if;
Returns true;
end;
/Use this package in the default package in your data model. Check the "defaultPackage ="employee"in the following data model header.
Sample data model
------------------------------
--
--
--
--
Before running the query SQL, data engine reads the "before the release of the report" and all the texte_requete argument based on the p_DeptNo value. When executing the Q1, engine sqlQuery analyze the query ' & quert_text and replace it with the actual value. For example if the p_deptno = 10, the query will be "select col1, col2, col3 from HR.
Try it...
-
Is conditional SQL processing of XDL available data models
Hello
Is it possible to have a CONDITIONAL STATEMENT in .xml (data model), based on a SQL statement that was going to be executed? Sense, based on a PARAMETER value, only the TWO queries should be run.
Ex: Lets take a system used, or if the report is run
SUMMARY = > summary query must be executed
DETAIL = > detailed queries to run.
< IF P_REPORT = SUMMARY >
< DO >
-THE QUERY SUMMARY
< END IF >
< IF P_REPORT = DETAIL >
< DO >
-DETAIL QUERY
< END IF >
Could if it you please let me know the feasibility of this in XML BI. Ask that you please point me to the documentation, if any.
Thank you
VijayaOK, I withdraw,
If it is required to be achieved. then that,
1. you need to add a report parameter values valid as 'SUMMARY' and 'DETAIL '.
2. you have 2 Add two separate petitions, no need to join them.
3 just put separate way queries in the data model or concatenated sql same datasource.
4 each of the query, to equate them with the setting something like
5 ' SUMMARY' =: PARAM_REPORT_TYPE in the query summary
6 ' DETAIL' =: PARAM_REPORT_TYPE in the query in detail
7. in execution, that which, one of the query to execute and extract the data.
8. Similarly, in the model, you can add the conditional creation of models. -
Single SQL query for the analysis of the date of customs declaration under the table of Stock codes
Dear all,
Please tell us a single SQL query for the below,
We have a Table of Stock as shown below,
STOCK_TABLE
ITEM_CODE
(item code)
BAT_NO
(lot no.)
TXN_CODE
(transaction code)
DOC_NO
(number)
BOE_DT
(date of the customs declaration)
I1
B1
I1
I2
I3
B70
I4
B80
I5
B90
T102
1234
JULY 2, 2015
I6
B100
We have to find the date of customs declaration (i.e. the date when the items have come under this particular table) for items that are not attached to any document (that is, who have TXN_CODE, DOC_NO and BOE_DT fields with a NULL value).
For each item in the table of actions, which is not attached to any document, the customs declaration date is calculated as follows.
- If (code section, lot number) combination is present under HISTORY_TABLE, the date of customs declaration will receive the UPDT_DT, the transaction code (TXN_CODE) is an IN or transactions (which can be analyzed from the TRANSACTIONS table).
- If (code section, lot number) combination is NOT currently at the HISTORY_TABLE (or) the transaction code respective to item - batch number combination code is an operation then customs declaration date will be the date of the document (DOC_DT) that we receive from one of the 3 tables IN_TABLE_HEAD that contains the element of that particular lot.
- If the case 1 and case 2 fails, our customs declaration date will be the last date of document (DOC_DT) that we receive from one of the 3 tables IN_TABLE_HEAD containing that particular item and the BAT_NO in expected results will be that corresponding to this document, as appropriate, to another NULL.
- If the case 1 or case 2 is successful, the value of the last field (in the output expected, shown further below) BATCH_YN will be 'Y', because it fits the lot. Otherwise it will be 'n'.
-
SQL query to represent the data in the graph bar
Hello
JDev 11.1.1.5.0
We must create a dashboard to track the translation. We have created the ADF table with buttons 'Create', 'Update' and 'Delete' to manipulate the table.
Our DB table structure is
File_id (PK), File_Name, ToSpanish - ARE (YES/NO), beginning of the ToSpanish, ToChina(YES/No), ToChina-Date... etc.
Once the translated file required language then user must update the DB table using the button "Update".
So far, we have implemented the requirement above.
We need represent the status of the translation in the graph bar with the language as X access and file count get access Ex: Spanish-100 China - 200 files files
Please suggest the sql query to retrieve the necessary info from the DB table that can be represented in the graph bar. Also, it would be great, if you can provide a pointers to create a bar chart.
Thanks in advance,
MSR.
If you set your major increment and minor than 1, then you won't not show decimal points. You can try setting these 10 or 100 to reach your goal.
Subtype = "BAR_VERT_CLUST" >
-
After some testing today with a new installation and plugin subversion in the latest edition of data Modeler this error happens with every start of the tool.
Have removed and unzipped the installation once again without changing the error.
After that, I started with another user on my computer, it the error does not occur.
Is there a system folder to remove the configuration of my personal like jdeveloper and sql developer?
I lose the most important features, for example. have no browser and cannot open a design.
Here is the full error stack:
29 may 2015 22:17:40 oracle.ideimpl.extension.AddinManagerImpl
SEVERE: Exception initialization 'oracle.dbtools.crest.fcp.DataModelerAddin' extension ' Oracle SQL Developer Data Modeling
java.lang.NullPointerException
at oracle.dbtools.crest.swingui.editor.UDPLibrariesPersistence.load(UDPLibrariesPersistence.java:220)
at oracle.dbtools.crest.model.design.DesignSet.createElement(DesignSet.java:56)
at oracle.dbtools.crest.swingui.ApplicationView.addDesign(ApplicationView.java:2497)
to oracle.dbtools.crest.swingui.ApplicationView. < init > (ApplicationView.java:435)
to oracle.dbtools.crest.swingui.ApplicationView. < init > (ApplicationView.java:389)
at oracle.dbtools.crest.swingui.ApplicationView.getInstance(ApplicationView.java:2258)
at oracle.dbtools.crest.fcp.DataModelerAddin.initialize(DataModelerAddin.java:553)
at oracle.ideimpl.extension.AddinManagerImpl.initializeAddin(AddinManagerImpl.java:496)
at oracle.ideimpl.extension.AddinManagerImpl.initializeAddin(AddinManagerImpl.java:483)
at oracle.ideimpl.extension.AddinManagerImpl.initializeAddins(AddinManagerImpl.java:299)
at oracle.ideimpl.extension.AddinManagerImpl.initProductAndUserAddins(AddinManagerImpl.java:160)
at oracle.ideimpl.extension.AddinManagerImpl.initProductAndUserAddins(AddinManagerImpl.java:143)
at oracle.ide.IdeCore.initProductAndUserAddinsAndActionRegistry(IdeCore.java:2294)
at oracle.ide.IdeCore.startupImpl(IdeCore.java:1817)
at oracle.ide.Ide.startup(Ide.java:772)
at oracle.ide.osgi.Activator.start(Activator.java:209)
to org.eclipse.osgi.framework.internal.core.BundleContextImpl$ 1.run(BundleContextImpl.java:711)
at java.security.AccessController.doPrivileged (Native Method)
at org.eclipse.osgi.framework.internal.core.BundleContextImpl.startActivator(BundleContextImpl.java:702)
at org.eclipse.osgi.framework.internal.core.BundleContextImpl.start(BundleContextImpl.java:683)
at org.eclipse.osgi.framework.internal.core.BundleHost.startWorker(BundleHost.java:381)
at org.eclipse.osgi.framework.internal.core.AbstractBundle.resume(AbstractBundle.java:390)
at org.eclipse.osgi.framework.internal.core.Framework.resumeBundle(Framework.java:1176)
at org.eclipse.osgi.framework.internal.core.StartLevelManager.resumeBundles(StartLevelManager.java:559)
at org.eclipse.osgi.framework.internal.core.StartLevelManager.resumeBundles(StartLevelManager.java:544)
at org.eclipse.osgi.framework.internal.core.StartLevelManager.incFWSL(StartLevelManager.java:457)
at org.eclipse.osgi.framework.internal.core.StartLevelManager.doSetStartLevel(StartLevelManager.java:243)
at org.eclipse.osgi.framework.internal.core.EquinoxLauncher.internalStart(EquinoxLauncher.java:271)
at org.eclipse.osgi.framework.internal.core.EquinoxLauncher.start(EquinoxLauncher.java:241)
at org.eclipse.osgi.launch.Equinox.start(Equinox.java:258)
at org.netbeans.core.netigso.Netigso.start(Netigso.java:191)
at org.netbeans.NetigsoHandle.startFramework(NetigsoHandle.java:209)
at org.netbeans.ModuleManager.enable(ModuleManager.java:1352)
at org.netbeans.ModuleManager.enable(ModuleManager.java:1156)
at org.netbeans.core.startup.ModuleList.installNew (ModuleList.java:340)
at org.netbeans.core.startup.ModuleList.trigger (ModuleList.java:276)
at org.netbeans.core.startup.ModuleSystem.restore (ModuleSystem.java:301)
at org.netbeans.core.startup.Main.getModuleSystem (Main.java:181)
at org.netbeans.core.startup.Main.getModuleSystem (Main.java:150)
at org.netbeans.core.startup.Main.start (Main.java:307)
at org.netbeans.core.startup.TopThreadGroup.run(TopThreadGroup.java:123)
at java.lang.Thread.run(Thread.java:745)
Hi Torsten,
Thanks for reporting the problem. I logged a bug.
You can view together as "list of system types" directory to "preference > Data Modeler"-probably no longer exists. I guess that the setting for this directory is empty when you start SQL Dev as a different user.
Philippe
-
How to create Spatial indexes in SQL Developer Data Modeler 4
Hello
What is the procedure to create a Spatial index in SDDM v4? I found myself at an impasse because of the following problems:
(1) retro-engineering rate the spatial index, then I need to add them manually.
(2) adding an index does not allow me to check the 'spatial', and does not allow me to add the sdo_geometry column in the index definition.
(3) for the registration of the spatial properties for a table, I can not choose the name of the Spatial Index.
I came across an old post (Data Modeler - cannot create a spatial index), but this post is no longer valid because SDDM V4 does not allow me to create an index without columns (definition of the incomplete index).
Help is appreciated,
Kind regards
Richard.
Hi Richard,
Thanks for your update. I'm scared, I was assuming you were using the stand alone rather than SQL Developer Data Modeler.
I tried using SQL Developer 4.1.0.18.37 (ai2) using a thin connection and got the ClassCastException you have found.
But when I used ai2 autonomous, maker of data using the same login details 'Advanced', it worked OK and imported from the spatial index.
(I'm puzzled as to why these two seem to behave differently, as this is the version ai2 data integrated with SQL Developer ai2 maker).
I then plugged and reverse engineering the same tables (yet) in a new model.
Funny is that there is no error in the external journal this time, but the spatial properties are still not retroconcues correctly.
Is it possible that you have not set the option to import the spatial properties? I expect that the spatial properties would be imported or that the error should be registered.
SQL Developer ai2 imported properly the spatial properties and the spatial index, when I used a basic connection and specified host, port, and SID or Service name.
Another issue: SQL Developer relies on a certain JDK / JDBC library?
Any Java JDK 1.8 should be good.
David
HISTORY_TABLE
ITEM_CODE | BAT_NO |
TXN_CODE
DOC_NO
UPDT_DT
I1
B1
T1
1234
JANUARY 3, 2015
I1
B20
T20
4567
MARCH 3, 2015
I1
B30
T30
7890
FEBRUARY 5, 2015
I2
B40
T20
1234
JANUARY 1, 2015
TRANSACTION
TXN_CODE | TXN_TYPE |
T1 | IN |
T20 |
OFF
T30
ALL THE
T50
IN
T80
IN
T90
IN
T60
ALL THE
T70
ALL THE
T40
ALL THE
IN_TABLE_HEAD_1
H1_SYS_ID (primary key) | TXN_CODE | DOC_NO |
DOC_DATE
H1ID1
T1
1234
JANUARY 1, 2015
H1ID2
T70
1234
FEBRUARY 1, 2015
IN_TABLE_ITEM_1
I1_SYS_ID |
H1_SYS_ID
(foreign key referencing H1_SYS_ID in IN_TABLE_HEAD_1)
ITEM_CODE
I1ID1
H1ID1
I1
I1ID2
H1ID1
I100
I1ID3
H1ID2
I3
IN_TABLE_BATCH_1
B1_SYS_ID | TXN_CODE DOC_NO (now in IN_TABLE_HEAD_1) | BAT_NO |
B1ID1
T1
1234
B1 / can be empty
B1ID2
T70
1234
B70
IN_TABLE_HEAD_2
H2_SYS_ID (primary key) | TXN_CODE |
DOC_NO
DOC_DATE
H2ID1
T30
4567
FEBRUARY 3, 2015
H2ID2
T60
1234
JANUARY 3, 2015
IN_TABLE_ITEM_2
I2_SYS_ID | H2_SYS_ID (foreign key referencing H2_SYS_ID in IN_TABLE_HEAD_2) | ITEM_CODE |
I2ID1 | H2ID1 |
I1
I2ID2
H2ID1
I200
I2ID3
H2ID2
I2
IN_TABLE_BATCH_2
B2_SYS_ID |
I2_SYS_ID
(foreign key referencing I2_SYS_ID in IN_TABLE_ITEM_2)
BAT_NO
B2ID1
I2ID1
B30 / null
B2ID2
I2ID2
B90
B2ID2
I2ID3
B60
IN_TABLE_HEAD_3
H3_SYS_ID (primary key) | TXN_CODE | DOC_NO | DOC_DATE |
H3ID1 |
T50
1234
JANUARY 2, 2015
H3ID2
T80
1234
JANUARY 3, 2015
H3ID3
T90
1234
JANUARY 4, 2015
H3ID4
T40
1234
AUGUST 5, 2015
IN_TABLE_ITEM_3
I3_SYS_ID |
H3_SYS_ID
(foreign key referencing H3_SYS_ID in IN_TABLE_HEAD_3)
ITEM_CODE
BAT_NO
I3ID1
H31D1
I2
B50
I3ID2
H3ID2
I4
B40
I3ID3
H3ID3
I4
I3ID4
H3ID4
I6
There is no IN_TABLE_BATCH_3
Please find below the expected results.
OUTPUT
ITEM_CODE | BAT_NO | TXN_CODE | DOC_NO |
BOE_DT
BATCH_YN
I1
B1
T1
1234
JANUARY 3, 2015
THERE
I1
B30
T30
7890
FEBRUARY 5, 2015
N
I2
B60
T60
1234
JANUARY 3, 2015
N
I3
B70
T70
1234
FEBRUARY 1, 2015
THERE
I4
T90
1234
JANUARY 4, 2015
N
I6
T40
1234
AUGUST 5, 2015
N
Controls database to create the tables above and insert the records.
CREATE TABLE stock_table()item_code VARCHAR2()80),bat_no VARCHAR2()80),txn_code VARCHAR2()80),
doc_no VARCHAR2 (80), boe_dt DATE );
INSERT EN stock_table
VALUES ('I1', 'B1', '', '', '');
INSERT EN stock_table
VALUES ('I1', '', '', '', '');
INSERT IN stock_table
VALUES ('I2', '', '', '', '');
INSERT EN stock_table
VALUES ('I3', 'B70', '', '', '');
INSERT EN stock_table
VALUES ('I4', 'B80', '', '', '');
INSERT EN stock_table
VALUES ('I5', 'B90', 'T102', '1234', '02-JUL-2015');
INSERT EN stock_table
VALUES ('I6', 'B100', '', '', '');
SELECT *
FROM stock_table
CREATE TABLE history_table()item_code VARCHAR2()80),bat_no VARCHAR2()80),txn_code VARCHAR2()80),
doc_no VARCHAR2 (80), updt_dt DATE );
INSERT IN history_table
VALUES ('I1', 'B1', 'T1', '1234', '03-JAN-2015');
INSERT IN history_table
VALUES ('I1', 'B20', 'T20', '4567', '03-MAR-2015');
INSERT IN history_table
VALUES ('I1', 'B30', 'T30', '7890', '05-FEB-2015');
INSERT IN history_table
VALUES ('I2', 'B40', 'T20', '1234', '01-JAN-2015');
SELECT *
FROM history_table
CREATE TABLE transaction1()txn_code VARCHAR()80),txn_type VARCHAR()80));
INSERT INTO transaction1
VALUES ('T1', 'IN');
INSERT INTO transaction1
VALUES ('T20', 'OUT');
INSERT INTO transaction1
VALUES ('T30', 'ALL');
INSERT INTO transaction1
VALUES ('T40', 'ALL');
INSERT INTO transaction1
VALUES ('T50', 'IN');
INSERT INTO transaction1
VALUES ('T60', 'ALL');
INSERT INTO transaction1
VALUES ('T70', 'ALL');
INSERT INTO transaction1
VALUES ('T80', 'IN');
INSERT INTO transaction1
VALUES ('T90', 'IN');
SELECT *
FROM transaction1
CREATE TABLE in_table_head_1()h1_sys_id VARCHAR2()80) PRIMARY KEY,txn_code VARCHAR2()80),
doc_no VARCHAR2 (80), doc_dt DATE );
CREATE TABLE in_table_head_2()h2_sys_id VARCHAR2()80) PRIMARY KEY,txn_code VARCHAR2()80),
doc_no VARCHAR2 (80), doc_dt DATE );
CREATE TABLE in_table_head_3()h3_sys_id VARCHAR2()80) PRIMARY KEY,txn_code VARCHAR2()80),
doc_no VARCHAR2 (80), doc_dt DATE );
INSERT IN in_table_head_1
VALUES ('H1ID1', 'T1', '1234', '01-JAN-2015');
INSERT IN in_table_head_1
VALUES ('H1ID2', 'T70', '1234', '01-FEB-2015');
INSERT IN in_table_head_2
VALUES ('H2ID1', 'T30', '4567', '03-FEB-2015');
INSERT IN in_table_head_2
VALUES ('H2ID2', 'T60', '1234', '03-JAN-2015');
INSERT IN in_table_head_3
VALUES ('H3ID1', 'T50', '1234', '02-JAN-2015');
INSERT IN in_table_head_3
VALUES ('H3ID2', 'T80', '1234', '03-JAN-2015');
INSERT IN in_table_head_3
VALUES ('H3ID3', 'T90', '1234', '05-JAN-2015');
INSERT IN in_table_head_3
VALUES ('H3ID4', 'T40', '1234', '05-AUG-2015');
CREATE TABLE in_table_item_1()i1_sys_id VARCHAR2()80) PRIMARY KEY,
h1_sys_id VARCHAR2 (80) REFERENCES in_table_head_1()h1_sys_id),item_code VARCHAR2()80));
CREATE TABLE in_table_item_2()i2_sys_id VARCHAR2()80) PRIMARY KEY,
h2_sys_id VARCHAR2 (80) REFERENCES in_table_head_2()h2_sys_id),item_code VARCHAR2()80));
CREATE TABLE in_table_item_3(i3_sys_id VARCHAR2(80) PRIMARY KEY,
h3_sys_id VARCHAR2 (80) REFERENCES in_table_head_3()h3_sys_id),item_code VARCHAR2()80),
bat_no VARCHAR2 (80));
INSERT IN in_table_item_1
VALUES ('I1ID1', 'H1ID1', 'I1');
INSERT IN in_table_item_1
VALUES ('I1ID2', 'H1ID1', 'I100');
INSERT IN in_table_item_1
VALUES ('I1ID3', 'H1ID2', 'I3');
INSERT IN in_table_item_2
VALUES ('I2ID1', 'H2ID1', 'I1');
INSERT IN in_table_item_2
VALUES ('I2ID2', 'H2ID1', 'I200');
INSERT IN in_table_item_2
VALUES ('I2ID3', 'H2ID2', 'I2');
INSERT IN in_table_item_3
VALUES ('I3ID1', 'H3ID1', 'I2','B50');
INSERT IN in_table_item_3
VALUES ('I3ID2', 'H3ID2', 'I4','B40');
INSERT IN in_table_item_3
VALUES ('I3ID3', 'H3ID3', 'I4','');
INSERT IN in_table_item_3
VALUES ('I3ID4', 'H3ID4', 'I6','');
SELECT *
FROM in_table_item_1
SELECT *
FROM in_table_item_2
SELECT *
FROM in_table_item_3
CREATE TABLE in_table_batch_1()b1_sys_id VARCHAR2()80) PRIMARY KEY,
txn_code VARCHAR2 (80), doc_no VARCHAR2 (80), bat_no VARCHAR2 (80));
CREATE TABLE in_table_batch_2()b2_sys_id VARCHAR2()80) PRIMARY KEY,
i2_sys_id VARCHAR2 (80) REFERENCES in_table_item_2()i2_sys_id),bat_no VARCHAR2()80));
INSERT IN in_table_batch_1
VALUES ('B1ID1', 'T1', '1234', 'B1');
INSERT IN in_table_batch_1
VALUES ('B1ID2', 'T70', '1234', 'B70');
INSERT IN in_table_batch_2
VALUES ('B2ID1', 'I2ID1', 'B30');
INSERT IN in_table_batch_2
VALUES ('B2ID2', 'I2ID2', 'B90');
INSERT IN in_table_batch_2
VALUES ('B2ID3', 'I2ID3', 'B60');
Please advise a solution for the same.
Thank you and best regards,
Séverine Suresh
very forced (question subfactoring used to allow easy testing/verification - could work with these test data only)
with
case_1 as
(select s.item_code,
s.bat_no,
h.txn_code,
h.doc_no,
h.updt_dt boe_dt,
cases where s.bat_no = h.bat_no then 'Y' else ' n end batch_yn.
cases where h.txn_code is not null
and h.doc_no is not null
and h.updt_dt is not null
then 'case 1' '.
end refers_to
from (select item_code, bat_no, txn_code, doc_no, boe_dt
of w_stock_table
where bat_no is null
or txn_code is null
or doc_no is null
or boe_dt is null
) s
left outer join
w_history_table h
On s.item_code = h.item_code
and s.bat_no = h.bat_no
and exists (select null
of w_transaction1
where txn_code = nvl (s.txn_code, h.txn_code)
and txn_type in ('IN', 'ALL')
)
),
case_2 as
(select s.item_code,
NVL (s.bat_no, h.bat_no) bat_no.
NVL (s.txn_code, h.txn_code) txn_code.
NVL (s.doc_no, h.doc_no) doc_no.
NVL (s.boe_dt, h.updt_dt) updt_dt.
cases where s.bat_no = h.bat_no then 'Y' else ' n end batch_yn.
cases where h.txn_code is not null
and h.doc_no is not null
and h.updt_dt is not null
then 'case 2'.
end refers_to
from (select item_code, bat_no, txn_code, doc_no, boe_dt
of case_1
where refers_to is null
) s
left outer join
w_history_table h
On s.item_code = h.item_code
and exists (select null
of w_transaction1
where txn_code = nvl (s.txn_code, h.txn_code)
and txn_type in ('IN', 'ALL')
)
and not exists (select null
of case_1
where item_code = h.item_code
and bat_no = h.bat_no
and txn_code = h.txn_code
and doc_no = h.doc_no
and updt_dt = h.updt_dt
)
),
case_31 as
(select s1.item_code,
NVL (S1.bat_no, W1.bat_no) bat_no.
NVL (S1.txn_code, W1.txn_code) txn_code.
NVL (S1.doc_no, W1.doc_no) doc_no.
NVL (S1.updt_dt, W1.doc_dt) updt_dt.
cases where s1.bat_no = w1.bat_no then 'Y' else ' n end batch_yn.
cases where w1.txn_code is not null
and w1.doc_no is not null
and w1.doc_dt is not null
then "case 31'.
end refers_to
from (select item_code, bat_no, txn_code, doc_no, updt_dt, batch_yn, refers_to
of case_2
where refers_to is null
) s1
left outer join
(select i1.item_code, h1.txn_code, h1.doc_no, h1.doc_dt, b1.bat_no
of w_in_table_item_1 i1
inner join
w_in_table_head_1 h1
On i1.h1_sys_id = h1.h1_sys_id
inner join
w_in_table_batch_1 b1
On h1.txn_code = b1.txn_code
and h1.doc_no = b1.doc_no
) w1
On s1.item_code = w1.item_code
),
case_32 as
(select s2.item_code,
NVL (S2.bat_no, W2.bat_no) bat_no.
NVL (S2.txn_code, W2.txn_code) txn_code.
NVL (S2.doc_no, W2.doc_no) doc_no.
NVL (S2.updt_dt, W2.doc_dt) updt_dt.
cases where s2.bat_no = w2.bat_no then 'Y' else ' n end batch_yn.
cases where w2.txn_code is not null
and w2.doc_no is not null
and w2.doc_dt is not null
then "case 32'.
end refers_to
from (select item_code, bat_no, txn_code, doc_no, updt_dt, batch_yn, refers_to
of case_2
where refers_to is null
) s2
left outer join
(select i2.item_code, h2.txn_code, h2.doc_no, h2.doc_dt, b2.bat_no
of w_in_table_item_2 i2
inner join
w_in_table_head_2 h2
On i2.h2_sys_id = h2.h2_sys_id
inner join
w_in_table_batch_2 b2
On i2.i2_sys_id = b2.i2_sys_id
) w2
On s2.item_code = w2.item_code
),
case_33 as
(select s3.item_code,
w3.bat_no,
NVL (S3.txn_code, w3.txn_code) txn_code.
NVL (S3.doc_no, w3.doc_no) doc_no.
NVL (S3.updt_dt, w3.doc_dt) updt_dt.
cases where s3.bat_no = w3.bat_no then 'Y' else ' n end batch_yn.
cases where w3.txn_code is not null
and w3.doc_no is not null
and w3.doc_dt is not null
then "case 33'.
end refers_to
from (select item_code, bat_no, txn_code, doc_no, updt_dt, batch_yn, refers_to
of case_2
where refers_to is null
) s3
left outer join
(select i3.item_code, h3.txn_code, h3.doc_no, h3.doc_dt, i3.bat_no
of w_in_table_item_3 i3
inner join
w_in_table_head_3 h3
On i3.h3_sys_id = h3.h3_sys_id
) w3
On s3.item_code = w3.item_code
)
Select item_code, bat_no, txn_code, doc_no, boe_dt, batch_yn
of case_1
where refers_to is not null
Union of all the
Select item_code, bat_no, txn_code, doc_no, updt_dt, batch_yn
of case_2
where refers_to is not null
Union of all the
Select item_code, bat_no, txn_code, doc_no, updt_dt, batch_yn
from (select item_code, bat_no, txn_code, doc_no, updt_dt, batch_yn,
ROW_NUMBER() over (partition by item_code of updt_dt desc order) rn
from (select item_code, bat_no, txn_code, doc_no, updt_dt, batch_yn
of case_31
where refers_to is not null
Union of all the
Select item_code, bat_no, txn_code, doc_no, updt_dt, batch_yn
of case_32
where refers_to is not null
Union of all the
Select item_code, bat_no, txn_code, doc_no, updt_dt, batch_yn
of case_33
where refers_to is not null
)
)
where rn = 1
ITEM_CODE | BAT_NO | TXN_CODE | DOC_NO | BOE_DT | BATCH_YN |
---|---|---|---|---|---|
I1 | B1 | T1 | 1234 | JANUARY 3, 2015 | THERE |
I1 | B30 | T30 | 7890 | FEBRUARY 5, 2015 | N |
I2 | B60 | T60 | 1234 | JANUARY 3, 2015 | N |
I3 | B70 | T70 | 1234 | FEBRUARY 1, 2015 | THERE |
I4 | - | T90 | 1234 | JANUARY 5, 2015 | N |
I6 | - | T40 | 1234 | AUGUST 5, 2015 | N |
Concerning
Etbin
Maybe you are looking for
-
How can I transfer music from Itunes to a USB Flash drive
I would like to transfer my music from ITunes to a flash drive USB is possible
-
Reset Toshiba e-store account - 14946
ToshibaVirtual Store account reset - 14946 Please reset my account virtual store
-
How to determine if the AHCI or IDE is in use.
Hi, I just recently upgraded my laptop (HP g4-1202ax) an SSD. He is running fast now than before. However, I read on the net that in order for SSD drives in order to fully use I need to put in my Bios from AHCI instead of IDE. I checked if my SSD wor
-
Pavilion H8 - 1240t starts with a blank screen!
From time to time my old Pavilion of 2 years with an AMD Radeon 7070 graphics card will start at a white screen and will continue to restart over and over again until I have pull the plug. Apart from that, a mess the PC works fine! How I managed to d
-
Stopper 0x000000F7 blue screen error occurs when you try to access the floppy drive.
The error occurs on computers 3differnt. Sometimes using "My Computer" to access the floppy drive. Malware, spyware and viruses, the computer has been verified. Clean everything. No new disk drivers listed.