Best way to details data and show of force only last result in Group

Hi all! First, just let all the world know that it is a great place to explore and learn Oracle - I learned more here than I have in some classes. Come and explore the forums and looking for an answer lead me to the functions that I hadn't known otherwise existed.

Here's what I'm trying to do now... Let's say I have a table that contains information about the family - if two or more persons associated with a family (determined by a separate table) then it should return the person identification and then the details of the group.

For example, the following data are contained in two tables, the follow-up of current results and then the result, that I'm looking...
PERSONS TABLE
----------------------
PERSON         PERSON_ID           ADDRESS
John Smith     101                     1 Oracle Drive
Jane Smith     102                     1 Oracle Drive
RELATIONSHIPS TABLE
-----------------------------
PERSON_ID      RELATEDPERSON_ID
101                 102
102                 101
A simple query would result in the following text:
WITH PERSONS AS
(
  SELECT 'John Smith' AS person, 101 AS person_id, '101 Oracle Drive' AS address FROM dual union all
  SELECT 'Jane Smith', 102, '101 Oracle Drive' FROM dual
)
,    RELATIONSHIPS AS
(
  SELECT 101 AS person_id, 102 AS relatedperson_id FROM dual union all
  SELECT 102, 101 FROM dual
)
SELECT
    person
  , address
FROM  PERSONS p
JOIN  RELATIONSHIPS r ON r.person_id = p.person_id
RESULT
PERSON      ADDRESS
John Smith      101 Oracle Drive
Jane Smith      101 Oracle Drive
I'm looking to produce the following result, but I do not know how... I am convinced that it is something simple.

DESIRED RESULT
PERSON      ADDRESS
John Smith     
Jane Smith      101 Oracle Drive
Note that the address for members of the family is not displayed until the last member of the family is returned. It repeats this process for each family.

Thank you all for any help you can provide! 31%

Published by: nage62587 on October 16, 2012 20:20

Hello

nage62587 wrote:
I did a lot of searching the forum and revised my question a bit hopefully make things a little clearer... I got a request Frank wrote and revised to meet my criteria.

The forum search (and other places on the web) is great! Not only are you things, but the people on this forum are more likely to help you when they see you do everything you can.
If you find something that you are trying to adapt, post a link to it. Seeing the correct way to adapt it can be very instructive.

essentially, if I can determine which members of the family someone is, then I can create a unique "FAMILY_ID" for them - once I have that, it would seem that I could then use the FAMILY_ID to combine their addresses and other information.

The problem I have is that if a RELATEDPERSON_ID is linked to a PERSON_ID (PERSON_ID related RELATEDPERSON_ID works very well), he attributes them a new FAMILY_ID, rather than include them in the correct family.

Here is my sql

WITH PERSONS AS
(
SELECT 'John Smith' AS person, 101 AS person_id, '1 Oracle Drive' AS address FROM dual union all
SELECT 'Jane Smith', 102, '1 Oracle Drive' FROM dual union all
SELECT 'Jack Smith', 103, '8 Oracle Drive' FROM dual union all
SELECT 'John Doe', 104, '10 Oracle Drive' FROM dual union all
SELECT 'Jane Doe', 105, '10 Oracle Drive' FROM dual union all
SELECT 'Pete Smith', 106, '1 Oracle Drive' FROM dual
)
,    RELATIONSHIPS AS
(
SELECT 101 AS person_id, 102 AS relatedperson_id FROM dual union all
SELECT 102, 101 FROM dual union all
SELECT 104, 105 FROM dual union all
SELECT 105, 104 FROM dual union all
SELECT 106, 101 FROM dual
)
, table_x
AS
(
SELECT   person_id         AS col1
,        relatedperson_id  AS col2
FROM     relationships
)
,     got_relatives     AS
(
     SELECT     col1
     ,     CONNECT_BY_ROOT col2     AS relative
     FROM     table_x
     CONNECT BY NOCYCLE     col1     =  col2
OR  col2  =  col1
)
SELECT       col1
,       DENSE_RANK () OVER ( ORDER BY  MIN (relative)
) AS family_id
FROM      got_relatives
GROUP BY  col1

I think the result is:

COL1   FAMILY_ID
-----   ----------
102     1
106     1
101     1
105     2
104     2

I suspect that whatever you copied originally had the FIRST keyword somewhere in the CONNECT BY clause.

Here's a way to do what you want:

WITH    all_relationships     AS
(
     SELECT  person_id
     ,     relatedperson_id
     FROM     relationships
    UNION
     SELECT  relatedperson_id     AS person_id
     ,     person_id          AS relatedperson_id
     FROM     relationships
)
,     got_relatives     AS
(
     SELECT     CONNECT_BY_ROOT person_id     AS person_id
     ,     relatedperson_id
     FROM     all_relationships
     CONNECT BY NOCYCLE     person_id       = PRIOR relatedperson_id
             OR          relatedperson_id  = PRIOR person_id
)
,     got_family_id     AS
(
     SELECT       person_id
     ,       MIN (relatedperson_id)     AS family_id
     ,       ROW_NUMBER () OVER ( PARTITION BY  MIN (relatedperson_id)
                                  ORDER BY          person_id     DESC
                                )            AS r_num
     FROM       got_relatives
     GROUP BY  person_id
)
SELECT       p.person
,       CASE
           WHEN  f.r_num  = 1
           THEN  p.address
       END          AS address
,       p.person_id
,       f.family_id
FROM       got_family_id  f
JOIN       persons      p  ON  p.person_id  = f.person_id
ORDER BY  family_id
,            person_id
;

Output:

PERSON     ADDRESS          PERSON_ID  FAMILY_ID
---------- --------------- ---------- ----------
John Smith                        101        101
Jane Smith                        102        101
Pete Smith 1 Oracle Drive         106        101
John Doe                          104        104
Jane Doe   10 Oracle Drive        105        104

Obviously, you don't have to see all the columns, I posted above in your first post, you said you wanted only person and address. In your last message, you said you wanted only person_id and family_id. change the SELECT main clause just as you wish.

I used the lowest person_id in each family as the family_id. You can use DENSE_RANK if you really want to have families numbered 1, 2, 3,..., but I suspect that you really care what family_id, as long as all the members of the family have the same value.

Table of relationship a you certain symmetrical, such as relationships

SELECT 104, 105 FROM dual union all
SELECT 105, 104 FROM dual union all

and a few asymmetric relations. For example, the only relationship involving person_id = 106 is

SELECT 106, 101 FROM dual

in other words, there is no line of mirror-image:

-- SELECT 101, 106 FROM dual union all   -- THIS IS NOT IN THE SAMPLE DATA

I guess it didn't matter to that. As long as 101 and 106 appear on the same line, they are in the same family, regardless of the person_id and which is the relatedperson_id.
The first thing I did above was therefore to ensure that all lines of the mirror-image were represented. This is what all_relationships does.
The following subquery, got_relatives, is probably what you mean to adapt, but you let him operators PREREQUISITES.
Got_family_id does the grouping and also calculates r_num to determine which is the last member of the family. Address of this member of the family only will be displayed in the main query.
You could combine got_family_id and the main request; you don't need really a subquery for this. It would be a little less coding, but I wrote it that way because I think it's a little easier to understand and maintain.

Tags: Database

Similar Questions

  • The best way to migrate data... Opinion please...

    Guys,

    I'm working on a solution for a customer who needs to move in a new SAN SAN data.

    Existing SAN is attached to the ESX server farm and is very close to the location of a new SAN. In your opinion, what is the best way to move data without interruption to VM ESX farm?

    I know we can do SAN replication, but right now, I don't know if the SAN is still the same provider.

    I know we can do Storage VMotion. In this case, he should present the new SAN somehow for the existing ESX servers.

    I know that we can probably use the converter to migrate virtual machines. This can take some time because we would go over WAN.

    What are your thoughts?

    Thanks in advance for your comments!

    With 4 TB of data, I wouldn't pass on the replication SAN or WAN solution of optimization of the property, and it can fly quickly with 4 TB of data that you use storage vmotion which basically depends on how your WAN links are, if if its quick to go for it.  There is not another free solution then this but if you can manage somehow to make backup of system state of all virtual machines, and then you can ship these backup data to remote locations and import, save back to clusters.  You have to do the actual calculation how fast you can Transfer data from the siteA-> siteB Mbps?  This will give you a clearer picture of how long it will take and picture plan here.  I could do that during maintenance or weekend windows that are the least impact to users and systems.

    If you found this information useful, please consider awarding points to 'Correct' or 'useful '. Thank you!!!

    Kind regards

    Stefan Nguyen

    VMware vExpert 2009

    iGeek Systems Inc.

    VMware, Citrix, Microsoft Consultant

  • The best way to pass data when browsing

    By taking the case where an action method is called and he chooses navigate to another page.

    What is the best way to pass data from bean of old media to the new (probably in the execution of the action method)? I am currently using beans of spring session scope as the support of beans.

    Take a look at:

    It goes between your commandLink or commandButton control:



  • Best way to migrate (RDBMS and Essbase) planning data between systems

    Hello. I need to migrate data from planning (fact data relationship of Planning (time and text comments) and Essbase). Migration of Essbase is relatively simple (migration of Essbase dimension data and then export and reload fact Essbase data) but I'm not sure on the best way to migrate relational data 100% done. Should I do a dump of the schema and then migrating the other system of planning? Thank you.

    It comes to 11.1.1.3.

    Like you're on 11.1.1.3 I suggest to use LCM to migrate the planning elements.
    An example on how is available at:-http://www.oracle.com/technology/obe/hyp_ss/SS11.1.1_PLNLCM/PLN_LCM_OBE.htm

    See you soon

    John
    http://John-Goodwin.blogspot.com/

  • The most effective way to log data and read simultaneously (DAQmx, PDM) high data rates

    Hello
     
    I want to acquire the data of several Modules cDAQ using several chassis to
    high data rates (100 k samples per second if possible). Let's say the measurement time is 10 minutes and we got a large number of channels (40 for example). The measured data is written to a PDM file. I guess, the memory or the HARD disk speed is the limits. For the user, there must be a possibility to view the selection of channels in a graph during the measurement.

    My question: what is the best and most effective way to save and read data at the same time?

    First of all, I use an architecture of producer-consumer and I don't want to write and display the data in the same loop. I expect two possibilities:

    [1] to use the 'DAQmx configure logging.vi' with the operation 'journal and read' to write the data to a PDM file. To display the data in a second loop, I would create a DVR samples documented and 'sent' the DVR for the second loop, where the data will be displayed in a graph (data value reference). This method has the disadvantage that the data of all channels is copied into memory. Correct me if I'm wrong.

    [2] use 'DAQmx configure logging.vi', but only with the "journal" operation to write the data to a PDM file. To view the selected data, I had read a number of samples of the TDMS file in the second loop (I'm currently writing the TDMS file). In this case, I have only one copy data from the selected channels (not), but there will be more HARD drive accesses necessary.

    What is the most effective and efficient solution in this case?

    Are there ways to connect and read data with high frequencies of sampling?

    Thank you for your help.

    You say that the measurement time is 10 minutes. If you have 40 channels and you enjoy all CHs at 100 kHz, it is quite a number of values.

    In this case, I always try to approach under the conditions of use. If a measure is only 10 minutes, I just connect all PDM data and create a graphic module that could be in the same loop of consumers where connect you the data. You can always work on the raw data files big offline afterwards, the extraction of all the information you need (have a look at the product called NI DIAdem: http://www.ni.com/diadem/)

    The main issue is that the user needs to see in the graph (or perhaps a chart can be useful too). Lets say that the graph is 1024 pixels wide. It makes no sense to show multiple data to 1024 points, Yes? Every second will produce you 100 data points k per channel. What is the useful information, which should see your username? It depends on the application. In similar cases, I usually use some kind of data reduction method: I use a moving average (Point by point Mean.VI for example) with a size of the interval of 100. This way you get 100 data points of 1000 per channel every second. If you feed your graph every second with these average values, it will be able to data points in 1024 of the store (as a default) by channel (curve), which is a little more than 10 minutes, so that the user will see the entire measurement.

    So it depends on the frequency at which you send data to the consumer. For example, collect you values 1024 by iteration of the producer and send it to the consumer. Here you can make a normal means calc or a bearing (according to your needs) and he draw a graphic. This way your chart will display only the values of the last 10 seconds...

    Once I programmed some kind of module where I use a chart and not a graph, and the user can specify the interval of the absolute timestamp that is traced. If the data size is larger than the size of the chart in pixels, the module performs an average calculation in order to reduce the number of data points. Of course, if you need to see the raw data, you can specify an interval that is small. It all depends on how you program zoom functions, etc... In my case I hade a rate of 1 Hz, so I just kept all data in RAM limiting the berries to keep 24 hours of data, so that technicians could monitor the system. In your case, given the enormous amount of data, only a file read/write approach can work, if you really need access to all of the RAW data on the fly. But I hope that the values of working capital means will be enough?

  • Best way to send data

    Hello

    It is more a question aside server, but I wanted to see what is the 'best practice' BB, and how the experts deal with this scenario.

    I intend to use servlets to serve data over HTTP to my BB application.  I need to retrieve a list of images more associated with name, description, dates, etc.  Sent a picture of servlet, I know that I can just send it as a byte array and the BB it can read the http input stream.  What to send the image and data? How can we separate the bytes of the image of the data bytes. Can I send a collection of images and associated data?

    What is the best way to address the issue?

    Thank you

    T

    I would answer the question by asking "what your infrastructure server-side looks like?

    If it's dot net, then you will probably want to take the path of least resistance and use JSR 172 or KSOAP.

    If you have more flexibility on the server, I recommend XML or JSON (as Peter says). JSON is going to be more compact, but the BB JSON tools are a bit sparse.

    On the other hand, SAX on the BB parser libraries are good enough for XML.

    I would say that in every project I've ever worked, I bypassed the weaknesses of the infrastructure server, rather than the reverse.

  • Best way to move data from one page to another within the app

    Hi all as I started to write that and was way to complicated to explain in full without a novel.

    Brief description. I use the query string that begins in the parent (from href in JSON) window must then be moved in the iframe url when moving to other pages to avoid security as a page and iframe is conflicting http and https. and are not accessible... It seems that there must be a better way.

    So my question is what is the best way in a platform open to move data from one page to another?

    You do it this way, it's in the menu.json, but when you work within the iframe and if your loading pages in applications using vairable relative URLS cross they come through.
    Discovered only this framework using the browser and you can se the construction and scope of all URLS used.

    The apps are sandboxed you cannot access aything in the parent frame, this is a feature of derliberate if something you can not do.

    As I said, you have a few methods to your own data storage space.

  • Best way to transfer data from a defective MacBook Pro?

    I just bought a new retina 15 "2.5 GHz Macbook Pro about 3 weeks ago and it has a faulty graphics card. I got Best Buy and they are willing to trade, but want $100 to transfer data. I'm not a techie BB have access to my data and I certainly do not think that I should pay $100 for it in any case. So I need to temporarily transfer the data on a disk external usb and contemplates what method to use, Carbon Copy Cloner or activation of Time Machine back up my hard drive. I don't have really a disk bootable since the new rMBP will have a system of fees so I don't know if CCC is the best choice, and I wonder if it is easier to migrate my data from a TM backup. Any ideas or suggestions would be welcome.

    I love the bootable clones.  They have perhaps not always necessary, but there are moments that the system does not boot and you want to try booting from the outside to either eliminate the internal drive as the question, or to repair internally by a fully-Boot installation.

    I don't use TM, nor do I need 'migrate' so I won't comment on those.

    I hope that you will buy AppleCare during this first year.  As you have seen, sometimes bad parties spend and you want 3 years of hardware coverage for these periods.

  • Best way to sort, organize and delete the RAW files?

    Hello world

    I am looking for a way to simply sort through a series of close to 600 raw on a Mac files.  I'm very new to this, so bare with me :)

    At first, I downloaded the raw files from my SD card in my iPhoto which led to all the files in JPEG format.  So I decided that it was no good, so I tried to put the CR2 files in an organized folder and dragging the file in the preview.  I find that the preview is verrrry slow.  I'm looking for the best way to sift through

    These files and choose the best/remove the worst.  What do you recommend? I plan on changing the files in Photoshop and Lightroom.

    Is there a place in Lightroom where I can get this sort of process I speak of? Or do I have to download another type of organization program offline?

    Thank you

    If you mean the word "out" the process of getting rid of the 'bad' images, and Yes keeping 'good' images, Lightroom can do this as many other programs.

    In Lightroom, you would import the 600 images together, and when the import process is completed (this may take a few minutes or more depending on your computer), then you can scroll through each image and if you decide that it is a 'bad' image that you don't want to keep, press X, then go to the next image. Once this is done, you can delete all the photos that you have assigned X in a single action.

    Other programs will probably allow you to do this "spell" faster, but I can't talk about them.

  • ESXI hosts 5-2 and the best way to configure vSwitch and Nic redundancy

    Hi all

    Could someone help me find the best way to configure the vNetwork on 2 5 ESXI host for redundancy.  Once I managed to correctly configure the 2 hosts I will seek to use the same installation process for 6 guests.  3 sites with 2 hosts on each site managed all of vCentre Server

    I have 2 DL380 G7 servers 5 ESXI installed on a class 10, 8 GB SD card, I'm looking to install VSA on the 2 hosts on each server with 4 TB of internal storage (8 * 600 GB 10 k SAS).  Each server has a gigabit integrated 4-port NIC and I installed a 2nd NIC gigabit PCIe 4 ports, I also x 2 16 switches of port with Layer 3 routing.  I use vSphere 5 Standard acceleration Kit I am looking to use vCentre server for managing, vMotion for maintenance, HA for the failover and VM 10/15 (vCentre Server Std, DB SQL to vCentre Server, Exchange 2010, SQL Server 2008 R2, IIS Intranet, Helpdesk, 1 DC, 2 Domain Controller, AV Server, WSUS, SCCM Server, and Terminal Server server).

    What would be the best way to install and configure the network for performance and redundancy and am I missing something?

    My thoughts are, Teaming: -.

    vCenter - vswitch0 - port1 on NIC1 and port1 on NIC 2 - port1 on NIC1 to physical switch 1 - port1 on NIC2 to physical switch 2

    vMotion - vswitch1 - port2 on NIC1 and port2on NIC 2 - port2 on NIC1 to physical switch 1 - port2 on NIC2 to physical switch 2

    HA - vswitch3 - port3 on NIC1 and port3on NIC 2 - port3 on NIC1 physics 1 - port3 on NIC2 to physical switch 2 switch

    VM - vswitch4 - port4 on NIC1 and port4on NIC 2 - port4 on NIC1 to physical switch 1 - port4 on NIC2 to physical switch 2

    or do I need an additional NIC on each server to hit the VM 12-6 VM for 2 ports on 2 NETWORK interface card, or maybe something else I missed?

    Thank you

    In your case, to keep it simple and what I can say here is what would be my recommendation:

    3 standard vSwitches

    vSwitch0:

    • Management - vmnic0, vmnic2

    vSwitch1:

    • vMotion - vmnic1, vmnic3

    vSwitch2:

    • VM network - vmnic4, vmnic5, vmnic6, vmnic7

    The only reason why that I didn't split the VM in other network adapters shipped is because the difference in the types of the DL380 adapter shipped and the PCIe quad.

  • What is the best way of updating OID and similar attributes via OIM OAM LDAP?

    Our environment uses IOM provisioning of an OID LDAP that is used by OAM.

    Purposes of inheritance, must complete the Oracle "orcl *" attributes and OAM "ob * ' in cases where they use the same or similar.

    Example: When a user is disabled in IOM we define orclisenabled = 'false' and obUserAccountControl = 'DISABLED' in OID

    What is the best way to achieve the IOM? My first thought was to write a custom adapter, similar to the OID adapter change user out-of-the-box, which takes in charge the changing multiple attributes.

    Is there a better way?

    You can create two tasks that will modify two attributes to OID.

    On the task of the user to disable call task1, success of task1, Task2 (using the task to feature generate).

    You can make use of OOTB connector only.

  • Best way to store data on local RAID

    Hi all

    Simple question for you, I'm new to vmware and I whant to be sure that what is the best way to set up my server.

    I have a proliant dl380 G5 with 5 HD SAS300G. Idon't know ask another external to associate with this server hardware.

    What is the best way to configure vmware ESXi 3.5 and my different raid for 800G of file storage and 300G the operating system image.

    I read on LUN RAW but I can't select it in my ESXi (because I have no san or because its esxi?)

    Thx for the help.

    AlX

    5 x 300 GB drives, I'd say you're having trouble to adapt to your VMs on here. If you use RAID-5, which will allow the failure of disks 1, you get 1200 MB of storage (a little less when formatted) ask for 1100 MB, but then you will need enable General newspapers and fresh VMware vswp such as snapshots. I would like to add another disk of 300 GB in this Bay RAID-5 if possible.

    If the virtual machines are intensive disk, you should consider the requirement of IOPS / s. If these are 15 k disks, the table should give you around 600IOPS - is it enough?

    Just format the local array with VMFS and use VMDK for all machines

  • Best way to stack 2d and 3d in a Composite

    Hey everyone, apparently, that I'm really on a binge 3d with Photoshop CS6. It is just a blast! In any case, I seek to understand the best way to stack the 2d and 3d set elements in the same document and maximize the editibility...

    I'm working on a digital décor, and I got the part of the room in 3d and 2d. Thus, for example, the walls, ceiling, floor, table and chairs, corner Hutch are 3D, but the ordinary Hutch and buffet server are in 2d.

    -Table and chairs must be in the foreground

    -Regular school and the server are behind the table and chairs

    -The corner Hutch must be behind all the other pieces of furniture

    Then, you can be now catching on to my little idea... I want that all 3D elements to have the same lighting and reflect on one another, however, I need layers 2d to adapt somehow 'inbetween' 3d objects. The simple question is, how is that possible? Is there a way to mask at the forefront of 3D objects so that they appear to be from 2D objects? My only current solution is the following:

    -Move the 3D layer (the scene) completely set backwards

    -Take the final decision on the position of the table and chairs

    -Hide all other STS on the scene outside the table and chairs

    -Made, so that the edges are clean

    -Load the 3D layer newly rendered as a selection (which support the table and chairs only)

    -Create a new layer from the selection of the layer of 3D rendering or hide the 3D layer so that the table and chairs seem to be in the foreground

    I had not expected this post is long, so I apologize... but as you might assume, the above method means that I can not edit/reposition/scale like 3D at all, so if there is a better way to stack these room elements I would be extremely grateful to one who knows how!

    Thank you!

    Andy

    First of all, I was wrong on layers that need to be hidden to prevent their appearance being included in a new postcard. I misunderstood something that happened when you work with a 3D document earlier.

    Materials supports the transparency/opacity. A material has a setting for opacity that can use a texture. The percentage of opacity material is multiplied by the opacity of the texture.

    When a postcard is created, the opacity of its default hardware control will contain an instance of the texture that is in the diffuse control, which is a file that contains BODY 2D layer (pixels, dynamic object or group) from which the postcard was created.

    Here is an example. Two solid pyramid cracks and a postcard of a cloud layer:

    The shape layer where the postcard was created is in the document that opens by selecting Edit buttons highlighted above in the drop-down list:

  • What is the best/best way to cut a folder of videos for only the useful parts?

    Hello

    I have a folder of 30 + videos that have many unstable parts for them. What is the best way to cut the unusable parts in transfer?

    Best

    Hi Gorazdr27768010,

    I have a folder of 30 + videos that have many unstable parts for them. What is the best way to cut the unusable parts in transfer?

    Use the media browser and import the required clips.

    Import media into Premiere Pro | Adobe first Pro CC tutorials

    And you can apply points in the source monitor and exit before adding them to the timeline.

    Score Points in the Source monitor output and first Pro CC - YouTube

    Answer please, if this is useful.

    Thank you

    Ilyes Singh

  • simple, the best way to import data, music computers betwqeen from pc to pc to laptop

    I want to transfer music from my pc to my phone, what is the (best) easiest way?

    Based computers involved and the amount of data involved. You can transfer data using a Flash DRIVE, burn a CD/DVD-R, on the network, pulling on the hard drive and attaching it to the target machine, etc.. It is not uniform so if you want a more specific answer, provide the missing details on your systems and your data. MS - MVP - Elephant Boy computers - don't panic!

Maybe you are looking for