The JSON Foursquare data analysis
Hello world
I'm trying to parse a JSON data I got foursquare place search webservice.
What I really need, is only the City and the country of the result value of the research of venues. I'm not trying to bind the data to a lisy
Any thoughts? Thank you!!
The code below does not work
JsonDataAccess jda; QVariant jsonQVariant=jda.load(networkReply).value(); QVariantList venueList = jsonQVariant.toMap()["venues"].toList(); foreach (QVariant v, venueList) { QVariantMap venueMap = v.toMap(); showToast(venueNameMap.value("city").toString()); }
Below is the result of Foursquare sample json I need to analyze.
{"meta":{"code":200}, "response": {"venues": [{"id":"4cd8a50c15d8b60c4e31230e", "name":"HSBC Bank", "contact":{}, "location": {"lat":4.5850378697477656, "lng":92.70275974273682, "distance":77, "city":"Kuala Lumpur", "state":"Selangor", "country":"Malaysia", "cc":"MY"}, "canonicalUrl":"https:\/\/foursquare.com\/v\/kompleks-asia-mega-mas\/4cd8a50c15d8b60c4e326a0e", "categories": [{"id":"4bf58dd8d48988d130941735", "name":"Building", "pluralName":"Buildings", "shortName":"Building", "icon":{"prefix":"https:\/\/foursquare.com\/img\/categories_v2\/building\/default_","suffix":".png"}, "primary":true}], "verified":false, "restricted":true, "stats":{"checkinsCount":2872, "usersCount":858, "tipCount":6}, "specials":{"count":0,"items":[]}, "hereNow":{"count":1,"groups":[{"type":"others","name":"Other people here","count":1,"items":[]}]}, "referralId":"v-1372731815"}, .......
Try something like this:
QString jsonString = QString("{response: {venues: [{},{}]}}"); bb::data::JsonDataAccess jda; QVariant list = jda.loadFromBuffer(jsonString); QVariantMap result = list.value(); QVariantMap response = result.value("response").toMap(); QVariantList venueList = response.value("venues").toList(); foreach (QVariant venue, venueList) { QVariantMap venueMap = v.toMap(); }
Tags: BlackBerry Developers
Similar Questions
-
HTML5 Panel pass the JSON javascript data
Hi Bruce,.
I'll soon work on a panel of HTML5 that will receive JSON data via a URL call.
Is there a way to transmit these JSON data to the javascript part of the Panel, so I can access all objects?
Thank you
Kelly
Thomas is right.
This can help, with passing JSON back from ExtendScript, JavaScript:
Panels HTML tips: #5 passage of JSX to HTML objects. Photoshop, etc.
Also relevant:
-
view the json data in the custom list field
Hi, I did analysis json and I created the custom list field. Now, I want to display only the data analyzed in my custom list field. I'll post my analyzed data from json and here is the code for my custom list field
data analyzed.
I have THREE channels of json and I want to show content tittle and date in the list filed. I'll post the screenshot of my list.JSONArray jsnarry = new JSONArray(responce); System.out.println("\n--length----- "+jsnarry.length()); //System.out.println("....................................................="); for (int i = 0; i < jsnarry.length(); i++){ JSONArray inerarray = jsnarry.getJSONArray(i); //System.out.println("\n-inerarray-values----- "+inerarray.getString(i1)); String TITTLE = inerarray.getString(1); String CONTENT = inerarray.getString(2); String DATE = inerarray.getString(3); System.out.println("TITTLE= "+TITTLE); System.out.println("CONTENT= "+CONTENT); System.out.println("DATE= "+DATE); }
output
[0.0] --length----- 2 [0.0] [0.0] -innerarray-length----- 6 [0.0] TITTLE= BJP State President Sanjay Tandon's visit to Amita Shukla's Home [0.0] CONTENT= BJP President Chandigarh Sanjay Tandon at Amita Shukla's Home [0.0] DATE= 2013-01-04 [0.0] ................................................ [0.0] TITTLE= Sanjay Tandon at mahasamadhi of Satya Shri Sai baba. [0.0] CONTENT= BJP Chandigarh President, Sanjay Tandon mahasmadhi of Sri Satya Sai Baba.(Andhra Pradesh) [0.0] DATE= 2013-01-13
and my custom list field
super(NO_VERTICAL_SCROLL); String TITTLE="TITTLE"; String CONTENT = "CONTENT"; String DATE = "DATE"; v.addElement(new ListRander(listThumb, TITTLE, CONTENT,DATE, navBar)); myListView = new CustomListField(v){ protected boolean navigationClick(int status, int time) { //Dialog.alert(" time in milisec :" + time); return true; } };
CustomListField.java
public class CustomListField extends ListField implements ListFieldCallback { private Vector _listData; private int _MAX_ROW_HEIGHT = 100; public CustomListField (Vector data) { _listData = data; setSize(_listData.size()); setSearchable(true); setCallback(this); setRowHeight(_MAX_ROW_HEIGHT); } public int moveFocus (int amount, int status, int time) { this.invalidate(this.getSelectedIndex()); return super.moveFocus(amount, status, time); } public void onFocus (int direction) { super.onFocus(direction); } protected void onUnFocus () { this.invalidate(this.getSelectedIndex()); } public void refresh () { this.getManager().invalidate(); } public void drawListRow (ListField listField, Graphics graphics, int index, int y, int w) { ListRander listRander = (ListRander)_listData.elementAt(index); graphics.setGlobalAlpha(255); graphics.setFont(Font.getDefault().getFontFamily().getFont(Font.PLAIN, 24)); final int margin =5; final Bitmap thumb= listRander.getListThumb(); final String listHeading = listRander.getListTitle(); final String listDesc= listRander.getListDesc(); final String listDesc2= listRander.getListDesc2(); final Bitmap nevBar = listRander.getNavBar(); //list border graphics.setColor(Color.BLACK); graphics.drawRect(0, y, w, _MAX_ROW_HEIGHT); graphics.drawBitmap(margin, y+margin+10, thumb.getWidth(), thumb.getHeight(), thumb, 0, 0); graphics.drawText(listHeading, 3*margin+thumb.getWidth(), y+margin); graphics.setColor(Color.BLACK); graphics.drawText(listDesc, 3*margin+thumb.getWidth(), y+ margin+30); graphics.drawText(listDesc2, 3*margin+thumb.getWidth(), y+ margin+60); } public Object get(ListField listField, int index) { String rowString = (String) _listData.elementAt(index); return rowString; } public int indexOfList (ListField listField, String prefix, int start) { for (Enumeration e = _listData.elements(); e.hasMoreElements(); ) { String rowString = (String) e.nextElement(); if (rowString.startsWith(prefix)) { return _listData.indexOf(rowString); } } return 0; } public int getPreferredWidth(ListField listField) { return 3 * listField.getRowHeight(); } }
Listrander.Java
public class ListRander {}
private bitmap listThumb = null;
incognito bar Bitmap = null;
private String listTitle = null;
private String listDesc = null;
private String listDesc2 = null;public ListRander (Bitmap listThumb, String listTitle, String listDesc, String listDesc2, Bitmap navBar) {}
this.listDesc = listDesc;
this.listDesc2 = listDesc2;
this.listThumb = listThumb;
this.listTitle = listTitle;
this.navBar = bar navigation;
}
public getListThumb() {Bitmap image
Return listThumb;
}
{} public void setListThumb (listThumb Bitmap)
this.listThumb = listThumb;
}
public getNavBar() {Bitmap image
return the navigation bar;
}
{} public void setNavBar (navigation bar of the Bitmap)
this.navBar = bar navigation;
}
public String getListTitle() {}
Return listTitle;
}
{} public void setListTitle (String listTitle)
this.listTitle = listTitle;
}
public String getListDesc() {}
Return listDesc;
}
{} public void setListDesc (String listDesc)
this.listDesc = listDesc;
}
public String getListDesc2() {}
Return listDesc2;
}
public void setListDesc2 (String listDesc2) {}
this.listDesc2 = listDesc2;
}
}You seem to have two problems here and are confusing them. You must break the problem into two parts
(1) extract the data from the entry and create the objects you want to display
2) display in a list, a set of objects.
Let's get the sorted first premiera.
I will suggest what to do here, but in practice, you might actually think about this yourself as part of the design phase of your application. You should do this, not me, because then you will have all the information available. At the present time, I have just what you said, which is not much. So maybe what I'm telling you is not correct for your application. Only you can decide that. And be blunt here, you should have decided this before you start coding. Do you want you could lead down the wrong path. You must think of your application as a home - as the architect must design all the rooms, and how they will be built, before you start building the House. You do not, then we are building the rooms on the fly. Who knows if they will be fit at home?
In this case, I think you need to create an object that represents each of the elements in the internal array of new data. call this object
NewsItem
This object will have attributes, such as its title, content, date, the linked image and so on, each of whom have will get and set methods. While you treat each inner element fetch you the associated entry and update the object.
When you have finished the inner loop of processing, you now have a complete
NewsItem
Object, so you will add it to a collection, an array of NewsItem objects, call this _newsItems. You will create it at the beginning - you know how many entries it takes because it is the number of entries in your outdoor table.
So before you start to deal with JSON, create your table and the 'index' value of 0.
Once you have created your Newsitem, add this in the table to the position 'index' and increment "index".
And once you have analyzed all the JSON, you will have a complete picture. This is part 1 finished!
And note in your drawListRow, you are given a clue - that is the index in your tables in _newsItems. So you can easily find which entry to view and display it correctly. But it is part 2 and is a separate issue.
-
Reorganization of the JSON data
Hi all
I need help, reorganizing the data I receive from a JSON Unflatten.
I have a JSON string of a call to API (JSON unflatten) that contains a name (String), a digital and a channel (also a string).
The JSON Unflatten automatically directs the data it receives in order, the data is provided. The data is information on the channels. The problem arises when not all channels are provided. If out of 12 channels, the JSON receives information on only 4 channels, he ordered the 1 to 4 in the table. However, 1 may not correspond to channel 1 and can actually be channel 5 and 3 might not correspond to channel 3 so on and so forth. I want to reorganize the JSON based on reading channel (a channel named canal). So, if in the table that Channel 4 is actually 12 (based on the channel of the channel, i.e., channel-12), I would link data from Channel 4 to profile 12 in a new table, so that the order of the channels still dependent on chain channel and not according to the order of unflatten of JSON.Please help as I am running out of ideas on how to solve this problem.
Thank you.
That's what is lying to you?
PS: Try to avoid the multiple index table - it is preferable to use arrays and loops
-
Help with several ListItemComponents with the JSON data
I can't find a way to use a specific ListItemComponent for a specific type of element JSON. I can only find examples for XML data that works. But I tried to do the same thing for JSON and does not work.
The examples that I found (nothing for JSON):
- https://supportforums.BlackBerry.com/T5/native-development-knowledge/ListItemComponent-types-when-us...
- https://developer.BlackBerry.com/native/reference/Cascades/bb__cascades__listview.html
Basically, I want to display a specific ListItemComponents for the JSON element in a table. One of the Photos, videos, etc...
QML
listItemComponents: [ ListItemComponent { type: "photos" content: Container { horizontalAlignment: HorizontalAlignment.Fill background: Color.Green Label { text: ListItemData.text horizontalAlignment: HorizontalAlignment.Fill } } }, ListItemComponent { type: "videos" content: Container { horizontalAlignment: HorizontalAlignment.Fill background: Color.Yellow Label { text: ListItemData.text horizontalAlignment: HorizontalAlignment.Fill } } }, ListItemComponent { type: "status" content: Container { horizontalAlignment: HorizontalAlignment.Fill background: Color.Blue Label { text: ListItemData.text horizontalAlignment: HorizontalAlignment.Fill } } }, ListItemComponent { type: "link" content: Container { horizontalAlignment: HorizontalAlignment.Fill background: Color.Red Label { text: ListItemData.text horizontalAlignment: HorizontalAlignment.Fill } } } ]
JSON
[ { type : "photos", text: "this is a photo" }, { type : "video", text: "this is a video" }, { type : "status", text: "this is a status" }, { type : "link", text: "this is a link" } ]
This works fine if I add it to a ListView dataModel. GroupDataModel or ArrayDataModel.
Can someone please help? Thank you very much in advance!
Redefine the itemType for the data model.
// Item type mapping function itemType(data, indexPath) { if (indexPath.length == 1) { return 'header'; } else { switch (Number(data.type)) { case 0: return 'this_item'; break; case 1: return 'that_item'; break; case 2: return 'another_item'; break; case 3: return 'whos_item'; break; case 4: return 'item_item'; break; case 5: return 'yet_another_item'; break; case 6: return 'last_item'; break; } } }
DataModel is a GroupDataModel
dataModel: {StatisticsModel}
Class ExampleModel: public bb::cascades:GroupDataModel
-
The JSON data displaying on Blackberry Simulator or device not but waving poster
Hello
I use ripple to develop an application that makes jQuery/Ajax calls to a Web service that returns jSON. The returned data show in Wellington, but not on the emulator or the device. I use the 9800 torch Simulator, php to create the jSON and jQuery.
Problem has been resolved. I made a mistake (a rookie mistake real) earlier in the code. I left out 'http://www.mydomain.com' when I did the ajax call to my php page. Everything works now. Thanks to all for trying to help with this.
-
problems with the JSON data loading
Hello
I have follow-up Simon Widjaja (EDGEDOCKS) YouTube lesson for the JSON data loading external. But I am not able to connect at least the console database.
I get this error: "error avascript in the handler! Type of event = element.
Content.JSON is located in the folder. Data there are very simple:
[
{
"title": "TITLE 1",
'description': "DESCRIPTION 1"
},
{
"title": "TITLE 2",
'description': "DESCRIPTION 2"
}
]
And here's the code in edgeActions.js:
(function ($, edge, compId) {})
Composition of var = Edge.Composition, symbol = Edge.Symbol; alias for classes of edge commonly used
Edge symbol: "internship."
(function (symbolName) {}
Symbol.bindElementAction (compId, NomSymbole, 'document', 'compositionReady', function (sym, e) {})
external json data loading
$.ajax({)
type: 'GET ',.
cache: false,
URL: "content.json",
data type: 'json ',.
success: function (data) {console.log ("data:", data);},
error: function() {console.log ("something went wrong") ;}}
});
});
End of binding edge
(}) ('step');
End of edge symbol: "internship."
}) (window.jQuery |) AdobeEdge. ($, AdobeEdge, "EDGE-11125477");
I tried $getJSON also as mentioned in the youtube video.
Please note: I do not understand 'something was wrong' also connected.
I use the free trial version. It is a limitation in the free trial version?
Well, same question as here: loading external data using ajax
Cannot run the jQuery file is missing, then $. ajax() or $. getJSON().
You must add the jQuery file as shown below:
See: http://jquery.com/download/
Note: Without loading the jQuery file, you can use these functions: API JavaScript Adobe Edge animate CC
-
DeserializeJSON - is there a limit on the size of the JSON data that can be converted?
, I have a few valid JSON data which are converted successfully by DeserializeJSON... until it gets to a certain size, or is certainly what seems to be the case. The breaking point seems to be somewhere in the neighborhood of 35 000 characters... about 35 KB. When the conversion fails, it fails with a "JSON parsing failure: unexpected end of the JSON string" message. And if the conversion fails, the JSON data are deemed to be valid by tools like this: http://www.freeformatter.com/json-validator.html.
So, is there a limit on the size of JSON data can be converted by DeserializeJSON?
Thank you!
Thank you, Carl.
The JSON is presented in its entirety, confirmed by Fiddler. And it is in fact being recorded successfully to a SQL Server nvarchar field too. I can validate that saved JSON.
I'm actually grabbing the JSON to convert directly from SQL Server and your comments / ideas drove me on the path of resolution.
It turns out that the JSON is truncated before going to the DeserializeJSON command, but it was the attraction of cfquery which was truncated. The difficulty was to allow the 'long text retrieval (CLOB)"for this data source in CF Admin. I would never forward or even knew that he had this setting.
Thanks again for your comments!
-
Hello
Oracle 10.2.0.4
Red Hat Linux
We have a report that is not scaling as the company increases in size and I am looking for some advice to setting / better approaches to a problem.
The report needs to find employees who while working in a dangerous environment (high / low temperature, wet conditions, etc.) did not have relevant qualifications.
The report creates a temporary table containing the EMP_ID, start date in a dangerous environment, and the end date in this environment. There are also about 200 bytes of other areas.
Then look for any day during this period where there is not a comprehensive relevant qualification / certification. Certifications are stored in a simple table, containing the EMP_ID, the ID of the qualification, a dt start and an option end dt. If the end dt is null, then the qualification is supposed to be in place forever.
CREATE TABLE EMP_DANGER AS
SELECT A.EMP_ID
, to_date('01-JAN-2005','dd-mon-yyyy') 'START_DT '.
, to_date('01-JAN-2016','dd-mon-yyyy') 'END_DT '.
, 'FILLERFILLERFILLERFILLERFILLERFILLERFILLERFILLERFILLERFILLERFILLERFILLERFILLERFILLERFILLERFILLERFILLER' 'FILLER '.
OF (to_char (rownum) SELECT 'EMP_ID' from dual connect by level < = 100000) has
/
CREATE TABLE EMP_QUAL AS
SELECT A.EMP_ID
B.QUAL_ID
, to_date('01-JAN-2005','dd-mon-yyyy') 'START_DT '.
, to_date('01-JAN-2016','dd-mon-yyyy') 'END_DT '.
OF (to_char (rownum) SELECT 'EMP_ID' from dual connect by level < = 100000) has
, (SELECT DECODER (ROWNUM, 1, 'A', 2, "B", "C", 3, 4, has ', 5, 'E') 'QUAL_ID' from dual connect by level < = 5) B
/
DELETE FROM EMP_QUAL WHERE EMP_ID = '3' AND QUAL_ID = 'B '.
/
UPDATE EMP_QUAL SET END_DT = 1 January 2013 "where EMP_ID = '3' and QUAL_ID = 'A'
/
UPDATE EMP_QUAL SET START_DT = 3 January 2013 ' where EMP_ID = "3" AND QUAL_ID = "C".
/
ENGAGE
/
-Assuming for the moment that the A, B or C skills are necessary if we're working in a dangerous environment
-You can see that EMP_ID 3 does not have a valid qualification for January 2, 2013.
-For the moment, we have a few SQL that performs
SELECT A.EMP_ID, B.DATE_SCAN, "no qualification valid."
OF EMP_DANGER HAS
, (SELECT to_date('01-JAN-1980','dd-mon-yyyy') + rownum 'DATE_SCAN' from dual connect by level < = 10000) B
WHERE B.DATE_SCAN BETWEEN A.START_DT AND A.END_DT
AND NOT EXISTS (SELECT 1 C EMP_QUAL
WHERE C.EMP_ID = A.EMP_ID
AND C.QUAL_ID IN ('A', 'B', 'C')
AND B.DATE_SCAN > = C.START_DT
AND (B.DATE_SCAN < = C.END_DT OR C.END_DT IS NULL))
-BUT - now we have a society where people who have worked since the 1980s, the Cartesian join near the DATE_SCAN
-really starts to make the system grind slowly through the scans. Any suggestions?
Hello
According to you, the actual data, it could be much faster:
WITH coverage AS
(
SELECT emp_id
, DATE ' 1979-12-31' AS start_dt
, start_dt - 1 AS end_dt
Of emp_danger
UNION ALL
SELECT emp_id
end_dt + 1 AS start_dt
, DATE ' 9999-12-30' AS end_dt
Of emp_danger
UNION ALL
SELECT emp_id
start_dt
, NVL (end_dt, DATE ' 9999-12-30') AS end_dt
Of emp_qual
WHERE qual_id IN ('A', 'B', 'C')
)
got_gap_length AS
(
SELECT emp_id
, start_dt - 1 AS gap_end_dt
, start_dt - (MAX (end_dt) OVER (PARTITION BY emp_id
ORDER BY start_dt
RANGE BETWEEN UNBOUNDED PRECEDING
ET.5 PRECEDING
)
+ 1
) AS gap_length
Covered UNDER warranty
)
SELECT d.emp_id - or whatever it is you want the columns
, g.gap_end_dt - (g.gap_length
-1
) AS gap_start_dt
g.gap_end_dt AS gap_end_dt
OF got_gap_length g
JOIN emp_danger d.emp_id d = g.emp_id
AND d.start_dt<=>=>
AND d.end_dt > = g.gap_end_dt - g.gap_length
WHERE gap_length > 0
ORDER BY emp_id
gap_end_dt
;
This produces a line of output for each group of consecutive days of risk not covered for each emp_id. With your sample data:
EMP_ID GAP_START_D GAP_END_DT
---------------------------------------- ----------- -----------
3 January 2, 2013 January 2, 2013
If you want to have a clean line of output for every non-covered day, he can adapt easily to the above query.
Solomon and I seem to disagree on your needs.
-
Questions about to determine the size of the array of data storage with flexibility
I try to use the CVI function below to load data from .csv file to the table that I was intending to use later:
filereturnvalue = FileSelectPopup ("c:\\Users\\Desktop\\FileDirectory", "*.csv", "", 'Select the data file', VAL_LOAD_BUTTON, 0, 0, 1, 0, LoadFilePath);
FileToArray (DATAArray, VAL_CHAR, LoadFilePath, MaxDataSize, 1, VAL_GROUPS_TOGETHER, VAL_GROUPS_AS_COLUMNS, VAL_ASCII);
For the data table "predetermined", it is quite straightforward, because the size of the table is already set. However, I wonder if there is anyway I could use CVI directly import data and determine the size of the array on the fly? Which means that the user need not know the length of the data, and it could be determined then to import the file.
Hello
for ASCII files, you can do so in any first read the file and counting the newline characters, something like:
int file_handle;
int index;
int size = 0;
char line_buffer [80];If (OpenFile (...) > 0)
{
While (ReadLine (...) > 0)
{
size ++;
}
}
CloseFile(f) (file_handle);This way you can determine the number of rows you have, but you must have an idea of how long a line may be, one number per line or 32000 because you must specify the size of your line_buffer.
In the next step you analyserions monotube for occurrence of your column separator (say a ;) to determine the number of columns (separators found + 1).
-
Data analysis for peak max after sampling offsets
In the attached VI, each channel is to measure a point of acceleration, which appears as a global maximum on the chart. Ideally, what I want to happen is for both tables to search for their highlight (global max) and the difference in time between these two peaks are shown on the screen after that program ended. However, what I have right now is a sort of measure in real time. Any suggestions on going about doing this?
Thank you
John
Hey Johnny,
Every time that you run in the loop, you recalculate the value of the difference. When the loop is running again, it clears the value you had before. There are tons of ways that solve you the problem. A lot of it depends on if you need data analysis while you're in the loop. If you want to check it after it's done running, you can connect just the difference on the edge of the while loop and turn on indexing by right-clicking on the tunnel. This will save the value of the difference of each iteration of the while loop. You can also save under condition maybe (maybe you only care about the difference if there is between certain values) in writing to a text file by using the writing of text file or by building an array in a structure of housing. These are just some thoughts.
-
How to add the JSON.jar file in bb OS 4.5?
Hi all
I have a problem in the analysis of a json file in BlackBerry os 4.5 and 5.0. I added the JSON.jar file in my project. I am trying to execute that project throws "Eception Exception: java.lnag.NoClassDefFound Error". I think that the jar file does not work correctly. How to solve this problem?
Thank you
Yes, add org folder to the source of your application.
-
The use of SQLite to store the JSON object
Hello
Can someone guide on how to store a JSON object returned after the use of the Adobe provide parser JSON which returns the native object
as
resultobject = JSON.decode (data)
Where the data is a JSON string.
The 'result' object is a complex subject, but I would like to keep this object because it is in the SQLite database.
I tried BLOB for the column type but doesn't seem to work to store this object. Ideally, I prefer to use this way to avoid the redesign of the main code as I am currently using the SharedObject method.
Any ideas?
Can you just store the data as a BLOB string?
-
Parses the JSON response in blackberry
Hi, I want to analyze the response from the server in JSON format.
I did some searching on Google but I can not find any library or jar kind of thing. Everywhere, it is supplied code open source in the zip file.
How can I achieve this?... & If there is no available for blackberry and then how to use any container that opens the source code in my application?
I used 1 jar file org.json, but it gives me the "module not found" exception
Help, please...
Thanks in advance.
Download the zip package org.json.me file and import it into your project...
can use it as...
External JSONObject = new JSONObject (resp); an RESP is JSON response I get
JSONArray ja = outer.getJSONArray ('DATA');
JSONArray arr = ja.getJSONArray (0); to access each element in the JSON array.If the response contains the JSON array... A JSON array will be of the form...
'DATA': ['abc', 'def'] / / DATA isarray and 'abc' and 'def' name are elements of the table...
-
How to calculate daily, hourly average for the very large data set?
Some_Timestamp
"Parameter1" Parameter2 1 JANUARY 2015 02:00. 00:000000 AM - 07:00 2341.4534 676341.4534 1 JANUARY 2015 02:00. 01:000000 AM - 07:00 2341.4533 676341.3 1 JANUARY 2015 03:04. 01:000000 PM - 07:00 5332.3533 676341.53 1 JANUARY 2015 03:00. 01:000046 PM - 07:00 23.34 36434.4345 JANUARY 2, 2015 05:06.01:000236 AM - 07:00 352.33 43543.4353 JANUARY 2, 2015 09:00. 01:000026 AM - 07:00 234.45 3453.54 3 FEBRUARY 2015 10:00. 01:000026 PM - 07:00 3423.353 4634.45 FEBRUARY 4, 2015 11:08. 01:000026 AM - 07:00 324.35325 34534.53 We have data as above a table and we want to calculate all the days, the hourly average for large data set (almost 100 million). Is it possible to use the sql functions of analysis for better performance instead of the ordinary average and group of? Or any other way better performance?
Don't know if that works better, but instead of using to_char, you could use trunc. Try something like this:
select trunc(some_timestamp,'DD'), avg(parameter1) from bigtable where some_timestamp between systimestamp-180 and systimestamp group by trunc(some_timestamp,'DD');
Hope that helps,
dhalek
Maybe you are looking for
-
can not find the option to search my country only - was available on older versions
I was able to change the location of the research in South Africa only in previous versions. Y at - it an option that I am missing or it is no longer available?It's under 'Search tools' before.Thank you for thisBruno
-
Satellite Pro 4270, information and specifications
can someone tell me where I could find information and specifications on this laptop I'm particularly interested to know if there is an internal microphone of the laptop... Thank you
-
HP Pavilion p6 - 2220 produced t NumberB3F79AV #ABA Windows 7 Home Premium 64-bit Service Pack 1 Windows Firewall and Microsoft Security Essentials HP Support Assistant 8.1.40.3 HP Solutions Framework 12.0.30.219 Support Since HP Support Assistant up
-
Original title: MSN Money I have problems to do on MSN Money. It freezes and takes some time to appear. Is there something I can do?
-
Vista speech recognition Forum
I'm looking for best Vista speech recognition Forum for Big Al 199 for learning purposes