Categories
- All Categories
- 72 Oracle Analytics News
- 7 Oracle Analytics Videos
- 14K Oracle Analytics Forums
- 5.2K Oracle Analytics Idea Labs
- Oracle Analytics User Groups
- 39 Oracle Analytics Trainings
- 58 Oracle Analytics Data Visualizations
- 2 Oracle Analytics Data Visualizations Challenge
- 3 Oracle Analytics Career
- 4 Oracle Analytics Industry
- Find Partners
- For Partners
Option for Data Viz data sets to be in-memory columnar storage
Organization Name
HSBC Plc
Description
Hi, we received feedback from our users that the responsiveness/performance of Data Viz was seemingly quite slow in relation to their Qlik dashboards. We therefore asked for a demonstration of their Qlik dashboards which were based on an underlying data set of 3.5M rows with about 40-50 metrics/attributes.
The performance was lightening fast via their web-browser. When a user drills or selects a filter the refresh is so fast it is easy to miss it, even complex filters and time series functions responded instantly - literally in the blink of an eye.
They pointed out that the performance is mainly due to the way Qlik stores its data set as an in-memory columnar data store.
So our idea is for Oracle Data Viz to provide the option of delivering its data sets as an in-memory columnar data store, to optimise the existing row-based data set performance.
Thanks!
Use Case and Business Need
Additional capabilities to compete against competitors, improved user engagement with lightening fast response times.
Original Idea Number: 51fdfba0ec
Comments
-
This feature is really a must when you want to convince business users to go for Oracle Data Visualization instead of Tableau/Qlick. Data storage for "external" data sets in DV is file system (in on-premise), which is pretty slow.
0 -
must have
0 -
Customer are always comparing tools in performance area. And we are behind in many cases.
0 -
What-If use cases involving a re-adjustment of existing pattern and analysis of a large dataset scored against that pattern would benefit a lot from in-memory operations. The analysis is interactive and involves a lot of back and forth user actions like check and uncheck options for many attributes with each action triggering a pattern/query transformation action and then followed by a scoring action against the larger transaction dataset.
0 -
Strongly agree with this. It would be nice to be able to define a data set built from a subject area (local subject area) to have extreme performance. There are still a lot of OAC/OAS/OBIEE accounts who want performance and go to another tool. Giving them extreme performance in DV would help diminish their extraction behavior. This would help kill two birds with one stone (slow down treating OA as an ETL tool and relying on other visualization platforms.)
0 -
This should be a must-have. Indeed it is quite easy for end-users to compare tools and there is high pressure from them and from the business to have at least equivalent performance.
0 -
This is a no-brainer. Its extremely difficult to bring in OAC/S DV to compete when it comes to response times of the front end.
Even with smaller Excel based data sets, the performance/appearence is not snappy. That's what the user community has already seen and come to expect.
0 -
We have a GL Drilldown report with 100m+ rows and OAC struggles with this. Would be nice to see this in-memory.
0 -
we want this feature
0 -
Definitely a must have. Faster response time is always the key!
0