Categories
- All Categories
- 15 Oracle Analytics Sharing Center
- 14 Oracle Analytics Lounge
- 212 Oracle Analytics News
- 42 Oracle Analytics Videos
- 15.7K Oracle Analytics Forums
- 6.1K Oracle Analytics Idea Labs
- Oracle Analytics User Groups
- 78 Oracle Analytics Trainings
- 14 Oracle Analytics Data Visualizations Challenge
- Find Partners
- For Partners
Ability to Manage User Access to Data Flows

Organization Name
Motability Operations Ltd
Description
Currently, all users with DV Authors application role can access the Data Flows to create a new one. We would need an option to limit user access to Data Flow pages. One possible solution is to have a new user privilege specific to Data Flows.
Use Case and Business Need
The situation is that some users are authors and should be able to create for example data sets but we don't want the same group to create/add Data flows too.
With the restriction on the user's privileges, we guarantee a healthier server with less unnecessarily load on resources such as memory and disks.
Original Idea Number: 2f475c6f08
Comments
-
Would you envision this role to be able to run a data flow that someone else created ? or not access at all ?
J
0 -
In an ideal scenario, we would benefit from access levels of Full Control, Read-Write and Read-Only on Data Flow.
Thanks,
Farnaz
0 -
Looks like you want to control the features end users can use. Is there a specific goal? Data flows are useful even for uploaded content like an excel spreadsheet. It would be good to understand your goal to inform our approach.
0 -
As mentioned in the main post, with the restriction on the user's access, we guarantee a healthier server with less unnecessarily load on resources such as memory and disks.
The situation is that we have connections to our Snowflake Data Lake defined in OAS DV. Business users are then given access to create data sets and projects as part of their discovery and as they are pleased (note currently server admins have no options to enforce filters or row limits).
We have noticed it is actually not very difficult to create a heavy load on the OAS server by selecting too many rows and/or running different scripts on them. This can actually cause the server to crash, out of memory errors and other resource issues causing the BI services to stop working.
On the other hand, we have business users with different levels of SQL knowledge or in the middle of learning DV features that we ideally don't want them to test for example ML algorithm within a Data Flow on a 400m rows cached data set ...
Hope this answers your question.
0