its because of the huge temp files its generating depending upon the amount of data you are trying to pull in the reports.
How to avoid this problem ? Is it possible ? Change the setting files or others method? Thanks.
Hi check the below link.it helps you..
This basically means that the write operation into the cache file fails.
First of all, check whether the [Cache] section in your NQSConfig.ini is configured properly (path, size, etc.).
Then check wether the user account running the BI server service has the rights to write into that folder structure.
If you have doubts about the data quantity get the actual queries fired against the source database from the log files to see what the resulting data set looks like.
Also, try running a test with caching turned off.
Mark it if helps/correct
I have turned off the Cache in NQSConfig.ini, so it does not exit the Cache in Local machine. Why it will happen?
ENABLE = NO; # This Configuration setting is managed by Oracle Enterprise Manager Fusion Middleware Control
# A comma separated list of <directory maxSize> pair(s).
# These are relative to the process instance directory.
# e.g. DATA_STORAGE_PATHS = "nQSCache" 500 MB;
# resolves to
DATA_STORAGE_PATHS = "cache" 500 MB;
MAX_ROWS_PER_CACHE_ENTRY = 100000; # 0 is unlimited size
MAX_CACHE_ENTRY_SIZE = 100 MB; # This Configuration setting is managed by Oracle Enterprise Manager Fusion Middleware Control
MAX_CACHE_ENTRIES = 10000; # This Configuration setting is managed by Oracle Enterprise Manager Fusion Middleware Control
POPULATE_AGGREGATE_ROLLUP_HITS = NO;
USE_ADVANCED_HIT_DETECTION = NO;
MAX_SUBEXPR_SEARCH_DEPTH = 7;
DISABLE_SUBREQUEST_CACHING = NO;
# Cluster-aware cache.
# Note that since this is a network share, the directory should not be
GLOBAL_CACHE_STORAGE_PATH = "" 0 MB; # This Configuration setting is managed by Oracle Enterprise Manager Fusion Middleware Control
MAX_GLOBAL_CACHE_ENTRIES = 1000;
CACHE_POLL_SECONDS = 300;
CLUSTER_AWARE_CACHE_LOGGING = NO;