1 Reply Latest reply on Jun 3, 2014 5:33 AM by Jjun.Tan

    Limitation of Oracle with respect to Big Data



      We are working on an Enterprise Data Warehousing solution where we need to pick the web visitor event data and store it to an Oracle 11g database. The estimated size of data is going to be 500 MB to 1 GB per day.

      Here I want to know what are the limitations with Oracle database to handle such huge data volume and is there any need to use Big Data strategies for this data volume?

      I would also be interested in understanding what solutions Oracle provides to handle such huge volumes and how is this different from Big Data strategies?


      Any advise over this is highly appreciated




        • 1. Re: Limitation of Oracle with respect to Big Data



          500MB to 1GB of data is definately ok for oracle database, but do note that the whole architecture comes into play here, can your network infrastructure, storage capacities and server technologies manage with the increased of volume?


          Big data is also termed as unstructured data, where companies aimed to capture and store these unstructured data. So, does the visitor event data varies alot or does it follows a fixed set of columns\rows like in RDBMS, if the latter satisfy, just use a oracle RDBMS will do. Hence, we'll need to look at what kind of data the visitor event provides, a good example where you'll require NoSQL to acquire big data is the collection of facebook posts, alongside with twitter updates or internet logs.


          Warmest Regards,


          1 person found this helpful