Forum Stats

  • 3,781,640 Users
  • 2,254,533 Discussions
  • 7,879,771 Comments

Discussions

What is the best way to read the large dataset from oracle DB into pandas dataframe ?

User_CK8A6
User_CK8A6 Member Posts: 5 Green Ribbon

In the production DB, we have few txn tables which have more 400 millions of records. I have to read the data and do the data processing for preparing the analytics report? I have tried to use the pandas.read_sql or pandas.read_sql_query with chunk size but this is very slow. Please suggest if we have different ways to do this optimally.


Thanks,

Manoj Kumar

Tagged:

Answers

This discussion has been closed.