Skip to Main Content

DevOps, CI/CD and Automation

Announcement

For appeals, questions and feedback about Oracle Forums, please email oracle-forums-moderators_us@oracle.com. Technical questions should be asked in the appropriate category. Thank you!

Reading large table using cx_Oracle

User619492-OCJul 9 2020 — edited Jul 11 2020

Hi Experts,

I have requirement to export large table (100M) to CSV. I am trying to do this using cx_Oracle which is working fine(tested 10% of data). I have few questions related how it consume server resources when using SessionPool.

My code:

pool  = cx_Oracle.SessionPool(db_config.user, db_config.pw, db_config.dsn,min = 5, max = 6, increment = 1)

file = open("file.csv", "w")

output = csv.writer(file, delimiter=',', lineterminator="\n")

with pool.acquire() as con:

    cur = con.cursor()

    cur.arraysize=2000

    tmp = cur.execute('select * from 100MRowsTable')

    rows = cur.fetchmany()

My questions are below:

1. For each 2000 records(arraysize) does it establish a connection through pool ?

2. Does it establish 5 connections(initially) and read in parallel each connection 2000 records at a time ? then it reuse same connections to fetch next set of records ?

3. if I want to improve performance can I use threading=true and pass chunk of  100MRowsTable for each thread to speed up extraction process? if yes each thread need to create a separate sessionpool or it has to use same session pool ?

Appreciate your help!! Thanks!!!

This post has been answered by Anthony Tuininga-Oracle on Jul 9 2020
Jump to Answer
Comments
Post Details
Added on Jul 9 2020
7 comments
2,902 views