search

Home  >  Q&A  >  body text

python - How can I optimize the speed of pandas' read_sql?

My needs require data query and correlation between multiple databases, so I chose to use pandas, read the data through read_sql and process it into the dataframe to directly generate the target data. But we are currently encountering a problem: read_sql is very slow. For example, it takes 4 and a half minutes to read a 37W data volume (22 fields) table into a dataframe in the Oracle library. code show as below:

import pandas as pd
import sqlalchemy as sql
ora_engine=sql.create_engine('oracle://test01:test01@test01db')
ora_df1=pd.read_sql('select * from target_table1',ora_engine)

It took 4 minutes and 32 seconds

Even if I use another simple and crude method, it will be much faster than read_sql. code show as below:

import pandas as pd
import sqlalchemy as sql
ora_engine=sql.create_engine('oracle://test01:test01@test01db')
conn=ora_engine.raw_connection()
cursor=conn.cursor()
queryset=cursor.execute('select * from target_table1')
columns=[for i[0] in queryset.description]
df_data=queryset.fetchall()
ora_df1=pd.DataFrame()
ora_df1.columns=columns
ora_df1.append(df_data)

It took 1 minute and 31 seconds

I would like to ask everyone here if there is any way to optimize and improve the speed of read_sql in pandas. Thank you very much~

淡淡烟草味淡淡烟草味2710 days ago2125

reply all(1)I'll reply

  • 世界只因有你

    世界只因有你2017-06-28 09:24:30

    试试read_sql_table
    
    http://pandas.pydata.org/pandas-docs/stable/generated/pandas.read_sql_table.html#pandas.read_sql_table

    reply
    0
  • Cancelreply