Pandas DataFrame to SQLite


Are you working with pandas in Python and needing to transfer the data over to a SQLite database? What if you need to move the data into a SQL Server database? Could you move it over to a SQLite database and then generate a SQL script to execute in SSMMS of SQL Server?

There is some pandas information in the documentation about this.

I created a project in Jupyter Notebook called PandasDataFrametoSQLite. Feel free to copy the code here for your own test project in your non-production server. You should know the following pieces of information before beginning your own project: the name of the SQLite database file, the location of the database file, the name of the table and finally the data itself. The data will need to already be in a pandas DataFrame.

import pandas as pd
import sqlite3 as sq

data = {'firstname': ['Bob', 'Sally', 'Suzie', 'Rowan'],
       'amount': [12, 67, 33, 44]}
df = pd.DataFrame(data)
df

The code below creates a new SQLite database file and replaces it if it already exists (in that directory).

table_name = "friends" 
file_name = "peopleDB"
conn = sq.connect('D:\Test\{}.db'.format(file_name)) # creates file
df.to_sql(table_name, conn, if_exists='replace', index=False) # writes to file
conn.close() # close the connection

Below is a screenshot of the results. I’m using DB Browser.

What code would you write to add another table to the same database?

if_exists has three options: (‘fail’, ‘replace’, ‘append’}, and the default is ‘fail’.

Leave a Reply