Beginner
You need a database for your Python project, but setting up MySQL or PostgreSQL feels like overkill. SQLite is a lightweight, serverless database that comes built into Python — no installation, no configuration, no separate server process. Just import sqlite3 and start storing data in a single file.
Python’s built-in sqlite3 module provides a full-featured interface to SQLite databases. It supports SQL queries, transactions, parameterized statements, and works identically on every platform. Your database is just a file on disk that you can copy, backup, or share like any other file.
In this tutorial, you will learn how to create databases and tables, insert and query data, use parameterized queries to prevent SQL injection, handle transactions, work with context managers, and build a complete project. By the end, you will be able to add persistent data storage to any Python project in minutes.
SQLite in Python: Quick Example
Here is a complete working example that creates a database, inserts data, and queries it — all in under 15 lines.
# quick_sqlite.py
import sqlite3
conn = sqlite3.connect("example.db")
cursor = conn.cursor()
cursor.execute("CREATE TABLE IF NOT EXISTS users (id INTEGER PRIMARY KEY, name TEXT, email TEXT)")
cursor.execute("INSERT INTO users (name, email) VALUES (?, ?)", ("Alice", "alice@example.com"))
cursor.execute("INSERT INTO users (name, email) VALUES (?, ?)", ("Bob", "bob@example.com"))
conn.commit()
cursor.execute("SELECT * FROM users")
for row in cursor.fetchall():
print(row)
conn.close()
Output:
(1, 'Alice', 'alice@example.com')
(2, 'Bob', 'bob@example.com')
The ? placeholders in the INSERT statement are parameterized queries — they prevent SQL injection by keeping data separate from SQL code. Never use f-strings or string concatenation to build SQL queries.
What Is SQLite and When Should You Use It?
SQLite is a self-contained, file-based relational database engine. Unlike MySQL or PostgreSQL, it does not require a separate server process. The entire database lives in a single file on your filesystem.
| Feature | SQLite | PostgreSQL | MySQL |
|---|---|---|---|
| Server required | No | Yes | Yes |
| Setup complexity | Zero | Medium | Medium |
| Concurrent writers | Limited | Excellent | Good |
| Max database size | 281 TB | Unlimited | Unlimited |
| Best for | Apps, prototypes, scripts | Web apps, analytics | Web apps, CMS |
Use SQLite when you need persistent storage for a single-user application, prototype, CLI tool, data processing script, or any project where simplicity matters more than concurrent access. Switch to PostgreSQL when you need multiple simultaneous writers or a web application with many users.
Creating Databases and Tables
Connecting to a database file creates it automatically if it does not exist. Use CREATE TABLE IF NOT EXISTS to make your scripts idempotent.
# create_tables.py
import sqlite3
conn = sqlite3.connect("bookstore.db")
cursor = conn.cursor()
cursor.execute("""
CREATE TABLE IF NOT EXISTS books (
id INTEGER PRIMARY KEY AUTOINCREMENT,
title TEXT NOT NULL,
author TEXT NOT NULL,
price REAL DEFAULT 0.0,
published_year INTEGER,
in_stock INTEGER DEFAULT 1
)
""")
cursor.execute("""
CREATE TABLE IF NOT EXISTS categories (
id INTEGER PRIMARY KEY AUTOINCREMENT,
name TEXT UNIQUE NOT NULL
)
""")
conn.commit()
print("Tables created successfully")
# Verify tables exist
cursor.execute("SELECT name FROM sqlite_master WHERE type='table'")
tables = cursor.fetchall()
print("Tables:", [t[0] for t in tables])
conn.close()
Output:
Tables created successfully
Tables: ['books', 'categories']
The AUTOINCREMENT keyword makes SQLite assign a unique incrementing ID to each new row. NOT NULL prevents inserting rows with missing required fields.
Inserting Data Safely
Always use parameterized queries (the ? placeholder pattern) when inserting data. This prevents SQL injection and handles special characters in your data automatically.
# insert_data.py
import sqlite3
conn = sqlite3.connect("bookstore.db")
cursor = conn.cursor()
# Single insert with parameterized query
cursor.execute(
"INSERT INTO books (title, author, price, published_year) VALUES (?, ?, ?, ?)",
("Python Crash Course", "Eric Matthes", 35.99, 2023)
)
# Bulk insert with executemany
books = [
("Fluent Python", "Luciano Ramalho", 55.99, 2022),
("Automate the Boring Stuff", "Al Sweigart", 29.99, 2019),
("Clean Code", "Robert C. Martin", 39.99, 2008),
]
cursor.executemany(
"INSERT INTO books (title, author, price, published_year) VALUES (?, ?, ?, ?)",
books
)
conn.commit()
print(f"Inserted {cursor.rowcount + 1} books")
conn.close()
Output:
Inserted 4 books
The executemany() method is significantly faster than calling execute() in a loop because it batches the operations into a single transaction.
Querying Data
SQLite supports full SQL query syntax. Use fetchall() for all results, fetchone() for a single row, or iterate directly over the cursor.
# query_data.py
import sqlite3
conn = sqlite3.connect("bookstore.db")
conn.row_factory = sqlite3.Row # Enable column name access
cursor = conn.cursor()
# Select all books
cursor.execute("SELECT * FROM books ORDER BY price DESC")
print("All books by price:")
for row in cursor.fetchall():
print(f" {row['title']} by {row['author']} - ${row['price']}")
# Filtered query with parameter
cursor.execute("SELECT * FROM books WHERE price < ? ORDER BY title", (40.0,))
print("\nBooks under $40:")
for row in cursor.fetchall():
print(f" {row['title']} (${row['price']})")
# Aggregate query
cursor.execute("SELECT COUNT(*) as count, AVG(price) as avg_price FROM books")
stats = cursor.fetchone()
print(f"\nTotal: {stats['count']} books, Average price: ${stats['avg_price']:.2f}")
conn.close()
Output:
All books by price:
Fluent Python by Luciano Ramalho - $55.99
Clean Code by Robert C. Martin - $39.99
Python Crash Course by Eric Matthes - $35.99
Automate the Boring Stuff by Al Sweigart - $29.99
Books under $40:
Automate the Boring Stuff ($29.99)
Clean Code ($39.99)
Python Crash Course ($35.99)
Total: 4 books, Average price: $40.49
Setting conn.row_factory = sqlite3.Row lets you access columns by name instead of index. This makes your code more readable and resilient to schema changes.
Using Context Managers
Context managers ensure your database connection is properly closed even if an error occurs. They also handle transaction commits automatically on success and rollbacks on failure.
# context_manager.py
import sqlite3
# The connection as context manager handles commits/rollbacks
with sqlite3.connect("bookstore.db") as conn:
conn.row_factory = sqlite3.Row
cursor = conn.cursor()
cursor.execute(
"INSERT INTO books (title, author, price, published_year) VALUES (?, ?, ?, ?)",
("Learning Python", "Mark Lutz", 59.99, 2013)
)
# Commit happens automatically when exiting the with block
cursor.execute("SELECT COUNT(*) as count FROM books")
print(f"Total books: {cursor.fetchone()['count']}")
# Connection is committed and closed here
Output:
Total books: 5
The with statement commits the transaction when the block exits normally and rolls it back if an exception occurs. This is the recommended pattern for all database operations.
Real-Life Example: Building a Contact Book CLI
# contact_book.py
import sqlite3
class ContactBook:
def __init__(self, db_path="contacts.db"):
self.conn = sqlite3.connect(db_path)
self.conn.row_factory = sqlite3.Row
self.conn.execute("""
CREATE TABLE IF NOT EXISTS contacts (
id INTEGER PRIMARY KEY AUTOINCREMENT,
name TEXT NOT NULL,
phone TEXT,
email TEXT,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
)
""")
self.conn.commit()
def add(self, name, phone=None, email=None):
self.conn.execute(
"INSERT INTO contacts (name, phone, email) VALUES (?, ?, ?)",
(name, phone, email)
)
self.conn.commit()
print(f"Added: {name}")
def search(self, query):
cursor = self.conn.execute(
"SELECT * FROM contacts WHERE name LIKE ? OR email LIKE ?",
(f"%{query}%", f"%{query}%")
)
return cursor.fetchall()
def list_all(self):
cursor = self.conn.execute("SELECT * FROM contacts ORDER BY name")
return cursor.fetchall()
def delete(self, contact_id):
self.conn.execute("DELETE FROM contacts WHERE id = ?", (contact_id,))
self.conn.commit()
def close(self):
self.conn.close()
# Demo usage
book = ContactBook(":memory:")
book.add("Alice Smith", "555-0101", "alice@example.com")
book.add("Bob Jones", "555-0102", "bob@example.com")
book.add("Charlie Brown", "555-0103", "charlie@example.com")
print("\nAll contacts:")
for c in book.list_all():
print(f" {c['name']} - {c['phone']} - {c['email']}")
print("\nSearch 'ali':")
for c in book.search("ali"):
print(f" {c['name']} - {c['email']}")
book.close()
Output:
Added: Alice Smith
Added: Bob Jones
Added: Charlie Brown
All contacts:
Alice Smith - 555-0101 - alice@example.com
Bob Jones - 555-0102 - bob@example.com
Charlie Brown - 555-0103 - charlie@example.com
Search 'ali':
Alice Smith - alice@example.com
Using ":memory:" creates an in-memory database for testing. Replace it with a filename like "contacts.db" for persistent storage. The class encapsulates all database operations, making it easy to extend with update, export, or import features.
Frequently Asked Questions
Is SQLite thread-safe in Python?
By default, a SQLite connection should only be used in the thread that created it. Set check_same_thread=False if you need to share a connection across threads, but be aware that concurrent writes can cause issues. For multi-threaded applications, give each thread its own connection.
How much data can SQLite handle?
SQLite can handle databases up to 281 terabytes. In practice, it works well for databases up to a few gigabytes. Performance depends more on query patterns than raw size -- proper indexing makes a bigger difference than database engine choice for most workloads.
How do I back up a SQLite database?
The simplest backup is to copy the database file when no writes are happening. For live databases, use Python's conn.backup() method which creates a consistent snapshot even during active writes. Schedule backups with cron or the schedule library.
How do I change the schema of an existing table?
SQLite has limited ALTER TABLE support. You can add columns with ALTER TABLE books ADD COLUMN rating REAL, but you cannot remove or rename columns in older SQLite versions. For complex changes, create a new table, copy data over, drop the old table, and rename the new one.
Should I use SQLite directly or through an ORM?
For simple scripts and small projects, the built-in sqlite3 module is perfect. For larger projects with complex queries and relationships, consider SQLAlchemy which provides an ORM layer. Start with raw SQL to understand the fundamentals, then add an ORM if the project grows.
Conclusion
SQLite with Python gives you a full relational database with zero configuration. We covered creating databases and tables, safe data insertion with parameterized queries, querying with filtering and aggregation, context managers for clean resource handling, and building a complete contact book application.
For the full SQLite documentation, visit docs.python.org/3/library/sqlite3. For SQLite SQL syntax, see sqlite.org/lang.