Intermediate
A config file is a flat file but is used for reading and writing of settings that affect the behaviour of your application. These files can be incredibly useful so that you can put individual settings inside the human editable file and then have the settings read from your application. This helps you configure your application in the way you need without having to change the application code.
Typically the config file is edited by a simple text editor by the user, then the application runs and reads the config file. If there are any changes to the config file, normally (depending how the code is written), the application will then have to be restarted to take on the new settings.
Some of the considerations for using a config file as a “data store” includes:
- Setup: There’s no setup that is required for files. You should use one of the config management python libraries that are available to make it easier to manipulate config files.
- Volume: Size Small-ish file size (< 5-10mb)
- Record access: Does not require to search data within the file to extract just a portion of the records. You would load or save all the data in the file in one go
- Data Writes: Applications don’t generally write to a config file, but it can be done. Instead the config file is edited outside in a text editor
- Data formats: Normally the data would be a structured record based (such as comma separated value – CSV or tab delimited), or a more complex structure such as what you see in windows based .INI files or JSON format even
- Editability: You generally want to allow direct editing of the file by users
- Redundancy: There’s no inbuilt redundancy. If there is any failure (data corrupt, the server with the file fails), then you’re out of luck. You need to setup your own mechanisms (e.g. replicate file to another server automatically)
Code examples to read and write from config file using ConfigParse
Setting up a config file is actually not that much harder than simply creating a constants inside your application. Your main decision will be what type of configuration file format you’d like to use as there are quite a few to choose from. Here are some options and samples:
| File type | Example config file |
|---|---|
|
1. Simple text file which is tab-delimited Python Library = noneExample: below |
records_per_page 10 |
|
2. A properties file with key value pair Python Library = None |
#webpage display |
|
3. INI file format Python library: configparser |
[database] |
|
4. JSON file format Python library: json |
{ “records_per_page”:10, “logo_icon”: “/images/company_log.jpg”}
|
Example 1: Simple text file which is tab-delimited
You can see a full article on how to read a text file in our “Storing Data in Files in Python” article. The short version of open a tab delimited file is as follows:
Suppose you have a configuration file as follows where each row has two fields which is separated by a tab:
config_data.txt
records_per_page 10
logo_icon /images/company_log.jpg
You can load the data into a python dictionary like the following:
config = {}
file_handler = open('config_data.txt', 'r')
for rec in file_handler:
config.update( [ tuple( rec.strip().split('\t') ) ] )
file_handler.close()
print(config)
The output will be as follows:
{'records_per_page': '10', 'logo_icon': '/images/company_log.jpg'}
Some explanation may be required on the code though to make it easier to understand. Firstly, the for loop is used to read a record line by line. So each time the for loop iterates, it will read a line into the field rec until the whole file is read.
The following code is a little tricky, but the intent is to take the two columns in the tab delimited file and create a dictionary key value pair.
config.update( [ tuple( rec.strip().split('\t') ) ] )
It works by the following:
- It first removes the newline character from the end of the line (through
rec.strip()) - This will then return a string which is then split with
split()by the a tab characters (denoted by‘\t’) - The result of this is a two filed array which is then created into a tuple format
- The tuple is then put in a list and added to list with the
[]brackets - The dictionary
.update()method is used to finally add they key value pair
Example 2: A properties file with key value pair
If you have a fairly simple configuration needs with just a key-value pair, then a properties type file would work for you where you have <config name> = <config value>. This can be easily loaded as a text file and then the key-value be loaded into a dictionary.
Imagine this was the config file: config_data.txt
#webpage display
records_per_page =10
logo_icon =/images/company_log.jpg
The following code could easily load this configuration:
config = {}
with open('config_data.txt', 'r') as file_hander:
for rec in file_hander:
if rec.startswith('#'): continue
key, value = rec.strip().split('=')
if key: config[key] = value
print( config )
Here the code ignores any comment lines (e.g. the line starts with a ‘#’), and then string-splits the line by the ‘=’ sign. This will then load the dictionary ‘config’
Example 3: INI file format using ConfigParse
You can see a full article on how the ConfigParse library works in our earlier article. The short version is as follows.
Suppose you have a configuration file as follows:
test.ini
[default]
name = development
host = 192.168.1.1
port = 31
username = admin
password = admin
[database]
name = production
host = 144.101.1.1
You can then read the file with the following simple code:
import configparser
config = configparser.ConfigParser()
#Open the file again to try to read it
config.read('test.ini')
print( config['database'][‘name’] ) #This will output ‘production’
print( config['database'][‘port’] ) #This will output ‘31’. As there is no port under
# database the default value will be extracted
Example 4: Reading Config values from a JSON file
With JSON being so popular, this is also another alternative you could use to keep all your config data in. It is very easy to also load.
Assume your config file is as follows: config_data.txt
{
"records_per_page":10,
"logo_icon": "/images/company_log.jpg"
}
Then the following code can be used to bring these into a dictionary:
import json
file_handler = open('config_data.txt', 'r')
config = json.loads( file_handler.read() )
file_handler.close()
print(config)
Where the output would be:
{'records_per_page': 10, 'logo_icon': '/images/company_log.jpg'}
Summary
A config file is a great option if you are looking to store settings for your applications. These are usually loaded at the start of the application and then can be loaded into a dictionary which can then serve as a set of constants which your application can use. This will both avoid the need to hardcode settings and also allow you to change the behaviour of your application without having to touch the code.
How To Use Python Decorators: A Complete Guide
Intermediate
You’ve probably seen the @ symbol above function definitions in Python code and wondered what it does. That’s a decorator — one of Python’s most powerful and elegant features. Decorators let you wrap a function with additional behavior (logging, caching, access control, rate limiting, timing) without modifying the function’s code. They’re the reason you can add authentication to a Flask route with a single line, or enable caching with @functools.lru_cache.
Decorators are a pure Python feature — no installation required. They’re built on Python’s first-class functions (functions that can be passed as arguments and returned from other functions). Once you understand how decorators work mechanically, you’ll be able to read and write the patterns used by virtually every Python framework, from Django’s @login_required to FastAPI’s @app.get() to pytest’s @pytest.fixture.
In this tutorial, you’ll learn how decorators work from first principles, how to use functools.wraps to preserve function metadata, how to write parameterized decorators (decorators that take arguments), how to stack multiple decorators, how to use class-based decorators, and how to apply these techniques in real-world scenarios like timing, retry logic, and access control.
Decorators: Quick Example
Here’s the simplest useful decorator — one that logs when a function is called:
# decorator_quick.py
import functools
def log_calls(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
print(f"Calling {func.__name__}({args}, {kwargs})")
result = func(*args, **kwargs)
print(f"{func.__name__} returned {result}")
return result
return wrapper
@log_calls
def add(a, b):
return a + b
# This is equivalent to: add = log_calls(add)
result = add(3, 4)
print(f"Final result: {result}")
# The function's identity is preserved
print(f"Function name: {add.__name__}")
Output:
Calling add((3, 4), {})
add returned 7
Final result: 7
Function name: add
The @log_calls syntax is shorthand for add = log_calls(add). The decorator receives the original function, returns a new wrapper function that adds behavior before and after calling the original, and replaces the name add with the wrapper. The @functools.wraps(func) line copies the original function’s name, docstring, and other metadata onto the wrapper — always include this.
How Decorators Work: First Principles
To truly understand decorators, you need to understand that in Python, functions are objects — they can be passed as arguments and returned from other functions. This is called “first-class functions.” Decorators are just a syntax shortcut for a function transformation pattern.
# first_class_functions.py
# Functions can be passed as arguments
def apply_twice(func, value):
return func(func(value))
def double(x):
return x * 2
result = apply_twice(double, 3)
print(f"Apply twice: {result}") # 3 -> 6 -> 12
# Functions can be returned from other functions
def make_multiplier(n):
def multiplier(x):
return x * n
return multiplier # Returns the inner function
triple = make_multiplier(3)
print(f"Triple 5: {triple(5)}") # 15
# The decorator pattern manually, without @ syntax
def shout(func):
def wrapper(*args, **kwargs):
result = func(*args, **kwargs)
return result.upper() + "!!!"
return wrapper
def greet(name):
return f"Hello, {name}"
# Without @ syntax -- same result
greet = shout(greet)
print(greet("alice")) # HELLO, ALICE!!!
Output:
Apply twice: 12
Triple 5: 15
HELLO, ALICE!!!
The key insight: @shout above a function definition is exactly equivalent to writing greet = shout(greet) after the definition. The @ syntax just makes it more readable and places the decoration visually near the function definition where it belongs.
Always Use functools.wraps
Without @functools.wraps(func), your decorator replaces the original function’s metadata with the wrapper’s. This causes problems with debugging, documentation, and tools that inspect function names. Always include it:
# wraps_example.py
import functools
# WITHOUT functools.wraps -- breaks function identity
def bad_decorator(func):
def wrapper(*args, **kwargs):
return func(*args, **kwargs)
return wrapper
# WITH functools.wraps -- preserves identity
def good_decorator(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
return func(*args, **kwargs)
return wrapper
@bad_decorator
def my_function_bad():
"""This function does something important."""
pass
@good_decorator
def my_function_good():
"""This function does something important."""
pass
print(f"Bad decorator name: {my_function_bad.__name__}")
print(f"Bad decorator docstr: {my_function_bad.__doc__}")
print()
print(f"Good decorator name: {my_function_good.__name__}")
print(f"Good decorator docstr: {my_function_good.__doc__}")
Output:
Bad decorator name: wrapper
Bad decorator docstr: None
Good decorator name: my_function_good
Good decorator docstr: This function does something important.
Practical Decorator Examples
Timing Functions
A timer decorator measures how long a function takes to execute — great for performance monitoring and identifying bottlenecks:
# timer_decorator.py
import functools
import time
def timer(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
start = time.perf_counter()
result = func(*args, **kwargs)
elapsed = time.perf_counter() - start
print(f"{func.__name__} took {elapsed:.4f} seconds")
return result
return wrapper
@timer
def slow_function():
time.sleep(0.1)
return "done"
@timer
def sum_million():
return sum(range(1_000_000))
slow_function()
result = sum_million()
print(f"Sum result: {result:,}")
Output:
slow_function took 0.1002 seconds
sum_million took 0.0312 seconds
Sum result: 499,999,500,000
Retry Logic
A retry decorator automatically re-runs a function if it raises an exception — essential for network calls, database operations, and any code that can fail transiently:
# retry_decorator.py
import functools
import time
import random
def retry(max_attempts=3, delay=1.0, exceptions=(Exception,)):
"""Decorator factory: retries a function up to max_attempts times."""
def decorator(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
last_error = None
for attempt in range(1, max_attempts + 1):
try:
return func(*args, **kwargs)
except exceptions as e:
last_error = e
print(f"Attempt {attempt}/{max_attempts} failed: {e}")
if attempt < max_attempts:
time.sleep(delay)
raise last_error
return wrapper
return decorator
# Simulated unreliable function (fails 70% of the time)
call_count = 0
@retry(max_attempts=5, delay=0.1, exceptions=(ValueError,))
def unreliable_api_call():
global call_count
call_count += 1
if random.random() < 0.7:
raise ValueError(f"API timeout on call #{call_count}")
return f"Success on call #{call_count}"
random.seed(42)
result = unreliable_api_call()
print(f"Final result: {result}")
Output:
Attempt 1/5 failed: API timeout on call #1
Attempt 2/5 failed: API timeout on call #2
Attempt 3/5 failed: API timeout on call #3
Final result: Success on call #4
Notice the decorator factory pattern: retry(max_attempts=5, delay=0.1) returns a decorator, which then returns a wrapper. This is a three-level nesting -- outer function configures, middle function receives the function to decorate, inner function is what actually runs. This is the standard pattern for parameterized decorators.
Parameterized Decorators
When your decorator needs configuration (like the number of retries in the example above), you add one more level of nesting -- a "decorator factory" that takes the parameters and returns the actual decorator:
# parameterized_decorator.py
import functools
def repeat(n):
"""Call the decorated function n times."""
def decorator(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
results = []
for _ in range(n):
results.append(func(*args, **kwargs))
return results
return wrapper
return decorator
@repeat(3)
def say_hello(name):
return f"Hello, {name}!"
results = say_hello("Alice")
for r in results:
print(r)
Output:
Hello, Alice!
Hello, Alice!
Hello, Alice!
Stacking Multiple Decorators
You can apply multiple decorators to the same function by stacking them. They apply from bottom to top (closest to the function first):
# stacking_decorators.py
import functools
import time
def timer(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
start = time.perf_counter()
result = func(*args, **kwargs)
print(f" [timer] {func.__name__}: {time.perf_counter()-start:.4f}s")
return result
return wrapper
def log_result(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
result = func(*args, **kwargs)
print(f" [log] {func.__name__} returned: {result}")
return result
return wrapper
# Applied bottom-up: log_result wraps the original,
# then timer wraps log_result's wrapper
@timer
@log_result
def compute(x, y):
return x ** y
result = compute(2, 10)
print(f"Final result: {result}")
Output:
[log] compute returned: 1024
[timer] compute: 0.0001s
Final result: 1024
Real-Life Example: Access Control Decorators
Here's a practical access control system using decorators -- the same pattern used by web frameworks for route authentication:
# access_control.py
import functools
# Simulated current user session
current_user = {'name': 'alice', 'roles': ['user', 'editor'], 'logged_in': True}
def login_required(func):
"""Decorator that requires the user to be logged in."""
@functools.wraps(func)
def wrapper(*args, **kwargs):
if not current_user.get('logged_in'):
print(f"Access denied: login required for {func.__name__}")
return None
return func(*args, **kwargs)
return wrapper
def require_role(role):
"""Decorator factory: requires the user to have a specific role."""
def decorator(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
if role not in current_user.get('roles', []):
print(f"Access denied: '{role}' role required for {func.__name__}")
return None
return func(*args, **kwargs)
return wrapper
return decorator
@login_required
def view_dashboard():
return f"Dashboard for {current_user['name']}"
@login_required
@require_role('admin')
def delete_user(user_id):
return f"Deleted user {user_id}"
@login_required
@require_role('editor')
def publish_post(post_id):
return f"Published post {post_id}"
# Alice is logged in and has 'editor' but not 'admin'
print(view_dashboard())
print(delete_user(42))
print(publish_post(101))
# Simulate a logged-out user
current_user['logged_in'] = False
print(view_dashboard())
Output:
Dashboard for alice
Access denied: 'admin' role required for delete_user
Published post 101
Access denied: login required for view_dashboard
This is the exact pattern used by Flask's @login_required and Django's @permission_required. The decorators are reusable across any number of functions -- add access control to a new function by adding one line above its definition. The stacked @login_required @require_role('admin') means the user must pass both checks: logged in AND has the required role.
Frequently Asked Questions
When should I use a decorator instead of a helper function?
Use a decorator when you want to add the same cross-cutting behavior (logging, timing, validation, caching) to multiple functions without repeating the logic. If you find yourself writing the same "before" and "after" code in many functions, that's a strong signal to extract it into a decorator. For one-off or highly specific behavior, a regular helper function is simpler.
Can I use a class as a decorator?
Yes -- any callable can be a decorator. A class with a __call__ method works as a decorator. Class-based decorators are useful when you need to maintain state between calls (like call counts or cached results). Define __init__(self, func) to receive the function and __call__(self, *args, **kwargs) to wrap it. The @functools.wraps(func) approach works on __call__ too.
Do decorators work on class methods?
Yes, but with one caveat: the first argument of instance methods is self. Since decorators use *args, **kwargs, this is handled automatically. However, @staticmethod and @classmethod are themselves decorators. When stacking with them, always place @staticmethod or @classmethod outermost (closest to the def).
What is @functools.lru_cache and when should I use it?
@functools.lru_cache(maxsize=128) memoizes a function's return values -- if the function is called again with the same arguments, it returns the cached result instead of recomputing. Use it for pure functions (no side effects) that are called repeatedly with the same inputs. It's especially powerful for recursive functions like Fibonacci where the same sub-problems repeat many times.
Why does my IDE show wrong type hints after applying a decorator?
Without @functools.wraps, the decorated function's signature shows as (*args, **kwargs) -- losing the original type hints. With @functools.wraps, the function identity is preserved, but the signature the type checker sees is still the wrapper's. For full type hint preservation in decorated functions, use typing.ParamSpec and typing.Concatenate (Python 3.10+) to annotate the wrapper correctly.
Conclusion
Decorators are one of Python's most powerful code-reuse mechanisms. In this tutorial, you learned how Python's first-class functions make decorators possible, why @functools.wraps(func) is essential in every decorator, how to write practical decorators for timing, retry logic, and logging, how to create parameterized decorators using a decorator factory pattern, how to stack multiple decorators on a single function, and how the access control pattern mirrors real framework implementations.
The access control project is a foundation you can extend: add role inheritance, time-based access restrictions, or rate limiting. Every web framework you'll encounter -- Flask, Django, FastAPI -- relies heavily on decorators for its most important features.
For deeper coverage, see the functools module documentation and PEP 318 which introduced decorator syntax to Python.
Related Articles
Related Articles
- How To Manage Python Environment Variables With dotenv and os.environ
- How To Read and Write JSON Files in Python 3
- How To Split And Organise Your Source Code Into Multiple Files in Python 3
Further Reading: For more details, see the Python configparser documentation.
Frequently Asked Questions
What is the best way to store settings in Python?
For simple key-value settings, use INI files with ConfigParser. For nested data, use JSON or TOML. For environment-specific settings, use .env files with python-dotenv. The best choice depends on your complexity needs and whether non-developers will edit the settings.
How do I create a config file in Python?
Use ConfigParser to create INI files: instantiate the parser, add sections and key-value pairs with config['section'] = {'key': 'value'}, then write with config.write(open('config.ini', 'w')). For JSON, use json.dump().
Should I use environment variables or config files?
Use environment variables for sensitive data (API keys, passwords) and deployment-specific settings. Use config files for application-level settings that rarely change. Many projects combine both: a config file for defaults and environment variables for overrides and secrets.
How do I prevent config files from being committed to Git?
Add your config file names to .gitignore (e.g., config.ini, .env). Provide a config.example.ini template in the repository so other developers know what settings are needed without exposing actual values.
Can I use YAML for Python configuration files?
Yes. Install PyYAML with pip install pyyaml and use yaml.safe_load() to read YAML files. YAML supports nested structures, lists, and comments, making it more expressive than INI. However, it is not part of Python’s standard library.