Advanced Features That Will Make Your Code Reviews More Annoying
💡
TL;DR - Python Features You're Probably Not Using (But Should)
• Walrus Operator (:=): Assign and use values in one expression—perfect for while loops and comprehensions
• Pattern Matching: Structural pattern matching that makes complex conditionals readable and powerful
• functools Magic:
lru_cache for memoization, singledispatch for function overloading• Context Managers: Create your own with
contextlib for cleaner resource management• Type System: Beyond basic hints—
Protocol, TypedDict, Literal for better type safety• Descriptors: The secret sauce behind
@property, enabling computed attributes• Data Classes: Post-init processing, field factories, and frozen instances
• pathlib: Object-oriented file paths that make
os.path feel prehistoricPython has been around for over 30 years, and its standard library is massive. Yet most of us use maybe 20% of what's available. We stick to the same patterns, the same modules, the same approaches we learned when we first picked up the language.
But Python's standard library is full of gems that can make your code cleaner, faster, and more expressive. These aren't experimental features or third-party packages—they're built-in tools that have been battle-tested for years.
Here are 12 Python features that are criminally underused. Each one solves real problems, and once you know about them, you'll start seeing opportunities to use them everywhere.
The Walrus Operator: Assignment Expressions That Save the Day
The walrus operator (
:=) was introduced in Python 3.8. It allows you to assign values inside expressions. The name comes from the visual similarity of := to a walrus lying on its side.The primary benefit is eliminating repeated function calls and making certain patterns more concise, particularly in while loops and list comprehensions.
Before: The Repetitive Pattern
python
1 # Reading file chunks - the old way2 chunk = file.read(8192)3 while chunk:4 process(chunk)5 chunk = file.read(8192) # Repeated call6 7 # List comprehension with wasteful computation8 results = []9 for item in data:10 value = expensive_function(item)11 if value > threshold:12 results.append(value)
After: Elegant and DRY
python
1 # Reading file chunks - with walrus2 while chunk := file.read(8192):3 process(chunk)4 5 # List comprehension that computes once6 results = [value for item in data7 if (value := expensive_function(item)) > threshold]8 9 # Regex matching in conditionals10 if match := pattern.search(text):11 print(f"Found: {match.group()}")
Real-World Example: API Pagination
The walrus operator is particularly useful for pagination patterns:
python
1 # Fetching paginated API results2 def fetch_all_users(api_client):3 users = []4 page = 15 6 while data := api_client.get_users(page=page):7 users.extend(data['users'])8 if not data['has_next']:9 break10 page += 111 12 return users
This eliminates duplicate
api_client.get_users() calls and removes the need to check if data exists before using it.⚠️
The walrus operator is best suited for while loops and filtered comprehensions. Overuse can reduce code readability, particularly when chaining multiple assignment expressions.
Pattern Matching: Switch Statements on Steroids
Python 3.10 introduced structural pattern matching with
match/case. This feature goes beyond simple switch statements, providing powerful pattern matching capabilities for complex data structures.Pattern matching excels at destructuring nested data, eliminating verbose if-elif chains that check types, keys, and values.
Basic Matching
python
1 def http_status_message(status_code):2 match status_code:3 case 200:4 return "OK"5 case 404:6 return "Not Found"7 case 500 | 502 | 503: # Multiple patterns8 return "Server Error"9 case _: # Default case10 return "Unknown Status"
Structural Pattern Matching
Where pattern matching truly shines is destructuring complex data:
python
1 def process_command(command):2 match command:3 # Match dictionary structure4 case {"action": "create", "type": "user", "data": {"name": str(name)}}:5 return create_user(name)6 7 # Match with guards8 case {"action": "delete", "id": int(id)} if id > 0:9 return delete_item(id)10 11 # Match sequences12 case ["move", x, y] if isinstance(x, int) and isinstance(y, int):13 return move_to(x, y)14 15 # Match objects16 case Point(x=0, y=0):17 return "Origin"18 19 # Capture patterns20 case ["copy", *files, "to", destination]:21 return copy_files(files, destination)
Real-World Example: JSON API Response Handler
python
1 def handle_api_response(response):2 match response:3 case {"status": "success", "data": data}:4 return process_data(data)5 6 case {"status": "error", "code": "AUTH_FAILED"}:7 refresh_token()8 return retry_request()9 10 case {"status": "error", "code": code, "message": msg}:11 logger.error(f"API Error {code}: {msg}")12 return None13 14 case {"status": "partial", "data": data, "errors": errors}:15 log_errors(errors)16 return process_partial_data(data)17 18 case _:19 raise ValueError(f"Unexpected response format: {response}")
functools: The Swiss Army Knife Module
The
functools module provides higher-order functions and operations on callable objects. Beyond the commonly used wraps decorator, it contains several powerful utilities that can significantly improve code performance and design.lru_cache: Automatic Memoization
The
lru_cache decorator implements a Least Recently Used cache for function results. When decorated functions are called with the same arguments, cached results are returned immediately instead of recomputing.This is particularly effective for recursive functions with overlapping subproblems, such as the classic Fibonacci sequence calculation.
python
1 from functools import lru_cache2 import time3 4 # Without caching - slow recursive implementation5 def fibonacci_slow(n):6 if n < 2:7 return n8 return fibonacci_slow(n-1) + fibonacci_slow(n-2)9 10 # With caching - lightning fast11 @lru_cache(maxsize=128)12 def fibonacci_fast(n):13 if n < 2:14 return n15 return fibonacci_fast(n-1) + fibonacci_fast(n-2)16 17 # Performance comparison18 start = time.time()19 result1 = fibonacci_slow(35) # Takes ~3 seconds20 print(f"Slow: {time.time() - start:.2f}s")21 22 start = time.time()23 result2 = fibonacci_fast(35) # Takes ~0.00001 seconds24 print(f"Fast: {time.time() - start:.5f}s")25 26 # Cache statistics27 print(fibonacci_fast.cache_info())28 # CacheInfo(hits=33, misses=36, maxsize=128, currsize=36)
✨
lru_cache is perfect for expensive computations with repeated inputs. Use it for API calls, database queries, or complex calculations. Python 3.9+ also has @cache for unlimited cache size.singledispatch: Function Overloading in Python
The
singledispatch decorator enables function overloading based on the type of the first argument. This eliminates lengthy if-elif chains that check object types and dispatch to different logic.Unlike traditional function overloading in statically typed languages,
singledispatch provides a Pythonic approach to polymorphism through registration of type-specific implementations.python
1 from functools import singledispatch2 import json3 4 @singledispatch5 def serialize(obj):6 """Default serialization - just convert to string"""7 return str(obj)8 9 @serialize.register(dict)10 def _(obj):11 """Serialize dictionaries to JSON"""12 return json.dumps(obj)13 14 @serialize.register(list)15 @serialize.register(tuple)16 def _(obj):17 """Serialize sequences as JSON arrays"""18 return json.dumps(list(obj))19 20 @serialize.register(datetime)21 def _(obj):22 """Serialize datetime as ISO format"""23 return obj.isoformat()24 25 # Custom class registration26 @serialize.register(User)27 def _(obj):28 """Serialize User objects"""29 return {30 'id': obj.id,31 'username': obj.username,32 'created': serialize(obj.created_at) # Recursive!33 }34 35 # Usage36 print(serialize({"name": "Alice"})) # JSON37 print(serialize([1, 2, 3])) # JSON array38 print(serialize(datetime.now())) # ISO format39 print(serialize(42)) # Falls back to str()
To add serialization for new types, simply register them with the decorator. The original function remains unmodified, demonstrating the open/closed principle in action.
partial: Function Currying
The
partial function creates partial applications by fixing some arguments of a function. This is useful when repeatedly calling functions with common parameters, such as database connections with consistent host and port values.Partial application allows pre-setting arguments, creating specialized versions of more general functions.
python
1 from functools import partial2 import logging3 4 # Create specialized logging functions5 log_debug = partial(logging.log, logging.DEBUG)6 log_error = partial(logging.log, logging.ERROR)7 8 # Partial application for configuration9 def connect_to_db(host, port, username, password, database):10 return f"Connected to {username}@{host}:{port}/{database}"11 12 # Create specialized connectors13 connect_to_prod = partial(connect_to_db,14 host="prod.db.com",15 port=5432)16 connect_to_dev = partial(connect_to_db,17 host="localhost",18 port=5432,19 username="dev")20 21 # Use them22 prod_conn = connect_to_prod(username="admin",23 password="secret",24 database="myapp")25 dev_conn = connect_to_dev(password="devpass",26 database="myapp_dev")
This transforms a 5-parameter function into specialized 2-3 parameter versions, reducing configuration errors and eliminating credential mix-ups between environments.
Context Managers: Beyond 'with open()'
While the
with open() pattern is widely known for file handling, Python allows creation of custom context managers for any resource that requires setup and cleanup operations.The
contextlib module provides decorators and utilities to create context managers without implementing the full protocol.The contextmanager Decorator
The
@contextmanager decorator converts a generator function into a context manager. Code before the yield statement executes on entry, and code after executes on exit, even if exceptions occur.python
1 from contextlib import contextmanager2 import time3 import os4 5 @contextmanager6 def timed_operation(name):7 """Measure and report operation time"""8 print(f"Starting {name}...")9 start = time.time()10 try:11 yield start # This is where the 'with' block executes12 finally:13 elapsed = time.time() - start14 print(f"{name} took {elapsed:.2f} seconds")15 16 # Usage17 with timed_operation("data processing") as start_time:18 process_large_dataset()19 # Any exception here will still trigger the finally block20 21 @contextmanager22 def temporary_env_var(key, value):23 """Temporarily set an environment variable"""24 old_value = os.environ.get(key)25 os.environ[key] = value26 try:27 yield28 finally:29 if old_value is None:30 os.environ.pop(key, None)31 else:32 os.environ[key] = old_value33 34 # Usage35 with temporary_env_var("DEBUG", "true"):36 run_tests() # Tests run with DEBUG=true37 # DEBUG is restored to original value
ExitStack: Dynamic Context Management
python
1 from contextlib import ExitStack2 3 def process_multiple_files(filenames):4 with ExitStack() as stack:5 # Open all files6 files = [7 stack.enter_context(open(fname))8 for fname in filenames9 ]10 11 # Process all files12 results = []13 for f in files:14 results.append(process_file(f))15 16 return results17 # All files are automatically closed18 19 # Conditional context managers20 def maybe_profile(should_profile):21 with ExitStack() as stack:22 if should_profile:23 stack.enter_context(profiler())24 25 # Always use timer26 stack.enter_context(timed_operation("processing"))27 28 return expensive_operation()
Type Hints: Beyond Basic Annotations
Python's type system has evolved far beyond simple
int and str annotations.Protocol: Structural Subtyping (Duck Typing with Types)
python
1 from typing import Protocol, runtime_checkable2 3 @runtime_checkable4 class Drawable(Protocol):5 """Anything that can be drawn"""6 def draw(self) -> None: ...7 8 class Circle:9 def draw(self) -> None:10 print("Drawing circle")11 12 class Square:13 def draw(self) -> None:14 print("Drawing square")15 16 def render(shape: Drawable) -> None:17 shape.draw()18 19 # Both work without inheritance!20 render(Circle()) # OK21 render(Square()) # OK22 23 # Runtime checking24 assert isinstance(Circle(), Drawable) # True
TypedDict: Typed Dictionaries
python
1 from typing import TypedDict, NotRequired, Required2 3 class UserDict(TypedDict):4 id: int5 username: str6 email: str7 is_active: bool8 metadata: NotRequired[dict] # Optional key9 10 class APIResponse(TypedDict, total=False): # All keys optional11 data: list[UserDict]12 error: str13 timestamp: float14 15 # Type checker knows the structure16 def process_user(user: UserDict) -> str:17 return f"User {user['username']} ({user['email']})"18 19 # Type checker catches errors20 user: UserDict = {21 'id': 1,22 'username': 'alice',23 'email': 'alice@example.com',24 'is_active': True25 }
Literal and Union Types
python
1 from typing import Literal, Union, overload2 3 LogLevel = Literal["DEBUG", "INFO", "WARNING", "ERROR"]4 5 def log(message: str, level: LogLevel = "INFO") -> None:6 print(f"[{level}] {message}")7 8 # Type checker ensures valid values9 log("Starting", "DEBUG") # OK10 log("Problem", "CRITICAL") # Type error!11 12 # Overloaded function signatures13 @overload14 def process(data: str) -> str: ...15 16 @overload17 def process(data: int) -> int: ...18 19 @overload20 def process(data: list) -> list: ...21 22 def process(data: Union[str, int, list]):23 if isinstance(data, str):24 return data.upper()25 elif isinstance(data, int):26 return data * 227 else:28 return data[::-1]
Descriptors: The Magic Behind Properties
Descriptors are what make
@property, @staticmethod, and @classmethod work. You can create your own!python
1 class ValidatedAttribute:2 """A descriptor that validates values"""3 def __init__(self, validator):4 self.validator = validator5 self.name = None6 7 def __set_name__(self, owner, name):8 self.name = f"_{name}"9 10 def __get__(self, obj, objtype=None):11 if obj is None:12 return self13 return getattr(obj, self.name, None)14 15 def __set__(self, obj, value):16 if not self.validator(value):17 raise ValueError(f"Invalid value: {value}")18 setattr(obj, self.name, value)19 20 class User:21 # Descriptors for validation22 age = ValidatedAttribute(lambda x: 0 <= x <= 150)23 email = ValidatedAttribute(lambda x: "@" in x)24 25 def __init__(self, age, email):26 self.age = age # Calls descriptor's __set__27 self.email = email # Calls descriptor's __set__28 29 # Usage30 user = User(25, "alice@example.com") # OK31 user.age = -5 # Raises ValueError!
Lazy Properties with Descriptors
python
1 class LazyProperty:2 """Compute property value only once"""3 def __init__(self, func):4 self.func = func5 self.name = func.__name__6 7 def __get__(self, obj, objtype=None):8 if obj is None:9 return self10 11 # Compute and cache the value12 value = self.func(obj)13 # Replace descriptor with computed value14 setattr(obj, self.name, value)15 return value16 17 class DataAnalysis:18 def __init__(self, data):19 self.data = data20 21 @LazyProperty22 def expensive_computation(self):23 print("Computing... (this only happens once)")24 return sum(x ** 2 for x in self.data)25 26 # Usage27 analysis = DataAnalysis(range(1000000))28 print(analysis.expensive_computation) # Computes29 print(analysis.expensive_computation) # Returns cached value
Data Classes: More Than Just init Generators
Data classes (Python 3.7+) are often underutilized beyond basic usage.
Advanced Data Class Features
python
1 from dataclasses import dataclass, field, InitVar2 from typing import ClassVar3 import uuid4 5 @dataclass6 class Product:7 name: str8 price: float9 10 # Class variable (not in __init__)11 tax_rate: ClassVar[float] = 0.0812 13 # Field with factory function14 id: str = field(default_factory=lambda: str(uuid.uuid4()))15 16 # Excluded from __init__ but available in __post_init__17 validate: InitVar[bool] = True18 19 # Computed field20 tags: list[str] = field(default_factory=list)21 22 def __post_init__(self, validate):23 if validate and self.price < 0:24 raise ValueError("Price cannot be negative")25 26 # Computed property27 self.price_with_tax = self.price * (1 + self.tax_rate)28 29 # Cached property30 @property31 def display_price(self):32 return f"${self.price:.2f}"33 34 # Frozen (immutable) data classes35 @dataclass(frozen=True)36 class Point:37 x: float38 y: float39 40 def distance_from_origin(self):41 return (self.x ** 2 + self.y ** 2) ** 0.542 43 # Usage44 p1 = Point(3, 4)45 # p1.x = 5 # Error! Frozen dataclass
Data Classes with Inheritance
python
1 @dataclass2 class Vehicle:3 make: str4 model: str5 year: int6 7 @dataclass8 class Car(Vehicle):9 doors: int = 410 11 @dataclass12 class Truck(Vehicle):13 payload_capacity: float14 doors: int = 215 16 # Comparison and sorting17 @dataclass(order=True)18 class Person:19 # sort_index is used for comparisons20 sort_index: float = field(init=False, repr=False)21 name: str22 age: int23 24 def __post_init__(self):25 # Sort by age, then name26 self.sort_index = (self.age, self.name)27 28 people = [29 Person("Alice", 30),30 Person("Bob", 25),31 Person("Charlie", 30)32 ]33 people.sort() # Automatically uses sort_index
pathlib: File Paths as Objects
Stop concatenating strings with
os.path.join()! pathlib (Python 3.4+) provides an object-oriented approach to file paths.python
1 from pathlib import Path2 3 # Creating paths4 project_dir = Path.home() / "projects" / "myapp"5 config_file = project_dir / "config.json"6 7 # Path operations8 if config_file.exists():9 data = json.loads(config_file.read_text())10 11 # Writing files12 output_file = project_dir / "output" / "results.csv"13 output_file.parent.mkdir(parents=True, exist_ok=True)14 output_file.write_text("col1,col2\n1,2\n")15 16 # Iterating over files17 for python_file in project_dir.rglob("*.py"):18 print(f"Found: {python_file.relative_to(project_dir)}")19 20 # Path properties21 print(config_file.suffix) # .json22 print(config_file.stem) # config23 print(config_file.parent) # /home/user/projects/myapp24 print(config_file.is_file()) # True
pathlib vs os.path
python
1 # Old way with os.path2 import os3 base_dir = os.path.dirname(os.path.abspath(__file__))4 config_path = os.path.join(base_dir, "config", "settings.json")5 if os.path.exists(config_path):6 with open(config_path) as f:7 config = json.load(f)8 9 # New way with pathlib10 from pathlib import Path11 base_dir = Path(__file__).parent.absolute()12 config_path = base_dir / "config" / "settings.json"13 if config_path.exists():14 config = json.loads(config_path.read_text())
💡
pathlib automatically handles platform differences (Windows vs Unix paths), making your code more portable.collections: Beyond dict and list
The
collections module has several underused gems.ChainMap: Layered Dictionaries
python
1 from collections import ChainMap2 3 # Configuration with defaults and overrides4 defaults = {'debug': False, 'port': 8080, 'host': 'localhost'}5 environment = {'port': 9000, 'debug': True}6 command_line = {'host': '0.0.0.0'}7 8 # ChainMap searches in order9 config = ChainMap(command_line, environment, defaults)10 11 print(config['host']) # '0.0.0.0' (from command_line)12 print(config['port']) # 9000 (from environment)13 print(config['debug']) # True (from environment)14 15 # Updates go to the first map16 config['new_key'] = 'value'17 print(command_line) # {'host': '0.0.0.0', 'new_key': 'value'}
defaultdict with Factories
python
1 from collections import defaultdict2 from datetime import datetime3 4 # Nested defaultdict for 2D structures5 matrix = defaultdict(lambda: defaultdict(int))6 matrix[0][0] = 17 matrix[5][5] = 28 print(matrix[3][3]) # 0 (automatically created)9 10 # Tracking with timestamps11 access_log = defaultdict(lambda: {'count': 0, 'first': datetime.now()})12 13 def log_access(user_id):14 access_log[user_id]['count'] += 115 if access_log[user_id]['count'] == 1:16 access_log[user_id]['last'] = access_log[user_id]['first']17 else:18 access_log[user_id]['last'] = datetime.now()
itertools: Combinatorial Power
itertools provides memory-efficient tools for creating iterators.python
1 from itertools import (2 chain, cycle, repeat, count,3 accumulate, groupby, permutations,4 combinations, product, islice5 )6 7 # Chain multiple iterables8 all_users = chain(9 get_active_users(),10 get_inactive_users(),11 get_pending_users()12 )13 14 # Sliding window15 def sliding_window(iterable, n):16 """Generate sliding windows of size n"""17 it = iter(iterable)18 window = list(islice(it, n))19 yield tuple(window)20 for item in it:21 window.append(item)22 window.pop(0)23 yield tuple(window)24 25 # Usage26 data = [1, 2, 3, 4, 5]27 for window in sliding_window(data, 3):28 print(window) # (1,2,3), (2,3,4), (3,4,5)29 30 # Grouping data31 data = [32 {'type': 'A', 'value': 1},33 {'type': 'A', 'value': 2},34 {'type': 'B', 'value': 3},35 {'type': 'A', 'value': 4},36 ]37 38 # Must sort first for groupby!39 data.sort(key=lambda x: x['type'])40 for key, group in groupby(data, key=lambda x: x['type']):41 items = list(group)42 print(f"{key}: {items}")
Named Tuples: Lightweight Classes
Named tuples provide a memory-efficient alternative to classes for simple data containers.
python
1 from typing import NamedTuple2 from collections import namedtuple3 4 # Modern approach with typing5 class Point(NamedTuple):6 x: float7 y: float8 9 def distance_from_origin(self) -> float:10 return (self.x ** 2 + self.y ** 2) ** 0.511 12 # Classic approach13 Color = namedtuple('Color', ['red', 'green', 'blue', 'alpha'],14 defaults=[255]) # alpha defaults to 25515 16 # Usage17 p = Point(3, 4)18 print(p.x, p.y) # Access by name19 print(p[0], p[1]) # Access by index20 x, y = p # Unpacking21 22 color = Color(128, 0, 255) # alpha=255 (default)
slots: Memory Optimization
For classes with many instances,
__slots__ can significantly reduce memory usage.python
1 class RegularPoint:2 def __init__(self, x, y):3 self.x = x4 self.y = y5 6 class SlottedPoint:7 __slots__ = ['x', 'y']8 9 def __init__(self, x, y):10 self.x = x11 self.y = y12 13 # Memory comparison14 import sys15 regular = RegularPoint(1, 2)16 slotted = SlottedPoint(1, 2)17 18 print(sys.getsizeof(regular.__dict__)) # 296 bytes (on 64-bit Python)19 # slotted has no __dict__, saves memory per instance20 21 # Performance benefit for attribute access22 # Slotted classes have faster attribute access due to fixed memory layout
⚠️
Use
__slots__ carefully. It prevents dynamic attribute addition and can complicate inheritance. Best for classes with many instances and fixed attributes.Ellipsis (...): Not Just for Type Hints
The
... (Ellipsis) literal has several uses beyond type hints.python
1 # In numpy-style slicing2 class Matrix:3 def __getitem__(self, key):4 if key is ...:5 return "All elements"6 return f"Specific: {key}"7 8 m = Matrix()9 print(m[...]) # "All elements"10 11 # As a placeholder12 def not_implemented_yet():13 ... # More explicit than 'pass'14 15 # In type hints16 from typing import Callable17 Handler = Callable[..., None] # Any args, returns None18 19 # As a sentinel value20 MISSING = ... # More unique than None21 22 def get_value(key, default=MISSING):23 if default is not MISSING:24 return cache.get(key, default)25 return cache[key] # Raise KeyError if missing26 27 # In stub files and protocols28 class MyProtocol(Protocol):29 def method(self) -> int: ... # Just the signature
Conclusion
These 12 Python features represent powerful tools for writing more efficient, readable, and maintainable code. From the walrus operator's concise assignments to pattern matching's elegant data destructuring, each feature addresses specific programming challenges.
The standard library contains many more underutilized features. Understanding and applying these tools appropriately can significantly improve code quality and developer productivity.
Consider gradually incorporating these features into your codebase where they provide clear benefits. Start with simple applications like
@lru_cache for expensive computations or pathlib for file operations, then explore more advanced features as needed.