Reading Environment Variables in Python Scripts
Illustration showing a Python script access environment variables: terminal window, code snippet using os.environ.get and dotenv, arrows to secure storage and runtime configuration
Sponsor message — This article is made possible by Dargslan.com, a publisher of practical, no-fluff IT & developer workbooks.
Why Dargslan.com?
If you prefer doing over endless theory, Dargslan’s titles are built for you. Every workbook focuses on skills you can apply the same day—server hardening, Linux one-liners, PowerShell for admins, Python automation, cloud basics, and more.
Environment variables serve as the backbone of modern application configuration, acting as a bridge between your code and the systems it runs on. Whether you're building a simple script or deploying complex microservices across multiple platforms, understanding how to properly read and manage environment variables in Python isn't just a technical skill—it's a fundamental requirement for writing secure, portable, and maintainable code. The way you handle sensitive information like API keys, database credentials, and configuration parameters can make the difference between a robust application and a security nightmare.
At its core, an environment variable is a dynamic-named value stored outside your application code that can affect the way running processes behave on a computer. In Python, these variables provide a clean separation between code and configuration, allowing you to deploy the same codebase across development, staging, and production environments without changing a single line of code. This separation of concerns aligns with the twelve-factor app methodology and represents best practices in modern software development.
Throughout this comprehensive guide, you'll discover multiple approaches to reading environment variables in Python, from the built-in os module to sophisticated third-party libraries. You'll learn practical patterns for handling missing variables, setting defaults, type conversion, and securing sensitive data. We'll explore real-world scenarios, common pitfalls, and professional techniques that will transform how you manage configuration in your Python projects.
The Standard Library Approach: Using the os Module
Python's built-in os module provides the most straightforward way to access environment variables without requiring any external dependencies. The os.environ object behaves like a dictionary, giving you immediate access to all environment variables available to your Python process. This approach is perfect for simple scripts and applications where you need quick access to configuration values.
The basic syntax for reading an environment variable looks like this:
import os
api_key = os.environ['API_KEY']
database_url = os.environ['DATABASE_URL']However, this direct access method has a significant drawback: if the environment variable doesn't exist, your application will crash with a KeyError exception. In production environments, this abrupt failure can be catastrophic, especially during deployment or when configuration changes haven't propagated correctly.
"The most elegant code is worthless if it fails silently in production or crashes without graceful degradation."
A more defensive approach uses the get() method, which allows you to specify a default value if the variable is missing:
import os
api_key = os.environ.get('API_KEY', 'default-api-key')
debug_mode = os.environ.get('DEBUG', 'False')
max_connections = os.environ.get('MAX_CONNECTIONS', '10')This pattern prevents crashes but introduces another consideration: all environment variables are strings. When you need boolean flags, integers, or other data types, you must explicitly convert them:
import os
debug_mode = os.environ.get('DEBUG', 'False').lower() in ('true', '1', 'yes')
max_connections = int(os.environ.get('MAX_CONNECTIONS', '10'))
timeout = float(os.environ.get('TIMEOUT', '30.0'))Checking Variable Existence
Sometimes you need to know whether an environment variable exists before attempting to use it. The in operator works perfectly for this purpose:
import os
if 'API_KEY' in os.environ:
api_key = os.environ['API_KEY']
print(f"Using API key: {api_key[:4]}...")
else:
print("Warning: API_KEY not set, using mock service")Listing All Environment Variables
During debugging or logging initialization, you might want to inspect all available environment variables. The os.environ object supports dictionary operations:
import os
for key, value in os.environ.items():
if not key.startswith('SECRET_'):
print(f"{key}: {value}")Advanced Pattern: The dotenv Library
While environment variables work beautifully in production environments, local development presents challenges. Manually setting dozens of environment variables before running your application becomes tedious and error-prone. The python-dotenv library solves this problem by loading environment variables from a .env file during development.
First, install the library:
pip install python-dotenvCreate a .env file in your project root:
DATABASE_URL=postgresql://localhost/mydb
API_KEY=sk_test_1234567890
DEBUG=True
MAX_WORKERS=4
REDIS_HOST=localhost
REDIS_PORT=6379"Configuration should be stored in the environment, but developers need convenient ways to manage that configuration locally."
Then load these variables in your Python script:
import os
from dotenv import load_dotenv
load_dotenv()
database_url = os.environ.get('DATABASE_URL')
api_key = os.environ.get('API_KEY')
debug = os.environ.get('DEBUG', 'False').lower() == 'true'The load_dotenv() function reads the .env file and sets the variables in os.environ. Crucially, it doesn't override existing environment variables, which means production environment variables take precedence over your development defaults. This behavior ensures your application works correctly across different environments.
Organizing Environment Files
Professional projects often maintain multiple environment files:
- ✨ .env — Default development settings
- 🧪 .env.test — Testing configuration
- 🎭 .env.staging — Staging environment settings
- 📋 .env.example — Template with dummy values (committed to git)
- 🔒 .env.local — Personal overrides (never committed)
Load specific environment files based on context:
import os
from dotenv import load_dotenv
environment = os.environ.get('ENVIRONMENT', 'development')
dotenv_path = f'.env.{environment}'
if os.path.exists(dotenv_path):
load_dotenv(dotenv_path)
else:
load_dotenv() # fallback to .envBuilding a Configuration Class
As applications grow, scattering os.environ.get() calls throughout your codebase becomes unmaintainable. A centralized configuration class provides structure, validation, and type safety. This pattern is especially valuable in larger applications where configuration management becomes complex.
| Approach | Pros | Cons | Best For |
|---|---|---|---|
| Direct os.environ | Simple, no dependencies, fast | No validation, scattered code, type conversion needed | Small scripts, prototypes |
| Configuration Class | Centralized, type-safe, validated | More code, requires maintenance | Medium to large applications |
| Pydantic Settings | Automatic validation, type hints, IDE support | External dependency, learning curve | Professional applications, APIs |
| python-decouple | Simple API, type conversion, .ini support | Another dependency | Django/Flask projects |
Here's a robust configuration class implementation:
import os
from typing import Optional
class Config:
def __init__(self):
self.database_url: str = self._get_required('DATABASE_URL')
self.api_key: str = self._get_required('API_KEY')
self.debug: bool = self._get_bool('DEBUG', False)
self.max_workers: int = self._get_int('MAX_WORKERS', 4)
self.timeout: float = self._get_float('TIMEOUT', 30.0)
self.redis_host: str = os.environ.get('REDIS_HOST', 'localhost')
self.redis_port: int = self._get_int('REDIS_PORT', 6379)
def _get_required(self, key: str) -> str:
value = os.environ.get(key)
if value is None:
raise ValueError(f"Required environment variable {key} is not set")
return value
def _get_bool(self, key: str, default: bool = False) -> bool:
value = os.environ.get(key, str(default))
return value.lower() in ('true', '1', 'yes', 'on')
def _get_int(self, key: str, default: int) -> int:
value = os.environ.get(key, str(default))
try:
return int(value)
except ValueError:
raise ValueError(f"Environment variable {key} must be an integer, got: {value}")
def _get_float(self, key: str, default: float) -> float:
value = os.environ.get(key, str(default))
try:
return float(value)
except ValueError:
raise ValueError(f"Environment variable {key} must be a float, got: {value}")
config = Config()This configuration class provides several advantages. Type conversion happens in one place with proper error handling. Required variables throw clear exceptions during initialization rather than causing mysterious failures later. The configuration object becomes a single source of truth that can be imported anywhere in your application:
from config import config
def connect_to_database():
return create_connection(config.database_url)
def start_workers():
for i in range(config.max_workers):
start_worker(i)"Centralized configuration isn't just about convenience—it's about creating a single point of validation that catches errors before they reach production."
Using Pydantic for Validation
The Pydantic library takes configuration management to the next level with automatic validation, type coercion, and excellent IDE support through type hints. If you're building APIs with FastAPI or need robust configuration validation, Pydantic's BaseSettings class is invaluable.
Install Pydantic:
pip install pydantic[dotenv]Create a settings class:
from pydantic import BaseSettings, Field, validator
from typing import Optional
class Settings(BaseSettings):
database_url: str = Field(..., env='DATABASE_URL')
api_key: str = Field(..., env='API_KEY')
debug: bool = Field(False, env='DEBUG')
max_workers: int = Field(4, env='MAX_WORKERS', ge=1, le=100)
timeout: float = Field(30.0, env='TIMEOUT', gt=0)
redis_host: str = Field('localhost', env='REDIS_HOST')
redis_port: int = Field(6379, env='REDIS_PORT', ge=1, le=65535)
allowed_hosts: list[str] = Field(default_factory=list, env='ALLOWED_HOSTS')
@validator('database_url')
def validate_database_url(cls, v):
if not v.startswith(('postgresql://', 'mysql://', 'sqlite://')):
raise ValueError('Invalid database URL scheme')
return v
@validator('allowed_hosts', pre=True)
def parse_allowed_hosts(cls, v):
if isinstance(v, str):
return [host.strip() for host in v.split(',')]
return v
class Config:
env_file = '.env'
env_file_encoding = 'utf-8'
case_sensitive = False
settings = Settings()Pydantic automatically handles type conversion, validates constraints, and provides helpful error messages when validation fails. The Field function allows you to specify constraints like minimum and maximum values, default factories for complex types, and custom environment variable names.
Advanced Validation Patterns
Complex applications often need sophisticated validation logic. Pydantic validators can access other fields and perform cross-field validation:
from pydantic import BaseSettings, root_validator
class Settings(BaseSettings):
use_ssl: bool = False
ssl_cert_path: Optional[str] = None
ssl_key_path: Optional[str] = None
@root_validator
def validate_ssl_config(cls, values):
use_ssl = values.get('use_ssl')
cert_path = values.get('ssl_cert_path')
key_path = values.get('ssl_key_path')
if use_ssl and (not cert_path or not key_path):
raise ValueError('SSL enabled but certificate paths not provided')
return valuesWorking with Secret Management
Storing secrets in environment variables is better than hardcoding them, but production systems often require more sophisticated secret management. Integration with services like AWS Secrets Manager, HashiCorp Vault, or Azure Key Vault provides additional security layers.
"Environment variables are a starting point for configuration, not the final destination for sensitive production secrets."
Here's a pattern that falls back to environment variables during development but uses a secret manager in production:
import os
from typing import Optional
class SecretManager:
def __init__(self):
self.environment = os.environ.get('ENVIRONMENT', 'development')
self._cache = {}
def get_secret(self, key: str, default: Optional[str] = None) -> Optional[str]:
if key in self._cache:
return self._cache[key]
if self.environment == 'development':
value = os.environ.get(key, default)
else:
value = self._fetch_from_vault(key) or default
if value is not None:
self._cache[key] = value
return value
def _fetch_from_vault(self, key: str) -> Optional[str]:
# Implement your secret manager integration here
# This is where you'd call AWS Secrets Manager, Vault, etc.
pass
secrets = SecretManager()Environment Variable Naming Conventions
Consistent naming conventions make configuration management more maintainable and reduce errors. While no universal standard exists, several patterns have emerged across the industry:
- 🔤 Use UPPERCASE_WITH_UNDERSCORES for all environment variables
- 🏷️ Prefix related variables with a namespace (e.g., DB_HOST, DB_PORT, DB_NAME)
- 🚫 Avoid special characters except underscores
- 📝 Use descriptive names that clearly indicate purpose
- 🔢 Include units in names when relevant (TIMEOUT_SECONDS, MAX_SIZE_MB)
Good naming examples:
DATABASE_URL=postgresql://localhost/mydb
API_KEY_STRIPE=sk_test_abc123
CACHE_TTL_SECONDS=3600
MAX_UPLOAD_SIZE_MB=10
FEATURE_FLAG_NEW_UI=true
LOG_LEVEL=INFO
WORKER_CONCURRENCY=4Poor naming examples to avoid:
db=localhost # Too short, lowercase
API-KEY=test # Uses hyphens
max_size=10 # Unclear what unit
NewFeature=1 # Mixed case
key=abc123 # Too genericType Conversion and Parsing
Since environment variables are always strings, proper type conversion is essential. Beyond simple boolean and integer conversion, you'll often need to parse complex data structures like lists, dictionaries, or JSON.
| Data Type | Environment Variable Example | Python Conversion | Notes |
|---|---|---|---|
| Boolean | DEBUG=true | value.lower() in ('true', '1', 'yes') | Case-insensitive, multiple formats |
| Integer | MAX_WORKERS=4 | int(value) | Handle ValueError exceptions |
| Float | TIMEOUT=30.5 | float(value) | Useful for timeouts, rates |
| List | ALLOWED_HOSTS=host1,host2 | value.split(',') | Comma-separated is common |
| JSON | CONFIG={"key":"value"} | json.loads(value) | For complex structures |
Here's a comprehensive type conversion utility:
import os
import json
from typing import Any, List, Dict, Optional, TypeVar, Callable
T = TypeVar('T')
class EnvParser:
@staticmethod
def get_string(key: str, default: Optional[str] = None) -> Optional[str]:
return os.environ.get(key, default)
@staticmethod
def get_bool(key: str, default: bool = False) -> bool:
value = os.environ.get(key, str(default)).lower()
return value in ('true', '1', 'yes', 'on', 't')
@staticmethod
def get_int(key: str, default: int = 0) -> int:
value = os.environ.get(key, str(default))
try:
return int(value)
except ValueError:
raise ValueError(f"Cannot convert {key}={value} to integer")
@staticmethod
def get_float(key: str, default: float = 0.0) -> float:
value = os.environ.get(key, str(default))
try:
return float(value)
except ValueError:
raise ValueError(f"Cannot convert {key}={value} to float")
@staticmethod
def get_list(key: str, default: Optional[List[str]] = None,
separator: str = ',') -> List[str]:
if default is None:
default = []
value = os.environ.get(key)
if value is None:
return default
return [item.strip() for item in value.split(separator) if item.strip()]
@staticmethod
def get_json(key: str, default: Optional[Dict] = None) -> Dict:
if default is None:
default = {}
value = os.environ.get(key)
if value is None:
return default
try:
return json.loads(value)
except json.JSONDecodeError as e:
raise ValueError(f"Cannot parse {key} as JSON: {e}")
@staticmethod
def get_enum(key: str, enum_class: type, default: Any) -> Any:
value = os.environ.get(key)
if value is None:
return default
try:
return enum_class[value.upper()]
except KeyError:
valid_values = ', '.join([e.name for e in enum_class])
raise ValueError(f"{key}={value} is not valid. Must be one of: {valid_values}")Usage example:
from enum import Enum
class LogLevel(Enum):
DEBUG = 'debug'
INFO = 'info'
WARNING = 'warning'
ERROR = 'error'
debug = EnvParser.get_bool('DEBUG', False)
max_workers = EnvParser.get_int('MAX_WORKERS', 4)
allowed_hosts = EnvParser.get_list('ALLOWED_HOSTS', ['localhost'])
log_level = EnvParser.get_enum('LOG_LEVEL', LogLevel, LogLevel.INFO)Testing with Environment Variables
Testing code that depends on environment variables requires careful setup and teardown to avoid test pollution. The unittest.mock module provides excellent tools for temporarily modifying environment variables during tests.
"Tests should be isolated and repeatable. Environment variables that leak between tests create flaky, unreliable test suites."
Using context managers for clean test isolation:
import os
import unittest
from unittest.mock import patch
class TestConfiguration(unittest.TestCase):
def test_with_environment_variable(self):
with patch.dict(os.environ, {'API_KEY': 'test-key-123'}):
from config import config
self.assertEqual(config.api_key, 'test-key-123')
def test_missing_required_variable(self):
with patch.dict(os.environ, {}, clear=True):
with self.assertRaises(ValueError):
from config import Config
Config()
def test_boolean_conversion(self):
test_cases = [
('true', True),
('True', True),
('1', True),
('false', False),
('False', False),
('0', False),
]
for value, expected in test_cases:
with patch.dict(os.environ, {'DEBUG': value}):
result = os.environ.get('DEBUG', 'False').lower() in ('true', '1', 'yes')
self.assertEqual(result, expected)For pytest users, fixtures provide an elegant way to manage test environments:
import pytest
import os
@pytest.fixture
def mock_env(monkeypatch):
monkeypatch.setenv('DATABASE_URL', 'postgresql://test/db')
monkeypatch.setenv('API_KEY', 'test-key')
monkeypatch.setenv('DEBUG', 'True')
yield
# Automatic cleanup after test
def test_configuration(mock_env):
from config import config
assert config.database_url == 'postgresql://test/db'
assert config.debug is True
@pytest.fixture
def clean_env(monkeypatch):
# Remove all environment variables for testing defaults
for key in list(os.environ.keys()):
monkeypatch.delenv(key, raising=False)
yield
def test_defaults(clean_env):
from config import Config
config = Config()
assert config.max_workers == 4 # default valueSecurity Considerations
Environment variables containing sensitive information require careful handling. Logging, error messages, and debugging output can inadvertently expose secrets if you're not cautious.
Never log environment variables directly:
# DANGEROUS - logs actual secret
import os
print(f"API_KEY: {os.environ.get('API_KEY')}")
# SAFE - masks sensitive data
api_key = os.environ.get('API_KEY', '')
masked = api_key[:4] + '*' * (len(api_key) - 4) if api_key else 'not set'
print(f"API_KEY: {masked}")Create a logging-safe configuration representation:
class Config:
SENSITIVE_KEYS = {'API_KEY', 'DATABASE_URL', 'SECRET_KEY', 'PASSWORD'}
def __repr__(self):
items = []
for key, value in self.__dict__.items():
if key.upper() in self.SENSITIVE_KEYS or 'password' in key.lower():
display_value = '***REDACTED***'
else:
display_value = value
items.append(f"{key}={display_value}")
return f"Config({', '.join(items)})"
def to_dict(self, include_sensitive: bool = False):
result = {}
for key, value in self.__dict__.items():
if include_sensitive or key.upper() not in self.SENSITIVE_KEYS:
result[key] = value
else:
result[key] = '***REDACTED***'
return resultPreventing Environment Variable Injection
When accepting environment variable names from external sources, validate them rigorously to prevent injection attacks:
import re
def safe_get_env(key: str, default: str = '') -> str:
# Only allow alphanumeric characters and underscores
if not re.match(r'^[A-Z0-9_]+$', key):
raise ValueError(f"Invalid environment variable name: {key}")
# Blacklist dangerous variables
dangerous_vars = {'PATH', 'LD_PRELOAD', 'LD_LIBRARY_PATH'}
if key in dangerous_vars:
raise ValueError(f"Access to {key} is not allowed")
return os.environ.get(key, default)Docker and Container Environments
Containers have become the standard deployment model, and Docker provides several mechanisms for passing environment variables to your Python applications. Understanding these patterns ensures smooth deployment across different orchestration platforms.
In a Dockerfile, set default values:
FROM python:3.11-slim
ENV DEBUG=False
ENV MAX_WORKERS=4
ENV TIMEOUT=30.0
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY . /app
WORKDIR /app
CMD ["python", "main.py"]Override values at runtime using docker-compose.yml:
version: '3.8'
services:
app:
build: .
environment:
- DEBUG=True
- DATABASE_URL=postgresql://db:5432/mydb
- API_KEY=${API_KEY} # from host environment
env_file:
- .env.production
ports:
- "8000:8000"For Kubernetes deployments, use ConfigMaps and Secrets:
apiVersion: v1
kind: ConfigMap
metadata:
name: app-config
data:
DEBUG: "False"
MAX_WORKERS: "4"
---
apiVersion: v1
kind: Secret
metadata:
name: app-secrets
type: Opaque
stringData:
API_KEY: "your-secret-key"
DATABASE_URL: "postgresql://..."
---
apiVersion: apps/v1
kind: Deployment
metadata:
name: python-app
spec:
template:
spec:
containers:
- name: app
image: myapp:latest
envFrom:
- configMapRef:
name: app-config
- secretRef:
name: app-secretsCommon Pitfalls and Solutions
Even experienced developers encounter challenges when working with environment variables. Understanding these common pitfalls helps you avoid frustrating debugging sessions.
Pitfall 1: Forgetting type conversion
# WRONG - comparison fails because '0' is truthy
if os.environ.get('WORKERS', '0'):
start_workers() # Always executes!
# CORRECT
workers = int(os.environ.get('WORKERS', '0'))
if workers > 0:
start_workers()Pitfall 2: Not handling missing variables gracefully
# WRONG - crashes in production
database_url = os.environ['DATABASE_URL']
# CORRECT - explicit error with context
database_url = os.environ.get('DATABASE_URL')
if not database_url:
raise RuntimeError(
"DATABASE_URL environment variable is required. "
"Please set it before starting the application."
)Pitfall 3: Committing .env files to version control
Always add .env files to .gitignore:
# .gitignore
.env
.env.local
.env.*.local
*.env
# But DO commit the example
!.env.example"The moment you commit a secret to git, consider it compromised. Git history is permanent, and secrets can't be truly deleted from repository history."
Pitfall 4: Using environment variables for non-configuration data
Environment variables work best for configuration that changes between environments. Don't use them for application state, user data, or anything that needs to persist. They're not a database replacement.
Performance Considerations
While reading environment variables is generally fast, repeatedly accessing os.environ in hot code paths can add unnecessary overhead. Cache configuration values during initialization:
# INEFFICIENT - reads environment on every request
def handle_request(request):
timeout = int(os.environ.get('TIMEOUT', '30'))
max_retries = int(os.environ.get('MAX_RETRIES', '3'))
# ... rest of handler
# EFFICIENT - read once at startup
class Config:
def __init__(self):
self.timeout = int(os.environ.get('TIMEOUT', '30'))
self.max_retries = int(os.environ.get('MAX_RETRIES', '3'))
config = Config()
def handle_request(request):
timeout = config.timeout
max_retries = config.max_retries
# ... rest of handlerFor applications that need to reload configuration without restarting, implement a reload mechanism:
import os
import threading
from typing import Dict, Any
class ReloadableConfig:
def __init__(self):
self._config: Dict[str, Any] = {}
self._lock = threading.Lock()
self.reload()
def reload(self):
with self._lock:
self._config = {
'timeout': int(os.environ.get('TIMEOUT', '30')),
'max_workers': int(os.environ.get('MAX_WORKERS', '4')),
'debug': os.environ.get('DEBUG', 'False').lower() == 'true',
}
def get(self, key: str, default: Any = None) -> Any:
with self._lock:
return self._config.get(key, default)
config = ReloadableConfig()
# Reload configuration on SIGHUP or via API endpoint
import signal
signal.signal(signal.SIGHUP, lambda signum, frame: config.reload())Integration with Popular Frameworks
Different Python frameworks have established patterns for configuration management. Aligning with these conventions makes your code more maintainable and familiar to other developers.
Django Configuration
import os
from pathlib import Path
BASE_DIR = Path(__file__).resolve().parent.parent
SECRET_KEY = os.environ.get('DJANGO_SECRET_KEY')
if not SECRET_KEY:
raise ValueError('DJANGO_SECRET_KEY environment variable is required')
DEBUG = os.environ.get('DJANGO_DEBUG', 'False').lower() == 'true'
ALLOWED_HOSTS = os.environ.get('DJANGO_ALLOWED_HOSTS', 'localhost').split(',')
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': os.environ.get('DB_NAME', 'mydb'),
'USER': os.environ.get('DB_USER', 'postgres'),
'PASSWORD': os.environ.get('DB_PASSWORD', ''),
'HOST': os.environ.get('DB_HOST', 'localhost'),
'PORT': os.environ.get('DB_PORT', '5432'),
}
}Flask Configuration
import os
from flask import Flask
app = Flask(__name__)
app.config.update(
DEBUG=os.environ.get('FLASK_DEBUG', 'False').lower() == 'true',
SECRET_KEY=os.environ.get('FLASK_SECRET_KEY', 'dev-secret-key'),
DATABASE_URI=os.environ.get('DATABASE_URL'),
MAX_CONTENT_LENGTH=int(os.environ.get('MAX_UPLOAD_SIZE', 16 * 1024 * 1024)),
)
# Or load from object
class Config:
DEBUG = os.environ.get('FLASK_DEBUG', 'False').lower() == 'true'
SECRET_KEY = os.environ.get('FLASK_SECRET_KEY')
SQLALCHEMY_DATABASE_URI = os.environ.get('DATABASE_URL')
app.config.from_object(Config)FastAPI with Pydantic
from fastapi import FastAPI
from pydantic import BaseSettings
class Settings(BaseSettings):
app_name: str = "My API"
debug: bool = False
database_url: str
api_key: str
class Config:
env_file = ".env"
settings = Settings()
app = FastAPI(title=settings.app_name, debug=settings.debug)
@app.get("/config")
async def get_config():
return {
"app_name": settings.app_name,
"debug": settings.debug,
# Never expose sensitive values in API responses
}How do I set environment variables on different operating systems?
On Linux and macOS, you can set environment variables in your terminal using export VARIABLE_NAME=value. For permanent settings, add this line to your shell configuration file (~/.bashrc, ~/.zshrc). On Windows, use set VARIABLE_NAME=value in Command Prompt or $env:VARIABLE_NAME="value" in PowerShell. For permanent Windows variables, use System Properties > Advanced > Environment Variables. For development, using a .env file with python-dotenv is the most convenient cross-platform approach.
What's the difference between os.environ['KEY'] and os.environ.get('KEY')?
The bracket notation os.environ['KEY'] raises a KeyError exception if the environment variable doesn't exist, which can crash your application. The get() method returns None by default if the variable is missing, and allows you to specify a default value as a second argument: os.environ.get('KEY', 'default'). In production code, always use get() with appropriate defaults or explicit error handling to prevent unexpected crashes.
Should I commit my .env file to version control?
Never commit .env files containing real secrets or credentials to version control. These files often contain sensitive information like API keys, database passwords, and encryption keys. Instead, commit a .env.example file with dummy values that documents what environment variables your application needs. Add .env to your .gitignore file. If you accidentally commit secrets, consider them compromised and rotate them immediately—removing them from git history isn't sufficient because they may have been cloned or cached.
How can I validate that all required environment variables are set before my application starts?
Create a configuration class or function that runs during application initialization and checks for required variables. Raise clear exceptions with helpful messages if variables are missing. Using Pydantic's BaseSettings with required fields (no defaults) automatically validates presence. You can also create a startup script that validates configuration before launching your main application. This fail-fast approach prevents mysterious runtime errors and makes configuration problems immediately obvious during deployment.
What's the best way to handle environment-specific configuration in CI/CD pipelines?
Most CI/CD platforms (GitHub Actions, GitLab CI, Jenkins) provide secure ways to store environment variables or secrets that get injected during builds and deployments. Use these platform-specific secret management features rather than storing secrets in your repository. For different environments (development, staging, production), use separate secret groups or namespaces. Your application should only need to read from environment variables—the CI/CD platform handles setting them appropriately for each environment. Consider using tools like Terraform or Ansible to manage environment variables consistently across infrastructure.
How do I handle complex configuration that doesn't fit well in environment variables?
For complex configuration structures, consider using environment variables to point to configuration files or using JSON-encoded environment variables. You can set an environment variable like CONFIG_PATH=/etc/myapp/config.json and load that file, or encode JSON directly: FEATURE_FLAGS='{"new_ui":true,"beta_features":false}'. Another approach is using a configuration service like Consul or etcd, with environment variables only storing the service URL and credentials. For very complex scenarios, consider a hierarchical configuration system that merges defaults, environment variables, and configuration files in order of precedence.