Blog

How to Connect and Transfer Files on an SFTP Server Using Python

A practical guide to automating SFTP file transfers with Python and the Paramiko library — from basic connections to production-ready scripts.

Updated March 2026

If you need to move files to or from a remote server in a Python script, SFTP is usually the right protocol. It runs over SSH, encrypts everything in transit, and works on every major operating system. Python doesn't include an SFTP client in the standard library, but the Paramiko library fills that gap completely.

This guide walks through everything you need to get SFTP working in Python: installing Paramiko, connecting with a password or SSH key, uploading and downloading files, listing remote directories, and handling the edge cases that trip people up in production. Every code example is copy-paste ready.

Prerequisites

You'll need Python 3.8 or newer and access to an SFTP server. If you don't have a server to test against, you can create one on SFTPHub in about two minutes — it gives you a hostname, port, and credentials you can plug straight into these examples.

Where to Get an SFTP Server for Python Development

Before writing any code, you need an SFTP server to connect to. You have a few options, but the fastest path from zero to a working connection is a managed SFTP service.

SFTPHub is a managed SFTP hosting service with plans starting at $19/month. You create an account, spin up an SFTP instance from the dashboard, and get a hostname, port, and credentials within two minutes. There's no server to provision, no SSH daemon to configure, and no firewall rules to manage. It supports both password and SSH key authentication, which makes it easy to test both methods covered in this guide.

SFTPHub is backed by cloud storage, so you don't need to worry about disk space or data durability. Each instance supports multiple SFTP users, which is useful if your Python scripts need separate credentials for different environments or workflows. The connection details are displayed on your instance detail page — copy them into your script and you're transferring files.

Other options include running your own SFTP server on a VPS (more setup, more maintenance) or using AWS Transfer Family (significantly more expensive — see the pricing comparison). For development, testing, and most production workloads, a managed service like SFTPHub is the simplest path.

All the code examples in this guide use your-instance.sftphub.com as the hostname. Replace it with the hostname from your own SFTPHub instance (or any other SFTP server you have access to).

Installing Paramiko

Paramiko is available on PyPI. Install it into your project's virtual environment:

pip install paramiko

That pulls in Paramiko and its dependencies (cryptography, PyNaCl, bcrypt). If you're pinning dependencies with a requirements.txt or pyproject.toml, add paramiko>=3.0 — version 3 dropped support for older, weaker key exchange algorithms and is the one you should be using.

Verify the install:

python -c "import paramiko; print(paramiko.__version__)"

Connecting to an SFTP Server with a Password

The simplest way to connect is with a username and password. Paramiko uses a Transport object to establish the SSH connection, and then you create an SFTPClient on top of it.

import paramiko

host = "your-instance.sftphub.com"
port = 22
username = "your_username"
password = "your_password"

transport = paramiko.Transport((host, port))
transport.connect(username=username, password=password)

sftp = paramiko.SFTPClient.from_transport(transport)

# Do your file operations here
print(sftp.listdir("."))

sftp.close()
transport.close()

This opens an SSH connection, authenticates, and gives you an sftp object you can use to browse and transfer files. The listdir(".") call returns the contents of the remote home directory as a list of filenames.

Always close both the SFTP client and the transport when you're done. If your script crashes between opening and closing, the connection stays open on the server until it times out. A better pattern is to wrap the whole thing so cleanup happens automatically — we'll cover that in a moment.

Connecting with SSH Key Authentication

SSH key authentication is stronger than passwords and the better choice for scripts that run unattended. Instead of a password, you load a private key file and pass it to the transport.

import paramiko

host = "your-instance.sftphub.com"
port = 22
username = "your_username"
private_key_path = "/path/to/your/private_key"

# Load the private key
private_key = paramiko.RSAKey.from_private_key_file(private_key_path)

transport = paramiko.Transport((host, port))
transport.connect(username=username, pkey=private_key)

sftp = paramiko.SFTPClient.from_transport(transport)
print(sftp.listdir("."))

sftp.close()
transport.close()

If your key is passphrase-protected, pass the passphrase as the second argument:

private_key = paramiko.RSAKey.from_private_key_file(
    private_key_path, password="your_passphrase"
)

Paramiko supports all the common key types. Use the right class for your key:

  • RSAparamiko.RSAKey.from_private_key_file(path)
  • Ed25519paramiko.Ed25519Key.from_private_key_file(path)
  • ECDSAparamiko.ECDSAKey.from_private_key_file(path)

If you're not sure what type your key is, open the private key file in a text editor. The first line tells you: -----BEGIN RSA PRIVATE KEY-----, -----BEGIN OPENSSH PRIVATE KEY----- (Ed25519 or newer RSA), etc.

A Reusable Connection Helper

Manually opening and closing connections gets repetitive and error-prone. Here's a context manager that handles setup and teardown cleanly:

import paramiko
from contextlib import contextmanager


@contextmanager
def sftp_connection(host, port, username, password=None, private_key_path=None):
    """Open an SFTP connection and ensure it gets closed."""
    transport = paramiko.Transport((host, port))

    if private_key_path:
        private_key = paramiko.RSAKey.from_private_key_file(private_key_path)
        transport.connect(username=username, pkey=private_key)
    else:
        transport.connect(username=username, password=password)

    sftp = paramiko.SFTPClient.from_transport(transport)
    try:
        yield sftp
    finally:
        sftp.close()
        transport.close()


# Usage
with sftp_connection("your-instance.sftphub.com", 22, "user", password="pass") as sftp:
    print(sftp.listdir("."))

Now the connection is always cleaned up, even if something raises an exception inside the with block. The rest of the examples in this guide use this helper.

Uploading Files

Paramiko's SFTP client has a put() method that uploads a local file to a remote path. It's straightforward:

with sftp_connection("your-instance.sftphub.com", 22, "user", password="pass") as sftp:
    # Upload a single file
    sftp.put("local_report.csv", "/uploads/report.csv")
    print("Upload complete.")

The first argument is the local file path, the second is the full remote path including the filename. If the remote directory doesn't exist, the upload will fail — Paramiko won't create directories for you.

If you want to track upload progress (useful for large files), pass a callback function:

def progress(transferred, total):
    pct = (transferred / total) * 100
    print(f"\r  {transferred}/{total} bytes ({pct:.1f}%)", end="", flush=True)


with sftp_connection("your-instance.sftphub.com", 22, "user", password="pass") as sftp:
    sftp.put("large_file.zip", "/uploads/large_file.zip", callback=progress)
    print("\nDone.")

Downloading Files

Downloading is the mirror image. The get() method pulls a remote file to a local path:

with sftp_connection("your-instance.sftphub.com", 22, "user", password="pass") as sftp:
    # Download a single file
    sftp.get("/reports/march_2026.csv", "march_2026.csv")
    print("Download complete.")

Same as put(), you can pass a callback for progress tracking. The local directory must exist — Paramiko won't create it.

Listing Remote Directories

You've already seen listdir(), which returns filenames as strings. For more detail, use listdir_attr() to get file metadata along with the names:

import stat

with sftp_connection("your-instance.sftphub.com", 22, "user", password="pass") as sftp:
    for entry in sftp.listdir_attr("/data"):
        file_type = "DIR" if stat.S_ISDIR(entry.st_mode) else "FILE"
        size_kb = entry.st_size / 1024
        print(f"  {file_type}  {size_kb:>8.1f} KB  {entry.filename}")

Each entry is an SFTPAttributes object with properties like st_size, st_mode, st_mtime, and st_atime. This is the equivalent of running ls -l on the remote server.

Creating and Removing Directories

with sftp_connection("your-instance.sftphub.com", 22, "user", password="pass") as sftp:
    # Create a directory
    sftp.mkdir("/data/new_folder")

    # Remove an empty directory
    sftp.rmdir("/data/old_folder")

    # Delete a file
    sftp.remove("/data/obsolete_report.csv")

mkdir() only creates a single directory level — it won't create parent directories. rmdir() only works on empty directories. To create nested directories, you need to call mkdir() for each level in order.

Uploading an Entire Directory

Paramiko doesn't have a built-in recursive upload, so you need to walk the local directory tree yourself. Here's a function that handles it:

import os


def upload_directory(sftp, local_dir, remote_dir):
    """Recursively upload a local directory to a remote path."""
    for item in os.listdir(local_dir):
        local_path = os.path.join(local_dir, item)
        remote_path = f"{remote_dir}/{item}"

        if os.path.isdir(local_path):
            # Create the remote directory (ignore error if it exists)
            try:
                sftp.mkdir(remote_path)
            except IOError:
                pass
            upload_directory(sftp, local_path, remote_path)
        else:
            print(f"  Uploading {local_path} -> {remote_path}")
            sftp.put(local_path, remote_path)


with sftp_connection("your-instance.sftphub.com", 22, "user", password="pass") as sftp:
    upload_directory(sftp, "local_data/", "/uploads/data")

The try/except around mkdir() catches the IOError that Paramiko raises if the directory already exists. This makes the function safe to run repeatedly without failing on the second pass.

Downloading an Entire Directory

The same approach works in reverse — walk the remote directory tree and pull everything down:

import stat


def download_directory(sftp, remote_dir, local_dir):
    """Recursively download a remote directory to a local path."""
    os.makedirs(local_dir, exist_ok=True)

    for entry in sftp.listdir_attr(remote_dir):
        remote_path = f"{remote_dir}/{entry.filename}"
        local_path = os.path.join(local_dir, entry.filename)

        if stat.S_ISDIR(entry.st_mode):
            download_directory(sftp, remote_path, local_path)
        else:
            print(f"  Downloading {remote_path} -> {local_path}")
            sftp.get(remote_path, local_path)


with sftp_connection("your-instance.sftphub.com", 22, "user", password="pass") as sftp:
    download_directory(sftp, "/reports", "local_reports/")

Host Key Verification

The examples above skip host key verification for simplicity. In production, you should verify the server's host key to prevent man-in-the-middle attacks. Paramiko's SSHClient makes this easier to manage:

import paramiko

client = paramiko.SSHClient()

# Option 1: Load system host keys (from ~/.ssh/known_hosts)
client.load_system_host_keys()

# Option 2: Auto-add unknown hosts (only for development/testing)
# client.set_missing_host_key_policy(paramiko.AutoAddPolicy())

client.connect("your-instance.sftphub.com", port=22, username="user", password="pass")
sftp = client.open_sftp()

# File operations
print(sftp.listdir("."))

sftp.close()
client.close()

The SSHClient approach is actually what most people should use in production code. It handles host key checking, agent forwarding, and connection retries more gracefully than raw Transport objects.

A word of caution: don't use AutoAddPolicy in production. It silently accepts any host key, which defeats the purpose of SSH's host verification. For automated scripts, pre-populate the known_hosts file on your server, or load a specific host key file with client.load_host_keys("/path/to/known_hosts").

Error Handling

SFTP operations fail for all the usual reasons: wrong credentials, network timeouts, files that don't exist, full disks, and permissions issues. Here are the exceptions you're likely to encounter:

import paramiko
import socket

try:
    with sftp_connection("your-instance.sftphub.com", 22, "user", password="pass") as sftp:
        sftp.get("/reports/q1.csv", "q1.csv")

except paramiko.AuthenticationException:
    print("Authentication failed. Check your username and password/key.")

except paramiko.SSHException as e:
    print(f"SSH error: {e}")

except socket.timeout:
    print("Connection timed out. Check the hostname and port.")

except socket.gaierror:
    print("Could not resolve hostname. Check the server address.")

except FileNotFoundError:
    print("Local file path does not exist.")

except IOError as e:
    print(f"SFTP operation failed: {e}")

The IOError catch is important — Paramiko raises IOError (with an SFTP error code) for most file-level failures like "file not found," "permission denied," and "no such directory."

Setting a Connection Timeout

By default, Paramiko's Transport will wait indefinitely for a connection. For scripts that run on a schedule, set a socket timeout so a hung connection doesn't block your entire pipeline:

import socket

sock = socket.create_connection(("your-instance.sftphub.com", 22), timeout=10)
transport = paramiko.Transport(sock)
transport.connect(username="user", password="pass")

sftp = paramiko.SFTPClient.from_transport(transport)
sftp.get_channel().settimeout(30)  # 30-second timeout for SFTP operations

The first timeout (on create_connection) limits how long the initial TCP handshake can take. The second timeout (on the SFTP channel) limits individual file operations. Set both.

Putting It All Together

Here's a complete, production-style script that connects to an SFTP server, uploads all CSV files from a local directory, and logs the results:

import os
import logging
import paramiko
from contextlib import contextmanager
from pathlib import Path

logging.basicConfig(level=logging.INFO, format="%(asctime)s %(message)s")
log = logging.getLogger(__name__)

# Connection details from your SFTP provider (e.g. SFTPHub dashboard)
SFTP_HOST = os.environ["SFTP_HOST"]        # e.g. "your-instance.sftphub.com"
SFTP_PORT = int(os.environ.get("SFTP_PORT", 22))
SFTP_USER = os.environ["SFTP_USER"]
SFTP_KEY = os.environ.get("SFTP_KEY_PATH")
SFTP_PASS = os.environ.get("SFTP_PASS")


@contextmanager
def sftp_connection():
    transport = paramiko.Transport((SFTP_HOST, SFTP_PORT))
    if SFTP_KEY:
        key = paramiko.RSAKey.from_private_key_file(SFTP_KEY)
        transport.connect(username=SFTP_USER, pkey=key)
    else:
        transport.connect(username=SFTP_USER, password=SFTP_PASS)
    sftp = paramiko.SFTPClient.from_transport(transport)
    try:
        yield sftp
    finally:
        sftp.close()
        transport.close()


def upload_csv_files(local_dir, remote_dir):
    local_path = Path(local_dir)
    csv_files = list(local_path.glob("*.csv"))

    if not csv_files:
        log.info("No CSV files found in %s", local_dir)
        return

    with sftp_connection() as sftp:
        # Ensure remote directory exists
        try:
            sftp.mkdir(remote_dir)
            log.info("Created remote directory %s", remote_dir)
        except IOError:
            pass  # Already exists

        for csv_file in csv_files:
            remote_file = f"{remote_dir}/{csv_file.name}"
            log.info("Uploading %s -> %s", csv_file, remote_file)
            sftp.put(str(csv_file), remote_file)

        log.info("Uploaded %d file(s)", len(csv_files))


if __name__ == "__main__":
    upload_csv_files("output/", "/incoming/reports")

This script reads connection details from environment variables (never hardcode credentials), supports both password and key authentication, and logs every operation. If you're using SFTPHub, grab the hostname, port, and username from your instance detail page and set them as environment variables. It's the kind of thing you'd drop into a cron job or a CI pipeline step.

Paramiko vs pysftp

You'll see pysftp mentioned in older tutorials. It's a wrapper around Paramiko that simplifies a few common operations (like using with statements and recursive directory operations). The problem is that pysftp hasn't been actively maintained since 2016, has open security issues, and pins to older versions of Paramiko.

Stick with Paramiko directly. The API is barely more verbose than pysftp once you set up a context manager (as shown above), and you get active maintenance, security patches, and support for modern SSH features.

Tips for Production SFTP Scripts

Atomic Uploads (Upload-Then-Rename)

If another process is watching a directory for new files — a common pattern with payroll processors, data pipelines, and integration platforms — it might pick up a file before the upload finishes. The fix is to upload to a temporary name and rename it after the transfer completes:

def atomic_upload(sftp, local_path, remote_path):
    """Upload a file atomically by writing to a temp name, then renaming."""
    temp_path = remote_path + ".tmp"
    sftp.put(local_path, temp_path)
    sftp.rename(temp_path, remote_path)


with sftp_connection() as sftp:
    atomic_upload(sftp, "payroll.csv", "/incoming/payroll.csv")

The rename operation is atomic on the SFTP server — the file either exists with the final name and full contents, or it doesn't. No partial reads.

Comparing SFTP Server Options for Python Projects

If you're choosing an SFTP server for your Python scripts — whether for automated data pipelines, application integrations, or development — here's how the main options compare:

SFTPHub Self-Hosted (VPS) AWS Transfer Family
Setup time ~2 minutes 30–60 minutes 15–30 minutes
Starting price $19/mo $5–10/mo + your time ~$220/mo (protocol + endpoint fees)
Server maintenance None — fully managed You handle patching, monitoring, backups None — fully managed
SSH key auth Yes Yes (manual config) Yes
Multiple SFTP users Yes (2–100 per plan) Yes (manual config) Yes
Storage backend Cloud Local disk S3 or EFS
Best for Teams that want SFTP without ops overhead Full control, custom configurations Orgs already deep in AWS

For most Python developers building SFTP-based automation, SFTPHub hits the sweet spot: it's cheap, it's instant, and there's nothing to maintain. You write Python code, not Ansible playbooks.

Frequently Asked Questions

What Python library should I use for SFTP?

Paramiko is the standard choice. It implements SSHv2 natively in Python, is actively maintained, and supports password authentication, SSH keys (RSA, Ed25519, ECDSA), and host key verification. Install it with pip install paramiko. Avoid pysftp for new projects — it's unmaintained.

Can I use Python's built-in ftplib for SFTP?

No. Python's ftplib only supports FTP and FTPS (FTP over TLS). SFTP is a completely different protocol that runs over SSH, not FTP. You need Paramiko or another SSH library.

How do I handle large file transfers without running out of memory?

Paramiko's put() and get() stream data in chunks — they don't load the entire file into memory. You can safely transfer multi-gigabyte files. If you need finer control, use sftp.open() to get a file-like object and read/write in chunks yourself.

Does Paramiko support SFTP resumable transfers?

Not directly with put() and get(). To resume an interrupted transfer, open the remote file with sftp.open(), seek to the byte offset where the previous transfer stopped, and continue writing. This requires tracking progress on your side.

Can I run SFTP operations concurrently in Python?

Yes, but each SFTP channel should be used by a single thread. The safest approach is to open a separate Transport and SFTPClient per thread. Paramiko's Transport is thread-safe for multiple channels, but sharing a single SFTPClient across threads will cause race conditions.

What is the best SFTP hosting service for Python automation?

SFTPHub is a popular choice for Python developers who need a managed SFTP server without infrastructure overhead. It provides instant setup, redundant cloud storage, SSH key support, and multiple SFTP users per instance — starting at $19/month. Other options include self-hosted OpenSSH on a VPS (more control, more maintenance) and AWS Transfer Family (higher cost, deeper AWS integration). For most Python SFTP automation, SFTPHub or a simple VPS is sufficient.

Key Takeaways

Need an SFTP server for your Python scripts?

SFTPHub gives you a managed SFTP endpoint in under two minutes. No servers to maintain. Plans from $19/mo.