How do I upload to IPFS using S3?
Want to upload files to IPFS without switching up your stack? Filebase gives you an S3-compatible API that plugs straight into IPFS.

Bridging centralized workflows and decentralized storage just got easier. If you’ve worked with AWS S3 before, you're 90% of the way there. Filebase gives you an S3-compatible API that plugs directly into IPFS — no blockchain knowledge required.
Here’s how to upload files to IPFS using familiar S3 tools.
Why S3-Compatible APIs with IPFS?
Let’s talk real benefits — why go this route?
- Familiar workflows: Use the same AWS SDKs and tools you already know (like
boto3
,aws-sdk
, etc). - Faster onboarding: No need to rewrite existing infra — just point your S3 client at a new endpoint.
- Hybrid-friendly: Filebase gives you IPFS-backed persistence plus advanced features like Ai based threat detection, pre-signed URLs and more.
- Easy pinning: Uploading via S3 means Filebase automatically pins your files to IPFS for you — no manual
pin add
needed.
Perfect for developers looking to gradually decentralize without burning everything down.
Getting Started with Filebase
First, you’ll need a Filebase account. Free tier gets you started quickly.
-
Sign Up
Head to filebase.com and create an account. -
Create a Bucket
Think of it like an S3 bucket — name it, choose “IPFS” as the backend, done. -
Grab Your Access Keys
From the dashboard:
➤ Go to Access Keys
➤ Copy yourAccess Key ID
andSecret Access Key
➤ Store them safely (use.env
for local dev)
Uploading Files to IPFS via Filebase
Let’s walk through a sample upload using the AWS SDK for JavaScript. You can adapt this for Python (boto3
) or Go later.
Step 1: Install Dependencies
npm install aws-sdk
Or if you're using ES modules:
npm install @aws-sdk/client-s3
Step 2: Set Up Your S3 Client
const AWS = require('aws-sdk');
const s3 = new AWS.S3({
endpoint: 'https://s3.filebase.com',
accessKeyId: process.env.FILEBASE_KEY,
secretAccessKey: process.env.FILEBASE_SECRET,
region: 'us-east-1',
signatureVersion: 'v4'
});
Yes, Filebase uses v4
signature just like AWS. You're just pointing at a new endpoint.
Step 3: Upload a File to IPFS
const fs = require('fs');
const fileStream = fs.createReadStream('./example.txt');
const params = {
Bucket: 'your-bucket-name',
Key: 'example.txt',
Body: fileStream
};
s3.upload(params, (err, data) => {
if (err) {
console.error('Upload failed ❌', err);
return;
}
console.log('✅ File uploaded to IPFS via Filebase!');
console.log('📦 S3 URL:', data.Location);
});
After upload, Filebase handles the pinning. To retrieve the file via IPFS:
https://<your-gateway-name>.myfilebase.com/ipfs/<CID>
You can also get the CID directly from the Filebase dashboard or via API.
Bonus: Using Boto3 (Python)
If you prefer Python:
import boto3
session = boto3.session.Session()
s3 = session.client(
service_name='s3',
endpoint_url='https://s3.filebase.com',
aws_access_key_id='YOUR_ACCESS_KEY',
aws_secret_access_key='YOUR_SECRET_KEY'
)
s3.upload_file('example.txt', 'your-bucket-name', 'example.txt')
Same principle, different flavor.
Conclusion
If you want decentralized storage without rearchitecting everything — S3-compatible APIs like Filebase are the sweet spot.
- No IPFS CLI.
- No new SDKs.
- Just S3... with a twist.
This hybrid approach is ideal for teams migrating off centralized infra, one piece at a time. Start uploading today, then explore more advanced IPFS tooling later (like CIDs, IPNS, and DAG structures).
👉 Check out the full docs: Filebase S3 API Reference
➤ Want to upload larger files?
Check out our guide on How to Upload Large Files to IPFS and learn how to work around common size limits with multi-part uploads.