Migration Guide
44s uses standard protocols — migrate in minutes, not months.
Quick Links
Zero Code Changes Required
44s speaks native Redis, PostgreSQL, and S3 protocols. Your existing client libraries, ORMs, and tools work unchanged.
44s speaks native Redis, PostgreSQL, and S3 protocols. Your existing client libraries, ORMs, and tools work unchanged.
⚡ Redis → 44s Cache
Redis Protocol Compatible
~2 minutes
44s Cache speaks native Redis protocol. Use your existing redis-py, ioredis, Jedis, or any Redis client.
1Update Connection String
Before (Redis)
REDIS_URL=redis://localhost:6379 # or REDIS_URL=redis://my-redis.aws.com:6379
After (44s)
REDIS_URL=redis://api.44s.io:6379 # With auth: REDIS_URL=redis://:YOUR_API_KEY@api.44s.io:6379
2That's It
No code changes. Your existing Redis commands work identically:
import redis
# Same code, just different host
r = redis.Redis(host='api.44s.io', port=6379, password='YOUR_API_KEY')
r.set('user:123', 'data') # ✓ Works
r.get('user:123') # ✓ Works
r.hset('hash', 'field', 'val') # ✓ Works
r.lpush('list', 'item') # ✓ Works
r.sadd('set', 'member') # ✓ Works
Supported Commands
- All string operations (GET, SET, MGET, MSET, INCR, etc.)
- Hash operations (HGET, HSET, HMGET, HGETALL, etc.)
- List operations (LPUSH, RPUSH, LPOP, LRANGE, etc.)
- Set operations (SADD, SMEMBERS, SINTER, etc.)
- Sorted sets (ZADD, ZRANGE, ZRANK, etc.)
- Key expiration (EXPIRE, TTL, PEXPIRE)
🗄️ PostgreSQL → 44s Database
PostgreSQL Wire Protocol
~5 minutes
44s Database speaks PostgreSQL wire protocol. Use psycopg2, SQLAlchemy, Prisma, or any Postgres client.
1Update Connection String
Before (PostgreSQL)
DATABASE_URL=postgres://user:pass@localhost:5432/mydb # or DATABASE_URL=postgres://user:pass@rds.aws.com:5432/db
After (44s)
DATABASE_URL=postgres://api:YOUR_API_KEY@api.44s.io:5432/default
2Migrate Schema
# Export from existing Postgres
pg_dump -h old-host -U user -s mydb > schema.sql
# Import to 44s (via HTTP API)
curl -X POST https://api.44s.io:8600/query \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{"sql": "CREATE TABLE users (id SERIAL PRIMARY KEY, name TEXT)"}'
3Migrate Data
# Export data pg_dump -h old-host -U user --data-only mydb > data.sql # Import via your ORM or direct SQL python migrate_data.py # Your existing migration scripts work
SQLAlchemy / Prisma / Django ORM
Just change the DATABASE_URL environment variable. Your models, migrations, and queries work unchanged.
Just change the DATABASE_URL environment variable. Your models, migrations, and queries work unchanged.
📦 S3 → 44s Object Store
S3-Compatible API
~5 minutes
44s Object Store is S3-compatible. Use boto3, AWS CLI, or any S3 client.
1Update SDK Configuration
Before (AWS S3)
import boto3
s3 = boto3.client('s3')
s3.upload_file('file.txt', 'my-bucket', 'file.txt')
After (44s)
import boto3
s3 = boto3.client('s3',
endpoint_url='https://api.44s.io:9000',
aws_access_key_id='YOUR_API_KEY',
aws_secret_access_key='YOUR_API_KEY'
)
s3.upload_file('file.txt', 'my-bucket', 'file.txt')
2Sync Existing Data
# Using AWS CLI with custom endpoint aws s3 sync s3://old-bucket s3://new-bucket \ --endpoint-url https://api.44s.io:9000 # Or using rclone rclone sync aws:old-bucket 44s:new-bucket
Supported Operations
- PutObject, GetObject, DeleteObject
- ListObjects, ListObjectsV2
- CreateBucket, DeleteBucket, ListBuckets
- Multipart uploads
- Pre-signed URLs
📡 Kafka → 44s Streaming
HTTP Streaming API
~10 minutes
44s Streaming uses a simple HTTP API. Migrate producers and consumers individually.
1Update Producer
Before (Kafka)
from kafka import KafkaProducer
producer = KafkaProducer(
bootstrap_servers=['kafka:9092']
)
producer.send('events', b'message')
After (44s)
import requests
requests.post(
'https://api.44s.io:5470/topics/events/produce',
headers={'Authorization': 'Bearer KEY'},
json={'messages': [{'value': 'message'}]}
)
2Update Consumer
Before (Kafka)
from kafka import KafkaConsumer
consumer = KafkaConsumer('events')
for msg in consumer:
process(msg.value)
After (44s)
import requests
while True:
resp = requests.get(
'https://api.44s.io:5470/topics/events/consume',
headers={'Authorization': 'Bearer KEY'},
params={'group': 'my-group'}
)
for msg in resp.json()['messages']:
process(msg['value'])
Note: 44s Streaming uses HTTP, not the Kafka protocol. You'll need to update client code, but the concepts (topics, partitions, consumer groups) are the same.
🎯 Pinecone → 44s Vector
Vector Search API
~10 minutes
44s Vector provides a simple HTTP API for vector similarity search.
1Create Collection
curl -X POST https://api.44s.io:9002/collections \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{"name": "embeddings", "dimension": 1536}'
2Export & Import Vectors
Before (Pinecone)
import pinecone
index = pinecone.Index('my-index')
# Export vectors (pagination required)
results = index.query(
vector=[0]*1536,
top_k=10000,
include_values=True
)
After (44s)
import requests
# Upsert to 44s
requests.post(
'https://api.44s.io:9002/collections/embeddings/upsert',
headers={'Authorization': 'Bearer KEY'},
json={'vectors': [
{'id': 'vec1', 'values': [...], 'metadata': {...}}
]}
)
3Update Query Code
# 44s Vector query
response = requests.post(
'https://api.44s.io:9002/collections/embeddings/query',
headers={'Authorization': 'Bearer YOUR_API_KEY'},
json={
'vector': embedding, # Your query vector
'top_k': 10,
'include_metadata': True
}
)
results = response.json()['results']
Migration Checklist
- Get API Key — Sign up at dashboard or purchase a founding member slot
- Test Connection — Verify you can connect to 44s endpoints
- Migrate Schema — For Database, create tables/indexes first
- Run Dual-Write — Write to both old and new systems during transition
- Backfill Data — Copy historical data to 44s
- Switch Reads — Point reads to 44s, verify correctness
- Switch Writes — Point all writes to 44s
- Decommission Old — Turn off old infrastructure