📦 Object Store
S3-compatible object storage. Fast metadata operations with lock-free architecture.
S3 Endpoint
api.44s.io:9000
S3-compatible API
Quick Start
Using AWS CLI
# Configure credentials
aws configure set aws_access_key_id 44s_your_api_key
aws configure set aws_secret_access_key 44s_your_api_key
# Create a bucket
aws --endpoint-url https://api.44s.io:9000 s3 mb s3://my-bucket
# Upload a file
aws --endpoint-url https://api.44s.io:9000 s3 cp myfile.txt s3://my-bucket/
# List objects
aws --endpoint-url https://api.44s.io:9000 s3 ls s3://my-bucket/
# Download a file
aws --endpoint-url https://api.44s.io:9000 s3 cp s3://my-bucket/myfile.txt ./downloaded.txt
# Delete a file
aws --endpoint-url https://api.44s.io:9000 s3 rm s3://my-bucket/myfile.txt
Python (boto3)
import boto3
# Create client
s3 = boto3.client(
's3',
endpoint_url='https://api.44s.io:9000',
aws_access_key_id='44s_your_api_key',
aws_secret_access_key='44s_your_api_key'
)
# Create bucket
s3.create_bucket(Bucket='my-bucket')
# Upload file
s3.upload_file('local_file.txt', 'my-bucket', 'remote_file.txt')
# Upload bytes
s3.put_object(
Bucket='my-bucket',
Key='data.json',
Body=b'{"hello": "world"}',
ContentType='application/json'
)
# Download file
s3.download_file('my-bucket', 'remote_file.txt', 'local_copy.txt')
# Get object
response = s3.get_object(Bucket='my-bucket', Key='data.json')
data = response['Body'].read()
# List objects
response = s3.list_objects_v2(Bucket='my-bucket', Prefix='uploads/')
for obj in response.get('Contents', []):
print(f"{obj['Key']}: {obj['Size']} bytes")
# Generate presigned URL
url = s3.generate_presigned_url(
'get_object',
Params={'Bucket': 'my-bucket', 'Key': 'file.pdf'},
ExpiresIn=3600 # 1 hour
)
print(f"Download link: {url}")
Node.js
const { S3Client, PutObjectCommand, GetObjectCommand } = require('@aws-sdk/client-s3');
const s3 = new S3Client({
endpoint: 'https://api.44s.io:9000',
region: 'us-east-1',
credentials: {
accessKeyId: '44s_your_api_key',
secretAccessKey: '44s_your_api_key'
},
forcePathStyle: true
});
// Upload
await s3.send(new PutObjectCommand({
Bucket: 'my-bucket',
Key: 'uploads/image.png',
Body: imageBuffer,
ContentType: 'image/png'
}));
// Download
const response = await s3.send(new GetObjectCommand({
Bucket: 'my-bucket',
Key: 'uploads/image.png'
}));
const data = await response.Body.transformToByteArray();
Supported S3 Operations
Bucket Operations
| Operation | Description |
|---|---|
CreateBucket | Create a new bucket |
DeleteBucket | Delete an empty bucket |
ListBuckets | List all buckets |
HeadBucket | Check if bucket exists |
Object Operations
| Operation | Description |
|---|---|
PutObject | Upload an object |
GetObject | Download an object |
DeleteObject | Delete an object |
HeadObject | Get object metadata |
ListObjectsV2 | List objects in bucket |
CopyObject | Copy object within/between buckets |
Multipart Upload
| Operation | Description |
|---|---|
CreateMultipartUpload | Start multipart upload |
UploadPart | Upload a part |
CompleteMultipartUpload | Complete upload |
AbortMultipartUpload | Cancel upload |
Use Cases
File Uploads
# Handle user uploads in your app
def upload_user_file(user_id: str, file: bytes, filename: str):
key = f"users/{user_id}/files/{filename}"
s3.put_object(
Bucket='app-uploads',
Key=key,
Body=file,
Metadata={
'uploaded-by': user_id,
'original-name': filename
}
)
# Generate download URL
return s3.generate_presigned_url(
'get_object',
Params={'Bucket': 'app-uploads', 'Key': key},
ExpiresIn=86400 # 24 hours
)
Static Assets
# Serve static files for your web app
s3.put_object(
Bucket='static-assets',
Key='js/app.bundle.js',
Body=open('dist/app.bundle.js', 'rb'),
ContentType='application/javascript',
CacheControl='public, max-age=31536000' # 1 year cache
)
# In your HTML
<script src="https://api.44s.io:9000/static-assets/js/app.bundle.js"></script>
Backups
# Automated database backups
import gzip
from datetime import datetime
def backup_database():
timestamp = datetime.now().strftime('%Y%m%d_%H%M%S')
# Dump and compress
dump = get_database_dump()
compressed = gzip.compress(dump)
# Upload to object store
s3.put_object(
Bucket='backups',
Key=f'database/{timestamp}.sql.gz',
Body=compressed,
StorageClass='STANDARD'
)
# Clean up old backups (keep last 30)
response = s3.list_objects_v2(Bucket='backups', Prefix='database/')
objects = sorted(response['Contents'], key=lambda x: x['LastModified'])
for obj in objects[:-30]:
s3.delete_object(Bucket='backups', Key=obj['Key'])
Performance
| Metric | 44s Object Store | AWS S3 |
|---|---|---|
| List latency | <1ms | ~50-200ms |
| Head latency | <1ms | ~20-100ms |
| Small object PUT | <5ms | ~50-200ms |
Performance gains from lock-free metadata index. Data transfer speed depends on network and object size.