📦 Object Store

S3-compatible object storage. Fast metadata operations with lock-free architecture.

S3 Endpoint

api.44s.io:9000

S3-compatible API

Quick Start

Using AWS CLI

# Configure credentials
aws configure set aws_access_key_id 44s_your_api_key
aws configure set aws_secret_access_key 44s_your_api_key

# Create a bucket
aws --endpoint-url https://api.44s.io:9000 s3 mb s3://my-bucket

# Upload a file
aws --endpoint-url https://api.44s.io:9000 s3 cp myfile.txt s3://my-bucket/

# List objects
aws --endpoint-url https://api.44s.io:9000 s3 ls s3://my-bucket/

# Download a file
aws --endpoint-url https://api.44s.io:9000 s3 cp s3://my-bucket/myfile.txt ./downloaded.txt

# Delete a file
aws --endpoint-url https://api.44s.io:9000 s3 rm s3://my-bucket/myfile.txt

Python (boto3)

import boto3

# Create client
s3 = boto3.client(
    's3',
    endpoint_url='https://api.44s.io:9000',
    aws_access_key_id='44s_your_api_key',
    aws_secret_access_key='44s_your_api_key'
)

# Create bucket
s3.create_bucket(Bucket='my-bucket')

# Upload file
s3.upload_file('local_file.txt', 'my-bucket', 'remote_file.txt')

# Upload bytes
s3.put_object(
    Bucket='my-bucket',
    Key='data.json',
    Body=b'{"hello": "world"}',
    ContentType='application/json'
)

# Download file
s3.download_file('my-bucket', 'remote_file.txt', 'local_copy.txt')

# Get object
response = s3.get_object(Bucket='my-bucket', Key='data.json')
data = response['Body'].read()

# List objects
response = s3.list_objects_v2(Bucket='my-bucket', Prefix='uploads/')
for obj in response.get('Contents', []):
    print(f"{obj['Key']}: {obj['Size']} bytes")

# Generate presigned URL
url = s3.generate_presigned_url(
    'get_object',
    Params={'Bucket': 'my-bucket', 'Key': 'file.pdf'},
    ExpiresIn=3600  # 1 hour
)
print(f"Download link: {url}")

Node.js

const { S3Client, PutObjectCommand, GetObjectCommand } = require('@aws-sdk/client-s3');

const s3 = new S3Client({
  endpoint: 'https://api.44s.io:9000',
  region: 'us-east-1',
  credentials: {
    accessKeyId: '44s_your_api_key',
    secretAccessKey: '44s_your_api_key'
  },
  forcePathStyle: true
});

// Upload
await s3.send(new PutObjectCommand({
  Bucket: 'my-bucket',
  Key: 'uploads/image.png',
  Body: imageBuffer,
  ContentType: 'image/png'
}));

// Download
const response = await s3.send(new GetObjectCommand({
  Bucket: 'my-bucket',
  Key: 'uploads/image.png'
}));
const data = await response.Body.transformToByteArray();

Supported S3 Operations

Bucket Operations

OperationDescription
CreateBucketCreate a new bucket
DeleteBucketDelete an empty bucket
ListBucketsList all buckets
HeadBucketCheck if bucket exists

Object Operations

OperationDescription
PutObjectUpload an object
GetObjectDownload an object
DeleteObjectDelete an object
HeadObjectGet object metadata
ListObjectsV2List objects in bucket
CopyObjectCopy object within/between buckets

Multipart Upload

OperationDescription
CreateMultipartUploadStart multipart upload
UploadPartUpload a part
CompleteMultipartUploadComplete upload
AbortMultipartUploadCancel upload

Use Cases

File Uploads

# Handle user uploads in your app
def upload_user_file(user_id: str, file: bytes, filename: str):
    key = f"users/{user_id}/files/{filename}"
    
    s3.put_object(
        Bucket='app-uploads',
        Key=key,
        Body=file,
        Metadata={
            'uploaded-by': user_id,
            'original-name': filename
        }
    )
    
    # Generate download URL
    return s3.generate_presigned_url(
        'get_object',
        Params={'Bucket': 'app-uploads', 'Key': key},
        ExpiresIn=86400  # 24 hours
    )

Static Assets

# Serve static files for your web app
s3.put_object(
    Bucket='static-assets',
    Key='js/app.bundle.js',
    Body=open('dist/app.bundle.js', 'rb'),
    ContentType='application/javascript',
    CacheControl='public, max-age=31536000'  # 1 year cache
)

# In your HTML
<script src="https://api.44s.io:9000/static-assets/js/app.bundle.js"></script>

Backups

# Automated database backups
import gzip
from datetime import datetime

def backup_database():
    timestamp = datetime.now().strftime('%Y%m%d_%H%M%S')
    
    # Dump and compress
    dump = get_database_dump()
    compressed = gzip.compress(dump)
    
    # Upload to object store
    s3.put_object(
        Bucket='backups',
        Key=f'database/{timestamp}.sql.gz',
        Body=compressed,
        StorageClass='STANDARD'
    )
    
    # Clean up old backups (keep last 30)
    response = s3.list_objects_v2(Bucket='backups', Prefix='database/')
    objects = sorted(response['Contents'], key=lambda x: x['LastModified'])
    
    for obj in objects[:-30]:
        s3.delete_object(Bucket='backups', Key=obj['Key'])

Performance

Metric44s Object StoreAWS S3
List latency<1ms~50-200ms
Head latency<1ms~20-100ms
Small object PUT<5ms~50-200ms

Performance gains from lock-free metadata index. Data transfer speed depends on network and object size.