Skip to main content
  1. Posts/

Building a Self-Hosted Expo OTA Update Server with Django and Tigris

· loading · loading ·
Jared Lynskey
Author
Jared Lynskey
Emerging leader and software engineer based in Seoul, South Korea

Over-the-air (OTA) updates let you push JavaScript changes, bug fixes, and new features to users without waiting for App Store review. Expo’s hosted service (EAS Update) works well, but there are good reasons to self-host: cost control at scale, data sovereignty requirements, or just wanting to own your infrastructure.

This post covers how I built a self-hosted Expo update server using Django REST Framework and Tigris S3. It’s what powers OTA updates for my Curtain Estimator app on both iOS and Android.

Why Self-Host Expo Updates?
#

Cost Control
#

EAS Update pricing scales with usage. If you’re pushing frequent updates to a large user base, self-hosting with cheap S3-compatible storage like Tigris saves real money.

Data Sovereignty
#

Some industries require full control over where application assets are stored. Self-hosting ensures all update bundles stay in your infrastructure.

Custom Business Logic
#

Need to roll out updates to specific user segments? Want to A/B test different bundles? With a custom server, you control the entire update flow.

No Vendor Lock-In
#

Your update infrastructure isn’t tied to Expo’s service availability or pricing changes.

Architecture Overview
#

The system consists of four main components:

  1. Django Backend: Serves update manifests and stores metadata
  2. Tigris S3 Storage: Hosts the actual bundle and asset files
  3. Publishing Pipeline: Script that exports, uploads, and registers updates
  4. Mobile App: Configured to check for updates from your server

Here’s how it works:

┌─────────────────┐
│   Mobile App    │
│  (expo-updates) │
└────────┬────────┘
         │ 1. Request manifest
         │    (with headers: platform, runtime-version)
┌─────────────────┐
│  Django Server  │
│  /api/expo-     │◄─── 2. Query DB for latest update
│   updates/      │
│   manifest/     │
└────────┬────────┘
         │ 3. Generate presigned URLs
┌─────────────────┐
│   Tigris S3     │
│  (Asset Files)  │◄─── 4. App downloads bundles directly
└─────────────────┘

Implementation Details
#

Django Models
#

The foundation is two Django models: ExpoUpdate and ExpoUpdateAsset.

ExpoUpdate stores metadata for each update:

class ExpoUpdate(models.Model):
    id = models.UUIDField(primary_key=True, default=uuid.uuid4)
    runtime_version = models.CharField(max_length=50, db_index=True)
    platform = models.CharField(
        max_length=10,
        choices=[("ios", "iOS"), ("android", "Android")],
        db_index=True
    )
    is_active = models.BooleanField(default=True, db_index=True)
    manifest_data = models.JSONField()
    description = models.TextField(blank=True)
    created_at = models.DateTimeField(auto_now_add=True)

    class Meta:
        indexes = [
            models.Index(fields=["runtime_version", "platform", "is_active", "-created_at"])
        ]

Key design decisions:

  • runtime_version: Matches the runtimeVersion in your app.json. This is critical—clients only download updates matching their runtime version.
  • platform: Separate updates for iOS and Android since bundles differ.
  • is_active: Supports rollbacks by deactivating problematic updates.
  • manifest_data: Stores the complete Expo Updates v1 protocol manifest as JSON.

ExpoUpdateAsset tracks individual files:

class ExpoUpdateAsset(models.Model):
    id = models.UUIDField(primary_key=True, default=uuid.uuid4)
    update = models.ForeignKey(ExpoUpdate, on_delete=models.CASCADE, related_name="assets")
    hash = models.CharField(max_length=255, db_index=True)
    key = models.CharField(max_length=255)
    content_type = models.CharField(max_length=100)
    file_extension = models.CharField(max_length=10)
    file_path = models.CharField(max_length=500)
    file_size = models.IntegerField(default=0)

Assets are referenced by their SHA-256 hash, making them immutable and cacheable. The same asset can be shared across multiple updates.

The Manifest Endpoint
#

The /api/expo-updates/manifest/ endpoint is the heart of the system. It implements the Expo Updates v1 protocol.

@action(detail=False, methods=["get"], url_path="manifest")
def manifest(self, request):
    # Extract required headers
    protocol_version = request.META.get("HTTP_EXPO_PROTOCOL_VERSION")
    platform = request.META.get("HTTP_EXPO_PLATFORM")
    runtime_version = request.META.get("HTTP_EXPO_RUNTIME_VERSION")

    # Validate protocol version
    if protocol_version != "1":
        return Response(
            {"error": f"Unsupported protocol version: {protocol_version}"},
            status=400
        )

    # Find latest active update for this runtime + platform
    update = ExpoUpdate.objects.filter(
        runtime_version=runtime_version,
        platform=platform,
        is_active=True,
    ).order_by("-created_at").first()

    # No update available - client uses embedded bundle
    if not update:
        response = Response(status=204)
        response["expo-protocol-version"] = "1"
        return response

    # Generate presigned URLs for all assets
    manifest_data = self._generate_manifest_with_presigned_urls(update)

    return Response(manifest_data, status=200)

Critical details:

  1. 204 No Content: When no update is found, return 204. The app will use its embedded bundle.
  2. Presigned URLs: Generate time-limited URLs that allow the app to download directly from Tigris CDN—much faster than proxying through Django.
  3. Headers: The expo-protocol-version header is required in the response.

Publishing Updates
#

The publishing workflow is automated through a Django management command and a shell script wrapper.

The management command (publish_expo_update.py) handles:

  1. Reading the output from expo export
  2. Calculating SHA-256 hashes for all assets
  3. Uploading bundles and assets to Tigris S3 in parallel
  4. Creating database records for the update and assets
  5. Optionally uploading an import JSON for production sync

Here’s the core flow:

def _publish_platform(self, platform, runtime_version, export_dir, ...):
    # 1. Find the bundle file
    bundle_files = list(bundle_dir.glob("entry-*.hbc"))
    bundle_file = bundle_files[0]

    # 2. Calculate hash
    with open(bundle_file, "rb") as f:
        bundle_content = f.read()
    bundle_hash = self._calculate_hash(bundle_content)

    # 3. Collect all assets and their hashes
    for asset_file in assets_dir.rglob("*"):
        # Calculate hash, determine content type...
        assets_metadata.append({...})

    # 4. Upload to S3 in parallel
    with ThreadPoolExecutor(max_workers=10) as executor:
        futures = {executor.submit(upload_asset, a): a for a in assets_metadata}

    # 5. Create database records
    with transaction.atomic():
        # Deactivate previous updates
        ExpoUpdate.objects.filter(
            runtime_version=runtime_version,
            platform=platform,
            is_active=True
        ).update(is_active=False)

        # Create new update
        update = ExpoUpdate.objects.create(...)

The shell script (publish-ota-update.sh) provides a user-friendly interface:

# Publish to local environment
./scripts/publish-ota-update.sh ios

# Publish to production
./scripts/publish-ota-update.sh ios --production

# Dry run to validate
./scripts/publish-ota-update.sh --dry-run

Key features:

  • Automatically exports the app with production environment variables
  • Uploads assets in parallel for speed
  • Supports platform-specific or multi-platform updates
  • Syncs to production via API endpoint

Production Sync via API
#

For production deployments, the system uses a clever two-step process:

  1. Local publish: Upload assets to Tigris and create a JSON snapshot
  2. Remote import: Call production API with the S3 path to import metadata

This approach avoids having production credentials on your local machine:

@action(detail=False, methods=["post"], url_path="import-update")
def import_update(self, request):
    # Authenticate via Bearer token
    secret = settings.OTA_IMPORT_SECRET
    token = request.META.get("HTTP_AUTHORIZATION", "")[7:]  # Strip "Bearer "
    if not hmac.compare_digest(token, secret):
        return Response({"error": "Invalid token"}, status=401)

    # Download import JSON from Tigris
    s3_key = request.data.get("s3_key")
    obj = s3_client.get_object(Bucket=bucket_name, Key=s3_key)
    data = json.loads(obj["Body"].read())

    # Import to production database
    with transaction.atomic():
        ExpoUpdate.objects.update_or_create(id=data["id"], defaults={...})
        for asset_data in data["assets"]:
            ExpoUpdateAsset.objects.update_or_create(...)

    # Clean up the import JSON
    s3_client.delete_object(Bucket=bucket_name, Key=s3_key)

Mobile App Configuration
#

In your app.json, configure the update URL and runtime version:

{
  "expo": {
    "runtimeVersion": "1.0.0",
    "updates": {
      "url": "https://your-server.com/api/expo-updates/manifest/"
    }
  }
}

Important: The runtimeVersion must match between your app and server. When you change native code or upgrade Expo SDK, increment the runtime version and publish new updates.

Storage with Tigris
#

Tigris is an S3-compatible object storage service that’s significantly cheaper than AWS S3, with global edge caching included.

Configuration in Django:

# settings.py
BUCKET_NAME = os.getenv("BUCKET_NAME")
AWS_ENDPOINT_URL_S3 = os.getenv("AWS_ENDPOINT_URL_S3")
AWS_ACCESS_KEY_ID = os.getenv("AWS_ACCESS_KEY_ID")
AWS_SECRET_ACCESS_KEY = os.getenv("AWS_SECRET_ACCESS_KEY")
AWS_REGION = os.getenv("AWS_REGION", "auto")

Creating the S3 client:

import boto3

def create_s3_client(endpoint_url, region, access_key, secret_key):
    return boto3.client(
        "s3",
        endpoint_url=endpoint_url,
        region_name=region,
        aws_access_key_id=access_key,
        aws_secret_access_key=secret_key,
    )

File uploads use presigned URLs for direct CDN delivery:

presigned_url = s3_client.generate_presigned_url(
    "get_object",
    Params={"Bucket": bucket_name, "Key": asset.file_path},
    ExpiresIn=3600,  # 1 hour
)

Security Considerations
#

Authentication
#

The manifest endpoint is unauthenticated by design—mobile apps need updates before users log in. However, the import endpoint requires a shared secret:

OTA_IMPORT_SECRET = os.getenv("OTA_IMPORT_SECRET")

# Constant-time comparison prevents timing attacks
if not hmac.compare_digest(token, secret):
    return Response({"error": "Invalid token"}, status=401)

Asset Integrity
#

All assets are validated by SHA-256 hash. If an asset is tampered with, the hash won’t match and the update fails.

Presigned URLs
#

URLs expire after 1 hour, preventing long-term unauthorized access to bundles.

Performance Optimizations
#

Database Indexes
#

The composite index on (runtime_version, platform, is_active, -created_at) ensures the manifest query is fast even with thousands of updates:

class Meta:
    indexes = [
        models.Index(fields=["runtime_version", "platform", "is_active", "-created_at"])
    ]

Parallel Uploads
#

The publishing script uses ThreadPoolExecutor to upload assets concurrently:

with ThreadPoolExecutor(max_workers=10) as executor:
    futures = {executor.submit(upload_asset, asset): asset for asset in assets}
    for future in as_completed(futures):
        # Track progress

For a typical update with 50 assets, this reduces publish time from 2 minutes to 15 seconds.

CDN Distribution
#

Tigris includes global edge caching. Assets are automatically distributed to edge locations near your users, reducing download times.

Operational Workflow
#

Daily Development
#

# 1. Make code changes in mobile app
cd mobile-app && git commit -am "Fix bug"

# 2. Publish OTA update to local environment
yarn publish-update:ios

# 3. Test on device
# App automatically downloads and applies update

Production Deployment
#

# 1. Publish to production
yarn publish-update:prod:ios

# 2. Monitor
# Check Django admin for update records
# Verify assets in Tigris dashboard

Rollback
#

# Mark problematic update as inactive in Django admin
# or via management shell:
python manage.py shell

>>> from jobs.models import ExpoUpdate
>>> bad_update = ExpoUpdate.objects.get(id="uuid-here")
>>> bad_update.is_active = False
>>> bad_update.save()

# Clients will now receive the previous active update

Cost Analysis
#

For my Curtain Estimator app with ~500 active users:

Tigris Storage:

  • Storage: ~200 MB (historical updates) = $0.02/month
  • Egress: ~50 GB/month (update downloads) = $1.00/month
  • Total: ~$1/month

Django Hosting (Fly.io):

  • Included in existing app hosting
  • Incremental cost: $0

Total OTA infrastructure cost: ~$1/month

Compare this to EAS Update’s pricing, which would be $300-500/year for similar usage. The self-hosted approach pays for itself immediately.

Monitoring and Debugging
#

Logging
#

The ViewSet logs all manifest requests:

logger.info(f"Manifest request: platform={platform}, runtime={runtime_version}")

Django Admin
#

Register the models in Django admin for easy inspection:

@admin.register(ExpoUpdate)
class ExpoUpdateAdmin(admin.ModelAdmin):
    list_display = ["platform", "runtime_version", "is_active", "created_at"]
    list_filter = ["platform", "is_active", "runtime_version"]
    search_fields = ["description"]

Client-Side Debugging
#

Enable update logs in your app:

import * as Updates from 'expo-updates';

Updates.checkForUpdateAsync().then(update => {
  console.log('Update available:', update.isAvailable);
  console.log('Manifest:', update.manifest);
});

Limitations and Gotchas
#

Runtime Version Matching
#

The most common issue: clients only download updates matching their runtime version. If your app is on runtime 1.0.0 but you publish for 1.0.1, updates won’t be delivered.

Solution: Keep runtime version in sync with your app builds, and increment it only when native code changes.

createdAt Timestamp
#

The expo-updates client compares the manifest’s createdAt timestamp against the embedded bundle’s commitTime. Updates are only applied if createdAt is newer.

Fix: The viewset overrides createdAt with the database timestamp:

manifest_data["createdAt"] = update.created_at.strftime("%Y-%m-%dT%H:%M:%S.%fZ")

For local development, if you build a binary after publishing an OTA, re-publish the OTA to ensure a newer timestamp.

Asset Cleanup
#

Old updates accumulate in Tigris. Currently, cleanup is manual:

# Delete updates older than 30 days
from datetime import timedelta
from django.utils import timezone

cutoff = timezone.now() - timedelta(days=30)
old_updates = ExpoUpdate.objects.filter(created_at__lt=cutoff, is_active=False)

for update in old_updates:
    # Delete assets from S3
    for asset in update.assets.all():
        s3_client.delete_object(Bucket=bucket_name, Key=asset.file_path)
    # Delete DB records
    update.delete()

Consider adding this to a scheduled task.

Future Enhancements
#

Rollout Control
#

Add a rollout_percentage field to gradually release updates:

rollout_percentage = models.IntegerField(default=100)

# In the manifest view:
if update.rollout_percentage < 100:
    # Hash user ID and check if they're in rollout group
    user_hash = int(hashlib.sha256(user_id.encode()).hexdigest(), 16)
    if (user_hash % 100) >= update.rollout_percentage:
        return Response(status=204)  # No update

Multi-Environment Support
#

Add an environment field to serve different updates for staging vs production:

environment = models.CharField(max_length=20, default="production")

# Client sends environment in custom header
environment = request.META.get("HTTP_X_UPDATE_ENVIRONMENT", "production")
update = ExpoUpdate.objects.filter(environment=environment, ...).first()

Analytics
#

Track download metrics:

class ExpoUpdateDownload(models.Model):
    update = models.ForeignKey(ExpoUpdate, on_delete=models.CASCADE)
    user_id = models.CharField(max_length=255, null=True)
    platform = models.CharField(max_length=10)
    downloaded_at = models.DateTimeField(auto_now_add=True)

Conclusion
#

Self-hosting Expo OTA updates with Django and S3-compatible storage is simpler than you’d expect. The whole thing is:

  • ~150 lines of Django models and views
  • ~200 lines of publishing script
  • ~$1/month in infrastructure costs

The code examples here are from my production Curtain Estimator app. Adapt them to whatever you’re building.


For the full spec, check out the Expo Updates Protocol Specification.