Data Engineering at Supabase

Welcome to Real Work From Anywhere.

The only fully location independent job board. We hand pick every job on this site. Live and work from anywhere.

💜 Love this site? plz tweet about us

Headshot
Applying to jobs?
Stand out to recruiters with a professional headshot — just $19.

Job Description

Supabase is the Postgres development platform, built by developers for developers. We provide a complete backend solution including Database, Auth, Storage, Edge Functions, Realtime, and Vector Search. All services are deeply integrated and designed for growth.

Supabase is looking for an experienced data engineer to join our growing data engineering team. You'll be responsible for building and maintaining our data infrastructure that powers analytics, machine learning, and business intelligence across the company. Your work will directly impact both our internal operations and our developer community through data-driven insights and products.

We believe in giving engineers the autonomy to work efficiently while maintaining high performance standards. As we scale rapidly, we're seeking teammates who share our commitment to open source, know how to ship impactful features, and would contribute to our developer-focused culture.

The Stack

  • Data Warehouse: BigQuery (primary), PostgreSQL

  • Orchestration: Apache Airflow (Cloud Composer), Meltano

  • Data Modeling: dbt

  • Infrastructure: Google Cloud Platform, Pulumi

  • Analytics: Hex, PostHog

  • Reverse ETL: Hightouch

  • Languages: Python, SQL

The Role

Core Responsibilities

Data Pipeline Development & Maintenance

  • Design, build, and maintain scalable ETL/ELT pipelines using Airflow and Meltano

  • Develop and optimize dbt models following our established data warehouse architecture

  • Implement data quality monitoring, testing, and alerting across all pipelines

  • Manage data ingestion from 15+ sources including GitHub, HubSpot, Stripe, PostHog, Sentry, and internal PostgreSQL databases

Infrastructure & Operations

  • Manage BigQuery datasets, reservations, and slot allocation across dev/staging/prod environments

  • Deploy and maintain Airflow DAGs using Cloud Composer with custom Docker images

  • Implement infrastructure as code using Pulumi for GCP resources

  • Monitor pipeline performance and optimize for cost and efficiency

Data Architecture & Modeling

  • Build and maintain our multi-layered data warehouse architecture with standardized naming conventions

  • Design and implement data governance policies and documentation standards

  • Optimize BigQuery performance through partitioning, clustering, and materialization strategies

Reverse ETL & Data Activation

  • Develop and maintain reverse ETL pipelines to sync data to HubSpot, Customer.io, and other downstream systems

  • Build attribution models and customer journey analytics

  • Create automated triggers for sales and marketing outreach based on data milestones

  • Implement data quality checks and monitoring for all reverse ETL processes

Collaboration & Documentation

  • Work closely with Analytics Engineers, Data Scientists, and business stakeholders

  • Maintain comprehensive documentation for all data models and pipelines

  • Participate in code reviews and establish best practices for the team

  • Support other team members in using new datasets and analytical tools

Your Experience

Required Skills

  • 3+ years of production experience with Python and SQL

  • Strong experience with dbt for data modeling and transformation

  • Experience with Apache Airflow for workflow orchestration

  • Proficiency with cloud data warehouses (BigQuery preferred, but Snowflake/Redshift acceptable)

  • Experience with infrastructure as code (Terraform, Pulumi, or similar)

  • Strong understanding of data warehouse design patterns and best practices

  • Experience with Git, CI/CD pipelines, and collaborative development workflows

Bonus Points

  • Experience with Pulumi for infrastructure management

  • Knowledge of PostHog, HubSpot, or other marketing/sales tools

  • Experience with real-time data processing and streaming

  • Background in developer tools or B2B SaaS companies

  • Contributions to open source data engineering projects

What We Offer

  • Fully Remote

    We hire globally. We believe you can do your best work from anywhere. There are no Supabase offices, but we provide a WeWork membership or co-working allowance you can use anywhere in the world.

  • ESOP

    Every team member receives ESOP (equity ownership) in the company. We want everyone to share in the upside of what we’re building together.

  • Tech Allowance

    Use this budget to set up your ideal work environment—laptop, monitor, headphones, or whatever helps you do your best work.

  • Health Benefits

    Supabase covers 100% of health insurance for employees and 80% for dependents, wherever you are. Your wellbeing and your family’s health are important to us.

  • Annual Off-Sites

    Once a year, the entire company gathers in a new city for a week of connection, collaboration, and fun. It’s a highlight of our year.

  • Flexible Work

    We operate asynchronously and trust you to manage your own time. You know what needs to be done and when.

  • Professional Development

    Every team member receives an annual education allowance to spend on learning—courses, books, conferences, or anything that supports your growth.

About the Team

Supabase was born-remote and open-source-first. We believe our globally distributed team is our secret weapon in building tools developers love.

  • 120+ team members

  • 35+ countries

  • 15+ languages spoken

  • $396M raised

  • 350,000+ community members

  • 20,000+ memes posted (and counting)

We move fast, build in public, and use what we ship. If it’s in your project, we probably use it in ours too. We believe deeply in the open-source ecosystem and strive to support—not replace—existing tools and communities.

Hiring Process

We keep things simple, async-friendly, and respectful of your time:

  1. Apply – Our team will review your application.

  2. Intro Call – A short video chat to get to know each other.

  3. Interviews – Up to four calls with:

    • Founders

    • Future teammates

    • Someone cross-functional from product, growth, or engineering (depending on the role)

  4. Decision – We may follow up with a final question or go straight to offer.

All communication is remote and we aim to move fast.

Please mention that you found the job on Real Work From Anywhere, this helps us grow. Thanks.

Supabase company logo

Supabase

Open-source backend platform built on Postgres for auth, storage, realtime, APIs.

View Company Profile

About the job

Posted on

Sep 25, 2025

Apply before

Oct 25, 2025

Job type
Full-Time
Category
Location
Worldwide

Share this job

Similar Jobs

Fingerprint company logo

Data Scientist

Fingerprint

sqltableau
3 days ago
Sardine company logo

Data Scientist-POC

Sardine

sqlpython
20 days ago
bunny.net company logo

Senior Engineering Manager

bunny.net

22 days ago
Sardine company logo

Data Scientist-POC

Sardine

sqlpython
19 days ago
Clutch company logo

Senior Data Analyst

Clutch

sqltableau
17 days ago