Job Url: https://www.remoterocketship.com/company/northbeam/jobs/senior-software-engineer-python-data-infrastructure-united-states-remote Job Description: Northbeam Website LinkedIn All Job Openings Northbeam is a marketing intelligence platform that specializes in multi-touch attribution and media mix modeling, powered by machine learning, to enhance profitable growth. The platform offers tools like omnichannel planning, real-time data analytics, and creative analytics to optimize ad spends, improve cross-channel performance, and drive growth for ecommerce brands. Trusted by top brands, Northbeam provides data-driven insights into customer journeys and ad performance with a focus on first-party data. Its solutions help businesses understand conversion paths, optimize budget mixes, and tackle creative fatigue for maximum return on ad spend. 11 - 50 employees Founded 2020 🛍️ eCommerce ☁️ SaaS 🤖 Artificial Intelligence 💰 $15M Series A on 2022-08 Senior Software Engineer, Python, Data & Infrastructure 18 hours ago 🇺🇸 United States – Remote 💵 $170k - $200k / year ⏰ Full Time 🟠 Senior 🔙 Backend Engineer Airflow AWS BigQuery Cloud Distributed Systems Docker ERP ETL Google Cloud Platform GraphQL Kubernetes Python SQL Terraform Apply Now Receive Emails with Similar Jobs Report problem 📋 Description • Northbeam is fundamentally a data product - the whole company. We don’t sell shoes, or ads, or games. We sell data: quality integrations with a variety of platforms, fresh and reliable data pulls, robust data ingest APIs, correct aggregations, and algorithmic insights on top of that data—all packaged into a user-facing application. • High-quality, reliable data integration is at the core of what we do, and your work will directly shape the company’s success. But building great data products also means building great infrastructure: scalable systems, resilient pipelines, and reliable platforms that empower everything else. • We are looking for a Senior Software Engineer with experience in data integration, API-based ETL pipelines, and cloud-native architecture, plus a strong interest in infrastructure engineering. You’ll help us not just ship data pipelines, but also design the systems that make them observable, secure, and resilient at scale. • You’ll work with a small engineering team to create a platform that consolidates third-party data from advertising platforms, e-commerce systems, customer data warehouses, ERP, POS, and CRM systems. Along the way, you’ll tackle questions of scalability, multi-tenancy, system reliability, data validation, cost optimization, and developer ergonomics. • Curiosity, willingness to do the hard thing, and an enjoyment of a startup pace of development will be the key to success in this role. • This is a startup. The only constant is change. Early on, you can expect to: • Design and implement scalable, high-performance data pipelines to ingest and transform data from a variety of sources, with reliability and observability baked in. • Engineer the infrastructure behind those pipelines, including containerized workloads, orchestration, monitoring, and CI/CD that enables the team to move quickly without breaking things. • Build and maintain APIs that enable flexible, secure, and tenant-aware data integrations with external systems. • Balance event-driven and batch processing architectures, ensuring data freshness, correctness, and cost efficiency. • Implement observability, monitoring, and alerting to track system health, failures, and performance issues—covering both data quality and infrastructure reliability. • Contribute to platform resilience by designing for fault tolerance, autoscaling, and graceful failure handling in a multi-tenant cloud environment. • Collaborate across data engineering, infrastructure, and product teams to ensure that the integration platform is flexible, extensible, and easy to onboard new data sources. 🎯 Requirements • 5+ years of experience in data engineering, software engineering, or infrastructure-focused engineering, with a focus on ETL, APIs, and cloud-native orchestration. • Strong proficiency in Python. • Experience with API-based ETL, handling REST, GraphQL, and Webhooks. • Experience implementing authentication flows. • Proficiency in SQL and BigQuery. • Experience with orchestration frameworks (e.g., Airflow, Prefect) to manage and monitor complex workflows. • Familiarity with containerization (Docker, Kubernetes) and cloud infrastructure (GCP/AWS) to deploy and scale workloads. • Strong grounding in infrastructure as code (Terraform, Pulumi, CloudFormation) for repeatable, auditable environments. • Ability to drive rapid development while ensuring maintainability, balancing short-term delivery with long-term platform stability. 🏖️ Benefits • Equity package • Comprehensive healthcare benefits (medical, dental, and vision) • 401(k) plan • Flexible PTO policy • 12 company-paid holidays • 12 weeks of paid parental leave • $500 work-from-home stipend