Company Name: Relay (relay.link) Job Details: Hiring,Remotely,in,USA,Remote,200K-200K,Annually,Senior,level Job Url: https://builtin.com/job/senior-data-engineer/7759407 Job Description: About UsRelay is a cross-chain payments system that provides instant low cost cross-chain bridging, swaps, and transactions. We're the fastest and cheapest way to bridge & transact across chains, serving over 5 million users who have completed 59+ million transactions with $6+ billion in volume across 85+ chains.Our company mission is to make transacting across chains as fast, cheap, and reliable as online payments. We are building the core infrastructure to abstract chains from end user payments, enabling the next billion users to experience the benefits of blockchain without the UX burdens. We are bringing the Relay Network to market through our consumer app (Relay.Link) and major ecosystem partners, including Opensea, Alchemy, LiFi, Metamask, Coinbase, and more.Role OverviewWe're seeking a talented Data Engineer to build the data infrastructure that powers analytics and decision-making across Relay. You'll be the critical bridge between our TypeScript production systems and our data warehouse, enabling our analytics team to derive insights from billions in transaction volume across 85+ chains.This is a unique opportunity to build data pipelines for one of the fastest-growing cross-chain infrastructure platforms, working with high-volume blockchain data and establishing foundational data practices for the company.Key ResponsibilitiesData Pipeline DevelopmentDesign and build reliable ETL/ELT pipelines to extract data from our TypeScript production systems into our data warehouseInstrument event tracking and data emission from our cross-chain relayer and application backendProcess and transform blockchain transaction data, cross-chain events, and user activity across 85+ chainsBuild data models that support analytics on transaction volume, capital efficiency, bridge performance, and user behaviorEnsure data quality, consistency, and reliability across all pipelinesInfrastructure & ToolingEstablish data orchestration infrastructure using tools like Airflow, Dagster, or PrefectImplement monitoring, alerting, and observability for data pipelinesOptimize query performance and data warehouse costsBuild self-service data tools and documentation for the analytics teamCross-Functional CollaborationPartner with backend engineers to understand data schemas and implement proper data captureWork closely with our analytics lead to understand reporting needs and design appropriate data modelsCollaborate with product and business teams to define metrics and ensure data availabilityDocument data flows, schemas, and best practicesRequired QualificationsTechnical Expertise3-5+ years of experience building data pipelines and working with data warehousesStrong proficiency in SQL and data modeling techniquesExperience with Python for data processing and pipeline orchestrationFamiliarity with TypeScript/Node.js applications and how to extract data from them (or willingness to learn quickly)Experience with modern data stack tools (dbt, Airflow/Dagster/Prefect, Fivetran/Airbyte, etc.)Knowledge of data warehouse platforms (Snowflake, BigQuery, Redshift, or similar)What We OfferCompetitive base salary ($200K+ depending on experience)Equity packageComprehensive health, dental, and vision insuranceAnnual company offsiteUnlimited PTO policy with encouraged minimum of 2 weeks annuallyRemote-first culture with emphasis on asynchronous communication and flexibilityOpportunity to build foundational data infrastructure for the leading cross-chain payments platform