Job Title: Lead Analytics Engineer Company Name: Sundays for Dogs Job Url: https://www.simplyhired.com/job/jH4nloe_77dciOuMC5s5Emh3_0pspXY_4tE1PF5NfIOT5Fzv61e_Nw Job Description: Lead Analytics Engineer Sundays for Dogs Remote Job Details Full-time 9 hours ago Qualifications Data model design Looker Databricks Data transformation pipeline development 6 years SQL databases Cloud data warehouses Automation Technical documentation Scalable systems Snowflake SQL Data Architecture Design (Architecture design skills) Analysis skills Data quality monitoring Mentoring Data analytics Developing automated testing protocols Scalability Data visualization Senior level Data visualization projects BigQuery Communication skills Python Dimensional modeling Full Job Description About Sundays for Dogs Sundays for Dogs is a founder-led, direct-to-consumer brand reimagining pet food so dog owners can spend more quality time with their dogs — now and later. We make human-grade, air-dried food that supports long, healthy lives, and we build our company around shared core values that guide how we work and how we treat each other. We’re headquartered in Cleveland, Ohio, with a primarily remote team of ~50 full-time employees across the U.S. While we work distributed day to day, we come together in person on a quarterly basis because relationships matter. Our business today is 100% DTC, with our subscription model at the core of how we serve customers — and we’re building to scale with intention. We’re just getting started... About your team and position: Sundays is a high-growth, category-defining brand that has built a sizable business — without a formal analytics engineering foundation. Now, we’re investing in doing data right. We’re hiring our first Lead Analytics Engineer to build our transformation layer from scratch, establish trusted definitions across the company, and create a scalable dbt-powered foundation that turns rich customer data into durable business advantage. Reporting to our newly appointed VP of Data, this is a rare opportunity to architect the analytics stack at a company that already knows its customers deeply — and is ready to get dramatically smarter. This role is fully remote within the U.S., with travel to our Cleveland headquarters for onboarding and quarterly in-person collaboration to stay connected as a team. Your day-to-day: Develop and maintain high-quality dbt models, tests, documentation, and workflows to create reliable transformation pipelines. Build and optimize dimensional and semantic data models that power business-critical reporting and analysis. Partner with data consumers and stakeholders to translate business questions into well-defined datasets and semantic definitions. Design, build, and maintain the semantic layer for visualization tools such as Omni Analytics, Looker, or Sigma. Ensure data quality and trust by implementing automated testing, monitoring, and observability best practices. Collaborate with Engineering to scale pipelines, improve performance, and reduce latency in the analytics stack. Own the data modeling strategy and governance standards, including conventions, documentation, and onboarding practices. Mentor and guide other analytics engineers and data team members on best practices. We'd love to hear from you if you have: 6+ years of experience in analytics engineering, data engineering, or related roles. Deep expertise with dbt, including advanced modeling, macros and testing frameworks. Strong SQL skills and experience with cloud data warehouses (e.g., Snowflake, BigQuery, Databricks). Significant hands-on experience developing in modern visualization layer platforms such as Omni Analytics, Looker, or Sigma. A solid understanding of DTC and ecommerce metrics such as CAC, LTV, cohort analysis, retention and forecasting. Experience translating business needs into structured, reusable analytics models. Excellent communication skills and ability to present technical concepts to business stakeholders. Nice to Have: Experience with data observability and monitoring tools. Familiarity with reverse ETL tools and workflows. Python experience for orchestration or transformation tasks.