Job Title: Sr. Data Engineer Company Name: Akido Job Details: $177k-$195k/yrRemoteFull,Time Job Url: https://hiring.cafe/viewjob/lk67aez1wavln47g Job Description: Posted 2d agoSr. Data Engineer@ AkidoView All JobsWebsiteLos Angeles or United States$177k-$195k/yrRemoteFull TimeResponsibilities:Design pipelines, Lead sessions, Collaborate teamsRequirements Summary:5+ years software engineering; 4+ years data engineering; experience with Snowflake, Airflow, DBT; Python and SQL; AWS; ability to design data pipelines and collaborate with teams.Technical Tools Mentioned:Snowflake, Airflow, DBT, AWS, Python, SQL, Git Akido builds AI-powered doctors. Akido is the first AI-native care provider, combining cutting-edge technology with a nationwide medical network to address America’s physician shortage and make exceptional healthcare universal. Its AI empowers doctors to deliver faster, more accurate, and more compassionate care. Serving 500K+ patients across California, Rhode Island, and New York, Akido offers primary and specialty care in 26 specialties—from serving unhoused communities in Los Angeles to ride-share drivers in New York. Founded in 2015 (YC W15), Akido is expanding its risk-bearing care models and scaling ScopeAI, its breakthrough clinical AI platform. Read more about Akido’s $60M Series B. More info at Akidolabs.com.The Opportunity Akido is hiring a Senior Data Engineer to design, build, and own our modern data platform.    What you’ll do  Design, build, and own data pipelines using DBT, Airflow, and Snowflake  Lead architectural design sessions for the modern data stack, focusing on solutions that seamlessly integrate with our technology stack, which encompasses Snowflake, Airflow, DBT, and AWS data services  Work with our data science and product management teams to design, rapidly prototype, and productize new data product ideas and capabilities  Collaborate cross-functionally with product, data science, and engineering leadership  Participate in code reviews to ensure code quality and distribute knowledge  Mentor teammates and contribute to a culture of continuous learning    Who you are  Minimum 5 years of professional software engineering experience; bachelor’s degree in computer science or a related field (or equivalent practical experience)  4+ years of experience in a data engineering role with deep exposure to modern data stacks including (or similar): Snowflake, Airflow, DBT, and AWS data services.   Expertise in applying Python and SQL to execute complex data operations, customize ETL/ELT processes, and perform advanced data transformations across the platform  Experience partnering with analytics and business stakeholders to translate requirements into scalable data solutions  Strong experience with version control systems (GitHub, GitLab)  Demonstrated ability to effectively leverage AI coding agents (such as Cursor or Copilot) while maintaining high standards for code quality, security, and correctness  Top notch communication skills  A mission-oriented mindset and a strong desire to continuously learn and improve  AWS environment familiarity     Preferred Qualifications/Skills  Kafka, Sigma, and working with healthcare data  Expertise in establishing data quality assurance frameworks  Hands-on experience with AWS infrastructure and data services Hands-on experience integrating external and internal systems via REST APIs, including ingesting and transforming JSON-based payloads at scale    Benefits  Stock-options package Health benefits include medical, dental and vision 401K Long-term disability Unlimited PTO Life insurance Paid Leave Program Salary range$176,875—$194,688 USDAkido Labs, Inc. is an equal opportunity employer, and we encourage qualified applicants of every background, ability, and life experience to contact us about appropriate employment opportunities.