Job Url: https://jobs.lever.co/gravie/8f3059e3-a256-4edb-bd92-173621797676 Job Description: Senior Data Engineer Minneapolis, MN / RemoteEngineering /Full Time /Hybrid APPLY FOR THIS JOB Hi, we’re Gravie. Our mission is to improve the way people purchase and access healthcare through innovative, consumer-centric health benefit solutions that people can actually use. Our industry-changing products and services are developed and delivered by a diverse group of unique people. We encourage you to be your authentic self - we like you that way.   A Little More About The role: We are seeking a highly skilled and experienced Senior Data Engineer to join our growing data team within the healthcare insurer industry. In this critical role, you will be instrumental in designing, building, and maintaining robust and scalable data pipelines, data models, and data solutions that power our analytics, reporting, and operational systems. You will leverage your expertise in data engineering best practices, cloud technologies (AWS), orchestration tools (Airflow), and visualization platforms (Tableau) to deliver high-quality, reliable, and performant data assets. Your deep understanding of healthcare data and the insurance landscape will be key to your success.   You will: ·       Lead the design and implementation of efficient, scalable, and well-documented data models (dimensional, relational, and potentially NoSQL) for various analytical and reporting needs within a healthcare insurance context. ·       Develop, optimize, and maintain complex data pipelines using various tools and technologies, ensuring data quality, integrity, and timely delivery. ·       Implement and enforce data governance best practices, including data lineage, metadata management, data quality checks, and data security protocols specific to healthcare regulations (e.g., HIPAA). ·       Design, build, and optimize data solutions within the AWS ecosystem (e.g., S3, Redshift, Glue, EMR, Athena, Lambda, Kinesis), ensuring cost-effectiveness and performance. ·       Develop, manage, and monitor data workflows using Apache Airflow to automate data ingestion, transformation, and loading processes. ·       Collaborate closely with data analysts, business intelligence developers, and business stakeholders to understand reporting and analytical requirements and provide robust data foundations. ·       Identify and resolve data performance bottlenecks, ensuring efficient query execution and data retrieval for various use cases. ·       Promote and enforce best practices in data engineering, including version control, testing, documentation, and code reviews. ·       Troubleshoot complex data-related issues, identify root causes, and implement sustainable solutions.   You bring: ·       Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or a related quantitative field. ·       5+ years of progressive experience as a Data Engineer, with a strong focus on data modeling, data pipeline development, and data management. ·       Minimum 2+ years of experience working specifically within the healthcare insurer industry, with a solid understanding of claims data, enrollment data, provider data, clinical data, and related healthcare datasets. ·       Proficiency in designing and implementing data models (e.g., star schemas, snowflake schemas, 3NF). ·       Extensive experience with AWS data services (e.g., S3, Redshift, Glue, EMR, Athena, Lambda, Kinesis, Lake Formation). ·       Strong experience with Apache Airflow for workflow orchestration, scheduling, and monitoring. ·       Proficient in SQL for complex data manipulation, querying, and optimization. ·       Experience with at least one scripting/programming language for data engineering tasks (e.g., Python, Scala). ·       Hands-on experience supporting reporting and analytics teams with data preparation and optimization for tools like Tableau. (Direct Tableau development experience is a plus, but primarily focus on data enablement). ·       Experience with data warehousing concepts and technologies. ·       Familiarity with data governance principles and healthcare data privacy regulations (e.g., HIPAA). ·       Excellent problem-solving skills and attention to detail. ·       Strong communication and collaboration skills, with the ability to translate technical concepts to non-technical stakeholders. ·       You must be eligible to work in the United States without need of sponsorship now or in the future.   Extra credit: ·       Experience with other BI tools or data visualization platforms. ·       Knowledge of Big Data technologies beyond AWS native services (e.g., Spark, Hadoop). ·       Experience with real-time data streaming technologies. ·       Familiarity with CI/CD practices for data pipelines.