Company Name: Partners Health Management Job Details: RemoteFull,Time Job Url: https://hiring.cafe/viewjob/xlnkhsyqz13yf50n Job Description: Posted 16h agoSenior Data Engineer (Remote)@ Partners Health ManagementView All JobsElkin or Davidson or Mocksville or Albemarle or Winston-Salem or Monroe or Kannapolis or Statesville or Hickory or GastoniaRemoteFull TimeResponsibilities:Architect data pipelines, Optimize queries, Lead data governanceRequirements Summary:Senior data engineer with extensive Snowflake expertise, Azure data services, SQL Server, and analytics leadership.Technical Tools Mentioned:Snowflake, Snowpark, SQL, Power BI, Azure DevOps, GitHub Actions, Kafka, Event Hubs, Python, Scala Competitive Compensation & Benefits Package!  Position eligible for – Annual incentive bonus planMedical, dental, and vision insurance with low deductible/low cost health planGenerous vacation and sick time accrual12 paid holidaysState Retirement (pension plan)401(k) Plan with employer matchCompany paid life and disability insuranceWellness ProgramsPublic Service Loan Forgiveness Qualifying EmployerSee attachment for additional details.  Location:  Remote option;  Flexible for any of Partners locationsProjected Hiring Range:  Depending on Experience Closing Date:  Open Until FilledPrimary Purpose of Position: Partners Health Management is establishing a new Enterprise Data Ecosystem to modernize and unify our data platform capabilities. We are seeking a highly skilled and motivated Senior Enterprise Data Engineer with expert-level Snowflake development and optimization skill, complemented by strong Azure Data Services and SQL Server experience. This role will lead architecture, design, and implementation of advanced Snowflake solutions including complex data models, transformations, and performance tuning, ensuring seamless integration with Azure-based services and on-premise SQL environments. This role will be instrumental in enabling data-driven decisions across the organization by developing scalable, secure, and well-governed data solutions. Role and Responsibilities:  This position will play a key role in building a scalable and secure data ecosystem across cloud and hybrid environments. The Enterprise Data Engineer will collaborate with technical and business stakeholders to deliver high-quality data assets, reusable models, and actionable insights. The engineer will also help establish enterprise-wide data practices that support analytics, reporting, and strategic operations. Architecture & Solution Design Architect, design, and implement advanced, enterprise-grade Snowflake data pipelines and integrations leveraging features such as tasks, streams, external tables, and secure data sharing.Lead Snowflake-centric data modeling (dimensional, data vault, and hybrid) to deliver high-performance, cost-optimized solutions.Design frameworks for real-time and batch data processing. Develop and implement a comprehensive analytics and reporting strategy aligned with Partners and NC DHHS goals and initiatives.Advanced Data Engineering & Analytics Enablement Develop advanced transformation logic directly in Snowflake using SQL, Snowpark (Python or Scala), and stored procedures.Build and maintain reusable semantic models in Power BI for self-service analytics. Implement materialized views, clustering keys, and query optimization techniques for maximum performance.Deliver reusable Snowflake datasets to power enterprise analytics in Power BI and other BI tools.Develop automated monitoring and alerting for data quality and pipeline health.Data Governance, Security & Compliance Establish and enforce standards for data governance, lineage, metadata, and privacy. Implement Snowflake role-based access control (RBAC), dynamic data masking, and row-level security.Apply advanced Snowflake governance capabilities, including object tagging, data classification, and audit logging to meet HIPAA and GDPR, and other regulatory requirements.DevOps & Continuous Improvement Lead CI/CD for Snowflake deployments using Git-based workflows with Azure DevOps or GitHub Actions.Implement automated testing and schema change management for Snowflake environments.Continuously monitor Snowflake performance and storage usage for cost optimization.Perform performance tuning, cost optimization, and scalability planning. Leadership & Collaboration Mentor junior data engineers and provide technical guidance. Collaborate with analysts, developers, and business leaders to translate business needs into scalable solutions. Communicate technical concepts to non-technical stakeholders effectively.Knowledge, Skills and Abilities:  Expert-level SQL with deep Snowflake-specific expertise in query optimization, performance tuning, and architecture.Strong understanding of data warehousing, ETL/ELT, and modern cloud data architectures. Strong knowledge of integrating Snowflake with BI platforms, and related tools.Proven track record of designing high-performance, cost-efficient data solutions. Hands-on experience with Snowpark, Snowflake tasks, streams, and time travel.Knowledge of event streaming platforms (Kafka, Event Hubs) is a plus. Strong leadership, problem-solving, and communication skills.Education/Experience Required: Bachelor’s degree in Computer Science, Data Science, Engineering, or related field; or equivalent experience.5+ years of experience in data engineering, with at least 3 years of direct Snowflake architecture and development experience.Proven track record of delivering enterprise-grade Snowflake solutions integrated with Azure.  Education/Experience Preferred:  SnowPro Advanced Architect or SnowPro Core certification.Experience with Snowpark, Snowflake security administration, and advanced cost optimization techniques.Healthcare or Medicaid data experience. Licensure/Certification Requirements:  N/A