Job Url: https://www.linkedin.com/jobs/search/?currentJobId=4348098558&distance=25&f_AL=true&f_TPR=r86400&f_WT=2&geoId=103644278&keywords=software%20engineer&origin=JOB_SEARCH_PAGE_JOB_FILTER&refresh=true&spellCorrectionEnabled=true&start=25 Job Description: GENNTE Technologies Share Show more options Data Engineer United States · 20 hours ago · Over 100 applicants Promoted by hirer · No response insights available yet Remote Matches your job preferences, workplace type is Remote. Full-time Easy Apply Save Save Data Engineer at GENNTE Technologies Data Engineer GENNTE Technologies · United States (Remote) Easy Apply Save Save Data Engineer at GENNTE Technologies Show more options Get personalized tips to stand out to hirers Practice mock interviews personalized to every role and get custom feedback Try Premium for PKR0 Meet the hiring team Namrata T. 3rd IT & Healthcare Recruiter at GENNTE Technologies | Hiring Allied Health & Nursing Professionals Across the USA | Recruiting IT Talent for USA & Mexico| Volunteer Brand Representative at The Carbon Footprints Job poster Message About the job Job Summary: We are seeking experienced Data Engineers to support a high-impact data modernization initiative. The ideal candidate will be responsible for building and maintaining scalable data pipelines and foundational components of Datawarehouse. This role requires strong technical expertise, the ability to work independently, and collaboration with client teams. Key Responsibilities: Design, develop, and maintain robust data pipelines aligned with medallion architecture principles. Source and integrate data from diverse systems including Oracle, MSSQL, MarkLogic, SAS, DB2, and Mainframe. Collaborate with client stakeholders to gather requirements and deliver solutions independently or as part of a team. Ensure data quality, consistency, and reliability across ingestion and transformation processes. Participate in code reviews, performance tuning, and documentation of data workflows. Required Skills & Qualifications: 10+ years of hands-on experience in Data Engineering. Proficiency in building data pipelines using AWS, Snowflake, and Flexter. Strong knowledge of ETL processes and data integration from multiple sources. Familiarity with medallion architecture and modern data lakehouse concepts. Experience with scripting and programming languages such as Python and SQL. Understanding of MDM and reference data management concepts. Excellent communication and problem-solving skills. Insurance domain experience is a plus but not mandatory.