Job Title: Senior Data Engineer Company Name: tapouts Job Details: RemoteFull,Time Job Url: https://hiring.cafe/viewjob/vfuiljhj5hbqplu6 Job Description: Posted 1w agoSenior Data Engineer@ tapoutsView All JobsWebsiteVenice Beach or United StatesRemoteFull TimeResponsibilities:design pipelines, build dashboards, develop queriesRequirements Summary:5+ years data engineering with SQL, Python; cloud platforms AWS/GCP/Azure; ETL/ELT; data warehousing; orchestration tools; big data frameworks.Technical Tools Mentioned:SQL, Python, AWS, GCP, Azure, Apache Airflow, Prefect, Apache Spark, Kafka, Flink, Snowflake, BigQuery, Redshift, dbt, Delta Lake, Apache Iceberg Senior Data EngineerLocation: Remote Type: Full-Time **We will only consider candidates who are located in Latin AmericaAbout UsAt tapouts, we believe in the boundless potential of every child. Our mission goes beyond teaching skills; we are dedicated to nurturing the emotional and psychological well-being of the next generation. Imagine being part of a team that transforms the lives of a million children and their families. By joining tapouts, you are not just taking on a job but using your talents for a deeply rewarding cause.About the RoleWe are looking for a Senior Data Engineer to join our growing data team. In this role, you will be responsible for designing, building, and maintaining scalable data infrastructure that powers our analytics, AI initiatives, and business operations. This is a hands-on role for someone who thrives in fast-paced environments, thinks like a platform architect, and is passionate about building data systems that matter.Key ResponsibilitiesDesign, build, and maintain robust, scalable data pipelines (batch and real-time/streaming).Design and develop dashboards that surface key business metrics and enable strategic, data-informed decision-making.Develop and optimize complex SQL queries, stored procedures, and data modelsWrite clean, production-grade Python code for data ingestion, transformation, and automationBuild and manage cloud-native data infrastructure on AWS, GCP, or AzureImplement and maintain data lakehouse architectures (e.g., Delta Lake, Apache Iceberg)Support ML workflows including feature engineering, model training pipelines, and MLOps integrationEnsure data quality, governance, and lineage tracking across all data assetsCollaborate with data scientists and analysts to deliver trusted, well-documented datasetsMonitor pipeline performance, troubleshoot issues, and optimize for cost and efficiencyContribute to the development of internal data platform tools and frameworksApply data governance best practices and ensure compliance with data privacy regulations (GDPR, LGPD)What We're Looking ForA platform-first mindset — you think beyond individual pipelines and consider ownership, reliability, and long-term maintainabilityA data-driven approach — you use metrics to measure pipeline health and continuously improveStrong communication skills — you can collaborate with technical and non-technical stakeholdersComfort working in ambiguous, fast-moving environments and bringing structure to chaosA passion for continuous learning — you stay current with the latest tools and trends in data engineeringRequirementsMust-Have:5+ years of experience in data engineering or a related fieldStrong proficiency in SQL — writing complex queries, optimizing performance, and data modelingStrong proficiency in Python — building ETL/ELT pipelines, scripting, and automationExperience with cloud platforms: AWS, GCP, or AzureHands-on experience with data orchestration tools (Apache Airflow, Prefect, or similar)Experience with big data frameworks (Apache Spark, Kafka, Flink, or similar)Familiarity with data warehousing solutions (Snowflake, BigQuery, Redshift, or similar)Strong understanding of data modeling, schema design, and data architecture principlesNice to Have:Experience with dbt (data build tool) and the modern data stackFamiliarity with streaming and event-driven architecturesKnowledge of MLOps and AI pipeline supportExperience with data mesh or data platform engineeringFamiliarity with data governance frameworks and tools (data lineage, data cataloging)Benefitstapouts is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will be considered regardless of race, color, religion, gender, sexual orientation, national origin, genetics, disability, or age.Join us in our mission to empower children with the social and emotional skills they need to succeed!