Job Url: https://www.remoterocketship.com/company/akubeinc/jobs/senior-data-engineer-united-states-remote/ Job Description: aKUBE Website LinkedIn All Job Openings aKUBE is a recruitment firm that specializes in connecting top-tier IT professionals with innovative companies. They emphasize a quality-focused approach to recruiting, ensuring a perfect match between candidates and company values. By leveraging advanced technology and a refined process, aKUBE streamlines staffing solutions that align with client timelines while offering services like contract staffing, direct hire placements, and payroll management. Their commitment to performance and client partnerships positions aKUBE as a trusted partner in achieving business success through tailored staffing solutions. 51 - 200 employees 🎯 Recruiter ☁️ SaaS Senior Data Engineer September 12 πŸ„ California – Remote πŸ’΅ $92 / hour ⏳ Contract/Temporary 🟠 Senior 🚰 Data Engineer Airflow Amazon Redshift Apache AWS BigQuery Cloud EC2 ETL Hadoop HDFS Java PySpark Python Scala Spark SQL Apply Now Receive Emails with Similar Jobs Report problem πŸ“‹ Description β€’ Design, build, and optimize large-scale data pipelines and warehousing solutions β€’ Develop ETL workflows in Big Data environments across cloud, on-prem, or hybrid setups β€’ Collaborate with Data Product Managers, Architects, and Engineers to deliver scalable and reliable data solutions β€’ Define data models and frameworks for data warehouses and marts supporting analytics and audience engagement β€’ Maintain strong documentation practices for data governance and quality standards β€’ Ensure solutions meet SLAs, operational efficiency, and support analytics/data science teams β€’ Contribute to Agile/Scrum processes and continuously drive team improvements 🎯 Requirements β€’ 6+ years of experience in data engineering with large, distributed data systems β€’ Strong SQL expertise and experience with MPP databases (Snowflake, Redshift, or BigQuery) β€’ Expertise in Big Data engineering pipelines β€’ Hands-on experience with Apache Spark (PySpark, Scala) and Hadoop ecosystem (HDFS, Hive, Presto) β€’ Proficiency in Python, Scala, or Java β€’ Experience with cloud environments (AWS – S3, EMR, EC2) β€’ Experience with orchestration/ETL tools such as Airflow β€’ Data warehousing and data modeling knowledge β€’ Familiarity with Agile methodologies β€’ Bachelor’s degree in STEM required β€’ Work authorization: Green Card, US Citizen, or valid EADs (except OPT, CPT; H1B not accepted) β€’ No C2C, 1099, or subcontractors; W2 only πŸ–οΈ Benefits β€’ Remote work Apply Now