Job Url: https://wellfound.com/jobs/3627181-data-engineer Job Description: Data Engineer $50k – $70k • No equity | Remote ( Everywhere ) |2 years of exp |Full Time Posted: yesterday• Recruiter recently active Hires remotely in Everywhere Remote Work Policy Remote only Company Location New York City Visa Sponsorship Not Available Preferred Timezones Pacific Time, Eastern Time Relocation Not Allowed Skills Python Nginx Kubernetes CICD Apache Airflow AWS/GCP/Azure FastAPI LLMs About the job Job Description – Data Engineer Location: Remote in the US Team: Data & Engineering Seniority: Mid-Level (3–5 years) About Acquco Acquco is a tech-driven e-commerce company focused on acquiring, operating, and scaling third-party marketplace brands. Our engineering and data teams build the automation, intelligence, and infrastructure that drive our acquisition engine and the operational excellence of our portfolio. We are looking for a Data Engineer who is passionate about designing robust data systems, building scalable backend services, and leveraging modern AI models to power internal tools and decision-making. What You Will Do • Design, build, and maintain data pipelines and ETL/ELT workflows to support analytics, automation, and operational tools. • Develop and maintain Python-based microservices and backend systems. • Build and operate cloud infrastructure on AWS, Azure, or GCP, including storage, compute, serverless, and managed data services. • Deploy, manage, and monitor services on Kubernetes, ensuring scalability, security, and reliability. • Work with Airflow to schedule and orchestrate complex workflows. • Build high-performance APIs using FastAPI, and manage reverse-proxy and routing configurations through nginx. • Integrate and operationalize LLMs, either via APIs (OpenAI, Anthropic, etc.) or via private/self-hosted models (HuggingFace models deployed on Kubernetes). • Implement and maintain CI/CD pipelines using GitHub Actions, with strong deployment and testing automation. • Collaborate with data scientists, analysts, engineers, and product teams to bring new data-powered features to life. What We're Looking For (Requirements) • 3–5 years of experience in data engineering, backend engineering, or related roles. • Strong programming experience in Python, including libraries for data processing and API development. • Hands-on experience with at least one major cloud provider: AWS, Azure, or GCP. • Experience deploying and managing workloads on Kubernetes. • Familiarity with workflow orchestration tools, especially Airflow. • Experience building APIs with FastAPI, and working with nginx for routing/proxying. • Practical experience using Large Language Models (LLMs): o via public APIs (OpenAI, etc.), or o running private models (e.g., HuggingFace) in a Kubernetes environment. • Experience with CI/CD pipelines, particularly GitHub Actions. Bonus Points (Good to Have) • Experience with frontend development (JavaScript, modern frameworks). • Experience working in or with e-commerce businesses, marketplaces, or retail ops. Who You Are • You care about creating reliable, scalable systems. • You enjoy end-to-end ownership—from design to deployment to monitoring. • You communicate clearly with both technical and non-technical stakeholders. • You are curious, self-driven, and know when to ship fast vs. when to engineer for the long term.