Job Title: Data Engineer Company Name: Koddi Job Url: https://wellfound.com/jobs/3875337-data-engineer Job Description: Koddi Powering the next evolution of travel advertising B2B Growth Stage 5.0 Highly rated +2 Save Apply Now Data Engineer Remote ( Fort Worth ) |24 years of exp |Full Time Posted: yesterday• Recruiter recently active Hires remotely in Fort Worth Remote Work Policy Remote only Company Location Austin •  Ann Arbor •  Oakland •  New York •  Fort Worth •  Düsseldorf Visa Sponsorship Not Available Relocation Allowed Skills Hadoop SQL PostgreSQL ETL Spark Elt databricks Iceberg About the job About Koddi We are a global technology company with software and services that help top digital marketplaces effectively monetize their first-party audiences through industry-leading commerce media technology and strategy. Our enterprise platforms leverage first-party data to drive marketplace revenue and profit by improving user experience and target shoppers throughout the purchase path. Koddi’s platforms enable any advertiser, any marketplace, in any industry to increase awareness, generate demand, and drive revenue. We are committed to driving innovation and growth in the commerce media space. As part of that commitment, we’ve launched Koddi Academy, a learning platform designed to provide expert knowledge, competitive insights, and practical strategies for thriving in this fast-growing industry. Explore our latest certification course, to deepen your expertise and stay ahead in the evolving world of commerce media. You can also sign up for our waiting list to be notified when new courses are released. Get started today at Koddi Academy. Job Summary Koddi, Inc., is hiring Data Engineers in Fort Worth, TX to develop, implement, test, and operate large-scale, high-performance data structures to support analytics and reporting needs. Telecommuting permitted from anywhere in the United States. Multiple openings. Full time. Equal Opportunity / Affirmative Action Employer. What You'll Do Build and maintain efficient, scalable ETL/ELT pipelines to ingest, transform, and integrate data from multiple structured and unstructured sources into a unified data platform. Implement data models and architectures using best practices in relational (PostgreSQL), distributed (Databricks/Spark), and NoSQL environments, ensuring data quality, consistency, and accessibility. Partner with product, business, and software teams to gather requirements, analyze source data, and deliver solutions that enable data-driven decision-making. Design and implement data solutions that scale with growing data volumes and support high-performance querying and analytics. Produce and maintain comprehensive dataset documentation, metadata, and technical specifications to ensure transparency and reproducibility. Engage in the full development lifecycle, from requirements gathering, design, implementation, and testing, through deployment, documentation, and ongoing support. Evaluate proposed data solutions, tools, and platforms to ensure alignment with organizational standards and future scalability. Share best practices in data modeling, pipeline development, and performance optimization. What You'll Bring Requires a Master’s degree in Computer Science, Computer Engineering, Data Science, or Analytics. Requires 24 months of experience in the job offered or closely related technical occupation. Requires experience in the following: big data engineering using Hadoop; SQL querying; building multi dimensional reports and dashboards; big data analytics using Spark and Databricks; automating API processes; Iceberg installation and implementations; and working with log data such as Netflow, Cisco, Solarwinds, Palo Alto, and Systempulse.