Job Url: https://www.remoterocketship.com/company/qode-world/jobs/lead-java-pyspark-developer-united-states-remote Job Description: qode.world Website LinkedIn All Job Openings qode. world is a company that leverages artificial intelligence to revolutionize the recruiting process. Their platform allows users to find candidates by sourcing data from billions of data points worldwide and provides data-driven insights. Users can connect with candidates directly through the platform, conduct customized AI-led interviews, and get comprehensive assessments. The service also integrates easily with LinkedIn, enhancing the talent pool and facilitating direct communication with candidates listed there. Qode. world offers additional recruiting services to assist in hiring for niche or senior roles. They are praised for their effectiveness in streamlining the hiring process and delivering quick results. 11 - 50 employees 🤖 Artificial Intelligence 👥 HR Tech 🎯 Recruitment Lead Java PySpark Developer 4 days ago 🏈 Ohio – Remote 🔔 Pennsylvania – Remote +1 more states ⏰ Full Time 🟠 Senior 🔙 Backend Engineer AWS Azure Cloud Docker Google Cloud Platform Hadoop HDFS Java Kubernetes Microservices PySpark Python Spark Spring Spring Boot SpringBoot SQL Apply Now Receive Emails with Similar Jobs Report problem 📋 Description • Job Title: Lead Java + PySpark Developer • Location: Dallas TX, Pittsburgh PA, Cleveland OH • Employment Type: Full-time • Job Summary: • Incedo is seeking a skilled and experienced Java + PySpark Developer to join its data engineering team. The ideal candidate will work on scalable data pipeline development, data integration, and backend microservices, contributing to analytics and business insights platforms. • Key Responsibilities: • Design, develop, and maintain scalable data processing pipelines using PySpark on big data platforms like Hadoop or Databricks. • Build and enhance Java-based microservices/APIs to support data ingestion and transformation. • Collaborate with data scientists, data analysts, and architects to deliver high-quality, secure, and performant solutions. • Optimize data workflows and troubleshoot performance bottlenecks. • Participate in code reviews, design discussions, and provide technical leadership when needed. • Implement data validation, quality checks, and error-handling mechanisms. • Ensure adherence to data security and compliance standards within financial environments. • Required Skills: • 6+ years of hands-on experience in Java (Spring Boot preferred). • 3+ years of experience with PySpark in distributed data processing. • Experience with Hadoop ecosystem, Hive, HDFS, and Spark SQL. • Solid understanding of RESTful APIs, JSON/XML, and integration patterns. • Strong knowledge of SQL and data modeling. • Familiarity with CI/CD tools, Git, and Agile methodology. • Preferred Skills: • Experience with AWS, Azure, or GCP cloud services. • Prior work in the Banking/Finance domain (especially risk or fraud analytics). • Knowledge of containerization using Docker/Kubernetes. • Educational Qualifications: • Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field. • Let me know if you'd like to tailor this to a contract or remote role, or if you want a version that includes Python only with less Java.