Job Url: https://usource.ripplehire.com/candidate/?token=8YxWjpwDhdL62DFYUIcQ&lang=en&source=USTLINKEDIN&ref=USTLINKEDIN#detail/job/51954 Job Description: Python Developer 1 Opening Chicago Role description Python Developer Lead I - Software Engineering       Who We Are: Born digital, UST transforms lives through the power of technology. We walk alongside our clients and partners, embedding innovation and agility into everything they do. We help them create transformative experiences and human-centered solutions for a better world. UST is a mission-driven group of 29,000+ practical problem solvers and creative thinkers in more than 30 countries. Our entrepreneurial teams are empowered to innovate, act nimbly, and create a lasting and sustainable impact for our clients, their customers, and the communities in which we live. With us, you’ll create a boundless impact that transforms your career—and the lives of people across the world. Visit us at UST.com.   You Are: UST is seeking a Python developer with solid Python fundamentals and strong hands-on expertise in PySpark for building scalable data processing pipelines on AWS.   The opportunity: ·       Design, build, and optimize PySpark-based data pipelines (batch & streaming) on AWS. ·       Tune Spark jobs for performance, reliability, and cost efficiency; monitor using Spark UI/CloudWatch. ·       Collaborate with platform, data, and application teams to integrate pipelines with Glue/EMR/Lambda/Step Functions. ·       Establish CI/CD for data workflows and ensure test coverage and deployment automation. ·       Contribute to coding standards, documentation, and Agile ceremonies.   This position description identifies the responsibilities and tasks typically associated with the performance of the position. Other relevant essential functions may be required.             What you need: ·       4+ years of professional experience in software development with a strong focus on Python. ·       Solid understanding of core Python concepts, data structures, algorithms, and design patterns. ·       Proficiency in Python for scripting, automation, backend services, and data-processing workflows. ·       Data modeling for analytics (medallion architecture: bronze/silver/gold), Parquet/Avro/JSON best practices. ·       Hands-on expertise with PySpark, including: ·       Working with DataFrames/Datasets and Spark SQL ·       ETL/ELT pipeline development for large-scale, batch and near-real-time workloads. ·       Expertise and hands on experience in Performance tuning & optimization. ·       Hands on experience on Spark Streaming. ·       Excellent knowledge of Lakehouse & table formats: Delta Lake (preferred), Apache Hudi or Apache Iceberg. ·       Expertise in Data quality & validation. ·       Excellent knowledge of Pandas. ·       AWS hands-on experience with a strong understanding of cloud principles, including: ·       AWS Glue (ETL jobs, Spark jobs, Glue Studio/Workflows, Glue Data Catalog) and AWS Lambda for serverless integrations. ·       Amazon EMR (cluster sizing, autoscaling, cost optimization with Spot, versioned runtimes). ·       Amazon S3 (data lake layout, partitioning, lifecycle policies. ·       Orchestration & monitoring: AWS Step Functions, Amazon MWAA/Airflow, CloudWatch Logs/Metrics/Alarms. ·       Experience with Agile development methodologies. ·       Familiarity with CI/CD concepts and tooling such as AWS CodePipeline/CodeBuild/CodeDeploy; infrastructure as code (CloudFormation/Terraform) is a plus. ·       Testing & code quality: unit/integration testing for Spark (pytest, chispa), code reviews, PEP 8, type hints/mypy. ·       Strong problem solving, analytical, and communication skills. ·       Ability to work independently and collaboratively in a team environment. ·       Nice to Have ·       Knowledge of Java and the Spring framework. ·       Databricks on AWS: Jobs, clusters, notebooks, Repos, Delta Live Tables, Unity Catalog. ·       Experience with catalog governance and row/column-level security. ·       Exposure to cost/performance governance (e.g., file compaction, small-files mitigation, Z-Ordering for Delta). ·       Knowledge of REST APIs integration and message-based architectures.     Compensation can differ depending on factors including but not limited to the specific office location, role, skill set, education, and level of experience. UST provides a reasonable range of compensation for roles that may be hired in various U.S. markets as set forth below. Role Location: Remote Compensation Range: $75,000-$113,000