Job Title: [GFA] Azure Senior Data Engineer Company Name: Software Mind Job Details: RemoteFull,Time Job Url: https://hiring.cafe/viewjob/xfjv2xdphmsohpim Job Description: Posted 9h ago[GFA] Azure Senior Data Engineer @ Software MindView All JobsWebsiteKrakow, Lesser Poland Voivodeship, PolandRemoteFull TimeResponsibilities:Develop pipelines, Automate processes, Build integrationsRequirements Summary:Python/PySpark, SQL; Databricks; CI/CD; Kafka/Event Hubs/Kinesis; Azure cloud; data pipelines; ETL; Spark; Agile/Scrum.Technical Tools Mentioned:Python, PySpark, SQL, Databricks, Kafka, Terraform, Kubernetes, Azure, CI/CD Company Description:Software Mind develops solutions that make an impact for companies around the globe. Tech giants & unicorns, transformative projects, emerging technologies and limitless opportunities – these are a few words that describe an average day for us. Building cross-functional engineering teams that take ownership and crave more means we’re always on the lookout for talented people who bring passion and creativity to every project. Our culture embraces openness, acts with respect, shows grit & guts and combines employment with enjoyment.Job Description: Project – the aim you’ll haveOur customer provides innovative solutions and insights that enable our clients to manage risk and hire the best talent. Their advanced global technology platform supports fully scalable, configurable screening programs that meet the unique needs of over 33,000 clients worldwide. Headquartered in Atlanta, GA, they have an internationally distributed workforce spanning 19 countries with about 5,500 employees. Our partner perform over 93 million screens annually in over 200 countries and territories. We are seeking a Senior Data Engineer with solid Python/PySpark programming skills to join the Data Engineering Team and help us build the Data Analytics Platform in Azure cloud.  Position – how you’ll contributeDevelop reusable, metadata-driven data pipelinesAutomate and optimize any data platform related processesBuild integrations with data sources and data consumersAdd data transformation methods to shared ETL librariesWrite unit testsDevelop solutions for the Databricks data platform monitoringProactively resolve any performance or quality issues in ETL processesCooperate with infrastructure engineering team to set up cloud resourcesContribute to data platform wiki / documentationPerform code reviews and ensures code qualityInitiate and implements improvements to the data platform architectureQualifications: Expectations – the experience you needProgramming: Python/PySpark, SQLProficient in building robust data pipelines using Databricks SparkExperienced in dealing with large and complex datasetsKnowledgeable about building data transformations modules organized as libraries (Python packages)Familiar with Databricks Delta optimization techniques (partitioning, z-ordering, compaction, etc.)Experienced in developing CI/CD pipelinesExperienced in leveraging event brokers (Kafka /Event Hubs / Kinesis) to integrate with data sources and data consumersUnderstanding of basic networking conceptsFamiliar with Agile Software Development methodologies (Scrum) Additional skills – the edge you haveUnderstanding of stream processing challenges and familiarity with Spark Structured StreamingExperience with IaC (Terraform, Bicep or other)Experience running containerized applications (Azure Container Apps, Kubernetes)Experience building event sourcing solutionsFamiliarity with platforms for change data capture (e.g. Debezium)Knowledge of Azure cloud native solutions (e.g. Azure Data Factory, Azure Function App, Azure Container Instances)Additional Information:Our offer – professional development, personal growth:Flexible employment and remote work  International projects with leading global clients International business trips  Non-corporate atmosphere Language classes Internal & external training Private healthcare and insurance  Multisport card Well-being initiatives Position at: Software Mind PolandThis role requires candidates to be based in Poland.