Job Url: https://www.remoterocketship.com/company/blueorange-digital/jobs/senior-data-devops-engineer-united-states-remote Job Description: Blue Orange Digital Website LinkedIn All Job Openings Blue Orange Digital is a strategic data consultancy that specializes in using data and advanced technologies to transform their clients' operations. They provide end-to-end modern data services, including data engineering, data analytics, and machine learning, to deliver powerful insights and ensure organizations remain competitive in the digital era. Their expertise extends across industries such as financial services, real estate, insurance, and more. With a focus on implementing advanced analytics and AI solutions, Blue Orange Digital facilitates the migration and optimization of data platforms to secure and scalable cloud architectures. Node • web development • React/Redux • Spark • sparkml 51 - 200 employees 🤖 Artificial Intelligence ☁️ SaaS 💰 $700k Corporate Round on 2022-04 Senior Data, DevOps Engineer 5 days ago 🌵 Arizona – Remote 🐊 Florida – Remote +2 more states 💵 $150.5k - $170.5k / year ⏳ Contract/Temporary 🟠 Senior ⛑ DevOps & Site Reliability Engineer (SRE) Airflow Amazon Redshift AWS Azure Cloud Docker DynamoDB ETL Google Cloud Platform JavaScript Kafka Microservices Oracle Python Scala Spark SQL Tableau Terraform Apply Now Receive Emails with Similar Jobs Report problem 📋 Description •Drive innovation and data pipelines and AWS services using AWS CDK in python. •Work fast with client experts and stakeholders to learn the existing data flows, code bases, infrastructure, operations, log methods, etc. •Build, maintain, and data ingestions, data models, orchestrations, transformations and validation tests. •Quickly master data flows and sets and code to rapidly position yourself to begin enhancing and advancing the platform as a whole, both functionally and non-functionally. •Quickly achieve information dominance and operational prowess across all microservices, code bases and data flows supporting the platform in AWS. •Evolve the data architecture in collaboration with the existing Team to take on adjacent platform missions and volumes. •Be and stay professionally and aggressively curious about the platform with colleagues, the code and its data. •Work in an extreme delivery Agile mode to constantly deliver value for our clients. 🎯 Requirements •Strong working command of AWS CDK and AWS services. •At least 7 years experience building and supporting data platforms; exposure to data technologies, eg. RDS, DynamoDB, Redshift, EMR, Glue, kafka, kinesis, MSK, Data Pipeline, Lake Formation, dbt, Airflow, Spark, etc. •Experience with AWS and exposure to other cloud data platforms, like ADF, Azure Fabric, Snowflake, Databricks, etc. •Advanced level Python, SQL, and Bash scripting. •Experience designing and building robust CI/CD pipelines. •Comfortable with Docker, configuration management, and monitoring tools. •Knowledge of best practices related to security, performance, and disaster recovery. •Excellent verbal and written English communication. •BA/BS degree in Computer Science or a related technical field, or equivalent practical experience. •Interacts with others using sound judgment, good humor, and consistent fairness in a fast-paced environment. •The ability to maintain poise, efficiency and effectiveness in fast-paced, sometimes frenetic, high-stakes environments.