Job Url: https://www.linkedin.com/jobs/search/?currentJobId=4365160068&distance=25.0&f_TPR=r86400&f_WT=2&geoId=103644278&keywords=software%20engineer&origin=JOB_SEARCH_PAGE_JOB_FILTER&start=50 Job Description: Code Metal Share Show more options Senior Backend Engineer United States · 14 hours ago · 79 applicants No response insights available yet Remote Matches your job preferences, workplace type is Remote. Full-time Matches your job preferences, job type is Full-time. Easy Apply Save Save Senior Backend Engineer at Code Metal Senior Backend Engineer Code Metal · United States (Remote) Easy Apply Save Save Senior Backend Engineer at Code Metal Show more options Your profile is missing required qualifications Show match details Help me update my profile BETA Is this information helpful? Get personalized tips to stand out to hirers Find jobs where you’re a top applicant and tailor your resume with the help of AI. Try Premium for PKR0 About the job About The Role We are looking for a Senior Backend Engineer who is passionate about building robust, scalable backend systems in Python. This role is for someone who thrives in an environment where most of the day is spent writing code, designing clean architectures, and solving hard problems independently. You'll develop backend services that handle complex asynchronous workflows, state machines, and event-driven lifecycles, powering distributed, data-intensive applications. The systems you build will integrate geospatial and temporal data, enforce fine-grained permissions, and operate across multiple services with high reliability. You'll work in a modern Python environment, FastAPI, SQLAlchemy, Redis, and Docker, collaborating with a small, high-performing engineering team that values autonomy, clean code, and pragmatic solutions. This is a 90% hands-on coding position, ideal for someone who loves to architect, implement, and own their code from design to deployment. Responsibilities Design and build FastAPI-based services and APIs with a focus on performance, clarity, and maintainability Implement state machines and event-driven workflows that coordinate asynchronous operations across multiple services. Work with PostgreSQL to design and query complex relational, geospatial, and temporal data models. Build pub/sub and task processing patterns using Redis and modern async frameworks Integrate and extend authentication and authorization systems, enabling role-based and attribute-based access to data and endpoints Deploy and maintain containerized applications using Docker, ensuring reliability, observability, and security Write unit, integration, and performance tests that validate reliability and correctness Participate in code reviews and architecture discussions, contributing to design patterns, tooling, and continuous improvement Collaborate cross-functionally with other engineers and analysts to define APIs, data contracts, and workflow orchestration patterns Requirements 8+ years of professional backend software engineering experience, with a strong focus on Python Proven ability to design and deliver production-grade APIs and services using frameworks such as FastAPI or Flask Solid experience with PostgreSQL, including schema design, query optimization, and migrations Practical knowledge of geospatial and time-series querying, data modeling, and efficient indexing Experience building asynchronous systems using Redis (pub/sub, streams, or queues) or similar technologies Strong foundation in Docker-based development and deployment workflows Deep understanding of software engineering best practices, version control, testing, CI/CD, documentation, and type safety Demonstrated ability to take full ownership of a complex backend service from concept to production, with minimal oversight Nice to Have Experience with event-driven architectures (Celery, FastStream, Kafka, or similar) Familiarity with authentication and authorization standards (OIDC, OAuth2, RBAC) Experience working with geospatial frameworks such as Shapely, GeoAlchemy2, or H3 Understanding of observability and monitoring (e.g., Sentry, structured logging, tracing) Experience deploying to cloud environments (AWS, Azure, GCP) Prior work with async I/O in Python (asyncio, aiohttp, or similar) Knowledge of workflow orchestration and message-based inter-service communication Job search faster with Premium Access company insights like strategic priorities, headcount trends, and more