Job Url: https://www.remoterocketship.com/company/aq-io/jobs/software-engineer-data-platform-worldwide-remote/ Job Description: ALGOQUANT Website LinkedIn All Job Openings Structured Products • Asset Management • Hedge Fund • Digital Assets • Quantitative Trading ALGOQUANT is a multi-strategy digital asset manager that allocates capital across internal and selected external quantitative trading teams. Founded in 2018 as a proprietary trading firm, it has evolved into an institutional platform combining quantitative research, advanced trading technology, disciplined risk and compliance frameworks, and institutional-grade custody and banking partnerships to deliver transparent, diversified exposure to digital asset markets. ALGOQUANT serves institutional investors (including family offices and protocol treasuries) and provides infrastructure, capital allocation, and execution support to partnered quant teams. 11 - 50 employees Founded 2019 ₿ Crypto 💸 Finance 🤝 B2B Software Engineer – Data Platform October 13 🌏 Anywhere in the World ⏰ Full Time 🟠 Senior 🔴 Lead 🧑‍💻 Full-stack Engineer Airflow Apache AWS Azure Cloud Docker Google Cloud Platform Kafka Kubernetes Linux Python PyTorch Spark Tensorflow Apply Now Receive Emails with Similar Jobs Report problem 📋 Description • Design, build, and maintain real-time and batch data processing pipelines for market, alternative, and on-chain data sources. • Ensure high availability and low latency across critical ingestion and transformation processes. • Develop and evolve our internal Data Lake and the surrounding data processing ecosystem. • Implement modern lakehouse technologies to enable scalable, queryable, and versioned data storage. • Build validation, monitoring, and alerting systems to guarantee the accuracy, consistency, and completeness of data. • Establish robust data quality frameworks and observability tooling across the pipeline stack. • Develop and maintain internal Python and C++ libraries for feature calculation, data processing, backtesting, and ML inference. • Promote code reuse, performance optimization, and reproducibility across teams. • Work closely with quant researchers, traders, and the execution team to understand data requirements and support their workflows. • Translate research and trading needs into reliable, production-grade data infrastructure. 🎯 Requirements • Experience: 10+ years in software engineering or data infrastructure development. • Languages: Expert-level Python and C++. • Distributed Data Systems: Proven experience with Spark, Flink, Slurm, Dask, or similar frameworks. • Data Lakehouse Technologies: Hands-on with Apache Iceberg, Delta Lake, or equivalent systems. • Messaging & Streaming: Strong experience with Kafka or similar streaming platforms. • Infrastructure: Proficient with Linux, Kubernetes, Docker, and workflow orchestrators like Airflow. • Machine Learning Exposure: Familiarity with PyTorch, TensorFlow, or model inference frameworks. • Cloud Platforms: Experience deploying and maintaining systems on AWS, GCP, or Azure. • AI Engineering Tools: Experience using Claude Code, GitHub Copilot, Codex, or similar AI-assisted coding tools. 🏖️ Benefits • Help build the data backbone of a cutting-edge digital asset investment platform. • Collaborate directly with world-class engineers, researchers, and quants. • Be part of a fully remote, high-performance culture that values innovation, autonomy, and continuous learning. • Shape the future of data infrastructure in a company where technology drives alpha.