Job Title: AI Systems Engineer (Agent-Native Development) Company Name: Deeter Analytics Job Details: RemoteFull,Time Job Url: https://hiring.cafe/viewjob/m7d7bnldczmxwcp2 Job Description: Posted 17h agoAI Systems Engineer (Agent-Native Development)@ Deeter AnalyticsView All JobsWebsiteUnited StatesRemoteFull TimeResponsibilities:define specs, design tests, ship systemsRequirements Summary:Experience building AI-driven production systems; ability to define specifications, validate with acceptance tests, and ship scalable agent-based workflows. Deeter AnalyticsAt Deeter Analytics, we’re building something that doesn’t get built twice in a generation.Our goal is to create a fundamental trading model as capable as today’s most advanced AI systems — but applied to global markets. Not incremental signals or isolated strategies, but a system that can continuously interpret, learn from, and act on the evolving state of the world.We train on large-scale, real-time social data — capturing how narratives form, how sentiment propagates, and how collective behavior drives markets. This requires operating at the frontier of data infrastructure, model design, and compute, all tightly integrated into a single system.You’ll work alongside a small group of elite traders, engineers and AI researchers, in an environment defined by speed and ownership. We run experiments continuously. Ideas move from concept to production in hours. And the feedback loop is immediate — measured directly in live performance.About the roleYou will build systems where AI agents do most of the implementation, and your role is to define, structure, and validate what gets built. This is closer to operating a high-throughput production system than writing code line-by-line.We prefer systems that scale through better specifications and feedback loops, not manual implementation.What you’ll work on● Defining systems through clear, executable specifications● Designing behavioral scenarios and acceptance tests that validate outcomes● Running tight loops: spec → agent execution → validation → iteration● Structuring work so agents converge quickly and reliably● Shipping systems into production with monitoring, rollback strategies, and fast iteration under real constraintsWhat we’re looking forWe’re looking for people who can turn ambiguous problems into systems that work, using AI as a primary building tool.Strong signals:● You have shipped production systems built largely with AI agents● You think in terms of behavior and outcomes, not implementation details● You can write precise specifications that machines can execute● You design validation through scenarios/adversarial cases/black-box testing● You run fast iteration loops and know how to drive systems toward convergence● You are comfortable operating at scale large contexts/high token usage/parallel runs● You take ownership of production outcomes, not just developmentBonus signals:● Experience building or contributing to agent tooling or workflows● Public work that clearly shows AI-assisted development processes● Experience designing evaluation systems that are hard to game● Experience operating as a high-leverage individual or in very small teams● Strong written communication — clear, structured, and unambiguous