Job Title: Founding Product Engineer (Full Stack) Company Name: WindShift Labs Job Url: https://wellfound.com/jobs/3901096-founding-product-engineer-full-stack-ai-platform Job Description: WindShift Labs WindShift builds the AI operating system for investment diligence, turning raw deal data into decisi Save Apply Now Founding Product Engineer (Full Stack) -- AI Platform $150k – $230k • 0.5% – 1.5% | Remote ( Canada +1 )  •  Boston |5 years of exp |Full Time Posted: today• Recruiter recently active Job Location Boston Remote Work Policy Onsite or remote Hires remotely in Canada -  United States Visa Sponsorship Not Available Preferred Timezones Pacific Time, Mountain Time, Central Time, Eastern Time, Atlantic Time Collaboration Hours 10:00 AM - 5:30 PM Eastern Time Relocation Allowed Skills Python Distributed Systems Redis PostgreSQL TypeScript Product Design Docker React.js AWS Terraform Next.Js FastAPI Langgraph RAGs, ChatGPT, Hugging Face, LangChain, LlamaIndex, Transformers, VectorDB Vector Databases (Pinecone Serverless, Qdrant, FAISS, ChromaDB) About the job Founding Product Engineer — AI-First Systems Architect The Mission WindShift is building the Operating System for AI-Native Investment Diligence. We are redesigning execution around AI as a core primitive — not layering it onto legacy workflows. The platform is live with paying customers. The architecture exists. We are scaling the AI-native execution layer into a durable enterprise system. We are hiring a founding engineer who builds systems the AI-first way. This role will help define the execution architecture that underpins a venture-scale AI-native platform. What AI-First Means Here We do not want manual feature builders. We want engineers who: Design systems where LLMs are orchestration primitives Structure execution as agent graphs, toolchains, and evaluators Treat Claude/Codex as collaborators, not autocomplete Optimize for system leverage, not lines of code Build evaluation loops, regression harnesses, and hallucination detection Think in terms of compounding structured intelligence If your default mode is writing everything manually before asking whether AI should orchestrate it, this role is not a fit. What You’ll Architect AI-Native Execution Layer Multi-step agent pipelines (planning → retrieval → synthesis → verification) Tool and adapter abstractions that allow structured orchestration Canonical intermediate representations decoupled from rendering AI-Driven Development Workflow Structured prompting systems Internal orchestration layers around Claude/Codex Automated evaluation and regression testing Cost-aware model routing and caching strategies Production-Grade Infrastructure Async worker reliability (retries, idempotency, backpressure) Observability and traceability across agent runs Multi-store data architecture (Postgres, Redis, vector DB, graph DB) Terraform-managed AWS deployment (ECS, RDS, ALB) The Standard You should: Have 5+ years shipping production systems Be fluent in modern LLM tooling and orchestration frameworks Understand distributed systems tradeoffs Be comfortable designing architecture, not just implementing tickets Operate independently in a high-context environment You should not need step-by-step specs. Why This Is a Rare Opportunity AI-native execution will reshape professional services economics. WindShift already compresses repeatable workstreams by 20–30% in live environments. The next phase is scaling that leverage structurally to >50%. With a small, high-talent team, AI-first engineering is the multiplier. This role is foundational to that leverage.