Job Title: Senior Data Product Engineer Company Name: Hydrolix Job Url: https://www.linkedin.com/jobs/view/4374580172/?eBP=NOT_ELIGIBLE_FOR_CHARGING&trk=flagship3_search_srp_jobs&refId=ofDkJbHJ7%2FnOK7Tw1QajVA%3D%3D&trackingId=4bmmuXfQr4BX45qpDpF5eg%3D%3D Job Description: Hydrolix Senior Data Product Engineer United States · 13 hours ago · Over 100 applicants Promoted by hirer · Company review time is typically 1 week Remote Full-time Easy Apply Save Use AI to assess how you fit Get AI-powered advice on this job and more exclusive features with Premium. Try Premium for PKR0 Show match details Tailor my resume Help me stand out People you can reach out to Meet the hiring team Harry G. • 3rd Recruiting @ Hydrolix Job poster Message About the job Senior Data Product Engineer — Insights & Semantic Intelligence Hydrolix | USCAN, Remote Job Description Today’s reality: Solutions engineers manually build queries and dashboards for each customer Every customer’s schema is different—even when representing similar concepts, they have different contextual meanings We rebuild similar analytics logic for each new use case (bot detection, CDN performance, security threats) Domain expertise lives in people’s heads, not in reusable systems Your mission: Build the data products and semantic intelligence systems that let us answer customer questions correctly and automatically. You’ll work on two interconnected efforts: Insight Products (70% of your time) Design schemas, rollups, and pipelines for security, bot detection, CDN performance, and other observability domains Build incremental data processing systems that handle backfills, schema evolution, and multi-tenant complexity Optimize queries from “scans 1TB” to “scans 10GB” by restructuring how data is stored Define semantic models for each domain—what “actor,” “endpoint,” or “attack pattern” means in context Ship new features and metrics weekly based on customer feedback Semantic Intelligence Platform (30% of your time) Contribute to the semantic knowledge that maps concepts across schemas and domains Build query planning components that translate semantic questions into optimized physical queries Implement validation systems that catch semantic errors before execution Create versioned APIs consumed by dashboards, applications, and automation Extract patterns from insight products and generalize them into reusable semantic infrastructure The integration: Insight products are the laboratory where we learn what customers need. The semantic platform is how we make that knowledge reusable across all products and customers. Key Responsibilities Design and implement data models, rollups, and pipelines for observability use cases Write Python (or Rust) for data transformations, validation, and pipeline orchestration Optimize SQL queries and data layouts for columnar database performance Define semantic concepts and relationships for security, CDN, and bot detection domains Contribute to semantic contract schemas, query planners, and validation systems Work with Solutions Engineers and customers to validate that data products answer real questions Ship weekly—new metrics, new data sources, query optimizations, or semantic capabilities Qualifications and Skills 5-7+ years building data products, pipelines, or analytics platforms (experience with log/event data strongly preferred) Experience with columnar databases or OLAP systems (ClickHouse, Druid, Pinot, TimescaleDB)—you understand how columnar storage affects data modeling and query performance Strong SQL skills for complex aggregations, performance tuning, and query optimization Programming in Python, Rust, or Go for data transformation, validation, and tooling Observability, security, or log analytics domain experience (CDN logs, WAF logs, application monitoring, or similar) Experience with schema design, data modeling, and semantic systems Comfort with ambiguity and rapid iteration—you can ship weekly while building toward a bigger vision Track record of building data products that customers rely on Bonus Qualifications Experience building semantic layers, metadata systems, or knowledge graphs Background in query planning, optimization, or SQL compilation Multi-tenant data architecture experience Data visualization experience (Grafana, Tableau, or similar) Understanding of bot detection, security analytics, or CDN performance patterns Experience with data validation frameworks or data quality systems Why This Role Matters Direct customer impact: You’ll build the data products that help customers detect attacks, optimize performance, and understand their traffic. You’ll talk to customers, see how they use what you build, and iterate based on real feedback. Foundational infrastructure: The semantic intelligence systems you contribute to will power every future insight product. Your work compounds—patterns you discover in bot detection become reusable for security analytics, CDN optimization, and beyond. Architectural influence: We’re in the design phase for the semantic platform. You’ll help shape how we think about semantic contracts, query planning, and domain modeling. Your real-world experience building insight products will directly inform the platform’s design. How We Work AI-assisted development: We use Claude Code extensively with custom MCP servers and tooling. We value velocity and correctness. Weekly shipping cadence: New features, metrics, or optimizations ship regularly. You’ll balance “get it working” with “get it right.” Collaborative: You’ll work with Solutions Engineering, customers, and the platform team. This isn’t “build in isolation”—it’s “learn from usage and iterate.” Production mindset: Customers depend on these systems. Data quality, backward compatibility, and performance matter from day one.