Job Url: https://www.linkedin.com/jobs/search/?currentJobId=4352180006&f_TPR=r86400&f_WT=2&keywords=software%20engineer&origin=JOB_SEARCH_PAGE_JOB_FILTER&start=150 Job Description: Senior Data Engineer Bespoke Labs · United States (Remote) Apply Save Save Senior Data Engineer at Bespoke Labs Show more options Your profile is missing required qualifications Show match details Help me update my profile BETA Is this information helpful? Get personalized tips to stand out to hirers Find jobs where you’re a top applicant and tailor your resume with the help of AI. Try Premium for PKR0 Meet the hiring team Ahmed Riaz 3rd Global Talent Acquisition Consultant | Hiring Remote Workers Globally Job poster Message About the job Job Posting: Senior Data Engineer / Analytics Engineer (DBT + Snowflake Cortex CLI) Location: Remote Type: Contract Experience Level: Mid–Senior About the Role We are seeking a skilled Data/Analytics Engineer with hands-on experience using dbt in conjunction with Snowflake's Cortex CLI. This role involves designing, developing, and optimizing data workflows that leverage Snowflake's new AI/ML and feature engineering capabilities via Cortex, while maintaining production-grade dbt transformations and CI/CD processes. You will collaborate with data engineering, analytics, and ML teams to prototype and productionize Cortex-driven workloads, ensure scalable model development, and define best practices for using dbt in a modern Snowflake-native stack. Responsibilities - Design and build dbt models, macros, and tests aligned with modern data modeling practices (e.g., modular, source freshness, semantic layers). - Integrate dbt workflows with Snowflake Cortex CLI, including: - Feature engineering pipelines - Model training & inference tasks - Pipeline orchestration and automation - Evaluation and monitoring of Cortex models - Define and document best practices for dbt–Cortex usage patterns. - Collaborate with data scientists and ML engineers to operationalize Cortex workloads in Snowflake. - Implement CI/CD pipelines for dbt projects (GitHub Actions / GitLab / Azure DevOps). - Optimize queries and Snowflake compute usage for cost and performance efficiency. - Troubleshoot and debug dbt artifacts, Snowflake objects, lineage, and data quality issues. - Provide guidance on dbt project structure, governance, and testing frameworks. Required Qualifications - 3+ years of experience with dbt Core or dbt Cloud, including macros, packages, testing, documentation, and deployments. - Strong expertise with Snowflake (warehouses, tasks, streams, materialized views, performance tuning). - Hands-on experience with Snowflake Cortex CLI or willingness and ability to quickly ramp up on Cortex features. - Proficiency in SQL and familiarity with Python as used in dbt and scripting. - Experience integrating dbt with orchestration tools (Airflow, Dagster, Prefect, etc.). - Strong understanding of modern data engineering workflows, ELT patterns, and version-controlled analytics development. Nice-to-Have Skills - Prior experience operationalizing ML workflows inside Snowflake. - Familiarity with Snowpark and Python UDFs/UDFs. - Experience building semantic layers using dbt metrics. - Knowledge of MLOps or DataOps best practices. - Exposure to LLM use cases, vector search, and unstructured data pipelines.