Company Name: Confie Job Details: $140k-$160k/yrRemoteFull,Time Job Url: https://hiring.cafe/viewjob/3c0d76h7kmhlbny5 Job Description: Posted 2d agoSenior Snowflake Data Engineer@ ConfieView All JobsWebsiteAddison or Huntington Beach or Huntington Beach or Huntington Beach or Huntington Beach or Huntington Beach$140k-$160k/yrRemoteFull TimeResponsibilities:Design pipelines, Develop data models, Monitor performanceRequirements Summary:6+ years data engineering; 5+ years Snowflake; Python; ETL and data warehousing; cloud platforms; strong communication.Technical Tools Mentioned:Snowflake, Python, Airflow, Fivetran, Airbyte, Coalesce, WhereScape, Azure Data Factory, Snowpipe, Streams, Tasks, Databricks, Google BigQuery, AWS Redshift, Snowpark, Pandas, Numpy, Requests, CSV/JSON Pay Range:   $140000 - $160000 / year   Our Perks:  Generous PTO plans, sick pay and health benefits Annual bonus based on employment standing* Work from home and hybrid model employment Confie Enablement Fund/ Scholarship Program I-Care Recognition Program Corporate Social Responsibility Program Diversity, Equity and Inclusion Initiatives Confie Hub and Discount Programs (Gym Membership)   Purpose Work under the guidance and supervision of the Director, Enterprise Architecture, to build Confie's next-generation Enterprise Data Solutions. Responsible for developing robust data models, creating efficient ELT processes, and optimizing performance to support the organization's data needs. Requires expertise in designing, implementing, and maintaining data solutions on Snowflake data cloud environments that drive critical business insights and operations Essential Duties & Responsibilities Design and develop data pipelines, ETL workflows to populate the cloud Data Lake and Data Warehouse on Snowflake with a transformation tool (e.g., Coalesce, WhereScape, Azure Data Factory) and replication tools (e.g., Fivetran, Airbyte, etc.) Translate business requirements into technical specifications; establish and define details, definitions, and requirements of applications, components, and enhancements Design and develop robust and scalable data pipelines to support data integrations using Snowflake, Coalesce, Python, Airflow, and Fivetran Design and develop Snowflake data objects (tables, views, stored procedures, UDFs, etc.) Implement ELT (Extract, Load, Transform) processes using Snowflake's features such as Snowpipe, Streams, and Tasks Perform data cleaning, analysis, and integration using Python Ability to work with multiple data sources and types (structured/semi-structured/unstructured) Design and implement efficient data models and schemas within Snowflake to support reporting, analytics, and business intelligence needs Monitor and optimize query performance and resource utilization within Snowflake using query profiling, query optimization techniques, and workload management features Identify and resolve performance bottlenecks and optimize data warehouse configurations for maximum efficiency Implement processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it Monitor data pipelines for timely and accurate completion Stay up-to-date with industry trends and advancements in data engineering, continuously improving the team's technical knowledge and skills On-call support   Qualification and Education Requirements 6+ years of professional experience in data engineering, designing and implementing data pipelines, and building data infrastructure 5+ years of strong experience required in Snowflake data cloud and ETL development, including Snowflake procedures, UDFs in Python and SQL, streams, tasks, Snowpipe, and working with semi-structured data, etc. 5+ years of strong experience with Python programming and extensively used frameworks/packages like Snowpark, pandas, numpy, and requests for Data Analysis and integration Solid understanding of data warehousing concepts, dimensional modeling, and data integration techniques 5+ years of strong experience with Data Integration & transformation tools like Coalesce, WhereScape, and Azure Data Factory Experience with Databricks, Google BigQuery, and AWS Redshift is a plus Experience with data quality and observability concepts is a plus Experience with cloud platforms (e.g., AWS, Azure, GCP) and cloud-based data technologies is a plus Skills: A Love for All Things Data - The backbone of a good data engineer is to understand the life cycle of data movements from source to final interpretations on a report A Passion to Learn - Strong desire and ability to learn new tools, skills, and acquire knowledge Listening Skills - The ability to understand what people say Analytical Skills - The ability to critically evaluate the information from multiple sources and break down high-level information into details Observation Skills - The ability to validate data obtained via other techniques and expose new areas for elicitation Organizational Skills - The ability to work with the vast array of information gathered during analysis and to cope with rapidly changing information Interpersonal Skills - The ability to help set priorities Oral and Written Skills - Excellent written and verbal communication with little to no supervision Critical thinking and problem-solving skills Confidence in communicating and translating data-driven insights and technical concepts into simple terminology for business clients of various levels Ability to work and communicate effectively with any level of the user community    Qualifications and Education Requirements Bachelor's degree in Computer Science, Engineering, or a related field Snowflake Advanced Certification is a plus Additional certifications related to data platforms are a plus   Other Duties This job description is not designed to cover or contain a comprehensive listing of activities, duties, or responsibilities that are required of the employee for this job. Duties, responsibilities, and activities may change at any time with or without notice