Job Url: https://www.remoterocketship.com/company/isofttekinc/jobs/snowflake-data-engineer-united-states-remote/ Job Description: iSoftTek Solutions Inc Website LinkedIn All Job Openings iSoftTek Solutions Inc is a recruitment and staffing solutions company that specializes in connecting talent with opportunities in both IT and non-IT sectors. They offer comprehensive services including executive searches, contingent staffing, and consulting solutions tailored to meet the workforce needs of enterprises across North America. With a focus on delivering top-tier talent and a deep understanding of client goals, iSoftTek empowers organizations to achieve transformative success through effective and responsive staffing strategies. 201 - 500 employees 🤝 B2B 🎯 Recruiter 🏢 Enterprise Snowflake Data Engineer August 26 ⚔️ Virginia – Remote ⏳ Contract/Temporary 🟡 Mid-level 🟠 Senior 🚰 Data Engineer 🦅 H1B Visa Sponsor AWS Cloud EC2 ETL Java Kafka Python SQL Unix Go Apply Now Receive Emails with Similar Jobs Report problem 📋 Description • Job Title: Snowflake Data Engineer • Location: VA • Duration: 2 Years • Job Type: C2C • Work Type: Remote • Are you a Data Engineer working at a Large Financial Institution and being told by your leadership that you are too hands-on or detail-oriented or think and work like a start-up? • We are looking forward to you joining our Platform Engineering Team. • Our Platform Engineering Team is working to solve the Multiplicity Problem. • We are trusted by some of the most reputable and established FinTech Firms. • Recently, our team has spearheaded the Conversion & Go Live of apps that support the backbone of the Financial Trading Industry. • Responsibilities: Design, develop, and maintain data pipelines; Implement ETL using Snowflake features; Design efficient data models and schemas; Optimize performance; Integrate Snowflake with external systems; Implement data synchronization; Monitor and optimize query performance; Identify and resolve bottlenecks; Work on Snowflake modeling; SQL performance and tuning; Set up RBAC; Data masking/encryption/tokenization; Setup AWS S3/EC2, External stages, SQS/SNS; Perform Data Integration with MSK Kafka and Delta Lake. 🎯 Requirements • Design, develop, and maintain data pipelines to ingest, transform, and load data from various sources into Snowflake. • Implement ETL processes using Snowflake features such as Snowpipe, Streams, and Tasks. • Design and implement efficient data models and schemas within Snowflake to support reporting, analytics, and BI. • Optimize data warehouse performance and scalability using Snowflake features like clustering, partitioning, and materialized views. • Integrate Snowflake with external systems and data sources, including on-premises databases, cloud storage, and third-party APIs. • Implement data synchronization processes to ensure consistency and accuracy across systems. • Monitor and optimize query performance and resource utilization within Snowflake using query profiling, query optimization techniques, and workload management. • Identify and resolve performance bottlenecks and optimize data warehouse configurations. • Work on Snowflake modeling – roles, databases, schemas, ETL tools with cloud-driven skills. • Work on SQL performance measuring, query tuning, and database tuning. • Handle SQL language and cloud-based technologies. • Set up the RBAC model at the infra and data level. • Work on Data Masking / Encryption / Tokenization, Data Wrangling / ECreLT / Data Pipeline orchestration (tasks). • Setup AWS S3/EC2, Configure External stages, and SQS/SNS. • Perform Data Integration e.g. MSK Kafka connect and other partners like Delta Lake (data bricks). • Work closely with Data Wrangling, ETL, Talend, Jasper, Java, Python, Unix, AWS, Data Warehousing, Data Modeling, Database Migration, ECreLT, RBAC model, Data migration.