Job Title: ETL Developer Company Name: Tricon Solutions Job Url: https://www.simplyhired.com/job/jHlP00SYciqzlN_BUAOSqptpQHMmUMgF90aiWNC8xTo9QHaj99ad4g Job Description: ETL Developer Tricon Solutions Secaucus, NJ Job Details Contract $55 - $65 an hour Qualifications Data model design Version control Databricks Data modeling Azure Bachelor's degree in information technology Computer Science Cloud data warehouses XML 5 years Spark Big data systems NoSQL Git Snowflake Java SQL Analysis skills Bachelor's degree IT Agile software development JSON Redshift Senior level Bachelor's degree in computer science Communication skills Python Hadoop Information Technology Database software proficiency Full Job Description Location: Fully REMOTE (Reports into EST time) Type: 12+ months contract Required Skills and Qualifications: Education: Bachelor//'s degree in computer science, Information Technology, or related field. Experience: Minimum of 5-7 years of experience as an ETL Developer or in a similar role. Technical Skills: Proficiency in ETL tools such as Matillion. Experience with cloud platforms such as AWS (preferred), Azure. Strong experience with Snowflake, Databricks, Amazon Redshift, etc. Strong SQL skills and experience with relational databases like Oracle, SQL Server, or MySQL. Strong experience with NoSQL databases. Strong experience with GIT integration. Experience with data formats like txt, CSV, XML and JSON. Experience with data modeling and database design. Familiarity with scripting languages such as Java, or Python. Knowledge of data warehousing concepts and methodologies. Analytical Skills: Strong analytical and problem-solving skills. Communication: Excellent verbal and written communication skills. Experience with Agile development methodologies. Attention to Detail: High attention to detail with the ability to manage multiple tasks simultaneously. Team Player: Ability to work effectively both independently and as part of a team. Preferred Skills and Qualifications: Familiarity with big data technologies like Hadoop, Spark, or similar. Familiarity with data fabric and data mesh