Job Title: GCP Cloud Data Engineer Company Name: Technoval Job Url: https://www.simplyhired.com/job/pW31LJxGkJ9axrDs3cfZdsYfo7GEP8v2Zq5as_Nepf6i5qjJVO7JSw Job Description: GCP Cloud Data Engineer Technoval Remote Job Details Contract $60 - $70 an hour 3 hours ago Qualifications RDBMS Performance tuning Computer science Data Integration (Data management) Machine learning/AI-based analysis Go Computer Science Continuous Delivery (CD) implementation Cloud data warehouses Data migration 5 years Database clustering Database backup and recovery Implementing cloud migration projects Disaster recovery Consulting experience Java Application development Data Architecture Design (Architecture design skills) Database management Bachelor's degree Model deployment Requirements analysis Presentation creation Technical troubleshooting support Senior level Bachelor's degree in computer science BigQuery Python MLOps Full Job Description Together with the team, you will support customer implementations of Google Cloud products through architecture guidance, best practices, data migration, database design, database management, performance tuning, CI/CD, capacity planning, implementation, troubleshooting, monitoring, dataflow, cloud functions and more. Responsibilities Function as the lead Google data team point of contact to support NOTAM data platform Be highly collaborative and work closely with data producers and data consumers, to understand the data needs, provide consultation, and align data solutions. Lead database administration best practices including backup and recovery, performance tuning, scaling, data archival, database design and provide implementation support. Create and deliver best practices, recommendations, sample code, and technical presentations, adapting to different levels of key business and technical stakeholders. Analyze on-premise and cloud database environments, consulting on the optimal design for performance and deployment on Google Cloud Platform. Support the design, development, and maintenance of RDBMS, data warehouse and data pipeline solutions. Minimum Qualifications (MQs) Bachelor's degree in Computer Science or equivalent practical experience. 5 years of experience with relational database technologies such as PostgreSQL, MySQL, SQL Server, or Oracle. Experience working with business stakeholders to understand requirements, provide technical leadership, and educate teams on GCP best practices. Preferred Qualifications (PQs) Experience with database management tools for backups, recovery, snapshot management, sharding, partitioning and database performance tuning. Experience working with cloud databases such as AlloyDB, CloudSQL, Big Query Experience with MLOps, data warehousing, and data pipeline development, including ETL and ELT, dataflow, cloud functions. Experience with application development. Experience in database administration techniques including storage, clustering, availability, disaster recovery, security, logging, performance tuning, monitoring and auditing. Experience developing, deploying, and managing machine learning models, including experience writing software in one or more languages, such as Java, Python, Golang Pay: $60.00 - $70.00 per hour Work Location: Remote