Job Title: Software Engineer (ETL) Company Name: Radial, Inc. Job Url: https://www.simplyhired.com/job/ZOG8qsIfZcp6-FsHxkJge1eaMfeKzLQfBeDnxugZN-5gkDHdjajKTg Job Description: Software Engineer (ETL) Radial, Inc. King of Prussia, PA Job Details 13 hours ago Qualifications Statistics Financial forecasting DataStage Data Integration (Data management) 6 years Business intelligence Computer Science Cloud data warehouses Amazon EC2 Data migration Manhattan Associates Engineering IT system monitoring Spark UNIX Application deployment Information Systems Apache Hive E-commerce Snowflake Automating deployment processes SQL Customer segmentation Statistical analysis Data Architecture Design (Architecture design skills) Spark implementation Bachelor’s degree in engineering Bachelor's degree Cloud infrastructure implementation HBase Model deployment AWS Glue Distributed systems Requirements analysis Model training Teradata Marketing analytics Bachelor's degree in computer information systems Fraud prevention and detection Data validation Metadata Implementing machine learning algorithms Distributed computing Business requirements Senior level AI Batch data processing Bachelor's degree in computer science Data-driven decision making Python Metadata management Control-M Supply chain process improvement Hadoop System performance monitoring Full Job Description Job Title: Software Engineer (ETL) Job Number: JO-2603-11234 Location (City, State): King of Prussia, PA Employee Group: Regular Shift: Day Travel: Site Name: King of Prussia 935 Is Remote Eligible: Yes Pay: Radial, Inc seeks Software Engineer (ETL) in King of Prussia, PA. Job Duties: ETL activities in Manhattan, loaded into a data warehouse or business intelligence tools for analysis, helping companies make data-driven decisions on inventory, marketing, and customer segmentation. Ensure robust monitoring is in place to track the performance of your ETL pipelines, especially under the load of Black Friday/Cyber Monday traffic. Architect, implement, and optimize advanced machine learning and deep learning models for large-scale structured and unstructured data. Develop and maintain automated ML pipelines, integrating model training, validation, and deployment using tools like Airflow. Collaborate with data engineers to preprocess massive datasets using distributed processing tools like Spark or Hadoop. Apply e-commerce domain-specific knowledge in areas like marketing analytics, supply chain optimization, customer segmentation, fraud detection, or financial forecasting. Lead the full lifecycle of Data Science projects—from data acquisition and cleaning, through exploratory analysis and modeling, to delivery of actionable insights. Understand the customer requirements, analysis, design, development, and implementation into the system, Experience with Snowflake Data warehouse, deep understanding of Snowflake architecture and processing. Automating, configuring and deploying instance on AWS. Write complex SQL scripts using Statistical Aggregate functions and Analytical functions. Design and develop E2E data flow starting from Source databases to hive tables which includes denormalization in Hadoop platform. Collaborate with data teams to support ML/AI workloads using custom frameworks on EC2/Batch. Work on Snowflake utilities such as Snow SQL, Snow Pipe Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Lead migration efforts from legacy data platforms (e.g., Oracle, Netezza, SQL Server) to Snowflake, including schema conversion and data validation. Telecommuting permitted. Minimum Requirements: Bachelor’s degree, or foreign equivalent, in Computer Science, Engineering (any), Computer information Systems, or closely related quantitative discipline and six (6) years of experience in the job offered or six (6) years of progressively responsible experience in the field of Software Engineering, or related occupation. Special Skill Requirements: Manhattan Active Manhattan E-commerce Machine Learning Data Science AWS Airflow Snowflake Hadoop, Hive, and Sqoop SQL/NoSQL Databases including Oracle, Netezza, Teradata and HBase. Python AWS Glue IBM info sphere DataStage (ETL) Autosys and Control-M UNIX Any suitable combination of education, training and experience is acceptable. Apply through the career’s website: https://radial.my.salesforce-sites.com/careers with reference to: Job#: JO-2603-11234.