Job Title: Data Platform Architect Company Name: Rayson Technologies Job Details: $85/hrRemoteContract Job Url: https://hiring.cafe/viewjob/b4po33ek46cz8ay1 Job Description: Posted 6mo agoData Platform Architect@ Rayson TechnologiesView All JobsWebsiteUnited States$85/hrRemoteContractResponsibilities:leading design, implement pipelines, communicating with stakeholdersRequirements Summary:8+ years in data engineering/architecture; expertise in AWS QuickSight, Redshift, Q Topics; ETL tooling (Glue, DMS, Airflow, Data Factory); Python/SQL; data modeling (star schema); secure cloud environments; strong communication with stakeholders.Technical Tools Mentioned:AWS QuickSight, Q Topics, Redshift, AWS Bedrock, Glue, DMS, Airflow, Data Factory, Python, SQL This is a remote position. Rayson Technologies is seeking a Data Platform Architect to lead the design and implementation of cloud-native data solutions for our clients. This role requires deep expertise in modern data architecture, including data lakes, ETL pipelines, and analytics dashboards within AWS QuickSight. This position will be be responsible for designing secure, scalable systems that leverage star schema modeling, structured and unstructured data, and real-time or batch ETL flows. The ideal candidate combines strong technical skills in AWS QuickSight with Q Topics, AWS Redshift, star schema, and general data architecture with the ability to communicate clearly and confidently with both technical and non-technical stakeholders. Understanding product stakeholders' needsProducing specifications and determining operational feasibilityExecuting full lifecycle software developmentWriting well-designed, testable and efficient code as neededTailoring and deploying software tools, processes and metricsDocumenting and maintaining software functionalityComplying with project plans and industry standards​ Requirements 8+ years in data engineering, data architecture, or a related role Proven experience within AWS, QuickSight, Q Topics, and Redshift Experience in AWS Bedrock with Redshift ML is a plus Expertise in building ETL pipelines using tools like Glue, DMS, Airflow, or Data Factory Strong hands-on knowledge of Python and SQL Deep understanding of data modeling, especially star/snowflake schema design Experience designing secure cloud environments and enforcing data governance policies Clear written and verbal communication skills — ability to engage clients directly and explain complex topics simply Ability to document clearly requirements and specifications Familiarity with software development methodology and release processes A degree in Computer Science, Engineering, or STEM Benefits Competitive compensation Remote-first flexibility and autonomy