We are seeking a Snowflake Platform Engineer to design, implement, and govern a secure and scalable Snowflake data platform integrated with AWS environments. The role focuses on the strategic implementation of Snowflake infrastructure, infrastructure-as-code automation using Terraform, and establishing secure connectivity between cloud platforms and enterprise identity systems.
As a Snowflake Data Platform Engineer, you will be a fully hands-on engineer responsible for building and maintaining Snowflake-based data solutions. You will develop data pipelines, implement best practices, and collaborate with development and analytics teams to enable data-driven insights across the business.
This role sits one level below Architect and focuses on implementation, optimisation, and operation of the Snowflake platform.
Expectations: On-site, 3 days a week at least.
Key Responsibilities
• Design and maintain Snowflake data models, schemas, and pipelines.
• Build and orchestrate ETL/ELT workflows using Dagster or Airflow.
• Implement infrastructure automation with Terraform for Snowflake resources.
• Integrate secure access controls using OAuth-based and OIDC authentication.
• Develop and maintain data models in dbt, implementing both dimensional (star/snowflake) and Data Vault approaches.
• Optimize Snowflake workloads, ensuring cost-efficient and performant solutions.
• Ingest and transform data from multiple internal and external systems.
• Collaborate with engineers, analysts, and architects to deliver reliable data services.
• Monitor, troubleshoot, and continuously improve data reliability and platform stability.
Technical Expertise
• Education: Bachelor’s degree in computer science, Information Technology, Engineering, or a related field (or equivalent practical experience).
• Experience: 5+ years of experience in data platform engineering or cloud data architecture, with strong expertise in Snowflake.
• Skills:
• Extensive hands-on experience with Snowflake (development, optimization, and administration fundamentals).
• Proficiency with dbt for model development, testing, data quality checks and deployment.
• Understanding of Data Vault 2.0 and dimensional (Star/3NF) modelling principles.
• Terraform for infrastructure automation (especially Snowflake roles, warehouses, and integrations).
• Practical use of Dagster or Airflow for orchestration.
• Working knowledge of AWS services (particularly S3 and IAM).
• Familiarity with OAuth and secure access patterns.
• Strong programming skills in Python and/or Java.
• Strong understanding of ETL, data warehousing, and data lifecycle management concepts.
Nice to have skills
• Familiarity with Denodo.
• Experience with Kafka and real-time ingestion patterns.
• Knowledge of commodities markets or capital markets.
Soft Skills
• Strategic mindset with the ability to design scalable data platforms rather than only execute technical tasks.
• Proactive approach to improving platform governance, security, and automation.
• Ability to collaborate across data, engineering, and application teams.
• Attention to detail with a focus on data reliability and quality.
• Clear communication and documentation skills.