Key Responsibilities
* Build and optimise ETL/ELT pipelines using AWS services
* Work with AWS Glue, Lambda, Redshift, S3, Athena, EMR, Kinesis
* Develop data models and ensure data quality & governance
* Collaborate with cross-functional teams to support analytics and reporting needs
* Implement best practices for performance, scalability, and cost optimisation
What You'll Bring
* Strong commercial experience as a Data Engineer in AWS environments
* Proficiency in Python, SQL, and data modelling
* Hands-on expertise with ETL/ELT development and orchestration
* Knowledge of DevOps principles, CI/CD, and IaC (Terraform/CloudFormation)
* Ability to work autonomously and deliver in a fast-paced project environment
Desirable Skills
* Experience with real-time/streaming data (Kafka/Kinesis)
* Familiarity with Docker or Kubernetes (ECS/EKS)
* Experience supporting ML/AI data workflows
