Ref: DataEng13/2/2023_1676299898

AWS Data Engineer - London 2x a week

England, London

  • £500 to £550 GBP
  • Engineer Role
  • Skills: AWS Data Engineer - PySpark, Python, Snowflake
  • Level: Mid-level

Job description

AWS Data Engineer - London 2x a week

DataEng13/2/2023_1676299898

Good Afternoon,

Hope you are well. I am working with a client who is in the final stages of signing off for a Data Engineer position. This role is super urgent, however, in the background, they are confirming some small details which should give us the green light to start interviewing.

The client will need someone in early March, therefore if you have a few week's notice period, don't worry this role is still for you!



Role - AWS Data Engineer

Rate- £500-£550

IR35 - Outside IR35

Location- 3 days remote, 2 days in London Office (City, near bank station / Liverpool street station)

Contract Length - 6 months

Start Date - ASAP



I am working, in a global business that specialises in Airport & Travel enhancement. They are based in London and have been achieving some great milestones.

The client is looking for someone who embraces modernization and uses data to enhance decision-making in the business. Bringing innovation to the team and working with modern data frameworks, your hands-on experience in delivering data pipelines in AWS will be crucial as they elevate the data landscape. You will also be responsible for mentoring the junior team members



Key responsibilities:

* Lead the design of data solutions with quality, automation, and performance in mind
* Own the data pipelines feeding into the Data Platform ensuring they are reliable and scalable
* Ensure data is available in a fit-for-purpose
* Maintain and optimise existing data pipelines to improve performance and quality, minimising impacts on business
* Effectively communicate plans and progress to both technical and non-technical stakeholders
* Be an active member of the Data & Analytics team
* Provide regular and accurate reports of progress to Technical leads and the Project lead where required.
* Build strong relationships with stakeholders



Skills and experience required:

* Strong experience in AWS and Snowflake cloud data platform transformation
* Experience in delivering data & analytical solutions in a large-scale cloud environment
* Strong experience implementing end-to-end data pipeline on AWS with deep knowledge of data engineering techniques across data preparation, ETL
* Strong experience developing Data Warehouses on Snowflake
* Experience managing the costs of data pipelines
* Deep knowledge or experience with using as much of the following
* PySpark
* Python
* Snowflake
* SQL
* Glue
* AWS
* EC2
* Kafka
* S3