Ref: JFI Data Engineer_1673533980

Lead/ Senior GCP Data Engineer

England, London

  • £70,000 to £80,000 GBP
  • Engineer Role
  • Skills: GCP, Airflow, ETL, Python
  • Level: Senior

Job description

Lead/ Senior GCP Data Engineer

JFI Data Engineer_1673533980

I am working with a digital marketing agency with a unique business model, recently named Campaign's Global Performance Marketing Agency of the Year.

My client is a highly successful, continually fast-growing agency with ambitious plans for the future. We have a unique agency operating model, which makes things different and exciting; and a brilliant, positive culture. It's a fantastic place to work.

They are investing in Data Solutions as a strategic priority. The team brings together a number of capabilities that sit within different parts of this company, into one unit that will drive cutting edge development in a number of critical areas for the business.



The role sits in the newly created Data Science & Tools team within this department, reporting to Head of Data Science & Tools. The team's remit is to drive innovation & revenue in two ways: develop & deliver revenue-driving/client-facing data science & engineering solutions and internal tools used within the agency.



The right hire will share values:

* Passion: loving what you do and being driven to excel with each new project
* Imagination: meeting challenges creatively and ready to try new ideas
* Integrity: accountable and always excellent to one another

This client operates a hybrid working model with a minimum of 2 days a week based in our London office and the remaining days from home if you wish.



RESPONSIBILITIES

* Ownership of the data engineering projects and solutions delivered by the Data Science and Tools team
* Line management of Data Engineers (2-3 including contractors)
* Champion data and software engineering best practices and helping and mentoring the team to implement and uphold those practices
* Solutionising client deliverables
* Define and maintain DataOps and governance policies, processes, and standards, including automation of tasks
* Hands-on involvement in data integrations, warehousing, and related engineering tasks
* Oversee the deliveries across multiple clients and assuring that the solutions meet the highest quality achievable within the project constraints
* Research and Development on the possible technologies and help the team and clients adopt those where appropriate
* Highlight risks and produce effective mitigation plans to address them
* Be a key owner of the Cloud GCP infrastructure making sure that appropriate governance practices are followed, whilst helping the company with cloud adoption
* Data infrastructure automation
* Contribute to new business and cross-sell proposals, especially with detail about the implementation work in proposed projects, and with estimations of the time and effort needed, which will contribute to the commercial element of a proposal



PERSON SPECIFICATION

Essential:

* Excellent Python and SQL and database design skills
* Proficient with cloud-based modern data warehousing technologies
* Fluent with GCP and related technologies, especially BigQuery
* Strong understanding of the best practices, the trends in data and software engineering
* Hands-on experience with setting up and maintaining production data pipelines using serverless technologies and/or Airflow
* Strong dbt/Dataform or similar SQL development frameworks
* Strong understanding and appreciation of unit testing, producing clean/maintainable code
* An ability to communicate clearly and effectively on technical topics with varied types of technical and non-technical stakeholders
* Productionising software applications, experience with Docker
* Experience working with software engineers, agile development, and software development lifecycles
* Infrastructure automation, preferably with Terraform
* Experience setting up CI/CD pipelines



Highly Desirable:

* Prior experience with Martech technologies and solutions

Desirable:

* Experience with data science for statistical / machine learning models
* Experience productionising ML pipelines (MLOps)
* Experience with data visualisation platforms such as Looker Studio (formerly Google Data Studio)



COMPANY BENEFITS

They offer a clear path to progression for all members of staff. They are committed to offering development opportunities alongside a support system of regular performance reviews. The opportunities are endless!



My client offers competitive starting salaries alongside numerous company benefits. On completion of the three month probation period every employee is eligible for the benefits listed on our careers

site which include:

* 25 days holiday a year
* Annual performance bonus
* Sale commission
* Recruitment referrals bonus
* Gym membership contributions
* Ride to Work scheme
* Rail card
* Season Ticket loan
* Free fruit, breakfast cereals and tea & coffee
* Free home office chair and screen
* Enhanced maternity and paternity package
* Life and income protection
* Medical cash plan
* Agile Working Policy

Standard hours are from 9.00am to 5.30pm, there's flexibility if agreed in advance with your line managers (it may also be necessary on occasions to work outside of these hours).

This client is an equal opportunity employer and does not discriminate on the grounds of a person's gender, marital status, race, religion, colour, age, disability or sexual orientation. All candidates will be assessed based on merit, qualifications and their ability to perform the requirements of the role.





Jefferson Frank is the Amazon Web Services (AWS) recruiter of choice. We work with organizations worldwide to find and deliver the best AWS professionals on the planet. Backed by private equity firm TPG Growth, we have a proven track record servicing the AWS recruitment market and, to date, have worked with over 30,000 organizations globally.