Ihre aktuelle Jobsuche

49 Suchergebnisse

Für Festanstellung in London

Data Engineer

England, London

  • £50,000 to £60,000 GBP
  • Engineer Stelle
  • Fähigkeiten: Python, Airflow, CICD, DBT, ETL
  • Seniority: Mid-level

Jobbeschreibung

My client is a highly successful, continually fast-growing agency with ambitious plans for the future. They have a unique agency operating model, which makes things different and exciting; and a brilliant, positive culture. It's a fantastic place to work.



The role sits in the newly created Data Science & Tools team within this department, reporting to Head of Data Science & Tools. The team's remit is to drive innovation & revenue in two ways: develop & deliver revenue-driving/client-facing data science & engineering solutions and internal tools used within the agency.



The right hire will share the values:

Passion: loving what you do and being driven to excel with each new project

Imagination: meeting challenges creatively and ready to try new ideas

Integrity: accountable and always excellent to one another

This company operates a hybrid working model with a minimum of 2 days a week based in our London office and the remaining days from home if you wish.



RESPONSIBILITIES

The Data Engineer is accountable for the successful and efficient implementation of the data science and engineering client projects and internal tool development. Key performance indicators include the timeliness with which projects are delivered, their profitability, client satisfaction (one recurring measure of which is client Net Promoter Score), and quality of work (whether the solution is fit for purpose and to the required standard).

Data Integration, warehousing & engineering



- Solution-design the data integration architecture and ETL pipeline based on client / internal project requirements:

Typically, the requirements include bringing together multiple data sources (typically client's first-party data with ad platforms such as Google Ads, Google Search Ads 360, DV360, Campaign Manager, Google Analytics and Facebook Ads) in a central data warehouse with an automated ETL pipeline

Sometimes this includes additional handlers in the pipeline which deploy statistical / machine learning models. These are developed by data scientists on the team. The postholder will have exposure to such work, and even opportunities to work on them if their interests, capabilities and client workload allow

Occassionally, the postholder will be required to work with tools which have a graphical user-interface, requiring therefore little programming knowledge

In general, they will strike a balance between such "easy" or "legacy" projects with more intellectually challenging ones (such as developing internal tools)

- Get under the skin of the data and systems required for the solution through working closely with the clients or client teams directly

- Work closely with the data engineers and data scientists in our team to ensure the solution is fit for purpose, and that they can develop relevant components accordingly

- Lead on the delivery, drawing from your own programming expertise or from more senior members of the team where relevant

Assure the quality of implemented data science and data engineering solutions (i.e. "QAing" them as they're rolled out)

Put in place logging & monitoring for quality KPIs and alerts for bugs

Maintain solutions on an ongoing basis such as debugging, and bringing in additional senior help where needed

Communicate the operation and results of these solutions to internal teams and clients to get feedback and ensure work is delivered per expectations

For solutions that are deemed to have significant client potential, the postholder will also have to opportunity to productionise the solution, with the support of more senior members of the team

Contribute to new business and cross-sell proposals, especially with detail about the implementation work in proposed projects, and with estimations of the time and effort needed, which will contribute to the commercial element of a proposal



Essential:

Good python and SQL skills, including:

Demonstrable experience in data integration, in particular using APIs of digital advertising platforms

Developing and maintaining ETL pipelines

Curating databases or even data warehouses

Basic front-end development using open source frameworks such as Flask (N.B. sleek UX/look & feel aren't necessary. Such front-end dev is typically required for internal tools.)

Experience in unit testing, producing clean/maintainable code and other software engineering best practices

Experience working with cloud computing platforms, Google Cloud Platform being highly desirable

An ability to communicate clearly and effectively about technical topics with very varied types of stakeholders, from client CMOs to software engineers



Highly Desirable:

Productionising software applications, experience with Docker

Experience working with software engineers, agile development, and software development lifecycles



Desirable:

Experience with data science for statistical / machine learning models

Experience setting up CI/CD pipelines



COMPANY BENEFITS

The offers a clear path to progression for all members of staff. My client are committed to offering development opportunities alongside a support system of regular performance reviews. The opportunities are endless!

The offers competitive starting salaries alongside numerous company benefits. On completion of the three month probation period every employee is eligible for the benefits listed on our careers site which include:

Standard hours are from 9.00am to 5.30pm, there's flexibility if agreed in advance with your line managers (it may also be necessary on occasions to work outside of these hours).

This is an equal opportunity employer and does not discriminate on the grounds of a person's gender, marital status, race, religion, colour, age, disability or sexual orientation. All candidates will be assessed based on merit, qualifications and their ability to perform the requirements of the role.



Jefferson Frank is the Amazon Web Services (AWS) recruiter of choice. We work with organizations worldwide to find and deliver the best AWS professionals on the planet. Backed by private equity firm TPG Growth, we have a proven track record servicing the AWS recruitment market and, to date, have worked with over 30,000 organizations globally.

Data Engineer (Range of levels)

England, London

  • £56,000 to £75,000 GBP
  • Engineer Stelle
  • Fähigkeiten: ETL, Oracle, Python, AWS
  • Seniority: Mid-level

Jobbeschreibung

Have you ever wanted to be working with global clients on a daily basis? Getting your teeth into projects that last 9-15 months, working with edge-cutting technologies such as Redshift, SQL, Oracle, S3 buckets and loads more!



The group provides a wide range of data and analytics solutions in support of our client's business priorities: maximise revenues, bear down on fraud, and respond to current Covid challenges. This role is a unique chance to design, develop, test and support data and analytics software solutions, delivering key critical systems to the public sector. You will be part of an Agile software delivery team working closely with software architects and supported by product managers, scrum masters and solutions architects.



Benefits



* You will get put through certifications and will get upskilled to the best you can be
* There's a mentorship program as well (10 days of training a year, can be anything)
* Looking to hire 25-35 candidates this year looking to grow as well
* Flat structure within the business
* Opportunity to move internal as they hire from within before going to external agencies
* Working for a growing company averaging around 30% every year



Why this role?

Work as part of an Agile software delivery team; typically delivering within an Agile Scrum framework. You will work on the full range of software engineering principles; covering requirements gathering and analysis, solutions design, software coding and development, testing, implementation and operational support. You will contribute to the software engineering communities by providing your ideas for innovation and process improvement and coaching and mentoring other team members.


Why You?

* Good experience in the following technologies are recommended:
* ETL toolset (Talend, Pentaho, SAS DI, Informatica etc)
* Database (Oracle, RDS, Redshift, MySQL, Hadoop, Postgres, etc)
* Data modelling (Data Warehouse, Marts)
* Job Scheduling toolset (Job Scheduler, TWS, etc)
* Programming and scripting languages (PL/SQL, SQL, Unix, Python, Hive, HiveQL, HDFS, Impala, etc)



Good to have experience in the following technologies:
* Data virtualisation tools (Denodo)
* Reporting (Pentaho BA, Power BI, Business Objects)
* Data Analytics toolset (SAS Viya)
* Cloud (AWS, Azure, GCP)
* ALM Tooling (Jira, Confluence, Bitbucket)
* CI/CD toolsets (Gitlab, Jenkins, Ansible)
* Test Automation (TOSCA)


Experience:
You should have experience as a software engineer delivering within large scale data analytics solutions and the ability to operate at all stages of the software engineering lifecycle, as well as some experience in the following:
* Awareness of devops culture and modern engineering practices
* Experience of Agile Scrum based delivery
* Proactive in nature, personal drive, enthusiasm, willingness to learn
* Excellent communications skills including stakeholder management



Jefferson Frank is the Amazon Web Services (AWS) recruiter of choice. We work with organizations worldwide to find and deliver the best AWS professionals on the planet. Backed by private equity firm TPG Growth, we have a proven track record servicing the AWS recruitment market and, to date, have worked with over 30,000 organizations globally.

AWS Data Engineer

England, London

  • £50,000 to £65,000 GBP
  • Engineer Stelle
  • Fähigkeiten: AWS Data Engineer, Data Engineer, Data Engineering, ETL, Datawarehouse, Data Modelling, Redshift
  • Seniority: Mid-level

Jobbeschreibung

Data Engineer - Up to £70,000

Fully Remote for UK based candidates

As an Data Engineer you'll find yourself participating in customer projects which can mean day-to-day hands-on work can be from implementing serverless/ETL infrastructure to taking a step back and giving workshops and design recommendations to a technical team.

Furthermore, you will become a trusted advisor for high-profile enterprises and jointly manage the customer relationship and roadmap working in partnership with AWS.

Ideally, you're bringing the following skills with you:

* Public cloud platform expert knowledge, especially Amazon Web Services (AWS).
* Worked as part of a crossfunctional team on an AWS platform that holds large amounts of data.
* Designing and implementing data pipelines that will feed user behavioural data back into production systems.
* Strong recent experience using AWS RedShift, Big query, QuickSight, Lambda and data pipelines.
* Strong recent experience ETL design, implementation and maintenance.
* Experience of working on data migrations
* All in all, you should have a solid engineering/technical background/hands-on experience.
* Minimum AWS certification

Why is this your next career move?

* Leading-edge projects - they're here to present their customers with the latest technologies and to push the IT industry forward!
* Highly skilled co-workers in a friendly and supportive working culture, enjoy working and having fun together and sharing knowledge.
* Most advanced technologies. They are the overly excited techies who can't wait to read about the newest launches!
* Besides an interesting assignment, you can enjoy extra-curricular activities
* Exam fees for partner certifications (Azure, AWS, GCP) + certification bonus covered by the employer
* Access to join and a possibility to create knowledge sharing sessions within a community of leading cloud professionals
* Interesting projects and a chance to work with a variety of high-profile customers from several industries
* Possibility to gather valued certificates and participate in world-class courses and training, you can choose from various courses to ensure continuous learning
* Lots of opportunities to develop your cloud expertise while working together with the leading Global Cloud Professionals

Data Engineer - London (Hybrid) - up to £95,000 + bonus

England, London

  • £60,000 to £95,000 GBP
  • Engineer Stelle
  • Fähigkeiten: DataLakes, PySpark, SQL, Azure, Python, AWS, Databricks, Agile
  • Seniority: Senior

Jobbeschreibung

Senior Data Engineer, Hybrid, East London - DataLakes, PySpark, SQL, Azure, Python, AWS, Databricks, Agile

Role Overview

We are looking for an experienced data engineers responsible for the design, development, and maintenance of applications. You will be working alongside other engineers and developers working on different layers of the infrastructure. Therefore, a commitment to collaborative problem-solving, sophisticated design, and the creation of quality products are essential.

Role & Responsibilities

* Collaborate with Big Data Solution Architects to design, prototype, implement, and optimize data ingestion pipelines so that data is shared effectively across various business systems.
* Build ETL/ELT and Ingestion pipelines and design optimal data storage and analytics solutions using cloud and on-perm technologies.
* Ensure the design, code and procedural aspects of the solution are production ready, in terms of operational, security and compliance standards.
* Participate in day-to-day project and product delivery status meetings, and provide technical support for faster resolution of issues.

Skills and Experience

* Demonstrable design & development experience and experience with big data technologies like Spark/Flink and Kafka
* Proficient in Python, PySpark, or Java/Scala.
Hands-on experience with some of the following technologies:

* Azure/AWS - Data Lake Projects
* SQL
* ETL/ELT
* Databricks
* Spring/Guice or any other DI framework,
* RESTful Web Services.

* Proficient in querying and manipulating data from various DB (relational and big data).
* Experience of writing effective and maintainable unit and integration tests for ingestion pipelines.
* Experience of using static analysis and code quality tools and building CI/CD pipelines.

Data Engineer

England, London, City of London

  • £50,000 to £60,000 GBP
  • Engineer Stelle
  • Fähigkeiten: Snowflake, Python, Airflow, SQL, AWS
  • Seniority: Mid-level

Jobbeschreibung

Snowflake Engineer

Do you like working with the latest technology and are interested in enhancing your tech abilities? Have an exciting opportunity for a highly skilled Data Engineer with significant experience of Snowflake.

As well as being an expert in the Snowflake cloud platform, you'll have a strong background in Data Ingestion and Integration, designing and implementing ETL pipelines on various technologies, Data Modelling and a rounded understanding of data warehousing.

They believe strongly in experimentation leading to industrialisation and are searching for passionate, energetic data engineers who are focussed on using their skills to drive out real business value for our customers

A bit about the job:

The new project is a greenfield Personal Lines insurer headquartered in Hoxton (London), set with the ambition to be the best in the UK market. It will combine the pace, focus, and test and learn mentality of a start-up with the expertise, and financial backing of the company.

Data is the life blood of any modern organisation and they are is no different. Our Data Engineering team sits within Aviva Quantum our global Data Science Practise (covering areas including Machine Learning, Analytics, Data Engineering, AI and many more).

You will form a vital part of our business, contribute to our first-class end-to-end solutions. You will play an active role in defining our practices, standards and ways of working, and apply them to your role. Be open to working across organisation and team boundaries to ensure they bring the best to our customers.

Ideal skills and experience:

* Experience of delivering end to end solutions with different databases technologies focusing on Snowflake but also Dynamo, Oracle, SQL Server, Postgres.
* Experience of managing data using the Data Vault architecture and managing it through DBT.
* Strong understanding of data manipulation/wrangling techniques in SQL along with at least one of the following Python, Scala, Snowpark or PySpark.
* Experience in designing structure to support reporting solutions optimised for use from tools like Qlik, Tableau etc. Good understanding of modern code development practices including DevOps/DataOps but also Agile.
* Strong interpersonal skills with the ability to work with customer to establish requirements and then design and deliver the solution. Taking the customer on the end-2-end journey with you.

What you'll get for this role:

* Salary up to £55,000 London and up to £46,000 National (depending on location, skills, experience, and qualifications)
* Generous pension (starting level they contributes 8% when you contribute 2%)
* Part of the Sales Bonus Scheme
* Family friendly parental and carer's leave
* 29 days holiday per year plus bank holidays and the option to buy/sell up to 5 additional days
* Up to 40% discount for products
* Brilliant flexible benefits
* Matching Share Plan and Save As You Earn scheme
* 21 volunteering hours per year

For everyone:

They want applications from people with diverse backgrounds and experiences.

Excited but not sure you tick every box? Research tells us that women, particularly, feel this way. So, regardless of gender, why not apply. And if you're in a job share just apply as a pair. Flex locations, hours and working patterns to suit our customers, business, and you.

Most of our people are smart working - spending around 60% of their time in our offices and 40% at home.



Jefferson Frank is the Amazon Web Services (AWS) recruiter of choice. We work with organizations worldwide to find and deliver the best AWS professionals on the planet. Backed by private equity firm TPG Growth, we have a proven track record servicing the AWS recruitment market and, to date, have worked with over 30,000 organizations globally.

Data Engineer

England, London, City of London

  • £60,000 to £75,000 GBP
  • Engineer Stelle
  • Fähigkeiten: ETL, AWS, Snowflake, Redshift, Python, SQL
  • Seniority: Mid-level

Jobbeschreibung

Data Engineer



This company is a leader in Artificial Intelligence (AI) and Machine Learning (ML) and specialist in advanced analytics, BI/MI and Data Science. They are an exciting and rapidly growing consultancy, based at Adastral Park in Suffolk, with offices in London and Rotterdam. As an Amazon Web Services (AWS) Premier Partner and having been awarded the AWS ML Partner of the Year 2020, we are extremely well positioned at the forefront of this fast-paced, cutting-edge technology space.



Following an extraordinary four years of growth and success, They were proud to be acquired in December 2020 and is now part of the global Cognizant family. With ambitious expansion plans and increased demand from customers across the world, we are looking to add Data Engineers to their dynamic team.



Role:

The Data Engineer will propose and implement solutions using a range of AWS infrastructure,

including S3, Redshift, Lambda, Step Functions, DynamoDB, AWS Glue and Matillion.

They will liaise with clients to define requirements, refine solutions and ultimately hand over to clients' own technical teams.

The ideal candidate will have exposure to CI/CD processes, or at least be keen to learn - our clients love infrastructure as code; and we like our engineers to own the deployment of their work.

Candidates should delight in creating something from nothing on greenfield projects. We're looking for people who can't let go of interesting problems.

We need people who can work independently; but we're a close-knit, supportive team - we like to learn new things and share our ideas, so that clients get the best return on their investments.



Essential experience:

* Experience in analysing and cleansing data using a variety of tools and techniques.
* Familiarity with AWS data lake related components.
* Hands on experience with Redshift, Glue and S3.
* Extensive experience of ETL and using patterns for cloud data warehouse solution (e.g. ELT).
* Hands on experience with Matillion
* Familiarity with variety of Databases, incl. structured RDBMS.
* Experience in working with a variety of data formats, JSON, XML, CSV, Parquet etc.
* Experience with building and maintaining data dictionaries / meta-data.
* Experience of Linux and cloud environments.
* Data Visualisation Technologies (e.g. Amazon QuickSight, Tableau, Looker, QlikSense).

Desirable experience:

* Familiarity with large data techniques (Hadoop, MapReduce, Spark etc.)
* Familiarity with providing data via a micro service API.
* Experience with other public cloud data lake.
* AWS Certifications (particularly Solution Architect Associate and Big Data Speciality).
* Machine Learning.



Jefferson Frank is the Amazon Web Services (AWS) recruiter of choice. We work with organizations worldwide to find and deliver the best AWS professionals on the planet. Backed by private equity firm TPG Growth, we have a proven track record servicing the AWS recruitment market and, to date, have worked with over 30,000 organizations globally.

On Premise Data Engineer

England, London, City of London

  • £50,000 to £75,000 GBP
  • Engineer Stelle
  • Fähigkeiten: ETL, Redshift, Python, SQL, Oracle
  • Seniority: Mid-level

Jobbeschreibung

Have you ever wanted to be working with global clients on a daily basis? Getting your teeth into projects that last 9-15 months, working with edge-cutting technologies such as Redshift, SQL, Oracle, S3 buckets and loads more!



The group provides a wide range of data and analytics solutions in support of our client's business priorities: maximise revenues, bear down on fraud, and respond to current Covid challenges. This role is a unique chance to design, develop, test and support data and analytics software solutions, delivering key critical systems to the public sector. You will be part of an Agile software delivery team working closely with software architects and supported by product managers, scrum masters and solutions architects.



Benefits



* You will get put through certifications and will get upskilled to the best you can be
* There's a mentorship program as well (10 days of training a year, can be anything)
* Looking to hire 25-35 candidates this year looking to grow as well
* Flat structure within the business
* Opportunity to move internal as they hire from within before going to external agencies
* Working for a growing company averaging around 30% every year



Why this role?

Work as part of an Agile software delivery team; typically delivering within an Agile Scrum framework. You will work on the full range of software engineering principles; covering requirements gathering and analysis, solutions design, software coding and development, testing, implementation and operational support. You will contribute to the software engineering communities by providing your ideas for innovation and process improvement and coaching and mentoring other team members.


Why You?

* Good experience in the following technologies are recommended:
* ETL toolset (Talend, Pentaho, SAS DI, Informatica etc)
* Database (Oracle, RDS, Redshift, MySQL, Hadoop, Postgres, etc)
* Data modelling (Data Warehouse, Marts)
* Job Scheduling toolset (Job Scheduler, TWS, etc)
* Programming and scripting languages (PL/SQL, SQL, Unix, Python, Hive, HiveQL, HDFS, Impala, etc)



Good to have experience in the following technologies:
* Data virtualisation tools (Denodo)
* Reporting (Pentaho BA, Power BI, Business Objects)
* Data Analytics toolset (SAS Viya)
* Cloud (AWS, Azure, GCP)
* ALM Tooling (Jira, Confluence, Bitbucket)
* CI/CD toolsets (Gitlab, Jenkins, Ansible)
* Test Automation (TOSCA)


Experience:
You should have experience as a software engineer delivering within large scale data analytics solutions and the ability to operate at all stages of the software engineering lifecycle, as well as some experience in the following:
* Awareness of devops culture and modern engineering practices
* Experience of Agile Scrum based delivery
* Proactive in nature, personal drive, enthusiasm, willingness to learn
* Excellent communications skills including stakeholder management



Jefferson Frank is the Amazon Web Services (AWS) recruiter of choice. We work with organizations worldwide to find and deliver the best AWS professionals on the planet. Backed by private equity firm TPG Growth, we have a proven track record servicing the AWS recruitment market and, to date, have worked with over 30,000 organizations globally.

Data Engineer

England, London, City of London

  • £55,000 to £75,000 GBP
  • Engineer Stelle
  • Fähigkeiten: ETL, AWS, Snowflake, Redshift, Python, SQL
  • Seniority: Senior

Jobbeschreibung

Have you ever wanted to be working with global clients on a daily basis? Getting your teeth into projects that last 9-15 months, working with edge-cutting technologies such as Redshift, SQL, Oracle, S3 buckets and loads more!



The group provides a wide range of data and analytics solutions in support of our client's business priorities: maximise revenues, bear down on fraud, and respond to current Covid challenges. This role is a unique chance to design, develop, test and support data and analytics software solutions, delivering key critical systems to the public sector. You will be part of an Agile software delivery team working closely with software architects and supported by product managers, scrum masters and solutions architects.



Benefits

* You will get put through certifications and will get upskilled to the best you can be
* There's a mentorship program as well (10 days of training a year, can be anything)
* Looking to hire 25-35 candidates this year looking to grow as well
* Flat structure within the business
* Opportunity to move internal as they hire from within before going to external agencies
* Working for a growing company averaging around 30% every year



Why this role?

Work as part of an Agile software delivery team; typically delivering within an Agile Scrum framework. You will work on the full range of software engineering principles; covering requirements gathering and analysis, solutions design, software coding and development, testing, implementation and operational support. You will contribute to the software engineering communities by providing your ideas for innovation and process improvement and coaching and mentoring other team members.


Why You?

* Good experience in the following technologies are recommended:
* ETL toolset (Talend, Pentaho, SAS DI, Informatica etc)
* Database (Oracle, RDS, Redshift, MySQL, Hadoop, Postgres, etc)
* Data modelling (Data Warehouse, Marts)
* Job Scheduling toolset (Job Scheduler, TWS, etc)
* Programming and scripting languages (PL/SQL, SQL, Unix, Python, Hive, HiveQL, HDFS, Impala, etc)



Good to have experience in the following technologies:
* Data virtualisation tools (Denodo)
* Reporting (Pentaho BA, Power BI, Business Objects)
* Data Analytics toolset (SAS Viya)
* Cloud (AWS, Azure, GCP)
* ALM Tooling (Jira, Confluence, Bitbucket)
* CI/CD toolsets (Gitlab, Jenkins, Ansible)
* Test Automation (TOSCA)


Experience:
You should have experience as a software engineer delivering within large scale data analytics solutions and the ability to operate at all stages of the software engineering lifecycle, as well as some experience in the following:
* Awareness of devops culture and modern engineering practices
* Experience of Agile Scrum based delivery
* Proactive in nature, personal drive, enthusiasm, willingness to learn
* Excellent communications skills including stakeholder management



Jefferson Frank is the Amazon Web Services (AWS) recruiter of choice. We work with organizations worldwide to find and deliver the best AWS professionals on the planet. Backed by private equity firm TPG Growth, we have a proven track record servicing the AWS recruitment market and, to date, have worked with over 30,000 organizations globally.

DEVOPS ENGINEER WITH KUBERNETES - London UK

England, London, City of London

  • £52,000 to £70,000 GBP
  • Engineer Stelle
  • Fähigkeiten: AWS, SoftwareEngineer, Python, AWSCloudEngineer, AWSSeniorCloudEngineer, AWSDevOps, Remote, Linux, Terraform, CICD, Serverless, Testing, ECS, Fargate, Lambda, Kubernetes, IaC
  • Seniority: Mid-level

Jobbeschreibung

DevOps Engineers with Kubernetes - DOE £52,000 - £70,000 + Package - London, UK

Must be Eligible for SC Clearance to apply

*This role does not provide Sponsorship*



The focus of your role

We are looking to recruit a DevOps Engineer with Kubernetes experience to join a Financial Services Digital Practice. This is a permanent, fulltime position and this represents a unique opportunity for someone to enhance their Digital Consulting career



What you'll bring

Primary skills (Mandatory) - Expertise on Kubernetes with extensive experience. Able to lead/guide team on Kubernetes.

1 Good to Have skills (Recommended) - Experience of using MuleSoft with Kubernetes.
2 Experience in DevOps methodologies & CICD integration
3 Working experience in Containerization technologies Docker & Kubernetes
4 Working experience in IaC technologies - Terraform preferred
5 Good understanding and working experience in CICD workflow
6 Good Working knowledge on CICD tools GitLab (Preferred), Jenkins
7 Proficient with git and git workflows
8 Working experience with build tools like Maven



What we'll offer you

Professional development. Accelerated career progression. An environment that encourages entrepreneurial spirit. It's all on offer. And although collaboration is at the core of the way we work, we also recognise individual needs with a flexible benefits package you can tailor to suit you.



Benefits;

Competetive Salary

Certification training

Expensed travel

Hybrid working



How to Apply

Successful candidates must be willing to undergo SC level Security Clearance.

Please apply to the role by registering your interest by emailing me at j.carter@jeffersonfrank.com or reaching out via LinkedIn "Jamie Carter".

Cloud Infrastructure Engineer - up to £100,000 - Remote First

England, London

  • £90,000 to £100,000 GBP
  • Engineer Stelle
  • Fähigkeiten: devops, cloud, aws, terraform, linux, windows, engineer, infostructure
  • Seniority: Senior

Jobbeschreibung

Cloud Infrastructure Engineer - up to £100,000 - Remote

I am looking for an experienced AWS Infrastructure Automation Engineers to work as part of and alongside software delivery teams.

As an AWS Infrastructure Automation Engineer you will be joining a software delivery team of 5-6 cross functional engineers. An essential part of the role is to ensure collaboration with the team when it comes to infrastructure.
The successful candidate will have proven experience in building, managing and supporting multiple highly available environments in a repeatable and sustainable manner.

This is a remote first role, but there will be the occasional need to visit Head Office in central London.

Skills & Experience Needed:
* At least 5+ years of AWS Infrastructure experience
* A solid understanding Kubernetes ecosystem (Helm etc)
* Experience of writing sustainable Terraform
* Experience of building Jenkins pipelines
* Experience of Scrum is advantageous
* Effective communicator at all levels
* An understanding of modern git techniques would be advantageous

Responsibilities in the role:
* Build AWS infrastructure as code
* Build automated deployment pipelines
* Implement observability tools, with monitoring and alerting
* Work with the development team to understand the software architecture
* Ensure security practices are adhered to
* Ensure code quality and scalability requirements are met
* Mentor, guide and collaborate with the development team
* Work with the DevOps mindset
* Actively participate in Scrum events
* Actively participate in design sessions with

Send over your CV or apply below to be in consideration for the role, or alternatively I welcome any recommendation or referral if you know someone looking for a similar position!