Recherche actuelle

64 Résultats de la recherche

For CDI In London

Data Engineer

England, London

  • £50,000 to £60,000 GBP
  • Engineer Role
  • Skills: Python, Airflow, CICD, DBT, ETL
  • Seniority: Mid-level

Description du poste

My client is a highly successful, continually fast-growing agency with ambitious plans for the future. They have a unique agency operating model, which makes things different and exciting; and a brilliant, positive culture. It's a fantastic place to work.



The role sits in the newly created Data Science & Tools team within this department, reporting to Head of Data Science & Tools. The team's remit is to drive innovation & revenue in two ways: develop & deliver revenue-driving/client-facing data science & engineering solutions and internal tools used within the agency.



The right hire will share the values:

Passion: loving what you do and being driven to excel with each new project

Imagination: meeting challenges creatively and ready to try new ideas

Integrity: accountable and always excellent to one another

This company operates a hybrid working model with a minimum of 2 days a week based in our London office and the remaining days from home if you wish.



RESPONSIBILITIES

The Data Engineer is accountable for the successful and efficient implementation of the data science and engineering client projects and internal tool development. Key performance indicators include the timeliness with which projects are delivered, their profitability, client satisfaction (one recurring measure of which is client Net Promoter Score), and quality of work (whether the solution is fit for purpose and to the required standard).

Data Integration, warehousing & engineering



- Solution-design the data integration architecture and ETL pipeline based on client / internal project requirements:

Typically, the requirements include bringing together multiple data sources (typically client's first-party data with ad platforms such as Google Ads, Google Search Ads 360, DV360, Campaign Manager, Google Analytics and Facebook Ads) in a central data warehouse with an automated ETL pipeline

Sometimes this includes additional handlers in the pipeline which deploy statistical / machine learning models. These are developed by data scientists on the team. The postholder will have exposure to such work, and even opportunities to work on them if their interests, capabilities and client workload allow

Occassionally, the postholder will be required to work with tools which have a graphical user-interface, requiring therefore little programming knowledge

In general, they will strike a balance between such "easy" or "legacy" projects with more intellectually challenging ones (such as developing internal tools)

- Get under the skin of the data and systems required for the solution through working closely with the clients or client teams directly

- Work closely with the data engineers and data scientists in our team to ensure the solution is fit for purpose, and that they can develop relevant components accordingly

- Lead on the delivery, drawing from your own programming expertise or from more senior members of the team where relevant

Assure the quality of implemented data science and data engineering solutions (i.e. "QAing" them as they're rolled out)

Put in place logging & monitoring for quality KPIs and alerts for bugs

Maintain solutions on an ongoing basis such as debugging, and bringing in additional senior help where needed

Communicate the operation and results of these solutions to internal teams and clients to get feedback and ensure work is delivered per expectations

For solutions that are deemed to have significant client potential, the postholder will also have to opportunity to productionise the solution, with the support of more senior members of the team

Contribute to new business and cross-sell proposals, especially with detail about the implementation work in proposed projects, and with estimations of the time and effort needed, which will contribute to the commercial element of a proposal



Essential:

Good python and SQL skills, including:

Demonstrable experience in data integration, in particular using APIs of digital advertising platforms

Developing and maintaining ETL pipelines

Curating databases or even data warehouses

Basic front-end development using open source frameworks such as Flask (N.B. sleek UX/look & feel aren't necessary. Such front-end dev is typically required for internal tools.)

Experience in unit testing, producing clean/maintainable code and other software engineering best practices

Experience working with cloud computing platforms, Google Cloud Platform being highly desirable

An ability to communicate clearly and effectively about technical topics with very varied types of stakeholders, from client CMOs to software engineers



Highly Desirable:

Productionising software applications, experience with Docker

Experience working with software engineers, agile development, and software development lifecycles



Desirable:

Experience with data science for statistical / machine learning models

Experience setting up CI/CD pipelines



COMPANY BENEFITS

The offers a clear path to progression for all members of staff. My client are committed to offering development opportunities alongside a support system of regular performance reviews. The opportunities are endless!

The offers competitive starting salaries alongside numerous company benefits. On completion of the three month probation period every employee is eligible for the benefits listed on our careers site which include:

Standard hours are from 9.00am to 5.30pm, there's flexibility if agreed in advance with your line managers (it may also be necessary on occasions to work outside of these hours).

This is an equal opportunity employer and does not discriminate on the grounds of a person's gender, marital status, race, religion, colour, age, disability or sexual orientation. All candidates will be assessed based on merit, qualifications and their ability to perform the requirements of the role.



Jefferson Frank is the Amazon Web Services (AWS) recruiter of choice. We work with organizations worldwide to find and deliver the best AWS professionals on the planet. Backed by private equity firm TPG Growth, we have a proven track record servicing the AWS recruitment market and, to date, have worked with over 30,000 organizations globally.

Data Engineer (Range of levels)

England, London

  • £56,000 to £75,000 GBP
  • Engineer Role
  • Skills: ETL, Oracle, Python, AWS
  • Seniority: Mid-level

Description du poste

Have you ever wanted to be working with global clients on a daily basis? Getting your teeth into projects that last 9-15 months, working with edge-cutting technologies such as Redshift, SQL, Oracle, S3 buckets and loads more!



The group provides a wide range of data and analytics solutions in support of our client's business priorities: maximise revenues, bear down on fraud, and respond to current Covid challenges. This role is a unique chance to design, develop, test and support data and analytics software solutions, delivering key critical systems to the public sector. You will be part of an Agile software delivery team working closely with software architects and supported by product managers, scrum masters and solutions architects.



Benefits



* You will get put through certifications and will get upskilled to the best you can be
* There's a mentorship program as well (10 days of training a year, can be anything)
* Looking to hire 25-35 candidates this year looking to grow as well
* Flat structure within the business
* Opportunity to move internal as they hire from within before going to external agencies
* Working for a growing company averaging around 30% every year



Why this role?

Work as part of an Agile software delivery team; typically delivering within an Agile Scrum framework. You will work on the full range of software engineering principles; covering requirements gathering and analysis, solutions design, software coding and development, testing, implementation and operational support. You will contribute to the software engineering communities by providing your ideas for innovation and process improvement and coaching and mentoring other team members.


Why You?

* Good experience in the following technologies are recommended:
* ETL toolset (Talend, Pentaho, SAS DI, Informatica etc)
* Database (Oracle, RDS, Redshift, MySQL, Hadoop, Postgres, etc)
* Data modelling (Data Warehouse, Marts)
* Job Scheduling toolset (Job Scheduler, TWS, etc)
* Programming and scripting languages (PL/SQL, SQL, Unix, Python, Hive, HiveQL, HDFS, Impala, etc)



Good to have experience in the following technologies:
* Data virtualisation tools (Denodo)
* Reporting (Pentaho BA, Power BI, Business Objects)
* Data Analytics toolset (SAS Viya)
* Cloud (AWS, Azure, GCP)
* ALM Tooling (Jira, Confluence, Bitbucket)
* CI/CD toolsets (Gitlab, Jenkins, Ansible)
* Test Automation (TOSCA)


Experience:
You should have experience as a software engineer delivering within large scale data analytics solutions and the ability to operate at all stages of the software engineering lifecycle, as well as some experience in the following:
* Awareness of devops culture and modern engineering practices
* Experience of Agile Scrum based delivery
* Proactive in nature, personal drive, enthusiasm, willingness to learn
* Excellent communications skills including stakeholder management



Jefferson Frank is the Amazon Web Services (AWS) recruiter of choice. We work with organizations worldwide to find and deliver the best AWS professionals on the planet. Backed by private equity firm TPG Growth, we have a proven track record servicing the AWS recruitment market and, to date, have worked with over 30,000 organizations globally.

AWS Data Engineer

England, London

  • £50,000 to £65,000 GBP
  • Engineer Role
  • Skills: AWS Data Engineer, Data Engineer, Data Engineering, ETL, Datawarehouse, Data Modelling, Redshift
  • Seniority: Mid-level

Description du poste

Data Engineer - Up to £70,000

Fully Remote for UK based candidates

As an Data Engineer you'll find yourself participating in customer projects which can mean day-to-day hands-on work can be from implementing serverless/ETL infrastructure to taking a step back and giving workshops and design recommendations to a technical team.

Furthermore, you will become a trusted advisor for high-profile enterprises and jointly manage the customer relationship and roadmap working in partnership with AWS.

Ideally, you're bringing the following skills with you:

* Public cloud platform expert knowledge, especially Amazon Web Services (AWS).
* Worked as part of a crossfunctional team on an AWS platform that holds large amounts of data.
* Designing and implementing data pipelines that will feed user behavioural data back into production systems.
* Strong recent experience using AWS RedShift, Big query, QuickSight, Lambda and data pipelines.
* Strong recent experience ETL design, implementation and maintenance.
* Experience of working on data migrations
* All in all, you should have a solid engineering/technical background/hands-on experience.
* Minimum AWS certification

Why is this your next career move?

* Leading-edge projects - they're here to present their customers with the latest technologies and to push the IT industry forward!
* Highly skilled co-workers in a friendly and supportive working culture, enjoy working and having fun together and sharing knowledge.
* Most advanced technologies. They are the overly excited techies who can't wait to read about the newest launches!
* Besides an interesting assignment, you can enjoy extra-curricular activities
* Exam fees for partner certifications (Azure, AWS, GCP) + certification bonus covered by the employer
* Access to join and a possibility to create knowledge sharing sessions within a community of leading cloud professionals
* Interesting projects and a chance to work with a variety of high-profile customers from several industries
* Possibility to gather valued certificates and participate in world-class courses and training, you can choose from various courses to ensure continuous learning
* Lots of opportunities to develop your cloud expertise while working together with the leading Global Cloud Professionals

Data Engineer - London (Hybrid) - up to £95,000 + bonus

England, London

  • £60,000 to £95,000 GBP
  • Engineer Role
  • Skills: DataLakes, PySpark, SQL, Azure, Python, AWS, Databricks, Agile
  • Seniority: Senior

Description du poste

Senior Data Engineer, Hybrid, East London - DataLakes, PySpark, SQL, Azure, Python, AWS, Databricks, Agile

Role Overview

We are looking for an experienced data engineers responsible for the design, development, and maintenance of applications. You will be working alongside other engineers and developers working on different layers of the infrastructure. Therefore, a commitment to collaborative problem-solving, sophisticated design, and the creation of quality products are essential.

Role & Responsibilities

* Collaborate with Big Data Solution Architects to design, prototype, implement, and optimize data ingestion pipelines so that data is shared effectively across various business systems.
* Build ETL/ELT and Ingestion pipelines and design optimal data storage and analytics solutions using cloud and on-perm technologies.
* Ensure the design, code and procedural aspects of the solution are production ready, in terms of operational, security and compliance standards.
* Participate in day-to-day project and product delivery status meetings, and provide technical support for faster resolution of issues.

Skills and Experience

* Demonstrable design & development experience and experience with big data technologies like Spark/Flink and Kafka
* Proficient in Python, PySpark, or Java/Scala.
Hands-on experience with some of the following technologies:

* Azure/AWS - Data Lake Projects
* SQL
* ETL/ELT
* Databricks
* Spring/Guice or any other DI framework,
* RESTful Web Services.

* Proficient in querying and manipulating data from various DB (relational and big data).
* Experience of writing effective and maintainable unit and integration tests for ingestion pipelines.
* Experience of using static analysis and code quality tools and building CI/CD pipelines.

Nouveau

Data Engineer - London - Up to £70,000

England, London

  • £50,000 to £70,000 GBP
  • Engineer Role
  • Skills: AWS, Python, S3, Redshift, Kafka, Hadoop. Spark
  • Seniority: Senior

Description du poste

Data Engineer

Introduction

This client is a bank-owned consortium distributing data from 31 banks to 80+ buy side clients managing $55 trillion in total assets. They are a tenacious organisation, passionate about market innovation for the benefit of their dealers and clients. They welcome anyone with an entrepreneurial spirit.

The Data Engineer role will be the first direct data engineer, focused on building data pipelines and architecture within AWS to support their growing data franchise. You will work closely with the Product Team, Sales, Management and most importantly their clients and dealers in delivering a stable and efficient data platform and will report directly to the head of data products.

The primary delivery will be to maintain and enhance the data platform as a foundation for real-time data products, visualizations, and APIs.

Primary accountabilities: AWS and data pipeline expert

*

Hands-on data engineering: cleaning and organising data in Python, S3, redshift, etc.
*

AWS solution guru: experience with a wide array of AWS tools, in order to be able to recommend the right tool for the job
*

Partnership with product management, sales and support to establish requirements and produce cost effective, scalable architectures to meet business requirements



Other Responsibilities:

*

Partner with the Technology organization to develop and enhance the platform
*

Collaborate with development, QA, Application support and Infrastructure managers to improve the operational efficiency of the organization.
*

Self-motivating and always looking to improve quality
*

Keep abreast of industry trends and ensure the company has the right technology to be market leading



Required Qualifications:

*

AWS or Azure Certified Solutions Architect: 4 years' experience, specifically Redshift, Lambda, Cloud formation
*

Experience with Kafka or other comparable streaming platform
*

Experience with Spark, Hadoop, or other big data/distributed computing paradigm
*

3+ years' experience writing production-quality code in Python in a similar role in financial services
*

Bachelor's degree in Computer Science/similar or equivalent
*

Formal requirements analysis and architecture modelling methodologies and tools
*

Design and architecture of systems and software components
*

Agile development methodologies and DevOps, CI/CD pipeline and associated tooling, Jira
*

Interest in financial markets desired



Other Requirements

*

Great communicator- comfortable with loose requirements working in a small team
*

Ability to establish rapport and work collaboratively: a solid team player
*

Ability to work independently on projects.
*

Highly motivated and detail-oriented



Location

*

This company operates a flexible working policy, typically 2-3 days per week in the London, Cannon Street office. They are open to profiles from other locations, for the right candidate.

Nouveau

Senior Consultant

England, London

  • £30,000 to £60,000 GBP
  • Engineer Role
  • Skills: node.js, node js, typescript, aws, python, backend development, linux, IoT
  • Seniority: Senior

Description du poste

Senior Consultant, £60,000, Hybrid (Cambridge), Advanced AWS Consulting Partner

Are you looking for a new challenging role? A chance to work in a friendly, technically brilliant and award winning environment? Then look no further!

This would be a great opportunity for a Senior Software Engineer to join a cloud native professional service company and AWS Advanced Tier Services Partner.

The company was founded in 2009 and has since then grown significantly and is well established. They work across multiple sectors from health-care, public sector and education.

The successful candidate will get to work on multiple projects with different clients as well as having the opportunity to use, develop and grow your experience in multiple programming languages and technology areas.

Skills and Qualifications:

* Solid AWS experience
* Solid Node.js experience
* Typescript
* Back-end development
* Knowledge of AWS IoT Services

Benefits:

* Clear career progression
* Pension
* Funded AWS Certifications
* Remote working
* Learning and development opportunities

This is a remote first role and you would be required to travel to their office in Cambridge once a month. If you are interested in this role, I'd love to hear from you. Please send me an email a.nkrumah@jeffersonfrank.com or contact me on 0203 808 7311

Nouveau

Operations/ Site Reliability Engineer

England, London

  • £60,000 to £80,000 GBP
  • Engineer Role
  • Skills: AWS, Docker, Terraform, Grafana
  • Seniority: Mid-level

Description du poste

We are looking for an Operations Engineer to join our talented, dynamic, and rapidly growing global team. The position is based out of our London office situated in the heart of East London's tech community in Shoreditch. We have a flexible hybrid working policy but will expect this person to be in office when required, potentially once a week.



Company Description



OpenAsset is the only Digital Asset Management solution built for the Architecture, Construction and Engineering industries. We have over 700 clients and 15 years' experience of delivering value. Our vision is to inspire people through visualization of the built world.



We are a diverse group of hard-working and entrepreneurial people dedicated to solving complex challenges, working hard on meaningful projects, and celebrating our successes. We are looking for extraordinary people to join our industry-leading and incredibly talented team! Our inspirational and fun working environment, innovation-driven, fast-growing company, and ambitious projects are just a few reasons why you will love working here.



As a company we are passionate about ensuring that diversity and inclusion are championed, and that everybody has a seat at the table. We promote a culture where everyone feels valued, and we have adopted policies to ensure we hire from a diverse pool of candidates.



Operations Engineer Description



The Operations team has a diverse set of responsibilities that require interaction and coordination across all departments throughout the business. The main areas of responsibility are Product and IT Operations, along with an overarching goal of promoting, implementing and maintaining security best practices.



Operations Engineer Responsibilities

* Work across business units to identify and implement processes and automations that will drive efficiency.
* Help manage our ever evolving SaaS infrastructure stack as we grow as a company.
* Work closely with our engineering teams to deploy new product infrastructure resources, client environments and releases.
* Ensure core product performance and reliability through monitoring and alerting.
* Analyze application issues and establish resolutions or recommendations for the development team.
* Create and maintain detailed operational documentation.
* Ensures Security and Compliance is maintained at all times.
* Ensure all hardware and software requirements are met out of our London office.
* Build upon and maintain our existing MDM tenant, we're primarily a Jamf Now + Mac shop!



Operations Engineer Skills and Experience

* 3+ years managing and implementing SaaS systems and applications
* Experience with a cloud provider (we use AWS) infrastructure, and management.
* Application and Technology tool management expertise: management tools, techniques, monitoring and integration.
* Experience with automation tooling to drive continuous improvement: Github Actions, Circle CI, Zapier, Workato.
* Experience creating scripts with Python, Bash, and other scripting languages.
* Some experience with infrastructure as code (we use Terraform).



Benefits

* Competitive salary
* 25 paid vacation days
* 8 bank holidays
* 5 paid sick days
* SSP
* Work from home flexibility
* Paid parental leave
* Pension program
* Bike storage/shower facilities in building
* Career growth and development opportunities



This position is not eligible for visa sponsorship.



Axomic is an Equal Opportunity Employer. We base our employment decisions entirely on business needs, job requirements, and qualifications-we do not discriminate based on race, gender, religion, health, parental status, personal beliefs, veteran status, age, or any other status. We have zero tolerance for any kind of discrimination, and we are looking for candidates who share those values. Applications from women and members of underrepresented minority groups are welcomed.

Data Engineer

England, London, City of London

  • £50,000 to £60,000 GBP
  • Engineer Role
  • Skills: Snowflake, Python, Airflow, SQL, AWS
  • Seniority: Mid-level

Description du poste

Snowflake Engineer

Do you like working with the latest technology and are interested in enhancing your tech abilities? Have an exciting opportunity for a highly skilled Data Engineer with significant experience of Snowflake.

As well as being an expert in the Snowflake cloud platform, you'll have a strong background in Data Ingestion and Integration, designing and implementing ETL pipelines on various technologies, Data Modelling and a rounded understanding of data warehousing.

They believe strongly in experimentation leading to industrialisation and are searching for passionate, energetic data engineers who are focussed on using their skills to drive out real business value for our customers

A bit about the job:

The new project is a greenfield Personal Lines insurer headquartered in Hoxton (London), set with the ambition to be the best in the UK market. It will combine the pace, focus, and test and learn mentality of a start-up with the expertise, and financial backing of the company.

Data is the life blood of any modern organisation and they are is no different. Our Data Engineering team sits within Aviva Quantum our global Data Science Practise (covering areas including Machine Learning, Analytics, Data Engineering, AI and many more).

You will form a vital part of our business, contribute to our first-class end-to-end solutions. You will play an active role in defining our practices, standards and ways of working, and apply them to your role. Be open to working across organisation and team boundaries to ensure they bring the best to our customers.

Ideal skills and experience:

* Experience of delivering end to end solutions with different databases technologies focusing on Snowflake but also Dynamo, Oracle, SQL Server, Postgres.
* Experience of managing data using the Data Vault architecture and managing it through DBT.
* Strong understanding of data manipulation/wrangling techniques in SQL along with at least one of the following Python, Scala, Snowpark or PySpark.
* Experience in designing structure to support reporting solutions optimised for use from tools like Qlik, Tableau etc. Good understanding of modern code development practices including DevOps/DataOps but also Agile.
* Strong interpersonal skills with the ability to work with customer to establish requirements and then design and deliver the solution. Taking the customer on the end-2-end journey with you.

What you'll get for this role:

* Salary up to £55,000 London and up to £46,000 National (depending on location, skills, experience, and qualifications)
* Generous pension (starting level they contributes 8% when you contribute 2%)
* Part of the Sales Bonus Scheme
* Family friendly parental and carer's leave
* 29 days holiday per year plus bank holidays and the option to buy/sell up to 5 additional days
* Up to 40% discount for products
* Brilliant flexible benefits
* Matching Share Plan and Save As You Earn scheme
* 21 volunteering hours per year

For everyone:

They want applications from people with diverse backgrounds and experiences.

Excited but not sure you tick every box? Research tells us that women, particularly, feel this way. So, regardless of gender, why not apply. And if you're in a job share just apply as a pair. Flex locations, hours and working patterns to suit our customers, business, and you.

Most of our people are smart working - spending around 60% of their time in our offices and 40% at home.



Jefferson Frank is the Amazon Web Services (AWS) recruiter of choice. We work with organizations worldwide to find and deliver the best AWS professionals on the planet. Backed by private equity firm TPG Growth, we have a proven track record servicing the AWS recruitment market and, to date, have worked with over 30,000 organizations globally.

Data Engineer

England, London, City of London

  • £60,000 to £75,000 GBP
  • Engineer Role
  • Skills: ETL, AWS, Snowflake, Redshift, Python, SQL
  • Seniority: Mid-level

Description du poste

Data Engineer



This company is a leader in Artificial Intelligence (AI) and Machine Learning (ML) and specialist in advanced analytics, BI/MI and Data Science. They are an exciting and rapidly growing consultancy, based at Adastral Park in Suffolk, with offices in London and Rotterdam. As an Amazon Web Services (AWS) Premier Partner and having been awarded the AWS ML Partner of the Year 2020, we are extremely well positioned at the forefront of this fast-paced, cutting-edge technology space.



Following an extraordinary four years of growth and success, They were proud to be acquired in December 2020 and is now part of the global Cognizant family. With ambitious expansion plans and increased demand from customers across the world, we are looking to add Data Engineers to their dynamic team.



Role:

The Data Engineer will propose and implement solutions using a range of AWS infrastructure,

including S3, Redshift, Lambda, Step Functions, DynamoDB, AWS Glue and Matillion.

They will liaise with clients to define requirements, refine solutions and ultimately hand over to clients' own technical teams.

The ideal candidate will have exposure to CI/CD processes, or at least be keen to learn - our clients love infrastructure as code; and we like our engineers to own the deployment of their work.

Candidates should delight in creating something from nothing on greenfield projects. We're looking for people who can't let go of interesting problems.

We need people who can work independently; but we're a close-knit, supportive team - we like to learn new things and share our ideas, so that clients get the best return on their investments.



Essential experience:

* Experience in analysing and cleansing data using a variety of tools and techniques.
* Familiarity with AWS data lake related components.
* Hands on experience with Redshift, Glue and S3.
* Extensive experience of ETL and using patterns for cloud data warehouse solution (e.g. ELT).
* Hands on experience with Matillion
* Familiarity with variety of Databases, incl. structured RDBMS.
* Experience in working with a variety of data formats, JSON, XML, CSV, Parquet etc.
* Experience with building and maintaining data dictionaries / meta-data.
* Experience of Linux and cloud environments.
* Data Visualisation Technologies (e.g. Amazon QuickSight, Tableau, Looker, QlikSense).

Desirable experience:

* Familiarity with large data techniques (Hadoop, MapReduce, Spark etc.)
* Familiarity with providing data via a micro service API.
* Experience with other public cloud data lake.
* AWS Certifications (particularly Solution Architect Associate and Big Data Speciality).
* Machine Learning.



Jefferson Frank is the Amazon Web Services (AWS) recruiter of choice. We work with organizations worldwide to find and deliver the best AWS professionals on the planet. Backed by private equity firm TPG Growth, we have a proven track record servicing the AWS recruitment market and, to date, have worked with over 30,000 organizations globally.