Recherche actuelle

8 Résultats de la recherche

For CDI In London

AI Engineer

England, London

  • £80,000 to £105,000 GBP
  • Engineer Role
  • Skills: Python, AI, AWS
  • Seniority: Senior

Description du poste

Unlock your potential as an AV

AI Engineer

Location: London
Salary: £105,000

Are you an experienced engineer with hands‑on expertise in Python, LLMs, and modern cloud technologies? We're looking for a Lead Engineer to join a high-impact Generative AI feature team, building applications used by thousands of colleagues daily. This is a unique opportunity to drive AI innovation at scale within a highly regulated environment.



About the Role

As a Lead Engineer, you will design, develop, and enhance software solutions using modern engineering practices. You'll act as an SME within the Generative AI domain, shaping technical direction, guiding others, and influencing both strategy and implementation.

You will lead a small team, provide mentorship, conduct code reviews, and drive a culture of technical excellence. This role offers a blend of hands-on engineering, leadership, and strategic input - perfect for someone ready to step into a high-ownership position.



Key Responsibilities:

* Build high‑quality, scalable, maintainable Python-based applications.
* Develop and deploy AI-driven applications, including those using LLMs.
* Collaborate closely with product, design, and engineering teams.
* Contribute to solution design, architecture, and secure coding practices.
* Drive testing best practices and ensure repeatable, reliable deployments.

* Mentor and guide junior engineers; support ongoing capability development.
* Lead technical discussions, code reviews, and cross-functional collaboration.
* Influence decision‑making and contribute to policy/standards.
* Support risk management, governance, and control requirements.

* Contribute to the organisation's Generative AI strategy as a subject matter expert.
* Analyse complex, multi-source data to inform design and decision‑making.
* Communicate complex or sensitive information clearly to senior stakeholders.



Required Experience

To succeed in this role, you should have strong hands-on capability in:

* Python
* Working with Large Language Models (LLMs)
* Cloud technologies, ideally AWS (Bedrock, Lambda, S3, Lex, CloudWatch)
* Prompt optimisation and evaluation methodologies
* Strong communication skills, especially in cross-functional environments
* Mentoring, coaching, or guiding other engineers

Databricks Architect

England, London

  • £100,000 to £120,000 GBP
  • Engineer Role
  • Skills: Azure Databricks
  • Seniority: Senior

Description du poste

Data Architect - Databricks (Hybrid, UK)
Locations: London, Manchester, or Edinburgh
Hybrid: 2-3 days per week on-site
Salary: Competitive (Manager & Senior Manager grades available)

About the Role
We are seeking an experienced Data Architect with deep expertise in Databricks to help our clients design, build, and scale modern data platforms. You will play a pivotal role in shaping Lakehouse architectures that enable advanced analytics, AI/ML, and enterprise-wide data-driven decision-making.
Working closely with clients early in their data journey, you will assess business needs, define architectural direction, and guide the implementation of robust, secure, and scalable solutions. This is a hands-on architecture role suited to someone who has spent the last 2-3 years working directly with Databricks at an architectural level and is ready to progress towards programmes such as the Databricks DPP.

Key Responsibilities
* Architect and implement Databricks Lakehouse solutions across ingestion, processing, storage, and analytics layers.
* Recommend best practices and innovative approaches for modern data platforms.
* Build strong client relationships and confidently present architectural decisions to senior stakeholders.
* Shape client data strategies and promote governance, quality, and security standards.
* Lead architectural engagements and ensure delivery within scope, budget, and timelines.
* Optimise Databricks workloads for performance, scalability, and cost efficiency.
* Implement governance and compliance frameworks using Unity Catalog, Purview, and cloud-native controls.
* Develop CI/CD pipelines using Databricks Repos, GitHub Actions, or Azure DevOps.
* Contribute to RFI/RFP responses and deliver innovative Proofs of Concept.
* Support the internal Architecture Practice by developing reusable patterns and accelerators.

Skills & Experience
* Proven experience delivering enterprise-scale Databricks solutions end-to-end.
* Strong background in Lakehouse Architecture, including structured and unstructured data.
* Expertise in Spark, PySpark, Delta Lake, and Databricks workflows.
* Experience building scalable ETL/ELT pipelines, including Delta Live Tables.
* Strong programming skills in Python, Scala, or SQL.
* Solid understanding of data modelling (3NF, Kimball, Data Vault).
* Experience integrating Lakehouse architectures with BI tools such as Power BI and Tableau.
* Hands-on experience with at least one major cloud platform (Azure, AWS, or GCP) and understanding of Databricks implications across each.
* Knowledge of Databricks security best practices (RBAC, IAM, encryption).
* Excellent communication, stakeholder engagement, and problem-solving skills.

Highly Valued Certifications
* Databricks Certified Data Engineer (Associate/Professional)
* Databricks Certified Machine Learning (Associate/Professional)
* Databricks Generative AI Fundamentals
* Databricks Lakehouse Fundamentals

Why Join Us?
* Generous annual leave and private medical insurance.
* Strong focus on wellbeing and personal development.
* A culture that rewards high performance and nurtures talent.
* Opportunities to work on impactful client projects and drive meaningful change.
* Supportive environment with investment in certifications and career progression.

Additional Information
* This role is fully signed off and part of a growing Databricks capability.
* Candidates must be willing to travel between UK offices when required.
* Suitable for individuals with strong architectural experience rather than purely engineering backgrounds.

Please can you send me a copy of your CV if you're interested

Databricks Architect

England, London

  • £100,000 to £120,000 GBP
  • Engineer Role
  • Skills: Azure Databricks
  • Seniority: Senior

Description du poste

Data Architect - Databricks (Hybrid, UK)
Locations: London, Manchester, or Edinburgh
Hybrid: 2-3 days per week on-site
Salary: Competitive (Manager & Senior Manager grades available)

About the Role
We are seeking an experienced Data Architect with deep expertise in Databricks to help our clients design, build, and scale modern data platforms. You will play a pivotal role in shaping Lakehouse architectures that enable advanced analytics, AI/ML, and enterprise-wide data-driven decision-making.
Working closely with clients early in their data journey, you will assess business needs, define architectural direction, and guide the implementation of robust, secure, and scalable solutions. This is a hands-on architecture role suited to someone who has spent the last 2-3 years working directly with Databricks at an architectural level and is ready to progress towards programmes such as the Databricks DPP.

Key Responsibilities
* Architect and implement Databricks Lakehouse solutions across ingestion, processing, storage, and analytics layers.
* Recommend best practices and innovative approaches for modern data platforms.
* Build strong client relationships and confidently present architectural decisions to senior stakeholders.
* Shape client data strategies and promote governance, quality, and security standards.
* Lead architectural engagements and ensure delivery within scope, budget, and timelines.
* Optimise Databricks workloads for performance, scalability, and cost efficiency.
* Implement governance and compliance frameworks using Unity Catalog, Purview, and cloud-native controls.
* Develop CI/CD pipelines using Databricks Repos, GitHub Actions, or Azure DevOps.
* Contribute to RFI/RFP responses and deliver innovative Proofs of Concept.
* Support the internal Architecture Practice by developing reusable patterns and accelerators.

Skills & Experience
* Proven experience delivering enterprise-scale Databricks solutions end-to-end.
* Strong background in Lakehouse Architecture, including structured and unstructured data.
* Expertise in Spark, PySpark, Delta Lake, and Databricks workflows.
* Experience building scalable ETL/ELT pipelines, including Delta Live Tables.
* Strong programming skills in Python, Scala, or SQL.
* Solid understanding of data modelling (3NF, Kimball, Data Vault).
* Experience integrating Lakehouse architectures with BI tools such as Power BI and Tableau.
* Hands-on experience with at least one major cloud platform (Azure, AWS, or GCP) and understanding of Databricks implications across each.
* Knowledge of Databricks security best practices (RBAC, IAM, encryption).
* Excellent communication, stakeholder engagement, and problem-solving skills.

Highly Valued Certifications
* Databricks Certified Data Engineer (Associate/Professional)
* Databricks Certified Machine Learning (Associate/Professional)
* Databricks Generative AI Fundamentals
* Databricks Lakehouse Fundamentals

Why Join Us?
* Generous annual leave and private medical insurance.
* Strong focus on wellbeing and personal development.
* A culture that rewards high performance and nurtures talent.
* Opportunities to work on impactful client projects and drive meaningful change.
* Supportive environment with investment in certifications and career progression.

Additional Information
* This role is fully signed off and part of a growing Databricks capability.
* Candidates must be willing to travel between UK offices when required.
* Suitable for individuals with strong architectural experience rather than purely engineering backgrounds.

Please can you send me a copy of your CV if you're interested

Lead Azure Databricks Engineer

England, London

  • £70,000 to £80,000 GBP
  • Engineer Role
  • Skills: Azure Databricks, insurance, reinsurance, lloyds of london
  • Seniority: Senior

Description du poste

Azure Databricks Engineer
Lloyd's of London Market

Hybrid 2 days in London Office

We are seeking an outstanding Lead Azure Databricks Engineer to own the design, build, optimisation and governance of our enterprise cloud data platforms. This is a senior, fully hands‑on engineering position requiring deep technical expertise, strong delivery capability and experience operating within the Lloyd's of London market and its regulatory demands.

You will shape the future of our cloud data strategy, enabling advanced analytics and building secure, scalable and highly performant data solutions across the organisation.

Key Responsibilities

* Lead the development, optimisation and governance of large‑scale data platforms using Azure Data Factory, Data Lake, Key Vault, Azure Functions, Databricks, Delta Lake, PySpark and Unity Catalog
* Partner closely with underwriting, actuarial, delegated authority, bordereaux, exposure management, reinsurance, finance, Solvency II and risk teams to deliver solutions aligned to Lloyd's of London regulatory and operational requirements
* Translate complex business needs into scalable cloud data architectures in partnership with architects, SMEs and product owners
* Resolve complex technical challenges rapidly and decisively, consistently delivering high‑quality outcomes
* Champion engineering best practices including design patterns, data lifecycle management, CI/CD automation and cloud‑native methodologies
* Provide hands‑on technical leadership, mentorship and uplift engineering capability across the team
* Drive continuous improvement across performance, cost optimisation, reliability and resilience
* Oversee end‑to‑end data engineering delivery ensuring quality, observability, robustness and compliance

Required Experience and Skills

* Expert‑level knowledge of Azure Data Services and Databricks within enterprise environments
* Deep understanding of Delta Lake, Medallion architecture, distributed compute, Lakehouse patterns, data modelling and performance optimisation
* Strong Python, PySpark and SQL experience, with hands‑on implementation of CI/CD for data solutions using Azure DevOps
* Strong understanding of data governance, lineage, access management, FinOps and secure cloud engineering
* Excellent communication skills with the ability to confidently engage senior stakeholders and simplify complex technical concepts
* Proven ability to take ownership, deliver at pace and operate effectively in high‑pressure, high‑expectation environments
* Experience working within or delivering solutions for the Lloyd's of London market or similarly regulated environments

Please send me a copy if your CV if you're interested

Nouveau

Test Engineer

England, London

  • £60,000 to £65,000 GBP
  • Engineer Role
  • Skills: adf, databricks, sql
  • Seniority: Mid-level

Description du poste

Test Engineer

Location: London (Hybrid)
Type: Permanent

About:

An established global organisation is undergoing a significant digital modernisation programme and is looking to appoint a Test Engineer to strengthen quality assurance across its data and application landscape.

This role sits within a central IT & Digital function and plays a critical role in improving delivery confidence, system reliability and overall product quality. You'll work closely with developers and business analysts to embed structured, proactive testing throughout the delivery lifecycle.

The Role

As a Test Engineer, you will act as a quality gate across applications, integrations and data platforms, ensuring solutions are accurate, stable and fit for real‑world use.

Key responsibilities include:

* Partnering with development leads and business analysts on test planning, scenario creation and early validation of requirements
* Executing end‑to‑end functional and integration testing across applications, data platforms and APIs
* Validating data accuracy, business logic, workflows and cross‑system behaviour
* Running regression and business‑scenario testing to ensure enhancements are stable and predictable
* Leading impact assessments to identify dependencies, risks and downstream effects of change
* Conducting performance and load testing across applications and high‑volume data processing pipelines



Requirements:

* Strong SQL skills for data validation and defect investigation
* Experience testing applications built on Power Platform (Power Apps, Dataverse), SharePoint and APIs
* Experience validating data pipelines and analytics platforms
* Ability to design test cases, structure test cycles and support UAT activities

Benefits:

* Join a large‑scale digital transformation with genuine complexity and impact
* Work in a collaborative, engineering‑led environment that values quality and structure
* Competitive salary and benefits package
* Flexible working model to support work-life balance
* Strong investment in learning, development and long‑term career progression
* And more.

Nouveau

Python Web Scraping Engineer - Fully Remote - £70k

England, London, City of London

  • £65,000 to £70,000 GBP
  • Engineer Role
  • Skills: Python, web scraping, data, website, engineer, programming, technology, cloud, aws, azure, google, data science, ai
  • Seniority: Senior

Description du poste

Python Web Scraping Engineer - Fully Remote - £70k

Please note - to be eligible for this role you must be UK-based with the unrestricted right to work in the UK. This organisation does not offer sponsorship.

My client is hiring a Senior Python Scraping Engineer to design, build, and operate high‑volume, highly resilient web scraping systems, with a specific and sustained focus on scraping Google at scale.

This is a specialist role. It is not suited to generalist data engineers who have only worked on light or opportunistic scraping. You will be working in hostile, rapidly changing environments where naïve techniques fail and deep expertise in bot detection and evasion is essential.

The role sits at the intersection of data engineering, reverse engineering, and large‑scale systems reliability, and plays a critical role in delivering accurate, timely, and trusted data.

My client is a leader in adopting AI‑assisted and agentic coding practices, and this role is ideal for engineers who actively use AI tools to improve productivity, reasoning, and system design.

Key Responsibilities:

* Design and operate large‑scale scraping systems handling 10+ million requests per day, primarily targeting Google and Google‑like platforms
* Build robust scrapers for dynamic, JavaScript‑heavy environments using browser automation and hybrid approaches
* Continuously adapt to changes in markup, request flows, ranking logic, and anti‑automation mechanisms
* Engineer extraction pipelines with a strong emphasis on correctness, consistency, and observability
* Implement and maintain advancedanti‑bot evasion strategies, including:

* Proxy and request routing strategies
* Browser and headless fingerprinting
* CAPTCHA handling and mitigation

* Monitor system health, detect anomalies early, and debug complex production issues
* Optimise performance, cost, and latency across large‑scale scraping infrastructure
* Collaborate closely with data engineers, data scientists, and product teams to ensure scraped data is reliable and usable
* Produce clear documentation and operational runbooks to support long‑term maintainability

Required Technical skills:

* Expert‑level web scraping skills using Python
* Direct, hands‑on experience scraping Google at scale - this is essential
* Deep understanding of:

* Anti‑bot and bot‑detection systems
* Browser and network fingerprinting
* CAPTCHA systems and mitigation techniques
* Scaling scraping infrastructure reliably

* Strong knowledge of HTTP, TLS, cookies, headers, redirects, and browser networking behaviour
* Experience with tools such as Playwright, Selenium, Puppeteer, or equivalent frameworks
* Comfortable designing asynchronous and concurrent scraping architectures
* Proven experience running scraping systems at scale in cloud environments
* Excellent debugging skills and the ability to reason about complex failure modes
* Strong communication skills, with the ability to clearly explain complex technical behaviour

To apply for this role please submit your CV or contact David Airey on 0191 338 7508 or at d.airey@tenthrevolution.com.

Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.

Data Modeller

England, London

  • £85,000 to £85,001 GBP
  • Engineer Role
  • Skills: Data Model Configuration
  • Seniority: Senior

Description du poste

Senior Data Modeller

Location: UK (Hybrid/Flexible)
Experience: 15+ years
Employment Type: Full‑time

Role Overview

A leading organisation is seeking an experienced Senior Data Modeller to design, develop, and optimise data models that support underwriting, claims, pricing, and regulatory reporting functions. The role requires strong knowledge of complex insurance data structures, market standards, and modern data architecture principles.

Key Responsibilities

Data Modelling & Architecture

* Design and maintain conceptual, logical, and physical data models for insurance and financial systems.
* Ensure alignment with industry standards and data governance frameworks.
* Develop and maintain data dictionaries, metadata, and modelling documentation.

Data Integration & Transformation

* Collaborate with business analysts, actuaries, and technology teams to integrate data from multiple sources (policy, claims, exposure, financial).
* Optimise data flows to support reporting, analytics, and regulatory requirements.

Stakeholder Engagement

* Work closely with internal teams to gather and interpret data requirements.
* Provide subject matter expertise on data standards, data structures, and modelling principles.

Quality & Compliance

* Implement data quality rules, validation processes, and governance controls.
* Ensure compliance with relevant regulatory frameworks (e.g., data protection, financial/regulatory reporting standards).

Required Skills & Experience

Technical Expertise

* 15+ years' experience in data modelling, ideally within insurance or financial services.
* Strong experience with conceptual, logical, and physical data modelling techniques.
* Proficiency in modelling tools such as ERwin, PowerDesigner, or equivalent.
* Strong SQL capabilities and experience in relational/dimensional modelling and data warehousing.

Domain Knowledge

* Solid understanding of insurance or financial‑market processes (e.g., placement, claims, bordereaux, regulatory reporting).
* Experience with industry data standards (e.g., ACORD, equivalent frameworks).

Analytical & Communication Skills

* Ability to translate complex business requirements into scalable data models.
* Strong communication and stakeholder‑management skills.

Preferred Qualifications

* Experience with cloud platforms (e.g., Azure, AWS, GCP).
* Knowledge of data governance, MDM, and metadata management.
* Familiarity with BI/analytics tools such as Power BI or Tableau.

Please send me a copy of your CV if you meet the requirements

Head of Engineering

England, London

  • £80,000 to £150,000 GBP
  • Engineer Role
  • Skills: Python, Devops, QA, ML, aws, data science, spark, databricks,
  • Seniority: Senior

Description du poste

Head of Engineering - Permanent

Location: Remote
Reports to: CTO

A fast‑growing technology company is seeking a Head of Engineering with deep experience building and scaling complex, data‑driven platforms. This role requires someone who can lead multi‑disciplinary teams, improve engineering execution, and drive technical excellence across high‑scale distributed systems and machine‑learning environments.

Required Experience

* Leadership of engineering organisations of 20 or more across multiple disciplines, including backend, web, data, ML, QA, and DevOps.
* Proven track record designing, scaling, and maintaining distributed systems and high‑volume data pipelines.
* Strong background in cloud‑native architectures and modern data stacks such as Spark or Databricks.
* Experience working closely with Data Science teams, including delivering production‑grade ML models and pipelines.
* Solid understanding of modern frontend engineering practices (e.g., TypeScript, React) to guide cross‑functional technical decisions.
* Robust DevOps knowledge, including CI/CD pipelines, container orchestration, monitoring, reliability engineering, and cloud infrastructure management (AWS preferred).
* Demonstrated ability to balance delivery speed with long‑term technical quality, reliability, and maintainability.
* Experience hiring, mentoring, and developing engineering leaders and high‑performing teams.

Role Overview

You will own engineering execution for a platform that processes large datasets, supports distributed computation, and integrates machine‑learning capabilities at scale. The engineering organisation already uses AI‑assisted development, and you will refine these practices, improve delivery predictability, and strengthen the technical foundation for long‑term growth.

Responsibilities

* Lead the engineering roadmap and technical execution across all product and platform teams.
* Guide architectural decisions in data engineering, ML, and web application development.
* Implement engineering KPIs to improve delivery speed, quality, and reliability.
* Develop a strong engineering culture centred on ownership, collaboration, and high performance.
* Partner with Product, Data Science, and DevOps to deliver high‑quality, scalable solutions.
* Manage engineering budgets, tooling, and cloud infrastructure costs.

Preferred

* Experience in search, advertising technology, or competitive intelligence.
* Familiarity with ML lifecycle tooling and agentic coding approaches.

Please send me a copy of your CV of you meet all of the above