Recherche actuelle

24 Résultats de la recherche

For CDI

AI Engineer

England, London

  • £80,000 to £105,000 GBP
  • Engineer Role
  • Skills: Python, AI, AWS
  • Seniority: Senior

Description du poste

Unlock your potential as an AV

AI Engineer

Location: London
Salary: £105,000

Are you an experienced engineer with hands‑on expertise in Python, LLMs, and modern cloud technologies? We're looking for a Lead Engineer to join a high-impact Generative AI feature team, building applications used by thousands of colleagues daily. This is a unique opportunity to drive AI innovation at scale within a highly regulated environment.



About the Role

As a Lead Engineer, you will design, develop, and enhance software solutions using modern engineering practices. You'll act as an SME within the Generative AI domain, shaping technical direction, guiding others, and influencing both strategy and implementation.

You will lead a small team, provide mentorship, conduct code reviews, and drive a culture of technical excellence. This role offers a blend of hands-on engineering, leadership, and strategic input - perfect for someone ready to step into a high-ownership position.



Key Responsibilities:

* Build high‑quality, scalable, maintainable Python-based applications.
* Develop and deploy AI-driven applications, including those using LLMs.
* Collaborate closely with product, design, and engineering teams.
* Contribute to solution design, architecture, and secure coding practices.
* Drive testing best practices and ensure repeatable, reliable deployments.

* Mentor and guide junior engineers; support ongoing capability development.
* Lead technical discussions, code reviews, and cross-functional collaboration.
* Influence decision‑making and contribute to policy/standards.
* Support risk management, governance, and control requirements.

* Contribute to the organisation's Generative AI strategy as a subject matter expert.
* Analyse complex, multi-source data to inform design and decision‑making.
* Communicate complex or sensitive information clearly to senior stakeholders.



Required Experience

To succeed in this role, you should have strong hands-on capability in:

* Python
* Working with Large Language Models (LLMs)
* Cloud technologies, ideally AWS (Bedrock, Lambda, S3, Lex, CloudWatch)
* Prompt optimisation and evaluation methodologies
* Strong communication skills, especially in cross-functional environments
* Mentoring, coaching, or guiding other engineers

Node.js Developer - Oslo

Norway, Fornebu

  • NOK 700,000 to NOK 900,000 NOK
  • Engineer Role
  • Skills: AWS, DevOps, Elastic Search, Node.JS, SQL, TypeScript, RabittMQ
  • Seniority: Senior

Description du poste

Senior Back End Engineer

Vi søker en erfaren backend- eller fullstack-utvikler som ønsker å ta teknisk eierskap. Du vil få hovedansvar for et kjerne­system som står sentralt i virksomhetens datadrevne prosesser. Målet er å styrke stabilitet, ytelse og videreutvikling, samtidig som du blir en nøkkelperson i et lite og langsiktig in-house team som bygger opp intern kompetanse og reduserer konsulentavhengighet.

Systemet du får ansvar for er en avansert matchingsplattform som kobler store mengder brukerprofiler med relevante objekter i sanntid. Plattformen benyttes daglig internt og håndterer et svært høyt volum av preferanser, søk og datakombinasjoner. Teknisk sett vil du jobbe med komplekse søkestrukturer, scoring-logikk og realtidsmatching av nye data mot tusenvis av profiler.



Rolle og ansvar

* Fullt teknisk eierskap til plattformen (drift, nyutvikling, stabilitet, ytelse, sikkerhet)
* Vedlikehold og modernisering av eksisterende arkitektur
* Backend- og infrastrukturansvar
* Tett samarbeid med teamet om arkitektur, roadmaps og prioriteringer

Kvalifikasjoner

* 5+ års solid erfaring med backend-utvikling
* Svært god kompetanse på Node.js og TypeScript
* Erfaring med event-drevet arkitektur (RabbitMQ e.l.)
* God erfaring med PostgreSQL (design, migrering, optimalisering)
* Erfaring med OpenSearch/Elasticsearch
* Trygg på komplekse systemer i produksjon
* Gode norskkunnskaper
* Interesse for KI/ML
* DevOps (CI/CD, monitoring, automatisering)



Hvis du ønsker å jobbe i et mindre team hvor du får jobbe med datadreven utvikling som virkelig gjør en forskjell for brukere og igjen kunder, ta kontakt i dag for mer informasjon.

Nouveau

Lead Data Analyst

England, Tyne and Wear, Newcastle upon Tyne

  • £70,000 to £75,000 GBP
  • Engineer Role
  • Skills: SQL, aws, databricks, l
  • Seniority: Mid-level

Description du poste

Lead Data Analyst / Data Product Lead - Managing Consultant

The Opportunity
You'll lead the delivery of analytical outcomes that enable organisations to realise their strategic vision. Acting as the bridge between business goals, data requirements, and technical implementation, you'll guide multidisciplinary teams and help clients modernise their data platforms, analytical capabilities, and decision‑making processes.
This role is ideal for someone who thrives in complex environments, enjoys solving ambiguous problems, and is passionate about modern cloud, big data, and analytics technologies.

What You'll Do
* Own and lead analytical delivery within broader data platform or transformation programmes.
* Guide teams of analysts, data engineers and analytics engineers to deliver end‑to‑end outcomes-from data workflows to analytical services and reporting assets.
* Define and uphold standards for requirements, documentation, code quality, version control, and release management.
* Partner with stakeholders across business and technology to prioritise work, manage expectations, and drive adoption.
* Run workshops to clarify requirements, map processes, and align teams on analytical definitions and success criteria.
* Shape and maintain analytical services, ensuring clear "definition of done" for outputs and user stories.
* Promote best practices in cloud, big data, analytics engineering, and AI‑accelerated frameworks.
* Contribute to proposals, shaping analytics workstreams, estimating effort, and defining delivery approaches.
* Support the creation of reusable assets such as analytics frameworks, reconciliation packs, and migration playbooks.
* Act as a role model for consulting behaviours: curiosity, clarity, pragmatism, integrity, and client empathy.

About You
You bring a blend of analytical depth, technical understanding, and strong consulting skills. You can see the bigger picture, navigate ambiguity, and lead teams to deliver high‑quality analytical products.
Experience & capabilities include:
* Significant experience leading analytical product delivery in complex, multi‑team environments.
* Proven track record delivering analytical and technical outcomes on modern cloud platforms (e.g., AWS, Azure, Snowflake, Databricks).
* Strong experience with data migration validation, reconciliation, data controls, and go‑live readiness.
* Ability to mentor analysts and collaborate effectively with engineers and architects.
* Strong stakeholder engagement skills across business and technical teams.
* Advanced SQL and Python skills.
* Solid understanding of data modelling (dimensional; Data Vault familiarity a plus).
* Strong BI and analytics experience (dashboarding, semantic modelling, storytelling).
* Familiarity with modern data warehousing, distributed processing, streaming, and DataOps.
* Comfortable leading iterative delivery using agile principles.

Qualifications & Tools
Experience with some of the following is beneficial:
* SQL/Python, Power BI, Tableau, Qlik, Dataiku, Alteryx
* AWS, Azure, GCP, Snowflake, Databricks certifications
* SAFe, Scrum Master or similar agile qualifications
* Modern data warehousing tools (Fabric, Lake Formation, Snowflake, Databricks)
* dbt or equivalent transformation tooling
* Airflow / ADF / Dagster
* Data governance, cataloguing, lineage tools
* Agile toolsets such as JIRA, Confluence, DevOps

Working Environment
* Permanent role with flexible working options.
* Hybrid model: typically 3 days per week in office (Newcastle).
* Some UK and international travel may be required.
* Eligibility for security clearance is essential.

What's in It for You
* Competitive salary with bonus potential.
* Highly collaborative culture with strong values and a people‑first mindset.
* Flexible benefits focused on wellbeing and lifestyle.
* 25 days' holiday, with the option to flex to 30.
* Two CSR volunteering days.
* Award‑winning learning and development, including dedicated training time.
* Personal tech budget for devices and accessories.
* Rapid progression opportunities in a high‑growth environment.

Please send me a copy of your CV if you're interested

Nouveau

Databricks Architect

England, London

  • £100,000 to £120,000 GBP
  • Engineer Role
  • Skills: Azure Databricks
  • Seniority: Senior

Description du poste

Data Architect - Databricks (Hybrid, UK)
Locations: London, Manchester, or Edinburgh
Hybrid: 2-3 days per week on-site
Salary: Competitive (Manager & Senior Manager grades available)

About the Role
We are seeking an experienced Data Architect with deep expertise in Databricks to help our clients design, build, and scale modern data platforms. You will play a pivotal role in shaping Lakehouse architectures that enable advanced analytics, AI/ML, and enterprise-wide data-driven decision-making.
Working closely with clients early in their data journey, you will assess business needs, define architectural direction, and guide the implementation of robust, secure, and scalable solutions. This is a hands-on architecture role suited to someone who has spent the last 2-3 years working directly with Databricks at an architectural level and is ready to progress towards programmes such as the Databricks DPP.

Key Responsibilities
* Architect and implement Databricks Lakehouse solutions across ingestion, processing, storage, and analytics layers.
* Recommend best practices and innovative approaches for modern data platforms.
* Build strong client relationships and confidently present architectural decisions to senior stakeholders.
* Shape client data strategies and promote governance, quality, and security standards.
* Lead architectural engagements and ensure delivery within scope, budget, and timelines.
* Optimise Databricks workloads for performance, scalability, and cost efficiency.
* Implement governance and compliance frameworks using Unity Catalog, Purview, and cloud-native controls.
* Develop CI/CD pipelines using Databricks Repos, GitHub Actions, or Azure DevOps.
* Contribute to RFI/RFP responses and deliver innovative Proofs of Concept.
* Support the internal Architecture Practice by developing reusable patterns and accelerators.

Skills & Experience
* Proven experience delivering enterprise-scale Databricks solutions end-to-end.
* Strong background in Lakehouse Architecture, including structured and unstructured data.
* Expertise in Spark, PySpark, Delta Lake, and Databricks workflows.
* Experience building scalable ETL/ELT pipelines, including Delta Live Tables.
* Strong programming skills in Python, Scala, or SQL.
* Solid understanding of data modelling (3NF, Kimball, Data Vault).
* Experience integrating Lakehouse architectures with BI tools such as Power BI and Tableau.
* Hands-on experience with at least one major cloud platform (Azure, AWS, or GCP) and understanding of Databricks implications across each.
* Knowledge of Databricks security best practices (RBAC, IAM, encryption).
* Excellent communication, stakeholder engagement, and problem-solving skills.

Highly Valued Certifications
* Databricks Certified Data Engineer (Associate/Professional)
* Databricks Certified Machine Learning (Associate/Professional)
* Databricks Generative AI Fundamentals
* Databricks Lakehouse Fundamentals

Why Join Us?
* Generous annual leave and private medical insurance.
* Strong focus on wellbeing and personal development.
* A culture that rewards high performance and nurtures talent.
* Opportunities to work on impactful client projects and drive meaningful change.
* Supportive environment with investment in certifications and career progression.

Additional Information
* This role is fully signed off and part of a growing Databricks capability.
* Candidates must be willing to travel between UK offices when required.
* Suitable for individuals with strong architectural experience rather than purely engineering backgrounds.

Please can you send me a copy of your CV if you're interested

Nouveau

Databricks Architect

England, London

  • £100,000 to £120,000 GBP
  • Engineer Role
  • Skills: Azure Databricks
  • Seniority: Senior

Description du poste

Data Architect - Databricks (Hybrid, UK)
Locations: London, Manchester, or Edinburgh
Hybrid: 2-3 days per week on-site
Salary: Competitive (Manager & Senior Manager grades available)

About the Role
We are seeking an experienced Data Architect with deep expertise in Databricks to help our clients design, build, and scale modern data platforms. You will play a pivotal role in shaping Lakehouse architectures that enable advanced analytics, AI/ML, and enterprise-wide data-driven decision-making.
Working closely with clients early in their data journey, you will assess business needs, define architectural direction, and guide the implementation of robust, secure, and scalable solutions. This is a hands-on architecture role suited to someone who has spent the last 2-3 years working directly with Databricks at an architectural level and is ready to progress towards programmes such as the Databricks DPP.

Key Responsibilities
* Architect and implement Databricks Lakehouse solutions across ingestion, processing, storage, and analytics layers.
* Recommend best practices and innovative approaches for modern data platforms.
* Build strong client relationships and confidently present architectural decisions to senior stakeholders.
* Shape client data strategies and promote governance, quality, and security standards.
* Lead architectural engagements and ensure delivery within scope, budget, and timelines.
* Optimise Databricks workloads for performance, scalability, and cost efficiency.
* Implement governance and compliance frameworks using Unity Catalog, Purview, and cloud-native controls.
* Develop CI/CD pipelines using Databricks Repos, GitHub Actions, or Azure DevOps.
* Contribute to RFI/RFP responses and deliver innovative Proofs of Concept.
* Support the internal Architecture Practice by developing reusable patterns and accelerators.

Skills & Experience
* Proven experience delivering enterprise-scale Databricks solutions end-to-end.
* Strong background in Lakehouse Architecture, including structured and unstructured data.
* Expertise in Spark, PySpark, Delta Lake, and Databricks workflows.
* Experience building scalable ETL/ELT pipelines, including Delta Live Tables.
* Strong programming skills in Python, Scala, or SQL.
* Solid understanding of data modelling (3NF, Kimball, Data Vault).
* Experience integrating Lakehouse architectures with BI tools such as Power BI and Tableau.
* Hands-on experience with at least one major cloud platform (Azure, AWS, or GCP) and understanding of Databricks implications across each.
* Knowledge of Databricks security best practices (RBAC, IAM, encryption).
* Excellent communication, stakeholder engagement, and problem-solving skills.

Highly Valued Certifications
* Databricks Certified Data Engineer (Associate/Professional)
* Databricks Certified Machine Learning (Associate/Professional)
* Databricks Generative AI Fundamentals
* Databricks Lakehouse Fundamentals

Why Join Us?
* Generous annual leave and private medical insurance.
* Strong focus on wellbeing and personal development.
* A culture that rewards high performance and nurtures talent.
* Opportunities to work on impactful client projects and drive meaningful change.
* Supportive environment with investment in certifications and career progression.

Additional Information
* This role is fully signed off and part of a growing Databricks capability.
* Candidates must be willing to travel between UK offices when required.
* Suitable for individuals with strong architectural experience rather than purely engineering backgrounds.

Please can you send me a copy of your CV if you're interested

Nouveau

Databricks Architect

Scotland, Edinburgh

  • £110,000 to £120,000 GBP
  • Engineer Role
  • Skills: Azure Databricks
  • Seniority: Senior

Description du poste

Data Architect - Databricks (Hybrid, UK)
Locations: London, Manchester, or Edinburgh
Hybrid: 2-3 days per week on-site
Salary: Competitive (Manager & Senior Manager grades available)

About the Role
We are seeking an experienced Data Architect with deep expertise in Databricks to help our clients design, build, and scale modern data platforms. You will play a pivotal role in shaping Lakehouse architectures that enable advanced analytics, AI/ML, and enterprise-wide data-driven decision-making.
Working closely with clients early in their data journey, you will assess business needs, define architectural direction, and guide the implementation of robust, secure, and scalable solutions. This is a hands-on architecture role suited to someone who has spent the last 2-3 years working directly with Databricks at an architectural level and is ready to progress towards programmes such as the Databricks DPP.

Key Responsibilities
* Architect and implement Databricks Lakehouse solutions across ingestion, processing, storage, and analytics layers.
* Recommend best practices and innovative approaches for modern data platforms.
* Build strong client relationships and confidently present architectural decisions to senior stakeholders.
* Shape client data strategies and promote governance, quality, and security standards.
* Lead architectural engagements and ensure delivery within scope, budget, and timelines.
* Optimise Databricks workloads for performance, scalability, and cost efficiency.
* Implement governance and compliance frameworks using Unity Catalog, Purview, and cloud-native controls.
* Develop CI/CD pipelines using Databricks Repos, GitHub Actions, or Azure DevOps.
* Contribute to RFI/RFP responses and deliver innovative Proofs of Concept.
* Support the internal Architecture Practice by developing reusable patterns and accelerators.

Skills & Experience
* Proven experience delivering enterprise-scale Databricks solutions end-to-end.
* Strong background in Lakehouse Architecture, including structured and unstructured data.
* Expertise in Spark, PySpark, Delta Lake, and Databricks workflows.
* Experience building scalable ETL/ELT pipelines, including Delta Live Tables.
* Strong programming skills in Python, Scala, or SQL.
* Solid understanding of data modelling (3NF, Kimball, Data Vault).
* Experience integrating Lakehouse architectures with BI tools such as Power BI and Tableau.
* Hands-on experience with at least one major cloud platform (Azure, AWS, or GCP) and understanding of Databricks implications across each.
* Knowledge of Databricks security best practices (RBAC, IAM, encryption).
* Excellent communication, stakeholder engagement, and problem-solving skills.

Highly Valued Certifications
* Databricks Certified Data Engineer (Associate/Professional)
* Databricks Certified Machine Learning (Associate/Professional)
* Databricks Generative AI Fundamentals
* Databricks Lakehouse Fundamentals

Why Join Us?
* Generous annual leave and private medical insurance.
* Strong focus on wellbeing and personal development.
* A culture that rewards high performance and nurtures talent.
* Opportunities to work on impactful client projects and drive meaningful change.
* Supportive environment with investment in certifications and career progression.

Additional Information
* This role is fully signed off and part of a growing Databricks capability.
* Candidates must be willing to travel between UK offices when required.
* Suitable for individuals with strong architectural experience rather than purely engineering backgrounds.

Please can you send me a copy of your CV if you're intersted

Nouveau

AI Engineer - Agentic AI & Gen AI

Australia, New South Wales, Sydney

  • A$170,000 to A$200,000 AUD
  • Engineer Role
  • Skills: AI Engineer, Generative AI, Agentic AI, AI Agents, Azure, RAG, Retrieval Augmented Generation, LangGraph, AutoGen, CrewAI, Python Developer, Machine Learning Engineer, LLM, Engineer, Artificial Intelligence, Foundry
  • Seniority: Senior

Description du poste

Sydney / Melbourne | Permanent OR Contract

Tenth Revolution Group is partnering with a specialist AI and data consultancy delivering advanced AI solutions into highly regulated enterprise environments.

Due to continued growth, they are looking to appoint multiple AI Engineers to play a key role in building and deploying next-generation AI solutions, with a particular focus on agentic AI systems and generative AI platforms.

This is a hands-on engineering role within a growing AI team responsible for turning emerging AI capabilities into production-grade systems used in complex, real-world environments.

Open to both permanent or contract, Melbourne or Sydney locations.



The Opportunity

You will design, prototype and deploy advanced AI systems that automate complex workflows and support intelligent decision-making.

Working closely with engineers, data specialists and business stakeholders, you will help translate ideas into scalable AI products while contributing to internal frameworks and reusable AI components.

This role is ideal for someone who enjoys building practical AI systems rather than purely researching them.



Key Responsibilities

*

Design and deliver end-to-end AI solutions solving complex business challenges
*

Build and deploy AI agents and automated workflows using modern frameworks
*

Prototype and scale production-grade Generative AI systems
*

Develop reusable AI components, orchestration frameworks and internal tooling
*

Collaborate closely with engineers, data scientists and product stakeholders
*

Monitor and improve deployed AI systems for performance and reliability
*

Apply responsible AI practices including explainability, governance and traceability
*

Ensure solutions meet security and compliance standards expected in enterprise environments



Skills & Experience

*

Strong software engineering background with several years building production systems
*

Hands-on experience developing AI agents, automation workflows or multi-agent systems
*

Experience with agent frameworks such as LangGraph, AutoGen, CrewAI or similar
*

Experience designing and deploying Retrieval-Augmented Generation (RAG) solutions
*

Ability to rapidly prototype AI-powered applications and iterate towards production systems
*

Experience deploying AI solutions across major cloud platforms
*

Exposure to knowledge graphs, orchestration patterns or advanced retrieval architectures is advantageous
*

Experience supporting or monitoring production AI systems
*

Ability to collaborate with both technical and non-technical stakeholders

Experience delivering solutions within regulated or enterprise environments is highly regarded.



Why Apply?

*

Join a growing AI engineering team building advanced AI capabilities
*

Work on agent-based AI systems and production GenAI solutions
*

Strong technical ownership with opportunity to shape AI frameworks and tooling
*

Collaborative environment with strong focus on learning and development



Apply now or contact Tenth Revolution Group for a confidential discussion.

Contact:

Neros Gorges
(03) 8592 0507
n.gorges@tenthrevolution.com

#SCR-neros-gorges

Lead Azure Databricks Engineer

England, London

  • £70,000 to £80,000 GBP
  • Engineer Role
  • Skills: Azure Databricks, insurance, reinsurance, lloyds of london
  • Seniority: Senior

Description du poste

Azure Databricks Engineer
Lloyd's of London Market

Hybrid 2 days in London Office

We are seeking an outstanding Lead Azure Databricks Engineer to own the design, build, optimisation and governance of our enterprise cloud data platforms. This is a senior, fully hands‑on engineering position requiring deep technical expertise, strong delivery capability and experience operating within the Lloyd's of London market and its regulatory demands.

You will shape the future of our cloud data strategy, enabling advanced analytics and building secure, scalable and highly performant data solutions across the organisation.

Key Responsibilities

* Lead the development, optimisation and governance of large‑scale data platforms using Azure Data Factory, Data Lake, Key Vault, Azure Functions, Databricks, Delta Lake, PySpark and Unity Catalog
* Partner closely with underwriting, actuarial, delegated authority, bordereaux, exposure management, reinsurance, finance, Solvency II and risk teams to deliver solutions aligned to Lloyd's of London regulatory and operational requirements
* Translate complex business needs into scalable cloud data architectures in partnership with architects, SMEs and product owners
* Resolve complex technical challenges rapidly and decisively, consistently delivering high‑quality outcomes
* Champion engineering best practices including design patterns, data lifecycle management, CI/CD automation and cloud‑native methodologies
* Provide hands‑on technical leadership, mentorship and uplift engineering capability across the team
* Drive continuous improvement across performance, cost optimisation, reliability and resilience
* Oversee end‑to‑end data engineering delivery ensuring quality, observability, robustness and compliance

Required Experience and Skills

* Expert‑level knowledge of Azure Data Services and Databricks within enterprise environments
* Deep understanding of Delta Lake, Medallion architecture, distributed compute, Lakehouse patterns, data modelling and performance optimisation
* Strong Python, PySpark and SQL experience, with hands‑on implementation of CI/CD for data solutions using Azure DevOps
* Strong understanding of data governance, lineage, access management, FinOps and secure cloud engineering
* Excellent communication skills with the ability to confidently engage senior stakeholders and simplify complex technical concepts
* Proven ability to take ownership, deliver at pace and operate effectively in high‑pressure, high‑expectation environments
* Experience working within or delivering solutions for the Lloyd's of London market or similarly regulated environments

Please send me a copy if your CV if you're interested

Python Backend Engineer

Denmark, Copenhagen Municipality, Copenhagen

  • DKK 50,000 to DKK 60,000 DKK
  • Engineer Role
  • Skills: python, django, ai, aws, backend, analytics, datam ml, code
  • Seniority: Mid-level

Description du poste

Copenhagen - Hybrid / On-site
Full-time, Permanent

Position

We're looking for an experienced Python and backend engineer to join a finance SaaS company developing data and risk-analysis tools for a mid-market financial institution.

You'll be responsible for building and maintaining backend services that support the core analytics platform. The role focuses on clean architecture, scalable data pipelines, and integrating machine learning models into production systems. You'll work closely with data scientists and the frontend team to deliver reliable and well-structured APIs.

What You'll Do

* Build and maintain backend services in Python
* Develop APIs and data pipelines that support real-time analytics
* Collaborate with data scientists to deploy and monitor machine learning models
* Improve system performance, reliability, and observability
* Work with the frontend team to design efficient API contracts
* Participate in sprint planning, stand-ups, and retrospectives
* Promote best practices in backend architecture, testing, and documentation

Your Profile

* Strong experience with Python and backend development
* Good understanding of microservices, API design, and distributed systems
* Experience with data processing, ML pipelines, or model deployment
* Familiarity with cloud environments and containerisation
* Comfortable collaborating across engineering and data teams
* A structured mindset and an interest in writing maintainable, well-tested code

Why Join Us

* Work on a platform used by financial teams to make better, faster decisions
* Be part of a product-driven company focused on clarity and reliability
* Take ownership of backend architecture and influence technical direction
* Join a small team that values thoughtful engineering and practical problem-solving
* Competitive salary, benefits, and flexible work setup

Questions?

Contact recruiter Maisy Holmes at M.Holmes@tenthrevolution.com or call +45 88 74 11 42.

Head of Engineering

England, London

  • £80,000 to £150,000 GBP
  • Engineer Role
  • Skills: Python, Devops, QA, ML, aws, data science, spark, databricks,
  • Seniority: Senior

Description du poste

Head of Engineering - Permanent

Location: Remote
Reports to: CTO

A fast‑growing technology company is seeking a Head of Engineering with deep experience building and scaling complex, data‑driven platforms. This role requires someone who can lead multi‑disciplinary teams, improve engineering execution, and drive technical excellence across high‑scale distributed systems and machine‑learning environments.

Required Experience

* Leadership of engineering organisations of 20 or more across multiple disciplines, including backend, web, data, ML, QA, and DevOps.
* Proven track record designing, scaling, and maintaining distributed systems and high‑volume data pipelines.
* Strong background in cloud‑native architectures and modern data stacks such as Spark or Databricks.
* Experience working closely with Data Science teams, including delivering production‑grade ML models and pipelines.
* Solid understanding of modern frontend engineering practices (e.g., TypeScript, React) to guide cross‑functional technical decisions.
* Robust DevOps knowledge, including CI/CD pipelines, container orchestration, monitoring, reliability engineering, and cloud infrastructure management (AWS preferred).
* Demonstrated ability to balance delivery speed with long‑term technical quality, reliability, and maintainability.
* Experience hiring, mentoring, and developing engineering leaders and high‑performing teams.

Role Overview

You will own engineering execution for a platform that processes large datasets, supports distributed computation, and integrates machine‑learning capabilities at scale. The engineering organisation already uses AI‑assisted development, and you will refine these practices, improve delivery predictability, and strengthen the technical foundation for long‑term growth.

Responsibilities

* Lead the engineering roadmap and technical execution across all product and platform teams.
* Guide architectural decisions in data engineering, ML, and web application development.
* Implement engineering KPIs to improve delivery speed, quality, and reliability.
* Develop a strong engineering culture centred on ownership, collaboration, and high performance.
* Partner with Product, Data Science, and DevOps to deliver high‑quality, scalable solutions.
* Manage engineering budgets, tooling, and cloud infrastructure costs.

Preferred

* Experience in search, advertising technology, or competitive intelligence.
* Familiarity with ML lifecycle tooling and agentic coding approaches.

Please send me a copy of your CV of you meet all of the above