Recherche actuelle

23 Résultats de la recherche

For CDI

AI Engineer

England, London

  • £80,000 to £105,000 GBP
  • Engineer Role
  • Skills: Python, AI, AWS
  • Seniority: Senior

Description du poste

Unlock your potential as an AV

AI Engineer

Location: London
Salary: £105,000

Are you an experienced engineer with hands‑on expertise in Python, LLMs, and modern cloud technologies? We're looking for a Lead Engineer to join a high-impact Generative AI feature team, building applications used by thousands of colleagues daily. This is a unique opportunity to drive AI innovation at scale within a highly regulated environment.



About the Role

As a Lead Engineer, you will design, develop, and enhance software solutions using modern engineering practices. You'll act as an SME within the Generative AI domain, shaping technical direction, guiding others, and influencing both strategy and implementation.

You will lead a small team, provide mentorship, conduct code reviews, and drive a culture of technical excellence. This role offers a blend of hands-on engineering, leadership, and strategic input - perfect for someone ready to step into a high-ownership position.



Key Responsibilities:

* Build high‑quality, scalable, maintainable Python-based applications.
* Develop and deploy AI-driven applications, including those using LLMs.
* Collaborate closely with product, design, and engineering teams.
* Contribute to solution design, architecture, and secure coding practices.
* Drive testing best practices and ensure repeatable, reliable deployments.

* Mentor and guide junior engineers; support ongoing capability development.
* Lead technical discussions, code reviews, and cross-functional collaboration.
* Influence decision‑making and contribute to policy/standards.
* Support risk management, governance, and control requirements.

* Contribute to the organisation's Generative AI strategy as a subject matter expert.
* Analyse complex, multi-source data to inform design and decision‑making.
* Communicate complex or sensitive information clearly to senior stakeholders.



Required Experience

To succeed in this role, you should have strong hands-on capability in:

* Python
* Working with Large Language Models (LLMs)
* Cloud technologies, ideally AWS (Bedrock, Lambda, S3, Lex, CloudWatch)
* Prompt optimisation and evaluation methodologies
* Strong communication skills, especially in cross-functional environments
* Mentoring, coaching, or guiding other engineers

Node.js Developer - Oslo

Norway, Fornebu

  • NOK 700,000 to NOK 900,000 NOK
  • Engineer Role
  • Skills: AWS, DevOps, Elastic Search, Node.JS, SQL, TypeScript, RabittMQ
  • Seniority: Senior

Description du poste

Senior Back End Engineer

Vi søker en erfaren backend- eller fullstack-utvikler som ønsker å ta teknisk eierskap. Du vil få hovedansvar for et kjerne­system som står sentralt i virksomhetens datadrevne prosesser. Målet er å styrke stabilitet, ytelse og videreutvikling, samtidig som du blir en nøkkelperson i et lite og langsiktig in-house team som bygger opp intern kompetanse og reduserer konsulentavhengighet.

Systemet du får ansvar for er en avansert matchingsplattform som kobler store mengder brukerprofiler med relevante objekter i sanntid. Plattformen benyttes daglig internt og håndterer et svært høyt volum av preferanser, søk og datakombinasjoner. Teknisk sett vil du jobbe med komplekse søkestrukturer, scoring-logikk og realtidsmatching av nye data mot tusenvis av profiler.



Rolle og ansvar

* Fullt teknisk eierskap til plattformen (drift, nyutvikling, stabilitet, ytelse, sikkerhet)
* Vedlikehold og modernisering av eksisterende arkitektur
* Backend- og infrastrukturansvar
* Tett samarbeid med teamet om arkitektur, roadmaps og prioriteringer

Kvalifikasjoner

* 5+ års solid erfaring med backend-utvikling
* Svært god kompetanse på Node.js og TypeScript
* Erfaring med event-drevet arkitektur (RabbitMQ e.l.)
* God erfaring med PostgreSQL (design, migrering, optimalisering)
* Erfaring med OpenSearch/Elasticsearch
* Trygg på komplekse systemer i produksjon
* Gode norskkunnskaper
* Interesse for KI/ML
* DevOps (CI/CD, monitoring, automatisering)



Hvis du ønsker å jobbe i et mindre team hvor du får jobbe med datadreven utvikling som virkelig gjør en forskjell for brukere og igjen kunder, ta kontakt i dag for mer informasjon.

Nouveau

Lead Data Analyst

England, Tyne and Wear, Newcastle upon Tyne

  • £70,000 to £75,000 GBP
  • Engineer Role
  • Skills: SQL, aws, databricks, l
  • Seniority: Mid-level

Description du poste

Lead Data Analyst / Data Product Lead - Managing Consultant

The Opportunity
You'll lead the delivery of analytical outcomes that enable organisations to realise their strategic vision. Acting as the bridge between business goals, data requirements, and technical implementation, you'll guide multidisciplinary teams and help clients modernise their data platforms, analytical capabilities, and decision‑making processes.
This role is ideal for someone who thrives in complex environments, enjoys solving ambiguous problems, and is passionate about modern cloud, big data, and analytics technologies.

What You'll Do
* Own and lead analytical delivery within broader data platform or transformation programmes.
* Guide teams of analysts, data engineers and analytics engineers to deliver end‑to‑end outcomes-from data workflows to analytical services and reporting assets.
* Define and uphold standards for requirements, documentation, code quality, version control, and release management.
* Partner with stakeholders across business and technology to prioritise work, manage expectations, and drive adoption.
* Run workshops to clarify requirements, map processes, and align teams on analytical definitions and success criteria.
* Shape and maintain analytical services, ensuring clear "definition of done" for outputs and user stories.
* Promote best practices in cloud, big data, analytics engineering, and AI‑accelerated frameworks.
* Contribute to proposals, shaping analytics workstreams, estimating effort, and defining delivery approaches.
* Support the creation of reusable assets such as analytics frameworks, reconciliation packs, and migration playbooks.
* Act as a role model for consulting behaviours: curiosity, clarity, pragmatism, integrity, and client empathy.

About You
You bring a blend of analytical depth, technical understanding, and strong consulting skills. You can see the bigger picture, navigate ambiguity, and lead teams to deliver high‑quality analytical products.
Experience & capabilities include:
* Significant experience leading analytical product delivery in complex, multi‑team environments.
* Proven track record delivering analytical and technical outcomes on modern cloud platforms (e.g., AWS, Azure, Snowflake, Databricks).
* Strong experience with data migration validation, reconciliation, data controls, and go‑live readiness.
* Ability to mentor analysts and collaborate effectively with engineers and architects.
* Strong stakeholder engagement skills across business and technical teams.
* Advanced SQL and Python skills.
* Solid understanding of data modelling (dimensional; Data Vault familiarity a plus).
* Strong BI and analytics experience (dashboarding, semantic modelling, storytelling).
* Familiarity with modern data warehousing, distributed processing, streaming, and DataOps.
* Comfortable leading iterative delivery using agile principles.

Qualifications & Tools
Experience with some of the following is beneficial:
* SQL/Python, Power BI, Tableau, Qlik, Dataiku, Alteryx
* AWS, Azure, GCP, Snowflake, Databricks certifications
* SAFe, Scrum Master or similar agile qualifications
* Modern data warehousing tools (Fabric, Lake Formation, Snowflake, Databricks)
* dbt or equivalent transformation tooling
* Airflow / ADF / Dagster
* Data governance, cataloguing, lineage tools
* Agile toolsets such as JIRA, Confluence, DevOps

Working Environment
* Permanent role with flexible working options.
* Hybrid model: typically 3 days per week in office (Newcastle).
* Some UK and international travel may be required.
* Eligibility for security clearance is essential.

What's in It for You
* Competitive salary with bonus potential.
* Highly collaborative culture with strong values and a people‑first mindset.
* Flexible benefits focused on wellbeing and lifestyle.
* 25 days' holiday, with the option to flex to 30.
* Two CSR volunteering days.
* Award‑winning learning and development, including dedicated training time.
* Personal tech budget for devices and accessories.
* Rapid progression opportunities in a high‑growth environment.

Please send me a copy of your CV if you're interested

Nouveau

Databricks Architect

England, London

  • £100,000 to £120,000 GBP
  • Engineer Role
  • Skills: Azure Databricks
  • Seniority: Senior

Description du poste

Data Architect - Databricks (Hybrid, UK)
Locations: London, Manchester, or Edinburgh
Hybrid: 2-3 days per week on-site
Salary: Competitive (Manager & Senior Manager grades available)

About the Role
We are seeking an experienced Data Architect with deep expertise in Databricks to help our clients design, build, and scale modern data platforms. You will play a pivotal role in shaping Lakehouse architectures that enable advanced analytics, AI/ML, and enterprise-wide data-driven decision-making.
Working closely with clients early in their data journey, you will assess business needs, define architectural direction, and guide the implementation of robust, secure, and scalable solutions. This is a hands-on architecture role suited to someone who has spent the last 2-3 years working directly with Databricks at an architectural level and is ready to progress towards programmes such as the Databricks DPP.

Key Responsibilities
* Architect and implement Databricks Lakehouse solutions across ingestion, processing, storage, and analytics layers.
* Recommend best practices and innovative approaches for modern data platforms.
* Build strong client relationships and confidently present architectural decisions to senior stakeholders.
* Shape client data strategies and promote governance, quality, and security standards.
* Lead architectural engagements and ensure delivery within scope, budget, and timelines.
* Optimise Databricks workloads for performance, scalability, and cost efficiency.
* Implement governance and compliance frameworks using Unity Catalog, Purview, and cloud-native controls.
* Develop CI/CD pipelines using Databricks Repos, GitHub Actions, or Azure DevOps.
* Contribute to RFI/RFP responses and deliver innovative Proofs of Concept.
* Support the internal Architecture Practice by developing reusable patterns and accelerators.

Skills & Experience
* Proven experience delivering enterprise-scale Databricks solutions end-to-end.
* Strong background in Lakehouse Architecture, including structured and unstructured data.
* Expertise in Spark, PySpark, Delta Lake, and Databricks workflows.
* Experience building scalable ETL/ELT pipelines, including Delta Live Tables.
* Strong programming skills in Python, Scala, or SQL.
* Solid understanding of data modelling (3NF, Kimball, Data Vault).
* Experience integrating Lakehouse architectures with BI tools such as Power BI and Tableau.
* Hands-on experience with at least one major cloud platform (Azure, AWS, or GCP) and understanding of Databricks implications across each.
* Knowledge of Databricks security best practices (RBAC, IAM, encryption).
* Excellent communication, stakeholder engagement, and problem-solving skills.

Highly Valued Certifications
* Databricks Certified Data Engineer (Associate/Professional)
* Databricks Certified Machine Learning (Associate/Professional)
* Databricks Generative AI Fundamentals
* Databricks Lakehouse Fundamentals

Why Join Us?
* Generous annual leave and private medical insurance.
* Strong focus on wellbeing and personal development.
* A culture that rewards high performance and nurtures talent.
* Opportunities to work on impactful client projects and drive meaningful change.
* Supportive environment with investment in certifications and career progression.

Additional Information
* This role is fully signed off and part of a growing Databricks capability.
* Candidates must be willing to travel between UK offices when required.
* Suitable for individuals with strong architectural experience rather than purely engineering backgrounds.

Please can you send me a copy of your CV if you're interested

Nouveau

Databricks Architect

England, London

  • £100,000 to £120,000 GBP
  • Engineer Role
  • Skills: Azure Databricks
  • Seniority: Senior

Description du poste

Data Architect - Databricks (Hybrid, UK)
Locations: London, Manchester, or Edinburgh
Hybrid: 2-3 days per week on-site
Salary: Competitive (Manager & Senior Manager grades available)

About the Role
We are seeking an experienced Data Architect with deep expertise in Databricks to help our clients design, build, and scale modern data platforms. You will play a pivotal role in shaping Lakehouse architectures that enable advanced analytics, AI/ML, and enterprise-wide data-driven decision-making.
Working closely with clients early in their data journey, you will assess business needs, define architectural direction, and guide the implementation of robust, secure, and scalable solutions. This is a hands-on architecture role suited to someone who has spent the last 2-3 years working directly with Databricks at an architectural level and is ready to progress towards programmes such as the Databricks DPP.

Key Responsibilities
* Architect and implement Databricks Lakehouse solutions across ingestion, processing, storage, and analytics layers.
* Recommend best practices and innovative approaches for modern data platforms.
* Build strong client relationships and confidently present architectural decisions to senior stakeholders.
* Shape client data strategies and promote governance, quality, and security standards.
* Lead architectural engagements and ensure delivery within scope, budget, and timelines.
* Optimise Databricks workloads for performance, scalability, and cost efficiency.
* Implement governance and compliance frameworks using Unity Catalog, Purview, and cloud-native controls.
* Develop CI/CD pipelines using Databricks Repos, GitHub Actions, or Azure DevOps.
* Contribute to RFI/RFP responses and deliver innovative Proofs of Concept.
* Support the internal Architecture Practice by developing reusable patterns and accelerators.

Skills & Experience
* Proven experience delivering enterprise-scale Databricks solutions end-to-end.
* Strong background in Lakehouse Architecture, including structured and unstructured data.
* Expertise in Spark, PySpark, Delta Lake, and Databricks workflows.
* Experience building scalable ETL/ELT pipelines, including Delta Live Tables.
* Strong programming skills in Python, Scala, or SQL.
* Solid understanding of data modelling (3NF, Kimball, Data Vault).
* Experience integrating Lakehouse architectures with BI tools such as Power BI and Tableau.
* Hands-on experience with at least one major cloud platform (Azure, AWS, or GCP) and understanding of Databricks implications across each.
* Knowledge of Databricks security best practices (RBAC, IAM, encryption).
* Excellent communication, stakeholder engagement, and problem-solving skills.

Highly Valued Certifications
* Databricks Certified Data Engineer (Associate/Professional)
* Databricks Certified Machine Learning (Associate/Professional)
* Databricks Generative AI Fundamentals
* Databricks Lakehouse Fundamentals

Why Join Us?
* Generous annual leave and private medical insurance.
* Strong focus on wellbeing and personal development.
* A culture that rewards high performance and nurtures talent.
* Opportunities to work on impactful client projects and drive meaningful change.
* Supportive environment with investment in certifications and career progression.

Additional Information
* This role is fully signed off and part of a growing Databricks capability.
* Candidates must be willing to travel between UK offices when required.
* Suitable for individuals with strong architectural experience rather than purely engineering backgrounds.

Please can you send me a copy of your CV if you're interested

Nouveau

Databricks Architect

Scotland, Edinburgh

  • £110,000 to £120,000 GBP
  • Engineer Role
  • Skills: Azure Databricks
  • Seniority: Senior

Description du poste

Data Architect - Databricks (Hybrid, UK)
Locations: London, Manchester, or Edinburgh
Hybrid: 2-3 days per week on-site
Salary: Competitive (Manager & Senior Manager grades available)

About the Role
We are seeking an experienced Data Architect with deep expertise in Databricks to help our clients design, build, and scale modern data platforms. You will play a pivotal role in shaping Lakehouse architectures that enable advanced analytics, AI/ML, and enterprise-wide data-driven decision-making.
Working closely with clients early in their data journey, you will assess business needs, define architectural direction, and guide the implementation of robust, secure, and scalable solutions. This is a hands-on architecture role suited to someone who has spent the last 2-3 years working directly with Databricks at an architectural level and is ready to progress towards programmes such as the Databricks DPP.

Key Responsibilities
* Architect and implement Databricks Lakehouse solutions across ingestion, processing, storage, and analytics layers.
* Recommend best practices and innovative approaches for modern data platforms.
* Build strong client relationships and confidently present architectural decisions to senior stakeholders.
* Shape client data strategies and promote governance, quality, and security standards.
* Lead architectural engagements and ensure delivery within scope, budget, and timelines.
* Optimise Databricks workloads for performance, scalability, and cost efficiency.
* Implement governance and compliance frameworks using Unity Catalog, Purview, and cloud-native controls.
* Develop CI/CD pipelines using Databricks Repos, GitHub Actions, or Azure DevOps.
* Contribute to RFI/RFP responses and deliver innovative Proofs of Concept.
* Support the internal Architecture Practice by developing reusable patterns and accelerators.

Skills & Experience
* Proven experience delivering enterprise-scale Databricks solutions end-to-end.
* Strong background in Lakehouse Architecture, including structured and unstructured data.
* Expertise in Spark, PySpark, Delta Lake, and Databricks workflows.
* Experience building scalable ETL/ELT pipelines, including Delta Live Tables.
* Strong programming skills in Python, Scala, or SQL.
* Solid understanding of data modelling (3NF, Kimball, Data Vault).
* Experience integrating Lakehouse architectures with BI tools such as Power BI and Tableau.
* Hands-on experience with at least one major cloud platform (Azure, AWS, or GCP) and understanding of Databricks implications across each.
* Knowledge of Databricks security best practices (RBAC, IAM, encryption).
* Excellent communication, stakeholder engagement, and problem-solving skills.

Highly Valued Certifications
* Databricks Certified Data Engineer (Associate/Professional)
* Databricks Certified Machine Learning (Associate/Professional)
* Databricks Generative AI Fundamentals
* Databricks Lakehouse Fundamentals

Why Join Us?
* Generous annual leave and private medical insurance.
* Strong focus on wellbeing and personal development.
* A culture that rewards high performance and nurtures talent.
* Opportunities to work on impactful client projects and drive meaningful change.
* Supportive environment with investment in certifications and career progression.

Additional Information
* This role is fully signed off and part of a growing Databricks capability.
* Candidates must be willing to travel between UK offices when required.
* Suitable for individuals with strong architectural experience rather than purely engineering backgrounds.

Please can you send me a copy of your CV if you're intersted

Nouveau

AI Engineer - Agentic AI & Gen AI

Australia, New South Wales, Sydney

  • A$170,000 to A$200,000 AUD
  • Engineer Role
  • Skills: AI Engineer, Generative AI, Agentic AI, AI Agents, Azure, RAG, Retrieval Augmented Generation, LangGraph, AutoGen, CrewAI, Python Developer, Machine Learning Engineer, LLM, Engineer, Artificial Intelligence, Foundry
  • Seniority: Senior

Description du poste

Sydney / Melbourne | Permanent OR Contract

Tenth Revolution Group is partnering with a specialist AI and data consultancy delivering advanced AI solutions into highly regulated enterprise environments.

Due to continued growth, they are looking to appoint multiple AI Engineers to play a key role in building and deploying next-generation AI solutions, with a particular focus on agentic AI systems and generative AI platforms.

This is a hands-on engineering role within a growing AI team responsible for turning emerging AI capabilities into production-grade systems used in complex, real-world environments.

Open to both permanent or contract, Melbourne or Sydney locations.



The Opportunity

You will design, prototype and deploy advanced AI systems that automate complex workflows and support intelligent decision-making.

Working closely with engineers, data specialists and business stakeholders, you will help translate ideas into scalable AI products while contributing to internal frameworks and reusable AI components.

This role is ideal for someone who enjoys building practical AI systems rather than purely researching them.



Key Responsibilities

*

Design and deliver end-to-end AI solutions solving complex business challenges
*

Build and deploy AI agents and automated workflows using modern frameworks
*

Prototype and scale production-grade Generative AI systems
*

Develop reusable AI components, orchestration frameworks and internal tooling
*

Collaborate closely with engineers, data scientists and product stakeholders
*

Monitor and improve deployed AI systems for performance and reliability
*

Apply responsible AI practices including explainability, governance and traceability
*

Ensure solutions meet security and compliance standards expected in enterprise environments



Skills & Experience

*

Strong software engineering background with several years building production systems
*

Hands-on experience developing AI agents, automation workflows or multi-agent systems
*

Experience with agent frameworks such as LangGraph, AutoGen, CrewAI or similar
*

Experience designing and deploying Retrieval-Augmented Generation (RAG) solutions
*

Ability to rapidly prototype AI-powered applications and iterate towards production systems
*

Experience deploying AI solutions across major cloud platforms
*

Exposure to knowledge graphs, orchestration patterns or advanced retrieval architectures is advantageous
*

Experience supporting or monitoring production AI systems
*

Ability to collaborate with both technical and non-technical stakeholders

Experience delivering solutions within regulated or enterprise environments is highly regarded.



Why Apply?

*

Join a growing AI engineering team building advanced AI capabilities
*

Work on agent-based AI systems and production GenAI solutions
*

Strong technical ownership with opportunity to shape AI frameworks and tooling
*

Collaborative environment with strong focus on learning and development



Apply now or contact Tenth Revolution Group for a confidential discussion.

Contact:

Neros Gorges
(03) 8592 0507
n.gorges@tenthrevolution.com

#SCR-neros-gorges

Lead Azure Databricks Engineer

England, London

  • £70,000 to £80,000 GBP
  • Engineer Role
  • Skills: Azure Databricks, insurance, reinsurance, lloyds of london
  • Seniority: Senior

Description du poste

Azure Databricks Engineer
Lloyd's of London Market

Hybrid 2 days in London Office

We are seeking an outstanding Lead Azure Databricks Engineer to own the design, build, optimisation and governance of our enterprise cloud data platforms. This is a senior, fully hands‑on engineering position requiring deep technical expertise, strong delivery capability and experience operating within the Lloyd's of London market and its regulatory demands.

You will shape the future of our cloud data strategy, enabling advanced analytics and building secure, scalable and highly performant data solutions across the organisation.

Key Responsibilities

* Lead the development, optimisation and governance of large‑scale data platforms using Azure Data Factory, Data Lake, Key Vault, Azure Functions, Databricks, Delta Lake, PySpark and Unity Catalog
* Partner closely with underwriting, actuarial, delegated authority, bordereaux, exposure management, reinsurance, finance, Solvency II and risk teams to deliver solutions aligned to Lloyd's of London regulatory and operational requirements
* Translate complex business needs into scalable cloud data architectures in partnership with architects, SMEs and product owners
* Resolve complex technical challenges rapidly and decisively, consistently delivering high‑quality outcomes
* Champion engineering best practices including design patterns, data lifecycle management, CI/CD automation and cloud‑native methodologies
* Provide hands‑on technical leadership, mentorship and uplift engineering capability across the team
* Drive continuous improvement across performance, cost optimisation, reliability and resilience
* Oversee end‑to‑end data engineering delivery ensuring quality, observability, robustness and compliance

Required Experience and Skills

* Expert‑level knowledge of Azure Data Services and Databricks within enterprise environments
* Deep understanding of Delta Lake, Medallion architecture, distributed compute, Lakehouse patterns, data modelling and performance optimisation
* Strong Python, PySpark and SQL experience, with hands‑on implementation of CI/CD for data solutions using Azure DevOps
* Strong understanding of data governance, lineage, access management, FinOps and secure cloud engineering
* Excellent communication skills with the ability to confidently engage senior stakeholders and simplify complex technical concepts
* Proven ability to take ownership, deliver at pace and operate effectively in high‑pressure, high‑expectation environments
* Experience working within or delivering solutions for the Lloyd's of London market or similarly regulated environments

Please send me a copy if your CV if you're interested

Senior Sales Executive - Data

England, London

  • £80,000 to £100,000 GBP
  • Engineer Role
  • Skills: Data Engineering & Analytics - Cloud Platforms (Azure, AWS, Snowflake, Databricks)
  • Seniority: Senior

Description du poste

Senior Sales Executive - Data & Analytics

London - Hybrid - Data Engineering & Analytics - Cloud Platforms (Azure, AWS, Snowflake, Databricks) £80k - £100k



Join a high‑growth, high‑impact data consultancy

A leading provider of data engineering, analytics, cloud data platforms, and AI‑driven solutions, this organisation partners with mid‑market and enterprise businesses across the UK to help them modernise, transform, and unlock the real value of their data.

If you're a motivated sales professional who thrives in a fast‑moving, high‑demand environment, this is the ideal place to excel. You'll benefit from the support of world‑class pre‑sales specialists, solution architects, and delivery teams-allowing you to focus on building relationships, driving revenue, and securing exciting new projects.

This is an opportunity where you can truly make your mark, develop your career, and sell services that clients actively need and value.



You'll work with:

In this role, you will be responsible for:

* Developing new business opportunities across mid‑market and enterprise clients.
* Managing the full sales lifecycle, supported by strong pre‑sales and delivery teams.
* Building and nurturing a healthy sales pipeline aligned to revenue targets.
* Owning and expanding existing accounts through upsell and cross‑sell opportunities.
* Engaging with senior stakeholders such as Heads of Data, Analytics Managers, and IT Directors.
* Delivering presentations, solution workshops, proposals, and client discussions.
* Collaborating closely with marketing, pre‑sales, onshore and offshore delivery teams.
* Maintaining accurate forecasting, CRM activity, and reporting.




Benefits

What's on offer:

* Competitive salary with performance‑based commissions.
* Opportunity to sell high‑demand services in data engineering, analytics, and AI.
* Strong pre‑sales and delivery support so you can focus on client engagement.
* Clear pathways for career growth and continuous learning.
* Exposure to leading UK mid‑market and enterprise clients.


Key Experience

We're looking for candidates with:

* 5-7 years of B2B sales experience in Data, Analytics, Cloud, or Digital Services.
* Experience selling consulting or managed services.
* Strong understanding of modern data engineering and cloud platforms (Azure, AWS, Snowflake, Databricks - high‑level).
* Excellent communication and stakeholder management skills.
* Experience working with pre‑sales and offshore delivery models (desirable).
* Exposure to enterprise or mid‑market clients within sectors such as Insurance, Public Sector, or BFSI (desirable).


Ready to take the next step in your sales career?

This is a high‑impact role in a rapidly growing organisation, don't miss the chance to be part of a leading force in the UK's data and analytics landscape.

Data Engineer

England, London

  • £50,000 to £65,000 GBP
  • Engineer Role
  • Skills: Azure
  • Seniority: Mid-level

Description du poste

Data Engineer - Azure | Databricks / Fabric - Remote - £50k-£65k

I'm currently supporting one of the UK's fastest-growing Microsoft data consultancies as they continue to scale their team. If you're a Data Engineer who wants to work on modern Azure projects, next‑gen lakehouse architectures, and large-scale transformation programmes, this one is worth a look.

This consultancy is known for investing heavily in their people, offering real progression pathways, and giving engineers the chance to take ownership of meaningful, enterprise-grade work.

If you're looking for a role where you can grow quickly and make a real impact, this could be a great fit.



What You'll Be Doing

In this role, you'll be central to designing and delivering modern data solutions for a range of clients:

* Architect and deliver scalable data solutions using Databricks, Synapse, and Microsoft Fabric
* Build and optimise ETL/ELT pipelines and high-quality data models using SQL & Python
* Develop Power BI dashboards that support insight-led decision-making
* Implement data lakes and medallion lakehouse architectures
* Apply strong standards around data quality, governance, and security
* Work collaboratively in Agile, cross-functional teams
* Support major cloud migration and modernisation initiatives



What's In It for You

* High-Growth Environment: You'll work with cutting-edge Microsoft technologies on impactful projects across multiple industries.



* Career Development: The business actively funds certifications, structured training programmes, and clear progression opportunities.



* Fully Remote: Work from anywhere in the UK, travel is only required occasionally and covered by the company.



You'll be a great fit if you have:

* Strong experience with Azure Synapse, Databricks, or Microsoft Fabric
* Solid SQL & Python skills for ETL/ELT development
* Experience working with data lakes and large datasets
* A good understanding of BI, data warehousing, and modern data architectures



Interested?

This team is one of the most in-demand in the Microsoft data space, and roles with them don't stay open for long.

If you'd like to explore the opportunity, apply now!