Recherche actuelle

25 Résultats de la recherche

For Freelance In London

    Data Solution Architect

    England, London

    • £450 to £500 GBP
    • Consultant Role
    • Skills: gcp, data mesh, bigquery, devops
    • Seniority: Senior

    Description du poste

    Data Solution Architect - Group & Enterprise Services

    An organisation is seeking an experienced Data Solution Architect to help shape its data strategy and lead the design of scalable, secure and cost‑effective data solutions. The role sits within a central data and analytics function and focuses on maturing enterprise data architecture, promoting modern data practices, and supporting a Data Mesh-aligned operating model.

    Key Responsibilities

    * Engage across technical and business teams, tailoring communication for different stakeholders.
    * Lead end‑to‑end data solution design, ensuring alignment with enterprise architecture, governance and cost considerations.
    * Provide expertise in cloud data platforms, data governance, security and modern data architecture.
    * Work with domain experts to understand business data and develop usable, reusable data products.
    * Support full delivery lifecycle including design, modelling, security reviews, testing and transition to BAU.
    * Contribute to the design and build of data solutions on cloud platforms, particularly BigQuery and native transformation tooling.
    * Collaborate with vendors on assessments, RFPs and solution implementation.

    Essential Skills

    * Strong experience with cloud‑native data architectures, ideally on GCP.
    * Deep knowledge of BigQuery, modern data processing patterns, warehouses and Lakehouse design.
    * Experience with data governance, lineage, metadata and data quality tooling.
    * Practical delivery of Data Products aligned to Data Mesh principles.
    * Strong architectural modelling and documentation skills.
    * Experience designing for performance, scalability, resilience and cost optimisation.
    * Ability to influence architectural direction and work with diverse stakeholders.
    * Understanding of architectural governance processes.

    Desirable

    * Familiarity with EA frameworks such as TOGAF or Zachman.
    * Experience in financial services or insurance.
    * Knowledge of SDLC, DevOps, and data engineering delivery models.
    * Exposure to multi‑cloud environments.





    Please send me a copy of your CV if you meet the requirements

    PBI Specialist

    England, London

    • £1 to £1 GBP
    • Engineer Role
    • Skills: pbi, sc cleared
    • Seniority: Senior

    Description du poste

    Contract Role: PBI Specialist (SC Cleared)
    Location: UK (on‑site or hybrid depending on project)
    Clearance: Active SC clearance required
    Day Rate: Competitive

    Overview
    We are looking for an experienced PBI (Power BI) Specialist with active SC clearance to support a high‑profile programme within a secure environment. The successful candidate will work closely with data, analytics, and delivery teams to design, build, and optimise Power BI solutions that drive effective reporting and decision‑making.

    Key Responsibilities

    * Develop, enhance, and maintain Power BI dashboards and reports
    * Convert complex data sets into clear, actionable visual insights
    * Work with stakeholders to gather requirements and translate them into technical solutions
    * Optimise data models, DAX calculations, and performance across reports
    * Ensure data integrity, governance, and compliance with secure‑environment standards
    * Support automation and process improvement initiatives

    Essential Skills

    * Valid SC clearance
    * Strong experience with Power BI (Desktop, Service, DAX, modelling)
    * Ability to work with SQL, data warehouses, and structured data sources
    * Background in BI, reporting, analytics, or data visualisation
    * Strong stakeholder engagement and communication skills
    * Experience working within secure or regulated environments

    Desirable Skills

    * Experience with Azure (Data Factory, Synapse, Databricks)
    * Knowledge of ETL processes
    * Understanding of government or defence sector reporting standard

    SC Cleared Data Warehouse Engineer

    England, London

    • £1 to £1 GBP
    • Engineer Role
    • Skills: Sc cleared, snowflake
    • Seniority: Senior

    Description du poste

    SC Cleared Data Warehouse Engineer

    Fully Remote

    About the Role

    We are seeking an experienced Data Warehouse Engineer with strong expertise in Kimball dimensional modelling to support secure, large‑scale data programmes. You will design and enhance modern data warehouse environments across cloud platforms such as Azure, Fabric, AWS and Snowflake, working within a high‑security setting that requires an active SC clearance.

    Key Responsibilities

    * Design and deliver Kimball‑based dimensional data models
    * Build and optimise robust ETL/ELT pipelines
    * Develop data warehouse solutions across platforms including Azure, Microsoft Fabric, AWS, Snowflake, and related technologies
    * Improve data architecture performance, governance and scalability
    * Collaborate closely with architects, analysts and senior stakeholders
    * Ensure compliance with UK government security standards

    Essential Skills & Experience

    * Active SC Clearance (mandatory)
    * Strong experience with Kimball dimensional modelling
    * Expertise in SQL and data warehousing concepts
    * Hands‑on experience withDWH platforms such as:

    * Azure Data Warehouse / Synapse
    * Microsoft Fabric
    * AWS Redshift
    * Snowflake

    * Experience with ETL/ELT tools (SSIS, ADF, Matillion, Airflow, etc.)

    Desirable Skills

    * Experience delivering solutions in secure government or defence environments
    * Python proficiency for data engineering workflows
    * CI/CD experience and data‑focused DevOps practices

    Platform Engineer

    England, London

    • £350 to £400 GBP
    • Engineer Role
    • Skills: Terraform, Github actions, / CI/CD, Azure private link, azure Policy Azure Security, Azure cost management, Databricks/unity catalog.
    • Seniority: Senior

    Description du poste

    Platform Engineer - 3 Month Contract
    Location: Remote (UK-based only) with occasional travel to client and team sites
    Rate: Competitive day rate

    A fast-growing data consultancy is seeking a skilled Platform Engineer to support the rollout and maintenance of a federated, multi-service Data & AI platform. This short-term contract offers the opportunity to work on cutting-edge cloud infrastructure projects, with a strong focus on scalability, governance, and cost optimisation.

    Interview Process:
    Two stages - an informal introductory chat followed by a technical interview with senior team members.

    Role Overview:
    You'll be responsible for designing and implementing secure, scalable cloud infrastructure on Azure, enabling advanced data services and ensuring compliance across environments. The role requires hands-on experience with infrastructure-as-code, CI/CD, and cloud-native security practices.

    Key Skills & Experience:

    * Terraform for Azure infrastructure automation
    * GitHub Actions and CI/CD pipeline design
    * Azure Private Link and Private Link Service configuration
    * Databricks and Unity Catalog for data governance
    * Azure Policy and compliance enforcement
    * Identity and access management (OAuth, federated credentials)
    * Azure security best practices including BCDR and high availability
    * Cost management and optimisation strategies within Azure

    Please send me a copy of your CV if you meet all of the requirement

    Data Platform Engineer

    England, London

    • Up to £1 GBP
    • Engineer Role
    • Skills: Terraform, Github actions, / CI/CD, Azure private link, azure Policy Azure Security, Azure cost management, Databricks/unity catalog.
    • Seniority: Senior

    Description du poste

    Azure Data Platform Engineer - Contract - Fully Remote

    We're looking for highly capable Azure Platform Engineers to support a large‑scale cloud and data engineering programme. This environment is fast‑paced, technically mature, and heavily invested in modern Azure, IaC, and Databricks ecosystems. Strong technical depth and clear, concise communication are essential. Outisde IR35
    Remote
    6 months contract

    Core Technical Requirements
    Azure Platform Engineering

    Deep hands‑on experience building and operating cloud‑native platforms in Azure
    Strong understanding of Azure networking, identity, security baselining, and landing zone patterns
    Experience working in highly automated, scalable, enterprise-grade environments

    Terraform (IaC)

    Solid production experience delivering infrastructure via Terraform
    Strong module design, state management, secret handling, and CI/CD integration
    Ability to define, manage, and version cloud resources using best‑practice IaC patterns

    Databricks

    Practical experience supporting, deploying, or engineering Databricks workspaces
    Strong knowledge of cluster configuration, job orchestration, workspace governance, and optimisation
    Comfortable working alongside data engineers and platform squads

    Unity Catalog (Highly Advantageous)

    Experience implementing or working within Unity Catalog for centralised governance
    Understanding of cataloguing, lineage, secure data access, and workspace-level controls
    Ability to support migration from legacy Hive Metastore to Unity Catalog is a major plus

    Soft Skills & Delivery

    Clear, concise communicator (no overly long CVs - brevity and clarity matter)
    Strong problem‑solver who can operate independently in a remote environment
    Able to articulate platform decisions and engineering rationale effectively
    Comfortable working in cross-functional teams following modern DevOps practices

    Please can you send me a copy of your CV if you are interested.

    Nouveau

    Data Architect

    England, London

    • £510 to £637 GBP
    • Architect Role
    • Skills: Databricks, Microsoft Fabric, Microsoft Azure, Data Migration, Data Strategy, Data Architecture
    • Seniority: Senior

    Description du poste

    Key Responsibilities

    Data Architecture & Strategy

    * Define and evolve enterprise data architecture principles, standards, and models
    * Develop data strategy and modelling transition roadmaps, including sequencing and dependencies
    * Design canonical, domain, and analytical data models, including shared and reference data patterns
    * Support federated data operating models with clear ownership and stewardship
    * Drive architectural simplification, innovation, and evolution within governed environments

    Governance, Risk & Compliance

    * Design data architectures that meet regulatory, security, and compliance requirements
    * Embed data governance, quality, metadata, lineage, and auditability by design
    * Define data classification and control models supporting traceability and regulatory reporting

    Cloud & Platform Architecture

    * Translate logical and domain data models into physical cloud implementations
    * Design schemas, pipelines, and lakehouse/warehouse patterns on modern cloud platforms
    * Apply cloud data integration patterns with awareness of governance, lineage, and access control implications

    Migration & Delivery

    * Define and support data migration strategies, including source‑to‑target mappings and reconciliation
    * Enable phased cutover approaches alongside business‑critical delivery
    * Work independently across multiple initiatives, prioritising effectively in complex environments

    Stakeholder Engagement

    * Partner with business and technology stakeholders to capture requirements and define data products
    * Communicate complex data concepts clearly to technical and non‑technical audiences
    * Articulate modelling trade‑offs and architectural constraints to support informed decisions

    Skills & Experience

    * Strong experience in enterprise data architecture and information architecture
    * Deep understanding of conceptual, logical, and physical data modelling
    * Proven experience shaping data strategy aligned to measurable business value
    * Knowledge of modern cloud data platforms (e.g. Databricks, Microsoft Fabric, Spark ecosystems)
    * Working knowledge of architecture frameworks (e.g. TOGAF) and relevant data standards

    Why Join

    This is an opportunity to play a pivotal role in shaping enterprise data foundations, influencing strategy, and enabling trusted, compliant, and scalable data platforms.

    Delivery Lead - London - £600/pd (Outside IR35)

    England, London, City of London

    • Up to £600 GBP
    • Other Role
    • Skills: Azure, insurance, lloyds, broker, delivery, it, data, salesforce, mulesoft, underwriter
    • Seniority: Senior

    Description du poste

    Delivery Lead - London - £600/pd (Outside IR35)

    Please note - this role will require you to work from the London office for a minimum of two days per week. To be a good fit for this role you should have the unrestricted right to work in the UK and must be available to start immediately.

    My client, a respected Lloyd's of London broker, is seeking an experienced Delivery Lead to oversee a key IT programme focused on enhancing their placement solution, built on Salesforce and MuleSoft technologies. This is a high-impact role within a fast-moving environment, ideal for someone who thrives on complex delivery challenges within the London Market.

    Key Responsibilities:

    * Lead end-to-end delivery of a major IT project focused on placement workflows.
    * Work closely with cross-functional teams including architects, developers, SMEs, and business stakeholders.
    * Ensure successful delivery of integrations and enhancements.
    * Drive project governance, reporting, timelines, and risk management.
    * Communicate effectively with senior leadership, providing clarity on progress and blockers.
    * Support continuous improvements across delivery processes and project disciplines.

    Required Experience:

    * Proven Delivery Lead experience across complex IT programmes.
    * Strong understanding of the Lloyd's of London / London Market and its placement processes.
    * Experience delivering projects involving Salesforce and MuleSoft (or similar integration / CRM technology).
    * Excellent communication, stakeholder management, and leadership skills.
    * Ability to work in a fast-paced environment and hit the ground running.
    * Unrestricted right to work in the UK.
    * Immediate availability is essential.

    Why Apply?

    * Opportunity to work with a leading broker at the centre of the Lloyd's market.
    * High-impact role shaping a critical placement solution.
    * Competitive day rate outside IR35.
    * Strong likelihood of contract extension.

    To apply for this role please submit your CV or contact David Airey on 0191 338 7508 or at d.airey@tenthrevolution.com.

    Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.

    Nouveau

    Contract GCP Data Engineer - £700/pd Outside IR35

    England, London, City of London

    • Up to £700 GBP
    • Consultant Role
    • Skills: MS Azure, MS Business Intelligence, data, data engineer, google cloud, gcp, cloud, texhnology, data science, ai
    • Seniority: Senior

    Description du poste

    Contract GCP Data Engineer - £700/pd Outside IR35

    Contract: Outside IR35
    Rate: £700 per day
    Duration: 4 weeks (strong potential for extension)
    Location: Central London (client‑based role)
    Start Date: Immediate

    Role Overview

    We are working with a central London-based insurance broker who requires an experienced GCP Data Engineer to support a critical short‑term engagement, with a strong likelihood of follow‑on work.

    This role will suit a hands‑on engineer who can quickly embed into an existing data team, take ownership of delivery, and add immediate value within a fast‑paced commercial environment.

    Key Responsibilities

    * Design, build, and maintain data pipelines on Google Cloud Platform (GCP).
    * Work with services such as BigQuery, Cloud Storage, Dataflow, Pub/Sub, and Cloud Functions.
    * Support ingestion, transformation, and optimisation of data for analytics and reporting use cases.
    * Collaborate closely with analytics, data science, and engineering stakeholders.
    * Ensure data quality, performance, and security best practices are applied.
    * Troubleshoot and resolve data platform issues efficiently.

    Skills & Experience Required

    * Strong commercial experience as a GCP Data Engineer.
    * Hands‑on expertise with BigQuery and modern data pipeline architectures.
    * Solid SQL skills and experience with Python (or similar) for data engineering tasks.
    * Understanding of data modelling, ETL/ELT patterns, and cloud‑native best practices.
    * Experience working in financial services or insurance environments is highly desirable.
    * Ability to start immediately and deliver with minimal onboarding.

    Engagement Details

    * Outside IR35
    * £700 per day
    * Initial 4‑week contract, with strong potential for extension
    * Based with a Central London insurance broker
    * Immediate start required

    To apply for this role please submit your CV or contact David Airey on 0191 338 7508 or at d.airey@tenthrevolution.com.

    Fabric/ BI developer

    England, London, Brentford

    • £350 to £400 GBP
    • Developer Role
    • Skills: MS Power BI, sql, power bi
    • Seniority: Senior

    Description du poste

    Fabric BI Developer - Modern Data PlatformOutside IR35
    Ideally 2 days in Brentford office, but can be remote
    Start date: Immediate

    About the Role

    As a BI Developer, you will design, build, and optimise end‑to‑end BI solutions across a modern data stack. You will work closely with data architects, analysts, IT teams, and business stakeholders to develop robust data pipelines, semantic models, and reporting assets that support strategic decision‑making across the organisation.

    This is a hands‑on technical role suited to someone passionate about engineering reliable, secure, and scalable BI environments.

    Key Responsibilities

    * Design and deliver scalable Azure‑based data platform solutions, including Data Warehouses and enterprise reporting tools.
    * Build and manage data pipelines using Microsoft Fabric, OneLake, Dataflows Gen2, and Data Factory.
    * Develop optimised Power BI semantic models, integrating with DirectLake and/or Warehouse datasets.
    * Act as a hands‑on Data Engineer across the full BI stack, leveraging Azure Synapse, SSIS, SQL, and Data Lake technologies.
    * Provide technical support and maintain the organisation's BI infrastructure.
    * Lead and support migration and reconciliation of data from legacy systems or acquired businesses.
    * Ensure BI solutions follow governance, security, and compliance best practices.
    * Collaborate with cross‑functional teams, including PMO, IT Solutions, and BI Reporting teams.

    Key Relationships

    You will work closely with:

    * The BI & Data leadership team
    * Project management and solution delivery teams
    * BI Reporting teams who depend on strong data infrastructure

    Skills & Experience Required

    * Degree or postgraduate‑level education.
    * Strong technical background in SQL, Power BI, Azure cloud technologies, Data Lake, and enterprise BI environments.
    * Experience designing and developing modern BI solutions at scale.
    * Excellent communication skills, with the ability to work across both technical and non‑technical teams.
    * Experience with Microsoft Fabric (must have).
    * Experience with BI migration projects (e.g., from legacy BI to Fabric) is beneficial.
    * Knowledge of data governance and security tooling such as Microsoft Purview (desirable).

    Please send me a copy of your CV if you meet the above

    Data Platform Architect - 6 month contract - £595/pd

    England, London, City of London

    • £550 to £595 GBP
    • Architect Role
    • Skills: Snowflake, aws, azure, data, platform, devops, engineer, cloud, architect, architecture, enterprise, solution, contract, consultant, business intelligence
    • Seniority: Senior

    Description du poste

    Data & AI Platform Architect - Contract

    Location: London (Hybrid - 2-3 days per week on-site)
    Contract Type: Inside IR35
    Duration: 6 Months
    Day Rate: £595 per day
    Working Pattern: Hybrid
    Client: Well‑established Data & AI Consulting Partner

    About the Role

    We are working with a well‑established Data & AI consulting partner to recruit an experienced Data & AI Solutions Architect for a 6‑month inside IR35 engagement. In this role, you will lead the design and delivery of modern data, analytics, and AI solutions, working closely with stakeholders and small delivery teams to implement cloud-centric architectures that drive real business value.

    This is an excellent opportunity for a seasoned architect who thrives in fast‑paced environments and is passionate about shaping impactful, scalable data solutions.

    Key Responsibilities

    * Lead solution architecture and end‑to‑end delivery across data, analytics, and AI workstreams.
    * Support and guide small technical teams through implementation.
    * Design cloud-forward architectures using modern data platform technologies.
    * Apply strong data management principles throughout the full data lifecycle.
    * Contribute to delivery frameworks, technical standards, and solution roadmaps.
    * Communicate key concepts in AI, machine learning, and data mining to technical and non-technical stakeholders.
    * Collaborate closely with client teams to understand requirements and translate them into actionable technical designs.

    About You

    You will bring:

    * 5+ years' experience in data & analytics and AI solution design and delivery.
    * Proven experience leading small teams or workstreams within complex programmes.
    * Deep understanding of cloud-centric architecture (Azure, AWS, or GCP).
    * Strong grasp of core data management principles and practices across the data lifecycle.
    * Solid understanding of CI/CD, DevOps practices, and modern infrastructure approaches.
    * Ability to articulate AI, ML, and data mining concepts in a clear and compelling way.
    * Experience in public sector or regulated industries (government, healthcare, etc.) is highly desirable.

    What's on Offer

    * 6‑month inside IR35 contract with competitive £550/day rate.
    * Hybrid working model with on‑site presence in London.
    * Opportunity to contribute to high-impact client projects with a respected Data & AI partner.
    * Collaborative environment where expertise and innovation are valued.

    How to Apply

    To apply for this role please submit your CV or contact David Airey on 0191 338 7508 or at d.airey@tenthrevolution.com.

    Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.