Une société du Tenth Revolution Group

Recherche actuelle

28 Résultats de la recherche

For CDI & Freelance, Platform

    Databricks Lead - Implementation - St Louis, MO.

    USA, Missouri, St. Louis

    • $150,000 to $200,000 USD
    • Platform Role
    • Skills: Databricks, Delta Lake Tables
    • Seniority: Senior

    Description du poste

    Tenth Revolution Group are working with a leading manufacturing organization who are seeking a technically strong professional to lead and manage the implementation, rollout, and subsequent utilization of Databricks across the organization. The successful candidate for this position will be responsible for designing and deploying scalable data pipelines and advanced analytics solutions using the Databricks platform.



    Technical Responsibilities:

    * Architect and implement data solutions using Databricks, including Delta Lake and Unity Catalog
    * Build and optimize ETL/ELT pipelines using PySpark and SQL
    * Configure and manage Databricks Workspaces, clusters, and job workflows
    * Integrate Databricks with cloud platforms (AWS, Azure, or GCP) and data sources
    * Implement CI/CD processes for Databricks using tools like Azure DevOps or GitHub Actions
    * Set up data governance, access control, and monitoring within Databricks
    * Collaborate with data engineers, platform teams, and stakeholders on technical delivery



    Required Technical Expertise:

    * Databricks (core platform, Delta Lake, Unity Catalog)
    * PySpark and Spark SQL
    * Cloud data services (AWS/GCP/Azure)
    * CI/CD automation for data pipelines
    * Git-based version control and release processes
    * Data lake architecture and performance tuning



    Candidates for this role must be able to demonstrate a deep technical fluency in Databricks and a strong track record of delivering complex data infrastructure projects

    Databricks Lead - Implementation - Des Moines

    USA, Iowa, Des Moines

    • $150,000 to $200,000 USD
    • Platform Role
    • Skills: Databricks, Delta Lake Tables
    • Seniority: Senior

    Description du poste

    Tenth Revolution Group are working with a leading manufacturing organization who are seeking a technically strong professional to lead and manage the implementation, rollout, and subsequent utilization of Databricks across the organization. The successful candidate for this position will be responsible for designing and deploying scalable data pipelines and advanced analytics solutions using the Databricks platform.



    Technical Responsibilities:

    * Architect and implement data solutions using Databricks, including Delta Lake and Unity Catalog
    * Build and optimize ETL/ELT pipelines using PySpark and SQL
    * Configure and manage Databricks Workspaces, clusters, and job workflows
    * Integrate Databricks with cloud platforms (AWS, Azure, or GCP) and data sources
    * Implement CI/CD processes for Databricks using tools like Azure DevOps or GitHub Actions
    * Set up data governance, access control, and monitoring within Databricks
    * Collaborate with data engineers, platform teams, and stakeholders on technical delivery



    Required Technical Expertise:

    * Databricks (core platform, Delta Lake, Unity Catalog)
    * PySpark and Spark SQL
    * Cloud data services (AWS/GCP/Azure)
    * CI/CD automation for data pipelines
    * Git-based version control and release processes
    * Data lake architecture and performance tuning



    Candidates for this role must be able to demonstrate a deep technical fluency in Databricks and a strong track record of delivering complex data infrastructure projects

    Databricks Lead - Implementation - Minneapolis, MN

    USA, Minnesota, Minneapolis

    • $150,000 to $200,000 USD
    • Platform Role
    • Skills: Databricks, Delta Lake Tables
    • Seniority: Senior

    Description du poste

    Tenth Revolution Group are working with a leading manufacturing organization who are seeking a technically strong professional to lead and manage the implementation, rollout, and subsequent utilization of Databricks across the organization. The successful candidate for this position will be responsible for designing and deploying scalable data pipelines and advanced analytics solutions using the Databricks platform.



    Technical Responsibilities:

    * Architect and implement data solutions using Databricks, including Delta Lake and Unity Catalog
    * Build and optimize ETL/ELT pipelines using PySpark and SQL
    * Configure and manage Databricks Workspaces, clusters, and job workflows
    * Integrate Databricks with cloud platforms (AWS, Azure, or GCP) and data sources
    * Implement CI/CD processes for Databricks using tools like Azure DevOps or GitHub Actions
    * Set up data governance, access control, and monitoring within Databricks
    * Collaborate with data engineers, platform teams, and stakeholders on technical delivery



    Required Technical Expertise:

    * Databricks (core platform, Delta Lake, Unity Catalog)
    * PySpark and Spark SQL
    * Cloud data services (AWS/GCP/Azure)
    * CI/CD automation for data pipelines
    * Git-based version control and release processes
    * Data lake architecture and performance tuning



    Candidates for this role must be able to demonstrate a deep technical fluency in Databricks and a strong track record of delivering complex data infrastructure projects

    Databricks Implementation Lead

    USA, Iowa, Iowa City

    • $150,000 to $200,000 USD
    • Platform Role
    • Skills: Databricks, Delta Lake Tables
    • Seniority: Senior

    Description du poste

    Tenth Revolution Group are working with a leading manufacturing organization who are seeking a technically strong professional to lead and manage the implementation, rollout, and subsequent utilization of Databricks across the organization. The successful candidate for this position will be responsible for designing and deploying scalable data pipelines and advanced analytics solutions using the Databricks platform.



    Technical Responsibilities:

    * Architect and implement data solutions using Databricks, including Delta Lake and Unity Catalog
    * Build and optimize ETL/ELT pipelines using PySpark and SQL
    * Configure and manage Databricks Workspaces, clusters, and job workflows
    * Integrate Databricks with cloud platforms (AWS, Azure, or GCP) and data sources
    * Implement CI/CD processes for Databricks using tools like Azure DevOps or GitHub Actions
    * Set up data governance, access control, and monitoring within Databricks
    * Collaborate with data engineers, platform teams, and stakeholders on technical delivery



    Required Technical Expertise:

    * Databricks (core platform, Delta Lake, Unity Catalog)
    * PySpark and Spark SQL
    * Cloud data services (AWS/GCP/Azure)
    * CI/CD automation for data pipelines
    * Git-based version control and release processes
    * Data lake architecture and performance tuning



    Candidates for this role must be able to demonstrate a deep technical fluency in Databricks and a strong track record of delivering complex data infrastructure projects

    Nouveau

    AWS Data Engineer - Oslo

    Norway, Oslo

    • Negotiable
    • Data Science Role
    • Skills: data engineer, python, AWS, Snowflake, dbt, SQL, ETL
    • Seniority: Mid-level

    Description du poste

    We are looking for a Data Engineer to bring data-driven insights into the agricultural industry in Norway!

    You will work directly with our product which is an open platform for gathering & analysing data to increase knowledge sharing amongst the farming and agricultural community - our goal is to create a more sustainable future for all

    Your role will involve building data products from our core platform using technologies such as AWS, Snowflake, DBT, SQL, and Python

    You will work alongside fellow data engineers, as well as data scientists, application developers, product owners and our R&D team who bring deep domain knowledge

    Your tasks can include:

    * Building data transformations and data pipelines
    * Integrating data from different sources
    * Automating processes
    * Ensuring data quality
    * Contributing towards the expansion of our data platform

    We are looking for you to bring:

    * Academic background in an engineering or computer science related subject
    * 3+ years experience in a data engineering position
    * Experience using data platforms to build data products
    * Competency with SQL & Python
    * Experience working in a cloud environment (preferably AWS)
    * Experience will tools such as Snowflake, DBT, Databricks, Airflow, Spark is beneficial

    What's in it for you?

    You will be part of a small but highly skilled working environment with 30 colleagues across engineering, data, product, and management

    Opportunities to work with new and modern technologies - we encourage exploring new tools and are always open to suggestions for improvement

    We value personal & professional development with opportunities learn new tools, methods, and soft skills - whether you see yourself as a future tech/team lead, architect, or even taking a step into an entirely new field!

    The value you create will directly impact farmers and the agricultural industry across Norway in a positive way

    Contributing to developing effective and sustainable agriculture for the future

    We offer enhanced pension & insurances, along with many other practical benefits

    We are a down-to-earth and committed team who share common passions - we create an inclusive & fun working environment and hold regular social activities for those who want to attend

    For any questions contact Lucy on l.whiting@tenthrevolution.com

    Nouveau

    Copilot Studio Consultant - Home-based - £60k

    England, Berkshire, Reading

    • £55,000 to £60,000 GBP
    • Consultant Role
    • Skills: Azure, Microsoft Co-pilot, copilot, studio, consultant, ai, azure, data science, data, aws
    • Seniority: Mid-level

    Description du poste

    Copilot Studio Consultant - Home-based - £60k

    Please note - to be eligible for this role you must be UK based with the unrestricted right to work in the UK. This organisation is not able to offer sponsorship now or at anytime in the future.

    This is an exciting opportunity for a talented and technically skilled Copilot Studio Consultant to join an expanding Data & AI practice. This role is working for a well-established UK-based digital consultancy that is at the forefront of AI-driven transformation.

    As a Copilot Studio Consultant, you will play a key role in designing and delivering intelligent solutions using Microsoft Copilot Studio. You will work remotely, collaborating with clients and internal teams to build innovative, AI-powered applications that drive real business value.

    Key Responsibilities:

    * Develop and deploy solutions using Microsoft Copilot Studio, including Power Virtual Agents and generative AI plugins
    * Translate business requirements into scalable, AI-driven applications
    * Collaborate with cross-functional teams to integrate Copilot Studio with Microsoft Power Platform and Azure services
    * Provide technical leadership and guidance on best practices
    * Stay current with Microsoft's AI roadmap and emerging technologies

    Required Skills and Experience:

    * Hands-on experience with Microsoft Copilot Studio
    * Strong knowledge of Microsoft Power Platform and Azure AI services
    * Ability to communicate technical concepts clearly to both technical and non-technical audiences
    * Self-motivated, innovative, and eager to learn
    * Must be based in the UK and eligible to work in the UK

    What's on offer:

    * Competitive salary of £60,000 plus 10% performance-based bonus
    * Fully remote working with flexible hours
    * Career development opportunities within a growing AI practice
    * Access to the latest tools, technologies, and training
    * A collaborative and forward-thinking team environment

    How to Apply

    To apply for this role please submit your CV or contact David Airey on 0191 338 7508 or at d.airey@tenthrevolution.com.

    Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.

    Nouveau

    Senior Applications Developer

    England, Essex, Chelmsford

    • £350 to £400 GBP
    • Developer Role
    • Skills: ICT, Microsoft Power Automate Desktop and DevOps, Microsoft 365, Dataverse, and Azure Services, JSON, Power Fx, JavaScript, HTML and SQL.
    • Seniority: Senior

    Description du poste

    This role will lead, monitor, optimise and continuously improve the delivery of software components so that a Council can provide simple user experiences and achieve better outcomes for people and businesses. They will be responsible for leading the team writing clean, accessible code that is open by default and easy for others to reuse, developing software which meets user needs and creates meaningful interactions and relationships with users.

    Knowledge, Skills & Experience



    * Educated to Degree Level or equivalent by experience in a relevant subject.
    * Hold relevant Microsoft certifications such as PL-100: Power Platform App Maker, PL-200: Power Platform Functional Consultant, PL-400: Power Platform Developer, PL-600: Power Platform Solution Architect.
    * Proven experience with Microsoft Power Automate Desktop and DevOps.
    * Solid understanding of Microsoft 365, Dataverse, and Azure Services
    * Understanding of application lifecycle management (ALM) in Power Platform
    * Knowledge of JSON, Power Fx, JavaScript, HTML and SQL.
    * Qualification or high level of demonstrable capability with relevant vendor business applications


    Desirable:



    * Able to demonstrate a clear understanding of, and capability to work within, relevant ICT related standards including HMG Security Policy Framework, IITIL V3, ISO/IEC 38500, ISO/IEC 27001, ISO/IEC 22301, ISO/IEC 20000, PRINCE2 and MSP
    * Good written and verbal communication skills with ability to present information in simple and accessible language to a wide range of audiences
    * Experience of balancing the needs of users with organisation priorities to make the right decisions and empowering teams to act upon them
    * Evidence of continual professional development to keep pace with technical and business change that meet defined SFIA V7 competencies


    Organisational Behaviours/Professional Competence



    * Implementing changes and continually evaluating service to improve the area of work, while maintaining the highest possible levels of service quality are continually delivered
    * Working collaboratively within and across functions to and thinking commercially support the delivery of best possible outcomes for our customers on a financially sustainable basis.
    * Deliver exemplar customer interactions to individuals and communities which support strong relationships and a reputation for achieving outcomes and resolving issues.
    * Roles at this level will be focused on delivering results in a specific functional area. They will hold expertise on the application of policy and improvement of service delivery. These roles have clear team budgets and targets set within the overall service requirements
    * Effective utilisation of digital technologies and innovation across the function.
    * Equality and diversity is celebrated and considered as part of all decisions taken.
    * Managing complex issues and resources to meet the needs of customers and deliver the best possible outcomes
    * Operational planning and performance review to maintain exceptional service delivery and ensures the political objectives and priorities of the council are met
    * Using professional expertise to translate goals and plans into ways of working that comply with relevant legislation and statutory requirements and manages a level of appropriate risk.
    * Above all, you will have the ability to develop skills and knowledge within your role.

    Some Hybrid working from the Town Hall in Chelmsford otherwise remote
    4 month contract likely to extend/ Inside IR35

    GCP Data Engineer - London - £75k +bonus

    England, London, City of London

    • £70,000 to £75,000 GBP
    • Engineer Role
    • Skills: Azure, Google Cloud Developer Tools, gcp, data engineer, data engineering, snowflake, aws, data
    • Seniority: Mid-level

    Description du poste

    GCP Data Engineer - London - £75k +bonus

    Please note - this role will require you to attend the London based office 2-3 days per week. To be considered for this role you must have the unrestricted right to work in the UK - this organisation can not offer sponsorship.

    Are you a skilled Data Engineer with a passion for cloud technologies and a strong foundation in GCP, Azure, SQL, and Python? A leading Lloyd's of London reinsurance broker is seeking a talented individual to join their growing data team and help shape the future of their cloud data platform.

    Key Responsibilities:

    * Design, build, and maintain scalable data pipelines and ETL processes in Google Cloud Platform (GCP).
    * Lead and contribute to a major cloud migration project from Azure to GCP, ensuring seamless data integration and minimal disruption.
    * Collaborate with data scientists, analysts, and business stakeholders to deliver high-quality data solutions.
    * Integrate data from various sources, ensuring data quality, consistency, and security.
    * Write clean, efficient, and well-documented SQL and Python code for data transformation and automation.
    * Assist in the utilisation and adoption of emerging AI technology usage.
    * Assist in training and mentoring more junior members of staff.

    Required Skills & Experience:

    * Proven experience as a Data Engineer in a commercial environment.
    * Strong hands-on experience with Google Cloud Platform (GCP) services (e.g., BigQuery, Dataflow, Pub/Sub).
    * Solid understanding of Azure data services and hybrid cloud environments.
    * Advanced SQL skills and proficiency in Python for data engineering tasks.
    * Experience working in or with insurance, reinsurance, or financial services is a strong advantage.

    🎯 Desirable:

    * Familiarity with data governance, security, and compliance in regulated industries.
    * Experience with CI/CD pipelines and infrastructure-as-code tools (e.g., Terraform).
    * Knowledge of data modelling and warehousing best practices.

    Interviews for this role are beginning this week so please apply today if you'd like to be considered! To apply, please submit your CV or contact David Airey on 0191 338 7508 or at d.airey@tenthrevolution.com.

    Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.

    AWS Databricks Data Architect

    England, London, City of London

    • £100,000 to £120,000 GBP
    • Architect Role
    • Skills: S3, AThena, Databricks, Workflows, Asset Bundles, python, pyspark, AWS, EKS, SQS, Lambda, Terraform, Jira, Confluence
    • Seniority: Senior

    Description du poste

    My client is based in the London area are currently looking to recruit for an experienced AWS Databricks Data Architect to join their team. They are one of the leaders within the consulting space and are currently going through a period of growth and are looking for an experienced Data Architect to join their team. They are backed by a huge Multi National equity firm who are looking to bolster my clients financial position. They are expected to see year on year growth, which will allow them to implement and utilise the most in demand and cutting edge technology on the market right now. They have just implemented Gen AI within their organisation are looking to utilise the newest technologies on the market.

    Your role will include:

    Responsible for designing and implementing effective Architectural solutions around the AWS Severless Technologies (S3, Lambda, Athena, Kafka) and Databricks including Data Lake and Data Warehousing.
    Assess database implementation procedures to ensure they comply with GDPR and data compliance.
    Guide, influence and challenge the technology teams and stakeholders to understand the benefits, pros and cons of solution options.
    Agree and set the technical direction for the data platform landscape and solutions, for the short and long term, in collaboration with delivery and engineering teams.
    This is a hands on role which requires extensive exposure to cloud technologies (AWS and Azure).

    The right candidate will have extensive knowledge of writing codes, building data pipelines and doing digital transformation and ingestion a certain tech suite. Extensive experience in implementing solutions around the AWS cloud environment (S3, Databricks, Athena, Glue),
    In depth understanding of Workflows, Asset Bundles, SQS, EKS, Terraform,
    Excellent understanding of Data Modelling & Kinesis
    An understanding of SQL/database management.
    Strong hands-on experience in Data Warehouse and Data Lake technologies preferably around AWS.

    My client is providing access to;

    Hybrid 2 days (London),
    28 Days Holiday, Plus Bank Holiday
    Private Medical Health
    Pension Scheme
    And More...

    This role is an urgent requirement, there are limited interview slots left, if interested send an up to date CV to Shoaib Khan - S.Khan@TenthRevolution.com or call 0191 338 7493 for a catch up in complete confidence.

    Frank Group's Data Teams offer more opportunities across the UK than any other recruiter We're the proud sponsor and supporter of SQLBits, AWS RE:Invent, Power Platform World Tour, the London Power BI User Group, Newcastle Power BI User Group and Newcastle Data Platform and Cloud User Group

    Offres d'emploi Plateforme AWS

    Consultez toutes les offres d'emploi Plateforme AWS publiées gracieusement par Jefferson Frank. Grâce à notre expérience conséquente, nous avons l'art de faire correspondre des candidats talentueux et hautement qualifiés avec le poste idéal et nous pouvons en faire de même pour vous. Il vous suffit de télécharger votre curriculum vitae ou de postuler à l'une de nos offres d'emploi Plateforme AWS. Nous assurerons la liaison entre vous et votre employeur potentiel, que ce soit pour promouvoir vos talents, organiser un entretien ou négocier une meilleure rémunération.