Ein Unternehmen der Tenth Revolution Group

Ihre aktuelle Jobsuche

26 Suchergebnisse

Für Festanstellung und freiberuflich, Platform

    Databricks Lead - Implementation - Des Moines

    USA, Iowa, Des Moines

    • $150,000 to $200,000 USD
    • Platform Stelle
    • Fähigkeiten: Databricks, Delta Lake Tables
    • Seniority: Senior

    Jobbeschreibung

    Tenth Revolution Group are working with a leading manufacturing organization who are seeking a technically strong professional to lead and manage the implementation, rollout, and subsequent utilization of Databricks across the organization. The successful candidate for this position will be responsible for designing and deploying scalable data pipelines and advanced analytics solutions using the Databricks platform.



    Technical Responsibilities:

    * Architect and implement data solutions using Databricks, including Delta Lake and Unity Catalog
    * Build and optimize ETL/ELT pipelines using PySpark and SQL
    * Configure and manage Databricks Workspaces, clusters, and job workflows
    * Integrate Databricks with cloud platforms (AWS, Azure, or GCP) and data sources
    * Implement CI/CD processes for Databricks using tools like Azure DevOps or GitHub Actions
    * Set up data governance, access control, and monitoring within Databricks
    * Collaborate with data engineers, platform teams, and stakeholders on technical delivery



    Required Technical Expertise:

    * Databricks (core platform, Delta Lake, Unity Catalog)
    * PySpark and Spark SQL
    * Cloud data services (AWS/GCP/Azure)
    * CI/CD automation for data pipelines
    * Git-based version control and release processes
    * Data lake architecture and performance tuning



    Candidates for this role must be able to demonstrate a deep technical fluency in Databricks and a strong track record of delivering complex data infrastructure projects

    Databricks Lead - Implementation - St Louis, MO.

    USA, Missouri, St. Louis

    • $150,000 to $200,000 USD
    • Platform Stelle
    • Fähigkeiten: Databricks, Delta Lake Tables
    • Seniority: Senior

    Jobbeschreibung

    Tenth Revolution Group are working with a leading manufacturing organization who are seeking a technically strong professional to lead and manage the implementation, rollout, and subsequent utilization of Databricks across the organization. The successful candidate for this position will be responsible for designing and deploying scalable data pipelines and advanced analytics solutions using the Databricks platform.



    Technical Responsibilities:

    * Architect and implement data solutions using Databricks, including Delta Lake and Unity Catalog
    * Build and optimize ETL/ELT pipelines using PySpark and SQL
    * Configure and manage Databricks Workspaces, clusters, and job workflows
    * Integrate Databricks with cloud platforms (AWS, Azure, or GCP) and data sources
    * Implement CI/CD processes for Databricks using tools like Azure DevOps or GitHub Actions
    * Set up data governance, access control, and monitoring within Databricks
    * Collaborate with data engineers, platform teams, and stakeholders on technical delivery



    Required Technical Expertise:

    * Databricks (core platform, Delta Lake, Unity Catalog)
    * PySpark and Spark SQL
    * Cloud data services (AWS/GCP/Azure)
    * CI/CD automation for data pipelines
    * Git-based version control and release processes
    * Data lake architecture and performance tuning



    Candidates for this role must be able to demonstrate a deep technical fluency in Databricks and a strong track record of delivering complex data infrastructure projects

    Databricks Lead - Implementation - Minneapolis, MN

    USA, Minnesota, Minneapolis

    • $150,000 to $200,000 USD
    • Platform Stelle
    • Fähigkeiten: Databricks, Delta Lake Tables
    • Seniority: Senior

    Jobbeschreibung

    Tenth Revolution Group are working with a leading manufacturing organization who are seeking a technically strong professional to lead and manage the implementation, rollout, and subsequent utilization of Databricks across the organization. The successful candidate for this position will be responsible for designing and deploying scalable data pipelines and advanced analytics solutions using the Databricks platform.



    Technical Responsibilities:

    * Architect and implement data solutions using Databricks, including Delta Lake and Unity Catalog
    * Build and optimize ETL/ELT pipelines using PySpark and SQL
    * Configure and manage Databricks Workspaces, clusters, and job workflows
    * Integrate Databricks with cloud platforms (AWS, Azure, or GCP) and data sources
    * Implement CI/CD processes for Databricks using tools like Azure DevOps or GitHub Actions
    * Set up data governance, access control, and monitoring within Databricks
    * Collaborate with data engineers, platform teams, and stakeholders on technical delivery



    Required Technical Expertise:

    * Databricks (core platform, Delta Lake, Unity Catalog)
    * PySpark and Spark SQL
    * Cloud data services (AWS/GCP/Azure)
    * CI/CD automation for data pipelines
    * Git-based version control and release processes
    * Data lake architecture and performance tuning



    Candidates for this role must be able to demonstrate a deep technical fluency in Databricks and a strong track record of delivering complex data infrastructure projects

    Databricks Implementation Lead

    USA, Iowa, Iowa City

    • $150,000 to $200,000 USD
    • Platform Stelle
    • Fähigkeiten: Databricks, Delta Lake Tables
    • Seniority: Senior

    Jobbeschreibung

    Tenth Revolution Group are working with a leading manufacturing organization who are seeking a technically strong professional to lead and manage the implementation, rollout, and subsequent utilization of Databricks across the organization. The successful candidate for this position will be responsible for designing and deploying scalable data pipelines and advanced analytics solutions using the Databricks platform.



    Technical Responsibilities:

    * Architect and implement data solutions using Databricks, including Delta Lake and Unity Catalog
    * Build and optimize ETL/ELT pipelines using PySpark and SQL
    * Configure and manage Databricks Workspaces, clusters, and job workflows
    * Integrate Databricks with cloud platforms (AWS, Azure, or GCP) and data sources
    * Implement CI/CD processes for Databricks using tools like Azure DevOps or GitHub Actions
    * Set up data governance, access control, and monitoring within Databricks
    * Collaborate with data engineers, platform teams, and stakeholders on technical delivery



    Required Technical Expertise:

    * Databricks (core platform, Delta Lake, Unity Catalog)
    * PySpark and Spark SQL
    * Cloud data services (AWS/GCP/Azure)
    * CI/CD automation for data pipelines
    * Git-based version control and release processes
    * Data lake architecture and performance tuning



    Candidates for this role must be able to demonstrate a deep technical fluency in Databricks and a strong track record of delivering complex data infrastructure projects

    Neu

    Senior Applications Developer

    England, Essex, Chelmsford

    • £350 to £400 GBP
    • Developer Stelle
    • Fähigkeiten: ICT, Microsoft Power Automate Desktop and DevOps, Microsoft 365, Dataverse, and Azure Services, JSON, Power Fx, JavaScript, HTML and SQL.
    • Seniority: Senior

    Jobbeschreibung

    This role will lead, monitor, optimise and continuously improve the delivery of software components so that a Council can provide simple user experiences and achieve better outcomes for people and businesses. They will be responsible for leading the team writing clean, accessible code that is open by default and easy for others to reuse, developing software which meets user needs and creates meaningful interactions and relationships with users.

    Knowledge, Skills & Experience



    * Educated to Degree Level or equivalent by experience in a relevant subject.
    * Hold relevant Microsoft certifications such as PL-100: Power Platform App Maker, PL-200: Power Platform Functional Consultant, PL-400: Power Platform Developer, PL-600: Power Platform Solution Architect.
    * Proven experience with Microsoft Power Automate Desktop and DevOps.
    * Solid understanding of Microsoft 365, Dataverse, and Azure Services
    * Understanding of application lifecycle management (ALM) in Power Platform
    * Knowledge of JSON, Power Fx, JavaScript, HTML and SQL.
    * Qualification or high level of demonstrable capability with relevant vendor business applications


    Desirable:



    * Able to demonstrate a clear understanding of, and capability to work within, relevant ICT related standards including HMG Security Policy Framework, IITIL V3, ISO/IEC 38500, ISO/IEC 27001, ISO/IEC 22301, ISO/IEC 20000, PRINCE2 and MSP
    * Good written and verbal communication skills with ability to present information in simple and accessible language to a wide range of audiences
    * Experience of balancing the needs of users with organisation priorities to make the right decisions and empowering teams to act upon them
    * Evidence of continual professional development to keep pace with technical and business change that meet defined SFIA V7 competencies


    Organisational Behaviours/Professional Competence



    * Implementing changes and continually evaluating service to improve the area of work, while maintaining the highest possible levels of service quality are continually delivered
    * Working collaboratively within and across functions to and thinking commercially support the delivery of best possible outcomes for our customers on a financially sustainable basis.
    * Deliver exemplar customer interactions to individuals and communities which support strong relationships and a reputation for achieving outcomes and resolving issues.
    * Roles at this level will be focused on delivering results in a specific functional area. They will hold expertise on the application of policy and improvement of service delivery. These roles have clear team budgets and targets set within the overall service requirements
    * Effective utilisation of digital technologies and innovation across the function.
    * Equality and diversity is celebrated and considered as part of all decisions taken.
    * Managing complex issues and resources to meet the needs of customers and deliver the best possible outcomes
    * Operational planning and performance review to maintain exceptional service delivery and ensures the political objectives and priorities of the council are met
    * Using professional expertise to translate goals and plans into ways of working that comply with relevant legislation and statutory requirements and manages a level of appropriate risk.
    * Above all, you will have the ability to develop skills and knowledge within your role.

    Some Hybrid working from the Town Hall in Chelmsford otherwise remote
    4 month contract likely to extend/ Inside IR35

    Neu

    AWS Data Engineer - Oslo

    Norway, Oslo

    • Negotiable
    • Data Science Stelle
    • Fähigkeiten: data engineer, python, AWS, Snowflake, dbt, SQL, ETL
    • Seniority: Mid-level

    Jobbeschreibung

    We are looking for a Data Engineer to bring data-driven insights into the agricultural industry in Norway!

    You will work directly with our product which is an open platform for gathering & analysing data to increase knowledge sharing amongst the farming and agricultural community - our goal is to create a more sustainable future for all

    Your role will involve building data products from our core platform using technologies such as AWS, Snowflake, DBT, SQL, and Python

    You will work alongside fellow data engineers, as well as data scientists, application developers, product owners and our R&D team who bring deep domain knowledge

    Your tasks can include:

    * Building data transformations and data pipelines
    * Integrating data from different sources
    * Automating processes
    * Ensuring data quality
    * Contributing towards the expansion of our data platform

    We are looking for you to bring:

    * Academic background in an engineering or computer science related subject
    * 3+ years experience in a data engineering position
    * Experience using data platforms to build data products
    * Competency with SQL & Python
    * Experience working in a cloud environment (preferably AWS)
    * Experience will tools such as Snowflake, DBT, Databricks, Airflow, Spark is beneficial

    What's in it for you?

    You will be part of a small but highly skilled working environment with 30 colleagues across engineering, data, product, and management

    Opportunities to work with new and modern technologies - we encourage exploring new tools and are always open to suggestions for improvement

    We value personal & professional development with opportunities learn new tools, methods, and soft skills - whether you see yourself as a future tech/team lead, architect, or even taking a step into an entirely new field!

    The value you create will directly impact farmers and the agricultural industry across Norway in a positive way

    Contributing to developing effective and sustainable agriculture for the future

    We offer enhanced pension & insurances, along with many other practical benefits

    We are a down-to-earth and committed team who share common passions - we create an inclusive & fun working environment and hold regular social activities for those who want to attend

    For any questions contact Lucy on l.whiting@tenthrevolution.com

    GCP Data Engineer - London - £75k +bonus

    England, London, City of London

    • £70,000 to £75,000 GBP
    • Engineer Stelle
    • Fähigkeiten: Azure, Google Cloud Developer Tools, gcp, data engineer, data engineering, snowflake, aws, data
    • Seniority: Mid-level

    Jobbeschreibung

    GCP Data Engineer - London - £75k +bonus

    Please note - this role will require you to attend the London based office 2-3 days per week. To be considered for this role you must have the unrestricted right to work in the UK - this organisation can not offer sponsorship.

    Are you a skilled Data Engineer with a passion for cloud technologies and a strong foundation in GCP, Azure, SQL, and Python? A leading Lloyd's of London reinsurance broker is seeking a talented individual to join their growing data team and help shape the future of their cloud data platform.

    Key Responsibilities:

    * Design, build, and maintain scalable data pipelines and ETL processes in Google Cloud Platform (GCP).
    * Lead and contribute to a major cloud migration project from Azure to GCP, ensuring seamless data integration and minimal disruption.
    * Collaborate with data scientists, analysts, and business stakeholders to deliver high-quality data solutions.
    * Integrate data from various sources, ensuring data quality, consistency, and security.
    * Write clean, efficient, and well-documented SQL and Python code for data transformation and automation.
    * Assist in the utilisation and adoption of emerging AI technology usage.
    * Assist in training and mentoring more junior members of staff.

    Required Skills & Experience:

    * Proven experience as a Data Engineer in a commercial environment.
    * Strong hands-on experience with Google Cloud Platform (GCP) services (e.g., BigQuery, Dataflow, Pub/Sub).
    * Solid understanding of Azure data services and hybrid cloud environments.
    * Advanced SQL skills and proficiency in Python for data engineering tasks.
    * Experience working in or with insurance, reinsurance, or financial services is a strong advantage.

    🎯 Desirable:

    * Familiarity with data governance, security, and compliance in regulated industries.
    * Experience with CI/CD pipelines and infrastructure-as-code tools (e.g., Terraform).
    * Knowledge of data modelling and warehousing best practices.

    Interviews for this role are beginning this week so please apply today if you'd like to be considered! To apply, please submit your CV or contact David Airey on 0191 338 7508 or at d.airey@tenthrevolution.com.

    Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.

    AWS Databricks Data Architect

    England, London, City of London

    • £100,000 to £120,000 GBP
    • Architect Stelle
    • Fähigkeiten: S3, AThena, Databricks, Workflows, Asset Bundles, python, pyspark, AWS, EKS, SQS, Lambda, Terraform, Jira, Confluence
    • Seniority: Senior

    Jobbeschreibung

    My client is based in the London area are currently looking to recruit for an experienced AWS Databricks Data Architect to join their team. They are one of the leaders within the consulting space and are currently going through a period of growth and are looking for an experienced Data Architect to join their team. They are backed by a huge Multi National equity firm who are looking to bolster my clients financial position. They are expected to see year on year growth, which will allow them to implement and utilise the most in demand and cutting edge technology on the market right now. They have just implemented Gen AI within their organisation are looking to utilise the newest technologies on the market.

    Your role will include:

    Responsible for designing and implementing effective Architectural solutions around the AWS Severless Technologies (S3, Lambda, Athena, Kafka) and Databricks including Data Lake and Data Warehousing.
    Assess database implementation procedures to ensure they comply with GDPR and data compliance.
    Guide, influence and challenge the technology teams and stakeholders to understand the benefits, pros and cons of solution options.
    Agree and set the technical direction for the data platform landscape and solutions, for the short and long term, in collaboration with delivery and engineering teams.
    This is a hands on role which requires extensive exposure to cloud technologies (AWS and Azure).

    The right candidate will have extensive knowledge of writing codes, building data pipelines and doing digital transformation and ingestion a certain tech suite. Extensive experience in implementing solutions around the AWS cloud environment (S3, Databricks, Athena, Glue),
    In depth understanding of Workflows, Asset Bundles, SQS, EKS, Terraform,
    Excellent understanding of Data Modelling & Kinesis
    An understanding of SQL/database management.
    Strong hands-on experience in Data Warehouse and Data Lake technologies preferably around AWS.

    My client is providing access to;

    Hybrid 2 days (London),
    28 Days Holiday, Plus Bank Holiday
    Private Medical Health
    Pension Scheme
    And More...

    This role is an urgent requirement, there are limited interview slots left, if interested send an up to date CV to Shoaib Khan - S.Khan@TenthRevolution.com or call 0191 338 7493 for a catch up in complete confidence.

    Frank Group's Data Teams offer more opportunities across the UK than any other recruiter We're the proud sponsor and supporter of SQLBits, AWS RE:Invent, Power Platform World Tour, the London Power BI User Group, Newcastle Power BI User Group and Newcastle Data Platform and Cloud User Group

    Technical SQL Developer

    England, London

    • £90,000 to £100,000 GBP
    • Consultant Stelle
    • Fähigkeiten: SQL, SSIS, ETL, SSAS, SSRS, Power BI,
    • Seniority: Senior

    Jobbeschreibung

    My client is based in the London area are currently looking to recruit for an experienced Technical SQL Developer to join a giant within the Insurance Indusrty. They are one of the leaders within the Consulting space. They are currently going through a period of growth and are looking for an experienced SQL Developer to join their team. They are backed by a huge Multi National equity firm who are looking to bolster my clients financial position. They are expected to see year on year growth, which will allow them to implement and utilise the most in demand and cutting edge technology on the market right now.

    Role Overview
    Design, develop, and maintain efficient SQL queries, stored procedures, views, and functions to access and process data from diverse systems.

    Build and manage reliable ETL pipelines using SQL and SSIS to support enterprise-wide data integration and transformation.

    Optimise SQL code and ETL workflows to enhance performance and scalability.

    Develop and maintain SSIS packages to ensure accurate and consistent data processing.

    Support operational reporting systems by resolving issues raised by users and stakeholders.

    (Preferred) Assist in the development of .NET/Windows services for data processing tasks.

    Carry out data profiling, source-to-target mapping, and quality checks to meet technical and business requirements.

    Work closely with analytics teams to translate business needs into technical solutions.

    Follow best practices in database design and coding; contribute to code reviews and collaborative development.

    Create clear technical documentation for data processes and workflows.



    Skills & Experience Required
    Degree (Bachelor's or Master's) in a quantitative discipline such as Computer Science, Engineering, Mathematics, or Economics.

    At least 5 years' experience in SQL development.

    Advanced skills in writing complex SQL, stored procedures, and performance tuning.

    Solid hands-on experience with SSIS and ETL development.

    Knowledge of BI/reporting tools (e.g. Power BI, Tableau, SSRS, SSAS) is advantageous.

    Strong analytical thinking and problem-solving abilities.

    Experience in fast-paced, agile environments with a proactive and solution-focused approach.

    Excellent written and verbal communication skills, with the ability to engage both technical and non-technical stakeholders.

    Able to collaborate effectively within global and culturally diverse teams.



    This role is an urgent requirement, there are limited interview slots left, if interested send an up to date CV to Shoaib Khan - S.Khan@Tenthrevolution.com or call 0191 338 7493 for a catch up in complete confidence.

    Tenth Rev Data Teams offer more opportunities across the UK than any other recruiter We're the proud sponsor and supporter of SQLBits, AWS RE:Invent, Power Platform World Tour, the London Power BI User Group, Newcastle Power BI User Group and Newcastle Data Platform and Cloud User Group.

    Neu

    Data Engineer - Leeds - ADF - DWH - Up to £80k

    England, West Yorkshire, Leeds

    • £60,000 to £80,000 GBP
    • Engineer Stelle
    • Fähigkeiten: Data Engineering, Data Engineer, Snowflake, ETL, ELT, ADF, Data Factory, Synapse Analytics, SSIS, Migration, Pipeline, Python, Spark, DBT, Snowflake, Azure, SQL, Leeds
    • Seniority: Senior

    Jobbeschreibung

    Data Engineer - Leeds - ADF - DWH - Up to £80k

    I'm working with a multinational, award winning organisation that are looking to scale out their entire IT team. Over the last decade or so, my client have been recognised as the leaders within their relative. As a result of their successes, they've grown exponentially and ventured into new countries where they've followed their blue print of success.
    This has meant that the business has looked to modernise their Systems to allow for such growth and one of the key modernisation's comes within their Data Platform.

    The vacancy is for a experienced Data Engineer who will be setting foot into a greenfield site where they will be using the latest and greatest technologies. This involves the use of the Azure Data stack as well as also, a data platform build which includes Snowflake.
    You will be migrating old data processes into the cloud and helping to build the new platform from scratch which means that you'll be working very closely with an expert team of Data professionals, ultimately developing a first in class data platform.

    This is a salaried position which ranges from between £60k-£80k. You can expect a great benefits package but also, an environment that boasts collaborative working as well as the opportunity to develop yourself. You will be given the opportunity to attend industry events, Training programmes but also, paid for, relevant certifications.

    Requirements

    * Strong background in Data Engineering / Data Warehousing
    * Strong experience with the Azure Data Stack; ADF, Synapse
    * Strong Python / Spark
    *
    * Strong SQL experience
    * Willingness to be in the office, ideally 4-5 days a week.
    * Additional experience of Snowflake is highly advantageous but not essential. Must show willingness to learn

    This is a great opportunity to join outstanding organisation who pride themselves on being one of the best companies to work for. Interviews are already taking place so don't miss out and apply now!

    If this is of an interest then get in touch ASAP. Send across your CV to t.shahid@tenthrevolution.com or alternatively, give me a call on 0191 3387551.

    Keywords: Data Engineering, Data Engineer, Snowflake, ETL, ELT, ADF, Data Factory, Synapse Analytics, SSIS, Migration, Pipeline, Python, Spark, DBT, Snowflake, Azure, SQL, Leeds

    AWS Platform Jobs

    Durchsuchen Sie unsere AWS-Job- und Projektdatenbank und finden Sie Ihre nächste Stelle oder Ihr nächstes Projekt. Wir haben viel Erfahrung darin, talentierte und qualifizierte Kandidat*innen mit einer passenden Stelle oder einem passenden IT-Projekt zu verbinden. Laden Sie einfach Ihren Lebenslauf hoch oder bewerben Sie sich auf eine unserer AWS-Plattform-Stellen oder Projekte und wir kümmern uns um die Kontaktaufnahme zwischen Ihnen und Ihrem potenziellen Arbeitgeber. Egal, ob es darum geht, für Ihre Talente zu werben, ein Vorstellungsgespräch zu vereinbaren oder ein höheres Gehalt auszuhandeln, wir unterstützen Sie, damit Sie den nächsten Schritt in Iher Karriere machen können.