Recherche actuelle

16 Résultats de la recherche

For CDI

AI Engineer

England, London

  • £80,000 to £105,000 GBP
  • Engineer Role
  • Skills: Python, AI, AWS
  • Seniority: Senior

Description du poste

Unlock your potential as an AV

AI Engineer

Location: London
Salary: £105,000

Are you an experienced engineer with hands‑on expertise in Python, LLMs, and modern cloud technologies? We're looking for a Lead Engineer to join a high-impact Generative AI feature team, building applications used by thousands of colleagues daily. This is a unique opportunity to drive AI innovation at scale within a highly regulated environment.



About the Role

As a Lead Engineer, you will design, develop, and enhance software solutions using modern engineering practices. You'll act as an SME within the Generative AI domain, shaping technical direction, guiding others, and influencing both strategy and implementation.

You will lead a small team, provide mentorship, conduct code reviews, and drive a culture of technical excellence. This role offers a blend of hands-on engineering, leadership, and strategic input - perfect for someone ready to step into a high-ownership position.



Key Responsibilities:

* Build high‑quality, scalable, maintainable Python-based applications.
* Develop and deploy AI-driven applications, including those using LLMs.
* Collaborate closely with product, design, and engineering teams.
* Contribute to solution design, architecture, and secure coding practices.
* Drive testing best practices and ensure repeatable, reliable deployments.

* Mentor and guide junior engineers; support ongoing capability development.
* Lead technical discussions, code reviews, and cross-functional collaboration.
* Influence decision‑making and contribute to policy/standards.
* Support risk management, governance, and control requirements.

* Contribute to the organisation's Generative AI strategy as a subject matter expert.
* Analyse complex, multi-source data to inform design and decision‑making.
* Communicate complex or sensitive information clearly to senior stakeholders.



Required Experience

To succeed in this role, you should have strong hands-on capability in:

* Python
* Working with Large Language Models (LLMs)
* Cloud technologies, ideally AWS (Bedrock, Lambda, S3, Lex, CloudWatch)
* Prompt optimisation and evaluation methodologies
* Strong communication skills, especially in cross-functional environments
* Mentoring, coaching, or guiding other engineers

Node.js Developer - Oslo

Norway, Fornebu

  • NOK 700,000 to NOK 900,000 NOK
  • Engineer Role
  • Skills: AWS, DevOps, Elastic Search, Node.JS, SQL, TypeScript, RabittMQ
  • Seniority: Senior

Description du poste

Senior Back End Engineer

Vi søker en erfaren backend- eller fullstack-utvikler som ønsker å ta teknisk eierskap. Du vil få hovedansvar for et kjerne­system som står sentralt i virksomhetens datadrevne prosesser. Målet er å styrke stabilitet, ytelse og videreutvikling, samtidig som du blir en nøkkelperson i et lite og langsiktig in-house team som bygger opp intern kompetanse og reduserer konsulentavhengighet.

Systemet du får ansvar for er en avansert matchingsplattform som kobler store mengder brukerprofiler med relevante objekter i sanntid. Plattformen benyttes daglig internt og håndterer et svært høyt volum av preferanser, søk og datakombinasjoner. Teknisk sett vil du jobbe med komplekse søkestrukturer, scoring-logikk og realtidsmatching av nye data mot tusenvis av profiler.



Rolle og ansvar

* Fullt teknisk eierskap til plattformen (drift, nyutvikling, stabilitet, ytelse, sikkerhet)
* Vedlikehold og modernisering av eksisterende arkitektur
* Backend- og infrastrukturansvar
* Tett samarbeid med teamet om arkitektur, roadmaps og prioriteringer

Kvalifikasjoner

* 5+ års solid erfaring med backend-utvikling
* Svært god kompetanse på Node.js og TypeScript
* Erfaring med event-drevet arkitektur (RabbitMQ e.l.)
* God erfaring med PostgreSQL (design, migrering, optimalisering)
* Erfaring med OpenSearch/Elasticsearch
* Trygg på komplekse systemer i produksjon
* Gode norskkunnskaper
* Interesse for KI/ML
* DevOps (CI/CD, monitoring, automatisering)



Hvis du ønsker å jobbe i et mindre team hvor du får jobbe med datadreven utvikling som virkelig gjør en forskjell for brukere og igjen kunder, ta kontakt i dag for mer informasjon.

Lead Data Analyst

England, Tyne and Wear, Newcastle upon Tyne

  • £70,000 to £75,000 GBP
  • Engineer Role
  • Skills: SQL, aws, databricks, l
  • Seniority: Mid-level

Description du poste

Lead Data Analyst / Data Product Lead - Managing Consultant

The Opportunity
You'll lead the delivery of analytical outcomes that enable organisations to realise their strategic vision. Acting as the bridge between business goals, data requirements, and technical implementation, you'll guide multidisciplinary teams and help clients modernise their data platforms, analytical capabilities, and decision‑making processes.
This role is ideal for someone who thrives in complex environments, enjoys solving ambiguous problems, and is passionate about modern cloud, big data, and analytics technologies.

What You'll Do
* Own and lead analytical delivery within broader data platform or transformation programmes.
* Guide teams of analysts, data engineers and analytics engineers to deliver end‑to‑end outcomes-from data workflows to analytical services and reporting assets.
* Define and uphold standards for requirements, documentation, code quality, version control, and release management.
* Partner with stakeholders across business and technology to prioritise work, manage expectations, and drive adoption.
* Run workshops to clarify requirements, map processes, and align teams on analytical definitions and success criteria.
* Shape and maintain analytical services, ensuring clear "definition of done" for outputs and user stories.
* Promote best practices in cloud, big data, analytics engineering, and AI‑accelerated frameworks.
* Contribute to proposals, shaping analytics workstreams, estimating effort, and defining delivery approaches.
* Support the creation of reusable assets such as analytics frameworks, reconciliation packs, and migration playbooks.
* Act as a role model for consulting behaviours: curiosity, clarity, pragmatism, integrity, and client empathy.

About You
You bring a blend of analytical depth, technical understanding, and strong consulting skills. You can see the bigger picture, navigate ambiguity, and lead teams to deliver high‑quality analytical products.
Experience & capabilities include:
* Significant experience leading analytical product delivery in complex, multi‑team environments.
* Proven track record delivering analytical and technical outcomes on modern cloud platforms (e.g., AWS, Azure, Snowflake, Databricks).
* Strong experience with data migration validation, reconciliation, data controls, and go‑live readiness.
* Ability to mentor analysts and collaborate effectively with engineers and architects.
* Strong stakeholder engagement skills across business and technical teams.
* Advanced SQL and Python skills.
* Solid understanding of data modelling (dimensional; Data Vault familiarity a plus).
* Strong BI and analytics experience (dashboarding, semantic modelling, storytelling).
* Familiarity with modern data warehousing, distributed processing, streaming, and DataOps.
* Comfortable leading iterative delivery using agile principles.

Qualifications & Tools
Experience with some of the following is beneficial:
* SQL/Python, Power BI, Tableau, Qlik, Dataiku, Alteryx
* AWS, Azure, GCP, Snowflake, Databricks certifications
* SAFe, Scrum Master or similar agile qualifications
* Modern data warehousing tools (Fabric, Lake Formation, Snowflake, Databricks)
* dbt or equivalent transformation tooling
* Airflow / ADF / Dagster
* Data governance, cataloguing, lineage tools
* Agile toolsets such as JIRA, Confluence, DevOps

Working Environment
* Permanent role with flexible working options.
* Hybrid model: typically 3 days per week in office (Newcastle).
* Some UK and international travel may be required.
* Eligibility for security clearance is essential.

What's in It for You
* Competitive salary with bonus potential.
* Highly collaborative culture with strong values and a people‑first mindset.
* Flexible benefits focused on wellbeing and lifestyle.
* 25 days' holiday, with the option to flex to 30.
* Two CSR volunteering days.
* Award‑winning learning and development, including dedicated training time.
* Personal tech budget for devices and accessories.
* Rapid progression opportunities in a high‑growth environment.

Please send me a copy of your CV if you're interested

Databricks Architect

England, London

  • £100,000 to £120,000 GBP
  • Engineer Role
  • Skills: Azure Databricks
  • Seniority: Senior

Description du poste

Data Architect - Databricks (Hybrid, UK)
Locations: London, Manchester, or Edinburgh
Hybrid: 2-3 days per week on-site
Salary: Competitive (Manager & Senior Manager grades available)

About the Role
We are seeking an experienced Data Architect with deep expertise in Databricks to help our clients design, build, and scale modern data platforms. You will play a pivotal role in shaping Lakehouse architectures that enable advanced analytics, AI/ML, and enterprise-wide data-driven decision-making.
Working closely with clients early in their data journey, you will assess business needs, define architectural direction, and guide the implementation of robust, secure, and scalable solutions. This is a hands-on architecture role suited to someone who has spent the last 2-3 years working directly with Databricks at an architectural level and is ready to progress towards programmes such as the Databricks DPP.

Key Responsibilities
* Architect and implement Databricks Lakehouse solutions across ingestion, processing, storage, and analytics layers.
* Recommend best practices and innovative approaches for modern data platforms.
* Build strong client relationships and confidently present architectural decisions to senior stakeholders.
* Shape client data strategies and promote governance, quality, and security standards.
* Lead architectural engagements and ensure delivery within scope, budget, and timelines.
* Optimise Databricks workloads for performance, scalability, and cost efficiency.
* Implement governance and compliance frameworks using Unity Catalog, Purview, and cloud-native controls.
* Develop CI/CD pipelines using Databricks Repos, GitHub Actions, or Azure DevOps.
* Contribute to RFI/RFP responses and deliver innovative Proofs of Concept.
* Support the internal Architecture Practice by developing reusable patterns and accelerators.

Skills & Experience
* Proven experience delivering enterprise-scale Databricks solutions end-to-end.
* Strong background in Lakehouse Architecture, including structured and unstructured data.
* Expertise in Spark, PySpark, Delta Lake, and Databricks workflows.
* Experience building scalable ETL/ELT pipelines, including Delta Live Tables.
* Strong programming skills in Python, Scala, or SQL.
* Solid understanding of data modelling (3NF, Kimball, Data Vault).
* Experience integrating Lakehouse architectures with BI tools such as Power BI and Tableau.
* Hands-on experience with at least one major cloud platform (Azure, AWS, or GCP) and understanding of Databricks implications across each.
* Knowledge of Databricks security best practices (RBAC, IAM, encryption).
* Excellent communication, stakeholder engagement, and problem-solving skills.

Highly Valued Certifications
* Databricks Certified Data Engineer (Associate/Professional)
* Databricks Certified Machine Learning (Associate/Professional)
* Databricks Generative AI Fundamentals
* Databricks Lakehouse Fundamentals

Why Join Us?
* Generous annual leave and private medical insurance.
* Strong focus on wellbeing and personal development.
* A culture that rewards high performance and nurtures talent.
* Opportunities to work on impactful client projects and drive meaningful change.
* Supportive environment with investment in certifications and career progression.

Additional Information
* This role is fully signed off and part of a growing Databricks capability.
* Candidates must be willing to travel between UK offices when required.
* Suitable for individuals with strong architectural experience rather than purely engineering backgrounds.

Please can you send me a copy of your CV if you're interested

Databricks Architect

England, London

  • £100,000 to £120,000 GBP
  • Engineer Role
  • Skills: Azure Databricks
  • Seniority: Senior

Description du poste

Data Architect - Databricks (Hybrid, UK)
Locations: London, Manchester, or Edinburgh
Hybrid: 2-3 days per week on-site
Salary: Competitive (Manager & Senior Manager grades available)

About the Role
We are seeking an experienced Data Architect with deep expertise in Databricks to help our clients design, build, and scale modern data platforms. You will play a pivotal role in shaping Lakehouse architectures that enable advanced analytics, AI/ML, and enterprise-wide data-driven decision-making.
Working closely with clients early in their data journey, you will assess business needs, define architectural direction, and guide the implementation of robust, secure, and scalable solutions. This is a hands-on architecture role suited to someone who has spent the last 2-3 years working directly with Databricks at an architectural level and is ready to progress towards programmes such as the Databricks DPP.

Key Responsibilities
* Architect and implement Databricks Lakehouse solutions across ingestion, processing, storage, and analytics layers.
* Recommend best practices and innovative approaches for modern data platforms.
* Build strong client relationships and confidently present architectural decisions to senior stakeholders.
* Shape client data strategies and promote governance, quality, and security standards.
* Lead architectural engagements and ensure delivery within scope, budget, and timelines.
* Optimise Databricks workloads for performance, scalability, and cost efficiency.
* Implement governance and compliance frameworks using Unity Catalog, Purview, and cloud-native controls.
* Develop CI/CD pipelines using Databricks Repos, GitHub Actions, or Azure DevOps.
* Contribute to RFI/RFP responses and deliver innovative Proofs of Concept.
* Support the internal Architecture Practice by developing reusable patterns and accelerators.

Skills & Experience
* Proven experience delivering enterprise-scale Databricks solutions end-to-end.
* Strong background in Lakehouse Architecture, including structured and unstructured data.
* Expertise in Spark, PySpark, Delta Lake, and Databricks workflows.
* Experience building scalable ETL/ELT pipelines, including Delta Live Tables.
* Strong programming skills in Python, Scala, or SQL.
* Solid understanding of data modelling (3NF, Kimball, Data Vault).
* Experience integrating Lakehouse architectures with BI tools such as Power BI and Tableau.
* Hands-on experience with at least one major cloud platform (Azure, AWS, or GCP) and understanding of Databricks implications across each.
* Knowledge of Databricks security best practices (RBAC, IAM, encryption).
* Excellent communication, stakeholder engagement, and problem-solving skills.

Highly Valued Certifications
* Databricks Certified Data Engineer (Associate/Professional)
* Databricks Certified Machine Learning (Associate/Professional)
* Databricks Generative AI Fundamentals
* Databricks Lakehouse Fundamentals

Why Join Us?
* Generous annual leave and private medical insurance.
* Strong focus on wellbeing and personal development.
* A culture that rewards high performance and nurtures talent.
* Opportunities to work on impactful client projects and drive meaningful change.
* Supportive environment with investment in certifications and career progression.

Additional Information
* This role is fully signed off and part of a growing Databricks capability.
* Candidates must be willing to travel between UK offices when required.
* Suitable for individuals with strong architectural experience rather than purely engineering backgrounds.

Please can you send me a copy of your CV if you're interested

Databricks Architect

Scotland, Edinburgh

  • £110,000 to £120,000 GBP
  • Engineer Role
  • Skills: Azure Databricks
  • Seniority: Senior

Description du poste

Data Architect - Databricks (Hybrid, UK)
Locations: London, Manchester, or Edinburgh
Hybrid: 2-3 days per week on-site
Salary: Competitive (Manager & Senior Manager grades available)

About the Role
We are seeking an experienced Data Architect with deep expertise in Databricks to help our clients design, build, and scale modern data platforms. You will play a pivotal role in shaping Lakehouse architectures that enable advanced analytics, AI/ML, and enterprise-wide data-driven decision-making.
Working closely with clients early in their data journey, you will assess business needs, define architectural direction, and guide the implementation of robust, secure, and scalable solutions. This is a hands-on architecture role suited to someone who has spent the last 2-3 years working directly with Databricks at an architectural level and is ready to progress towards programmes such as the Databricks DPP.

Key Responsibilities
* Architect and implement Databricks Lakehouse solutions across ingestion, processing, storage, and analytics layers.
* Recommend best practices and innovative approaches for modern data platforms.
* Build strong client relationships and confidently present architectural decisions to senior stakeholders.
* Shape client data strategies and promote governance, quality, and security standards.
* Lead architectural engagements and ensure delivery within scope, budget, and timelines.
* Optimise Databricks workloads for performance, scalability, and cost efficiency.
* Implement governance and compliance frameworks using Unity Catalog, Purview, and cloud-native controls.
* Develop CI/CD pipelines using Databricks Repos, GitHub Actions, or Azure DevOps.
* Contribute to RFI/RFP responses and deliver innovative Proofs of Concept.
* Support the internal Architecture Practice by developing reusable patterns and accelerators.

Skills & Experience
* Proven experience delivering enterprise-scale Databricks solutions end-to-end.
* Strong background in Lakehouse Architecture, including structured and unstructured data.
* Expertise in Spark, PySpark, Delta Lake, and Databricks workflows.
* Experience building scalable ETL/ELT pipelines, including Delta Live Tables.
* Strong programming skills in Python, Scala, or SQL.
* Solid understanding of data modelling (3NF, Kimball, Data Vault).
* Experience integrating Lakehouse architectures with BI tools such as Power BI and Tableau.
* Hands-on experience with at least one major cloud platform (Azure, AWS, or GCP) and understanding of Databricks implications across each.
* Knowledge of Databricks security best practices (RBAC, IAM, encryption).
* Excellent communication, stakeholder engagement, and problem-solving skills.

Highly Valued Certifications
* Databricks Certified Data Engineer (Associate/Professional)
* Databricks Certified Machine Learning (Associate/Professional)
* Databricks Generative AI Fundamentals
* Databricks Lakehouse Fundamentals

Why Join Us?
* Generous annual leave and private medical insurance.
* Strong focus on wellbeing and personal development.
* A culture that rewards high performance and nurtures talent.
* Opportunities to work on impactful client projects and drive meaningful change.
* Supportive environment with investment in certifications and career progression.

Additional Information
* This role is fully signed off and part of a growing Databricks capability.
* Candidates must be willing to travel between UK offices when required.
* Suitable for individuals with strong architectural experience rather than purely engineering backgrounds.

Please can you send me a copy of your CV if you're intersted

AI Engineer - Agentic AI & Gen AI

Australia, New South Wales, Sydney

  • A$170,000 to A$200,000 AUD
  • Engineer Role
  • Skills: AI Engineer, Generative AI, Agentic AI, AI Agents, Azure, RAG, Retrieval Augmented Generation, LangGraph, AutoGen, CrewAI, Python Developer, Machine Learning Engineer, LLM, Engineer, Artificial Intelligence, Foundry
  • Seniority: Senior

Description du poste

Sydney / Melbourne | Permanent OR Contract

Tenth Revolution Group is partnering with a specialist AI and data consultancy delivering advanced AI solutions into highly regulated enterprise environments.

Due to continued growth, they are looking to appoint multiple AI Engineers to play a key role in building and deploying next-generation AI solutions, with a particular focus on agentic AI systems and generative AI platforms.

This is a hands-on engineering role within a growing AI team responsible for turning emerging AI capabilities into production-grade systems used in complex, real-world environments.

Open to both permanent or contract, Melbourne or Sydney locations.



The Opportunity

You will design, prototype and deploy advanced AI systems that automate complex workflows and support intelligent decision-making.

Working closely with engineers, data specialists and business stakeholders, you will help translate ideas into scalable AI products while contributing to internal frameworks and reusable AI components.

This role is ideal for someone who enjoys building practical AI systems rather than purely researching them.



Key Responsibilities

*

Design and deliver end-to-end AI solutions solving complex business challenges
*

Build and deploy AI agents and automated workflows using modern frameworks
*

Prototype and scale production-grade Generative AI systems
*

Develop reusable AI components, orchestration frameworks and internal tooling
*

Collaborate closely with engineers, data scientists and product stakeholders
*

Monitor and improve deployed AI systems for performance and reliability
*

Apply responsible AI practices including explainability, governance and traceability
*

Ensure solutions meet security and compliance standards expected in enterprise environments



Skills & Experience

*

Strong software engineering background with several years building production systems
*

Hands-on experience developing AI agents, automation workflows or multi-agent systems
*

Experience with agent frameworks such as LangGraph, AutoGen, CrewAI or similar
*

Experience designing and deploying Retrieval-Augmented Generation (RAG) solutions
*

Ability to rapidly prototype AI-powered applications and iterate towards production systems
*

Experience deploying AI solutions across major cloud platforms
*

Exposure to knowledge graphs, orchestration patterns or advanced retrieval architectures is advantageous
*

Experience supporting or monitoring production AI systems
*

Ability to collaborate with both technical and non-technical stakeholders

Experience delivering solutions within regulated or enterprise environments is highly regarded.



Why Apply?

*

Join a growing AI engineering team building advanced AI capabilities
*

Work on agent-based AI systems and production GenAI solutions
*

Strong technical ownership with opportunity to shape AI frameworks and tooling
*

Collaborative environment with strong focus on learning and development



Apply now or contact Tenth Revolution Group for a confidential discussion.

Contact:

Neros Gorges
(03) 8592 0507
n.gorges@tenthrevolution.com

#SCR-neros-gorges

Lead Azure Databricks Engineer

England, London

  • £70,000 to £80,000 GBP
  • Engineer Role
  • Skills: Azure Databricks, insurance, reinsurance, lloyds of london
  • Seniority: Senior

Description du poste

Azure Databricks Engineer
Lloyd's of London Market

Hybrid 2 days in London Office

We are seeking an outstanding Lead Azure Databricks Engineer to own the design, build, optimisation and governance of our enterprise cloud data platforms. This is a senior, fully hands‑on engineering position requiring deep technical expertise, strong delivery capability and experience operating within the Lloyd's of London market and its regulatory demands.

You will shape the future of our cloud data strategy, enabling advanced analytics and building secure, scalable and highly performant data solutions across the organisation.

Key Responsibilities

* Lead the development, optimisation and governance of large‑scale data platforms using Azure Data Factory, Data Lake, Key Vault, Azure Functions, Databricks, Delta Lake, PySpark and Unity Catalog
* Partner closely with underwriting, actuarial, delegated authority, bordereaux, exposure management, reinsurance, finance, Solvency II and risk teams to deliver solutions aligned to Lloyd's of London regulatory and operational requirements
* Translate complex business needs into scalable cloud data architectures in partnership with architects, SMEs and product owners
* Resolve complex technical challenges rapidly and decisively, consistently delivering high‑quality outcomes
* Champion engineering best practices including design patterns, data lifecycle management, CI/CD automation and cloud‑native methodologies
* Provide hands‑on technical leadership, mentorship and uplift engineering capability across the team
* Drive continuous improvement across performance, cost optimisation, reliability and resilience
* Oversee end‑to‑end data engineering delivery ensuring quality, observability, robustness and compliance

Required Experience and Skills

* Expert‑level knowledge of Azure Data Services and Databricks within enterprise environments
* Deep understanding of Delta Lake, Medallion architecture, distributed compute, Lakehouse patterns, data modelling and performance optimisation
* Strong Python, PySpark and SQL experience, with hands‑on implementation of CI/CD for data solutions using Azure DevOps
* Strong understanding of data governance, lineage, access management, FinOps and secure cloud engineering
* Excellent communication skills with the ability to confidently engage senior stakeholders and simplify complex technical concepts
* Proven ability to take ownership, deliver at pace and operate effectively in high‑pressure, high‑expectation environments
* Experience working within or delivering solutions for the Lloyd's of London market or similarly regulated environments

Please send me a copy if your CV if you're interested

Data Engineer

England, Oxfordshire, Banbury

  • £60,000 to £70,000 GBP
  • Engineer Role
  • Skills: Databricks, Python, Spark, SQL
  • Seniority: Senior

Description du poste

Data Engineer - Hybrid (Oxfordshire) - Databricks, Python, Spark, SQL - up to £70k + Benefits



Why this company?

You'll be joining a growing organisation where data is at the heart of every strategic decision. This isn't a place where you're just maintaining pipelines, you'll help shape their entire data landscape.
They're investing heavily in modern data platforms, encouraging innovation, and giving engineers real ownership over architecture, tooling, and best practices. If you want autonomy, a voice in technical direction, and the chance to build things properly, you'll feel at home here.



You'll work with:

* Designing and delivering scalable data solutions on Databricks
* Building and maintaining robust data pipelines and ETL workflows
* Working closely with senior data leaders on architecture and strategy
* Translating architectural designs into high‑quality build plans
* Optimising large‑scale data workflows for performance and reliability
* Implementing strong data quality, validation, and governance processes
* Providing technical guidance and mentoring a small team of Data Engineers



Benefits:

* Highly competitive salary
* Strong benefits package
* Hybrid working model with flexibility
* Career progression opportunities within a growing data function
* Support for training, certifications, and professional development



Key Experience:

* Strong commercial experience with Databricks
* Solid knowledge of Python, Spark, and SQL
* Experience working with major cloud platforms
* Familiarity with modern pipeline tools and best practices
* Strong problem‑solving abilities and proven leadership/mentoring skills



Ready for a role where you can make real impact?

If you're passionate about data engineering and want your work to genuinely shape a modern data platform, apply today or send your CV directly and I'll be in touch right away!

Nouveau

Data Engineer (m/w/d) - Microsoft Fabric

Germany, Hamburg

  • €70,000 to €85,000 EUR
  • Engineer Role
  • Skills: Data Engineering, Microsoft Fabric, MS Power BI, Python, SQL, Databricks, SQL, Lakehouse, Data Warehouse, Data Architecture, ETL, CI/CD, Agentic AI, Business Intelligence, Data Science, SSIS, SSRS, SSAS, Delta Lake, Spark, DWH, Power BI, Data Factory
  • Seniority: Mid-level

Description du poste

Für ein wachsendes Unternehmen im Norden Deutschlands suche ich einen Data Engineer (m/w/d) mit Fokus auf Microsoft Fabric. Die Rolle bietet viel Gestaltungsspielraum beim Aufbau einer modernen Datenplattform.

Data Engineer (m/w/d) - Microsoft Fabric

Aufgaben

* Aufbau und Weiterentwicklung einer neuen Microsoft-Fabric-Datenplattform
* Konzeption und Implementierung von ETL/ELT-Prozessen (Data Factory, Dataflows, Notebooks)
* Entwicklung von Lakehouse-Strukturen, Delta-Tabellen und Medallion-Architekturen
* Migration einer bestehenden DWH-Landschaft in Microsoft Fabric
* Performance-Optimierung von Pipelines, Spark-Jobs und Warehouse-Abfragen
* Abstimmung mit Fachbereichen zu Datenmodellen und Reporting-Anforderungen
* Dokumentation sowie Unterstützung im 2nd-Level-Support

Anforderungen

* Mehrjährige Erfahrung mit Microsoft Fabric (Data Factory, Lakehouse, Spark, Warehouse)
* Sehr gute SQL-Kenntnisse und Erfahrung in Datenmodellierung
* Erfahrung im Aufbau oder in der Weiterentwicklung von Data-Warehouse-/Datenplattformen
* Kenntnisse gängiger ETL-/ELT-Design-Patterns und Best Practices
* Strukturierte, lösungsorientierte Arbeitsweise
* Deutsch und Englisch in Projektumgebung

Nice to have

* Erfahrung mit SSIS oder klassischen MSSQL-DWH-Umgebungen
* Power BI / SSRS Kenntnisse

Rahmenbedingungen

* Modernes, wachsendes Unternehmensumfeld mit Gestaltungsspielraum
* Flexible Arbeitszeiten & Homeoffice-Möglichkeit
* Individuelle Weiterbildungsangebote
* Attraktives Gesamtpaket inkl. zusätzlicher Benefits
* Standort: Deutschland (Großstadt), hybrides Arbeiten möglich

Bewerbung:

Um Dich auf diese Stelle zu bewerben, sende Deinen vollständigen Lebenslauf unter Angabe Deiner Gehaltsvorstellung an f.sarwar@tenthrevolution.com . Bei weiteren Fragen stehen wir Dir auch telefonisch unter 0221 6503 3904 zur Verfügung.

Die Frank Recruitment Group ist einer der globalen Marktführer in der Vermittlung von IT-Spezialisten. Seit 2006 vermitteln wir qualifizierte Fachkräfte für Jobs und Projekte an Unternehmen aller Art und Größe. Vom Start-Up, das einen Entwickler braucht, bis hin zu internationalen Unternehmen, die ein komplettes Team an einem neuen Standort aufbauen - wir sind der passende Ansprechpartner.

Wir sind Teil der Tenth Revolution Group und verbunden mit dem Unternehmen TPG Growth, zu dem Marken wie Uber, Airbnb, Survey Monkey und Spotify gehören.

Impressum - https://www.frankgroup.com/de/impressum/