Recherche actuelle

21 Résultats de la recherche

For CDI & Freelance In Sweden

    AI Engineer - Freelance

    Sweden, Stockholm

    • Negotiable
    • Engineer Role
    • Skills: Generative AI
    • Seniority: Mid-level

    Description du poste

    AI Engineer - Freelance

    Location: Stockholm- some onsite required
    Engagement: 100% allocation, 6 months initial contract



    Our client is looking for a talented AI Engineer to help build and deliver new AI‑driven features and prototypes. This is a great fit for freelancers who enjoy working with modern tools, moving quickly, and contributing to meaningful product decisions without heavy processes or red tape.

    What you'll be doing

    - Building and fine‑tuning LLMs

    - Developing lightweight agent workflows and RAG pipelines

    - Turning ideas into working prototypes and production‑ready solutions

    - Helping shape AI architecture and best practices

    - Working closely with product and engineering teams to deliver fast, high‑quality outcomes

    Tech they use

    * Python
    * PyTorch, Hugging Face
    * FastAPI
    * Chroma or Pinecone
    * Docker
    * AWS or GCP
    * LangChain / LlamaIndex
    * (Nice to have) Weights & Biases



    What they're looking for

    Strong practical experience with AI/ML

    Comfortable working independently and iterating quickly

    Ability to select the right approach for each project

    Good communication and a collaborative mindset

    Why freelancers enjoy working with this client

    Greenfield AI work with real impact

    Flexible working hours

    Modern, interesting projects

    Freedom to experiment with new tools



    How to apply


    Send your CV and contact details to t.roach@tenthrevolution.com along with a couple lines talking about a production AI system you built that you're proud of - and what you'd want to build next. We don't require any cover letter, just a small insight into how you operate.

    Head Of Data

    Sweden, Stockholm

    • Negotiable
    • Other Role
    • Skills: Data Architecture & Platform Design, Cloud & Tooling Expertise, Data Migration, Technical Leadership, Stakeholder Communication
    • Seniority: Senior

    Description du poste

    About the Role

    We're at a point where our data setup needs to evolve.

    Over time, different systems and solutions have been built to support the business. While they've served us well, they're no longer enough for where we're going next. We want to create a modern, scalable data platform that supports better decision-making, advanced analytics, and future AI initiatives.

    As Head of Data, you'll take ownership of that journey.

    This is a role for someone who enjoys building not just managing. You'll define the direction, make key technical decisions, and work closely with a strong team to bring that vision to life.

    What You'll Be Responsible For

    Defining the Data Strategy
    You'll set the long-term vision for how data is structured, stored, and used across the organisation. This includes selecting the right architecture and ensuring it supports both current and future needs.

    Designing the Platform
    You'll evaluate and implement a modern cloud-based data platform (warehouse/lakehouse), balancing scalability, performance, and cost.

    Leading the Migration Journey
    You'll plan and execute the transition from existing systems to a unified environment-ensuring stability, data integrity, and minimal disruption along the way.

    Leading the Team
    You'll work with a team of experienced, Sweden-based data engineers. Your role is to guide, support, and challenge them, creating an environment where people take ownership and do their best work.

    Working Across the Business
    You'll collaborate with stakeholders from different parts of the organisation, helping them understand what's possible with data and how it can create real value.

    Driving Best Practices
    You'll establish standards for data quality, governance, security, and documentation, ensuring the platform is reliable and trusted.

    Technology & Environment

    You'll have a strong voice in shaping the tech stack. We expect you to bring experience and perspective rather than follow a fixed setup.

    You've likely worked with:

    * Cloud data platforms such as Snowflake, Databricks, or BigQuery
    * Python for data engineering, automation, or ML-related workflows
    * BI tools like Power BI, Tableau, or Looker
    * Modern cloud environments and distributed data systems

    Who We're Looking For

    We're looking for someone who combines technical depth with leadership and pragmatism.

    * You've built or significantly evolved a data platform before
    * You have experience with large-scale data migrations
    * You're comfortable making architectural decisions and explaining them clearly
    * You've led engineering teams, ideally in a Swedish or similar work culture
    * You value collaboration, ownership, and low hierarchy
    * You're still hands-on enough to challenge technical decisions when needed

    Data Platform Engineer

    Sweden, Stockholm

    • Negotiable
    • Engineer Role
    • Skills: Azure Databricks, DBT, Python, SQL, Hands-on Platform experience
    • Seniority: Mid-level

    Description du poste

    The Role

    We are hiring a Senior Data Platform Engineer to help build and scale a cloud-based data platform in a highly regulated environment.

    The organisation is partway through its migration to Azure, with production workloads already in place. The focus now is not strategy or architecture it's execution: building reliable, production-grade data platforms that will underpin future data and AI capabilities.

    This role is for engineers who have actually built platforms within azure databricks, not just contributed to them or designed them at a high level.

    You will be working in a low-risk environment, where data security, governance, and cost control are critical. The expectation is that you can deliver within those constraints, not work around them.

    Responsibilities

    * Building and operating Azure Databricks-based data platforms used in production
    * Developing end-to-end data pipelines using Python and SQL
    * Configuring and optimising Databricks environments, including:

    * Cluster setup, tuning, and troubleshooting
    * Job orchestration and scheduling
    * Managing performance vs cost trade-offs

    * Implementing data transformation and modelling workflows (dbt or similar)
    * Owning platform components from initial build through to production support
    * Ensuring data platforms meet strict security, governance, and risk requirements
    * Contributing to ongoing cloud migration and platform modernisation work

    Required Experience

    * Strong, hands-on experience building Azure Databricks data platforms from scratch
    * Proven track record delivering and supporting production-grade data pipelines
    * Solid experience with:

    * Azure
    * Azure Databricks
    * Python
    * SQL

    * Experience working with data at scale (not just small or isolated datasets)
    * Background in regulated or enterprise environments with governance constraints
    * Comfortable working in cost-sensitive environments where efficiency matters

    Nice to Have

    * Hands-on experience with dbt in production
    * Experience in financial services or similar regulated industries
    * Exposure to cloud migration programmes
    * Experience optimising Databricks for performance and cost

    Why This Role

    * Opportunity to build real, production data platforms, not just prototypes
    * Direct impact on a platform that will enable future AI and data capabilities
    * Work in an environment where engineering quality, security, and reliability actually matter
    * Join a team that values delivery over theory

    Nouveau

    Data Engineer

    Sweden, Gothenburg

    • Negotiable
    • Technician Role
    • Skills: MS Power BI, SQL
    • Seniority: Senior

    Description du poste

    Your Role

    As a Senior Databricks Lakehouse Architect (Swedish-Speaking), you will lead the design and evolution of enterprise-grade data platforms built on the modern lakehouse paradigm. This role is tailored for engineers who go beyond standard pipeline development focusing on performance tuning, governance at scale, and advanced Databricks capabilities.

    You will act as a technical authority, shaping best practices across data engineering, platform architecture, and machine learning integration within the Databricks ecosystem.



    In This Role, You Will

    Own end-to-end Lakehouse architecture
    Design and implement scalable lakehouse solutions using Databricks, leveraging advanced features such as Delta Live Tables, Unity Catalog, and Photon for high-performance workloads.

    Build highly optimized Spark workloads
    Develop and fine-tune complex distributed data pipelines using Apache Spark, with deep focus on query optimisation, partitioning strategies, and cost efficiency.

    Implement advanced data governance and security
    Drive enterprise-grade governance using Unity Catalog, data lineage, fine-grained access control, and compliance frameworks across Azure environments.

    Engineer real-time and batch data solutions
    Design streaming architectures (Structured Streaming, Auto Loader) alongside batch processing pipelines for large-scale, mission-critical data sets.

    Lead ML and MLOps integration
    Operational machine learning models using Databricks ML, MLflow, and CI/CD pipelines-ensuring reproducibility, monitoring, and life cycle management.

    Act as a technical mentor and stakeholder partner
    Collaborate with senior stakeholders, data scientists, and platform teams while mentoring engineers and setting engineering standards.



    Your Profile

    * Extensive hands-on experience in Data Engineering, with a strong focus on Databricks-based platforms
    * Deep expertise in:

    * Advanced Databricks Lakehouse architecture
    * Delta Lake internals (transaction logs, optimisation, Z-ordering, vacuum strategies)
    * Performance tuning in Spark (Catalyst optimiser, execution plans, memory management)

    * Proven experience with Azure Data Lake Storage, Azure Data Factory, and broader Azure ecosystem
    * Strong programming skills in Python and SQL, with production-grade pipeline development experience
    * Hands-on experience with MLflow, feature stores, and MLOps practices
    * Experience implementing data governance frameworks and working with regulated data environments
    * Fluent Swedish (spoken and written) - essential for collaboration with local stakeholders



    What Sets You Apart (Essesntial)

    * Experience with Unity Catalog at scale across multiple workspaces
    * Deep understanding of cost optimization in Databricks (cluster policies, spot instances, workload isolation)
    * Experience with Delta Sharing and cross-platform data collaboration
    * Building medallion architecture (Bronze/Silver/Gold) at enterprise scale
    * Familiarity with infrastructure-as-code (Terraform) for Databricks deployments
    * Exposure to real-time analytics and event-driven architectures