Recherche actuelle

10 Résultats de la recherche

For CDI & Freelance In Stockholm

    Head Of Data

    Sweden, Stockholm

    • Negotiable
    • Other Role
    • Skills: Data Architecture & Platform Design, Cloud & Tooling Expertise, Data Migration, Technical Leadership, Stakeholder Communication
    • Seniority: Senior

    Description du poste

    About the Role

    We're at a point where our data setup needs to evolve.

    Over time, different systems and solutions have been built to support the business. While they've served us well, they're no longer enough for where we're going next. We want to create a modern, scalable data platform that supports better decision-making, advanced analytics, and future AI initiatives.

    As Head of Data, you'll take ownership of that journey.

    This is a role for someone who enjoys building not just managing. You'll define the direction, make key technical decisions, and work closely with a strong team to bring that vision to life.

    What You'll Be Responsible For

    Defining the Data Strategy
    You'll set the long-term vision for how data is structured, stored, and used across the organisation. This includes selecting the right architecture and ensuring it supports both current and future needs.

    Designing the Platform
    You'll evaluate and implement a modern cloud-based data platform (warehouse/lakehouse), balancing scalability, performance, and cost.

    Leading the Migration Journey
    You'll plan and execute the transition from existing systems to a unified environment-ensuring stability, data integrity, and minimal disruption along the way.

    Leading the Team
    You'll work with a team of experienced, Sweden-based data engineers. Your role is to guide, support, and challenge them, creating an environment where people take ownership and do their best work.

    Working Across the Business
    You'll collaborate with stakeholders from different parts of the organisation, helping them understand what's possible with data and how it can create real value.

    Driving Best Practices
    You'll establish standards for data quality, governance, security, and documentation, ensuring the platform is reliable and trusted.

    Technology & Environment

    You'll have a strong voice in shaping the tech stack. We expect you to bring experience and perspective rather than follow a fixed setup.

    You've likely worked with:

    * Cloud data platforms such as Snowflake, Databricks, or BigQuery
    * Python for data engineering, automation, or ML-related workflows
    * BI tools like Power BI, Tableau, or Looker
    * Modern cloud environments and distributed data systems

    Who We're Looking For

    We're looking for someone who combines technical depth with leadership and pragmatism.

    * You've built or significantly evolved a data platform before
    * You have experience with large-scale data migrations
    * You're comfortable making architectural decisions and explaining them clearly
    * You've led engineering teams, ideally in a Swedish or similar work culture
    * You value collaboration, ownership, and low hierarchy
    * You're still hands-on enough to challenge technical decisions when needed

    Data Platform Engineer

    Sweden, Stockholm

    • Negotiable
    • Engineer Role
    • Skills: Azure Databricks, DBT, Python, SQL, Hands-on Platform experience
    • Seniority: Mid-level

    Description du poste

    The Role

    We are hiring a Senior Data Platform Engineer to help build and scale a cloud-based data platform in a highly regulated environment.

    The organisation is partway through its migration to Azure, with production workloads already in place. The focus now is not strategy or architecture it's execution: building reliable, production-grade data platforms that will underpin future data and AI capabilities.

    This role is for engineers who have actually built platforms within azure databricks, not just contributed to them or designed them at a high level.

    You will be working in a low-risk environment, where data security, governance, and cost control are critical. The expectation is that you can deliver within those constraints, not work around them.

    Responsibilities

    * Building and operating Azure Databricks-based data platforms used in production
    * Developing end-to-end data pipelines using Python and SQL
    * Configuring and optimising Databricks environments, including:

    * Cluster setup, tuning, and troubleshooting
    * Job orchestration and scheduling
    * Managing performance vs cost trade-offs

    * Implementing data transformation and modelling workflows (dbt or similar)
    * Owning platform components from initial build through to production support
    * Ensuring data platforms meet strict security, governance, and risk requirements
    * Contributing to ongoing cloud migration and platform modernisation work

    Required Experience

    * Strong, hands-on experience building Azure Databricks data platforms from scratch
    * Proven track record delivering and supporting production-grade data pipelines
    * Solid experience with:

    * Azure
    * Azure Databricks
    * Python
    * SQL

    * Experience working with data at scale (not just small or isolated datasets)
    * Background in regulated or enterprise environments with governance constraints
    * Comfortable working in cost-sensitive environments where efficiency matters

    Nice to Have

    * Hands-on experience with dbt in production
    * Experience in financial services or similar regulated industries
    * Exposure to cloud migration programmes
    * Experience optimising Databricks for performance and cost

    Why This Role

    * Opportunity to build real, production data platforms, not just prototypes
    * Direct impact on a platform that will enable future AI and data capabilities
    * Work in an environment where engineering quality, security, and reliability actually matter
    * Join a team that values delivery over theory

    AI Engineer - Freelance

    Sweden, Stockholm

    • Negotiable
    • Engineer Role
    • Skills: Generative AI
    • Seniority: Mid-level

    Description du poste

    AI Engineer - Freelance

    Location: Stockholm- some onsite required
    Engagement: 100% allocation, 6 months initial contract



    Our client is looking for a talented AI Engineer to help build and deliver new AI‑driven features and prototypes. This is a great fit for freelancers who enjoy working with modern tools, moving quickly, and contributing to meaningful product decisions without heavy processes or red tape.

    What you'll be doing

    - Building and fine‑tuning LLMs

    - Developing lightweight agent workflows and RAG pipelines

    - Turning ideas into working prototypes and production‑ready solutions

    - Helping shape AI architecture and best practices

    - Working closely with product and engineering teams to deliver fast, high‑quality outcomes

    Tech they use

    * Python
    * PyTorch, Hugging Face
    * FastAPI
    * Chroma or Pinecone
    * Docker
    * AWS or GCP
    * LangChain / LlamaIndex
    * (Nice to have) Weights & Biases



    What they're looking for

    Strong practical experience with AI/ML

    Comfortable working independently and iterating quickly

    Ability to select the right approach for each project

    Good communication and a collaborative mindset

    Why freelancers enjoy working with this client

    Greenfield AI work with real impact

    Flexible working hours

    Modern, interesting projects

    Freedom to experiment with new tools



    How to apply


    Send your CV and contact details to t.roach@tenthrevolution.com along with a couple lines talking about a production AI system you built that you're proud of - and what you'd want to build next. We don't require any cover letter, just a small insight into how you operate.

    Chief Technology Officer (CTO)

    Sweden, Stockholm

    • Negotiable
    • Other Role
    • Skills: Technology Strategy, Software Architecture, Cloud Computing (AWS/Azure/GCP), Engineering Leadership, Data & Analytics
    • Seniority: Senior

    Description du poste

    Overview
    We're looking for a CTO who wants to do more than just lead a tech team - someone who wants to shape how the company thinks, builds, and scales through technology.

    You'll be a key part of the leadership team, setting the technical direction and making sure everything we build actually moves the business forward. This role is about combining big-picture thinking with real execution - turning ideas into systems that work, scale, and last.



    What You'll Be Doing

    You'll define and drive our technology strategy, making sure it aligns with where the company is headed. A big part of your role will be building and developing a high‑performing engineering function, creating a culture people actually want to be part of.

    You'll ensure our systems are secure, scalable, and ready for growth, while staying on top of emerging tools and technologies - including knowing when to adopt them and when to avoid unnecessary complexity. You'll collaborate closely with product, commercial, and operational teams, translating ideas into real outcomes and maintaining smooth, reliable, and secure delivery.

    You'll introduce the right processes - things like Agile, DevOps, and CI/CD - but in a way that's pragmatic and suited to how we work. You'll also manage budgets, resources, and key technology partnerships, and act as a trusted technical voice for leadership, investors, and other stakeholders.





    What We're Looking For

    * You've operated at a senior level before (CTO, VP Engineering, or similar)
    * You have a strong technical foundation - you understand how things are built, not just how to manage them
    * You've helped scale products, platforms, or teams in a growing company
    * You're comfortable with modern tech stacks and cloud environments (AWS, Azure, GCP)
    * You understand security, data privacy, and what "good" looks like in production systems
    * You're a strong communicator who can bridge technical and non-technical worlds
    * You think commercially - you understand how tech decisions impact the business



    Nice to Have

    * Experience in our industry (e.g. fintech, SaaS, healthtech)
    * Background in startups or scale-ups
    * Exposure to fundraising, investors, or M&A
    * Interest or experience in data, AI, or advanced analytics



    What Success Looks Like

    * We have a sucsessful platform that scales without constant issues
    * The engineering team is strong, motivated, and delivering consistently
    * Technology decisions clearly support business growth
    * We're continuously improving - not stagnating



    What You'll Get

    * Competitive salary, plus bonus and/or equity
    * Flexibility around how and where you work
    * Real ownership and the ability to shape the company's future
    * A fast-moving, collaborative environment
    * Opportunities to grow as the company grows



    Interested?

    Send us your CV and a short note about why this role caught your attention.
    We're less interested in perfect applications - more interested in people who want to build something meaningful.

    Freelance Data Engineer

    Sweden, Stockholm

    • SEK 800 to SEK 950 SEK
    • Consultant Role
    • Skills: Data Engineering, Data Migration, GCP, Databricks, Big Query
    • Seniority: Mid-level

    Description du poste

    Job Description: GCP Data Engineer

    About the Role

    We're looking for a highly skilled GCP Data Engineer to join our growing data function. In this role, you'll design, build, and optimise scalable data pipelines and cloud-based solutions using Google Cloud Platform. You'll work closely with product, analytics, and engineering teams to deliver clean, reliable, high‑quality data that powers decision‑making across the organisation.

    This is an exciting opportunity for someone who enjoys end‑to‑end ownership, solving complex data challenges, and shaping modern cloud architectures.

    Key Responsibilities

    Data Engineering & Pipeline Development

    * Design, build, and maintain scalable ETL/ELT pipelines using GCP-native services.
    * Develop batch and real‑time data processing solutions with Dataflow and Pub/Sub.
    * Implement efficient, reliable workflows via Cloud Composer (Airflow).
    * Build and optimise data transformations using Dataform or dbt (optional).

    Data Architecture & Modelling

    * Design logical and physical data models for analytics and application use‑cases.
    * Implement best practices across data warehousing, partitioning, and performance tuning in BigQuery.
    * Ensure data quality, observability, and governance across systems.

    Cloud Engineering & Infrastructure

    * Deploy scalable and cost‑efficient solutions using GCP services (GCS, Dataproc, Cloud Functions, Cloud Run).
    * Use Terraform to manage cloud resources as code.
    * Collaborate with DevOps/Platform teams on CI/CD and containerisation (Docker, GKE).

    Collaboration & Stakeholder Engagement

    * Partner with data analysts, scientists, and cross‑functional teams to understand requirements.
    * Translate business needs into robust technical solutions.
    * Contribute to ongoing improvements in engineering standards, documentation, and best practices.

    Required Skills & Experience

    * Strong hands‑on experience with Google Cloud Platform, including:

    * BigQuery
    * Dataflow (Apache Beam)
    * Pub/Sub
    * Cloud Composer
    * Cloud Storage (GCS)
    * Dataproc

    * Strong programming skills in Python and SQL.
    * Experience building large‑scale ETL/ELT pipelines.
    * Solid understanding of data modelling, data warehousing principles, and analytics workflows.
    * Experience with Terraform or equivalent IaC tooling.
    * Familiarity with CI/CD, Git, Docker, and containerised workloads.
    * Strong problem-solving skills and ability to work with cross-functional stakeholders.

    BI Developer

    Sweden, Stockholm

    • SEK 1,000 to SEK 1,200 SEK
    • Consultant Role
    • Skills: MS Power BI
    • Seniority: Senior

    Description du poste

    Key Responsibilities

    * Develop and maintain semantic data models and Power BI dashboards
    * Ensure data quality, consistency, and well-defined KPI logic
    * Translate business requirements into scalable and user-friendly BI solutions
    * Work with Power BI (DAX, Power Query/M) and Microsoft Fabric
    * Manage data refresh processes, monitoring, and solution stability
    * Document data models, logic, and data sources

    Additionally, you will contribute to the development of evolving data architecture, including:

    * Supporting proof-of-concept initiatives for modern data pipelines
    * Working with data ingestion from systems such as Oracle and other operational sources
    * Prototyping data transformation and processing using Python
    * Contributing to workflow orchestration using tools such as Apache Airflow
    * Supporting SQL Server (on-premises) as part of a centralized data layer
    * Assisting in the development of dimensional models aligned with business requirements

    Technology Environment

    Current Core Stack:

    * Power BI (data modeling, DAX, Power Query/M)
    * SQL
    * Microsoft Fabric
    * Data sourced from Oracle and other operational systems

    Emerging / Proof-of-Concept:

    * Python for data processing and prototyping
    * Apache Airflow for orchestration
    * SQL Server (on-premises)
    * Dimensional data modelling

    Future Direction:

    * Potential transition toward a cloud-based architecture (e.g., Microsoft Azure)
    * Increased focus on scalable data engineering and advanced analytics

    Qualifications

    * Strong proficiency in Power BI (data modelling, DAX, M)
    * Solid SQL skills
    * Understanding of modern data modelling concepts (e.g., star schema, dimensions)
    * Familiarity with Microsoft Fabric or similar platforms
    * Ability to translate business needs into technical solutions
    * Strong communication and collaboration skills
    * Fluency in English and Swedish

    Preferred:

    * Experience with Python and/or Apache Airflow
    * Experience working with Oracle or similar data sources
    * Familiarity with ETL/ELT processes
    * Experience with version control (e.g., Git)
    * Interest in cloud platforms such as Microsoft Azure

    Personal Attributes

    * Structured and detail-oriented
    * Strong analytical and problem-solving skills
    * Comfortable working both independently and in cross-functional teams
    * Able to communicate effectively with both technical and non-technical stakeholders

    Data Engineer

    Sweden, Stockholm

    • Negotiable
    • Consultant Role
    • Skills: Databricks, Azure, Python, SQL, Data Pipelines, Data Migration
    • Seniority: Mid-level

    Description du poste

    About the Role

    We're looking for a Data Engineer to join a collaborative, high-performing team in Stockholm.

    This role is very hands-on. You'll be working closely with product, analytics, and engineering teams to build and improve the data platform ensuring data is clean, reliable, and actually useful for decision-making.

    It's a great fit for someone who enjoys solving real problems, working with modern tools, and being part of a team that values quality and simplicity.

    What You'll Be Doing

    * Designing, building, and maintaining scalable data pipelines
    * Developing ETL/ELT workflows for both batch and streaming data
    * Working with Databricks and Spark to process large datasets efficiently
    * Collaborating with cross-functional teams to deliver data solutions that scale
    * Writing clean, production-level code in Python and SQL
    * Improving data quality, reliability, and performance across the platform
    * Supporting best practices in data governance and security

    Tech Environment

    You'll be working in a modern, cloud-based setup:

    * Core: Databricks, Azure, Python, SQL
    * Cloud: AWS (preferred), with Azure or GCP as a plus
    * Data Engineering: ETL/ELT pipelines, Delta Lake, Airflow (or similar orchestration tools)
    * Version Control & CI/CD: Git, Jenkins (or similar)
    * Other: APIs, streaming data, and large-scale datasets

    What We're Looking For

    * 3-5 years' experience as a Data Engineer
    * Strong hands-on skills in Python and SQL
    * Solid experience with Databricks and Spark
    * Proven track record building and optimising data pipelines
    * Comfortable working in a Linux or cloud environment
    * A proactive, curious mindset-you take ownership and get things done
    * Comfortable working onsite in a collaborative team environment

    Why This Role

    * Immediate impact - you'll be contributing from day one
    * Modern tech stack - work with Databricks, AWS, and scalable data systems
    * Strong team environment - collaborative, low hierarchy, and delivery-focused
    * Interesting challenges - large datasets, real-time data, and performance optimisation