Your current job search

555 search results

For Permanent

AWS Data Engineer - Edinburgh - up to £75k

Scotland, Edinburgh

    Salary: register to unlock
  • Engineer Role
  • Skills: Python, AWS, Spark, Hive
  • Seniority: Senior

Job description

Data Engineer

EdTech

Edinburgh/Remote (Candidates must be willing to travel to Edinburgh)

£60,000 - £75,000 DOE



Jefferson Frank have partnered with one of the most exciting and widely recognised leaders in modern EdTech solutions worldwide, already working with the likes of Oxford and Cambridge University.

They are looking to expand their Data Engineering and Data Science team in Edinburgh who are passionate about or looking to make the move into the exciting and rapidly growing 'EdTech' industry. They work in a serverless AWS environment and leverage modern Machine Learning algorithms to provide predictive insights/patterns on 'elearners' and educational institutions worldwide.



The following experience is necessary:



* 3+ years of work experience
* Professional experience with developing in both Python and Java
* Experienced with SQL and relational databases
* Experience with AWS, in particular S3, SQS, Lambda
* Professional experience with Apache Spark and/or Apache Hive
* Familiar with Elasticsearch
* An eye for clean, testable, maintainable, and performant code
* Passion, drive, energy, a sense of humor and a great attitude!

The following is considered a plus:

* Experience with Tensorflow or PyTorch
* Some knowledge in machine learning
* Comfortable with mathematics and statistics
* Some experience with NoSQL databases



Candidates must be within distance to commute to Edinburgh a number of times a month, UK citizens/ILR only, no sponsorship available.



If you are interested please apply below or reach out to Steven Mckay from Jefferson Frank at s.mckay@jeffersonfrank.com for more information.

AWS Data Engineer - Informatica / SAS DI - £80,000 - Remote

England, Greater Manchester, Manchester

    Salary: register to unlock
  • Engineer Role
  • Skills: AWS, Talend, Pentaho, SAS DI, Informatica, Jenkins, Gitlab, Oracle
  • Seniority: Senior

Job description

Up to £80,000

UK - Remote First

My client are one of the biggest names in technology and modern cloud consulting, they boast a world class data and analytics centre and are looking to add to their team. They provide a wide range of cutting edge data and analytics solutions to the Public Sector to revolutionise the UK Government's data platform and maximise insight, accessibility, revenues, reduce fraud etc - they key lifeblood of the Public Sector.

You will work in an Agile team collaborating closely with Architects, Product Managers and Solution Architects.

Experience Required:

* Strong ETL skills using Talend or Pentaho.
* Database background of one of the following Oracle, RDS, Redshift, Hadoop, MySQL, Postgres etc
* Various Data Architecture techniques including Data Modelling, Data Warehousing etc
* Job Scheduling
* Programming skills in any of PL/SQL, SQL, Unix, Java, Python, Hive, HiveQL, HDFS, Impala
* Must be eligible for Security Clearance

The Role

* You will be working in an agile delivery team within an agile scrum framework.
* Work across various software engineering principles including requirements gathering, analysis, design, coding, development, testing, implementation and support.
* Contribute to various teams providing ideas and innovation/process recommendations, help mentor other team members, strong information sharing culture.
* Client and stakeholder engagement

Benefits

* Annual training budget, 10% of your annual working hours dedicated to this
* Internal technology university to obtain qualifications
* Excellent pension
* Flexibility around working from home

Data Engineer (w/m/d) - in Hannover

Germany, Niedersachsen, Hannover

    Salary: register to unlock
  • Engineer Role
  • Seniority: Mid-level

Job description

Wir suchen für den Fachbereich Data & Analytics von einem Kunden in ganz Deutschland:

Data Engineer (w/m/d)

Aufgaben:

- Implementieren von Data Pipelines, Data Ingest und Data Processing
- Engineering von Container basierten Analytics Plattformen und effizienten CI/CD Pipelines
- Projektbasierte Umsetzung von Business Anforderungen mit Analytics Services von AWS, Azure und GCP oder etablierten On Premises Big Data Technologien
- StetigeWeiterbildung in Technologienausdem Cloud- und Software Engineering-Umfeld

Anforderungen:

-Erfahrung mit Cloud Analystics (Azure, AWS, GCP) oder Big Data (Haddop, Kafka)
- Know-How in der Softwareentwicklung mit Java, Scala oder Python
- von Vorteil: Erfahrung mit Buildtools und CI/CD Pipelines
- von Vorteil: gute Kenntnisse von verschiedenen Cloud-Strategien und lac Technologien
- verhandlungssichereDeutsch- undguteEnglischkenntnisse

Worum es geht:

Das Unternehmen ist einer der führenden IT-Dienstleister im deutschsprachigen Raum. Mit rund 6.800 Mitarbeitenden unterstützt das Unternehmen Kunden aus verschiedenen Branchen bei der Digitalen Transformation.

Das Competence Center Data Engineering ist verantwortlich für die Umsetzung komplexer Big Data und Cloud Analytics Projekte für namhafte große Kunden. Als Data Engineer bist du Teil eines innovativen und stetig wachsenden Software Engineering Teams. Du profitierst von erfahrenen Kolleginnen und Kollegen und absolvierst ein spezielles Ausbildungsprogramm, um dein Wissen auszubauen und zu vertiefen.

Dafür wird Ihnen geboten:

Das Unternehmen bietet ein modernes Arbeitsumfeld an, welches auf Flexibilität und die Vereinbarung Beruf-Familie ausgelegt ist. Dort erwartest du mobiles Arbeiten und flexible Arbeitszeiten, sodass du deinen Alltag flexibel gestalten kannst.

Durch verschiedene Aus- und Weiterbildungsmöglichkeiten kannst du deine eigene Entwicklung vorantreiben und deine Laufbahn selbst mitgestalten. Hierbei kannst du sowohl von zahlreichen Kursen wie auch von Brown Bag Sessions, regelmäßigem Fachaustausch mit deinen Kollegen und Experten zu aktuellen IT-Themen, profitieren.

Hat diese Stelle dich angesprochen? Schicke gerne deinen Lebenslauf an h.chang@frankgroup.com

Bei weiteren Fragen zu diesem Stellenangebot stehe ich (Han Chang) zusätzlich unter der folgenden Rufnummer +49 (0)30 3080 8845 gerne zur Verfügung.

Impressum - https://www.frankgroup.com/de/impressum/

Data Engineer (w/m/d) - in Nürnberg

Germany, Bayern, Nürnberg

    Salary: register to unlock
  • Engineer Role
  • Seniority: Mid-level

Job description

Wir suchen für den Fachbereich Data & Analytics von einem Kunden in ganz Deutschland:

Data Engineer (w/m/d)

Aufgaben:

- Implementieren von Data Pipelines, Data Ingest und Data Processing
- Engineering von Container basierten Analytics Plattformen und effizienten CI/CD Pipelines
- Projektbasierte Umsetzung von Business Anforderungen mit Analytics Services von AWS, Azure und GCP oder etablierten On Premises Big Data Technologien
- StetigeWeiterbildung in Technologienausdem Cloud- und Software Engineering-Umfeld

Anforderungen:

-Erfahrung mit Cloud Analystics (Azure, AWS, GCP) oder Big Data (Haddop, Kafka)
- Know-How in der Softwareentwicklung mit Java, Scala oder Python
- von Vorteil: Erfahrung mit Buildtools und CI/CD Pipelines
- von Vorteil: gute Kenntnisse von verschiedenen Cloud-Strategien und lac Technologien
- verhandlungssichereDeutsch- undguteEnglischkenntnisse

Worum es geht:

Das Unternehmen ist einer der führenden IT-Dienstleister im deutschsprachigen Raum. Mit rund 6.800 Mitarbeitenden unterstützt das Unternehmen Kunden aus verschiedenen Branchen bei der Digitalen Transformation.

Das Competence Center Data Engineering ist verantwortlich für die Umsetzung komplexer Big Data und Cloud Analytics Projekte für namhafte große Kunden. Als Data Engineer bist du Teil eines innovativen und stetig wachsenden Software Engineering Teams. Du profitierst von erfahrenen Kolleginnen und Kollegen und absolvierst ein spezielles Ausbildungsprogramm, um dein Wissen auszubauen und zu vertiefen.

Dafür wird Ihnen geboten:

Das Unternehmen bietet ein modernes Arbeitsumfeld an, welches auf Flexibilität und die Vereinbarung Beruf-Familie ausgelegt ist. Dort erwartest du mobiles Arbeiten und flexible Arbeitszeiten, sodass du deinen Alltag flexibel gestalten kannst.

Durch verschiedene Aus- und Weiterbildungsmöglichkeiten kannst du deine eigene Entwicklung vorantreiben und deine Laufbahn selbst mitgestalten. Hierbei kannst du sowohl von zahlreichen Kursen wie auch von Brown Bag Sessions, regelmäßigem Fachaustausch mit deinen Kollegen und Experten zu aktuellen IT-Themen, profitieren.

Hat diese Stelle dich angesprochen? Schicke gerne deinen Lebenslauf an h.chang@frankgroup.com

Bei weiteren Fragen zu diesem Stellenangebot stehe ich (Han Chang) zusätzlich unter der folgenden Rufnummer +49 (0)30 3080 8845 gerne zur Verfügung.

Impressum - https://www.frankgroup.com/de/impressum/

Senior Python Engineer - Stockholm

Sweden, Stockholm

    Salary: register to unlock
  • Engineer Role
  • Skills: Product Development, SaaS, Software, Engineer, Developer, Stockholm, Uppsala, Solna, Python, Django, Flask, Fast API, React, Angular, Vue, TypeScript, AWS, GCP, Azure, Cloud, Hybrid-solutions, Docker, Kubernetes
  • Seniority: Senior

Job description

This exclusive role is only available through Jefferson Frank, an international agency that has grown to be one of the trusted recruitment partners to businesses across Sweden in different industries and domains. To learn more about this opportunity, reach out to George Paxton on LinkedIn who is responsible for this advert. You can also reach him at g.paxton@jeffersonfrank.com.

The Company:

I'm working with a SaaS product development company located in central Stockholm who are hiring for their core development team. Set up through 2 founders, they were frustrated with the unnecessarily complex structures of modern software which made these difficult to manage, unresponsive and were not aligned with the brand promises of the wider organisation. Today, they run with 2 core product lines which have similar goals; to ensure software is straight-forward and allow for optimal performance through simpler messaging routes. Their patented approach improves the performance (speed), adds redundancy and ensures data is secure. All of which is done utilising MULTIPLE cloud service providers.

In early 2023 they will be hiring for the core software development team, extremely product focused which will set themselves apart in the market moving forwards. Majority of the work here is greenfield development with plenty of team meetings to universally agree on the best process for moving forwards.

They operate in a selection of small teams so you'll get the chance to work in a growing and innovative start-up that takes a technology first approach to their day-to-day operations, meaning that you aren't a complimentary, add on service to the organisation's value proposition. Due to the nature of the work they passionately promote creativity, this is in addition to their culture of knowledge sharing when they promote engineers to be vocal about what they have learnt recently to encourage this being implemented into their main product lines. You'd be working in a small but senior team of tech geeks, ones who often debate on the best approach to move forwards providing a great opportunity for you to progress from a technical point of view where you'll be working on a range of individual tasks.

The Role:

You'll be joining a small team of talented and senior engineers which work end to end to design, develop and deploy Python based solutions within customer environments. You'll be working cross-functionally with technologies including but not limited to Python, Golang, Microservices, Kafka, React, TypeScript and hybrid cloud solutions (AWS & GCP mostly). You'll work in a mature and intuitive team of consiultants and permanent employee's that are lead by a CTO. Majority of the solutions you'll be building will be greenfield, giving you a great opportunity to flex your creative muscles and implement your own ideas into the code base. You'll be responsible for designing, building and integrating solutions into the customer environment in addition ton deploying and managing applications which use multiple public cloud providers.

Requirements:

* 3+ years software development experience building solutions from scratch with a modern Python framework (Django, Flask or Fast API).
* 2+ years experience with cloud services having worked with at least one of the 3 main providers (AWS, Azure or GCP).
* Permanent work permit and able to commute to Stockholm at least once a week

Beneficial if you have:

* Experience with infrastructure based tools such as Kubernetes, Docker, Terraform or Ansible

To apply for this position, submit your CV through this portal or reach out to me on LinkedIn so I can provide more details and answer any questions that you have regarding the company or role!

Golang Engineer - Stockholm

Sweden, Stockholm

    Salary: register to unlock
  • Engineer Role
  • Skills: Product Development, SaaS, Software, Stockholm, Uppsala, Solna, Golang, Backend, back end, back-end, full-stack, fullstack, full stack, AWS, GCP, Azure, API's, Docker, Kubernetes, Microservices
  • Seniority: Senior

Job description

This exclusive role is only available through Jefferson Frank, an international agency that has grown to be one of the trusted recruitment partners to businesses across Sweden in different industries and domains. To learn more about this opportunity, reach out to George Paxton on LinkedIn who is responsible for this advert. You can also reach him at g.paxton@jeffersonfrank.com.

The Company:

I'm working with a SaaS product development company located in central Stockholm who are hiring for their core development team. Set up through 2 founders, they were frustrated with the unnecessarily complex structures of modern software which made these difficult to manage, unresponsive and were not aligned with the brand promises of the wider organisation. Today, they run with 2 core product lines which have similar goals; to ensure software is straight-forward and allow for optimal performance through simpler messaging routes. Their patented approach improves the performance (speed), adds redundancy and ensures data is secure. All of which is done utilising MULTIPLE cloud service providers.

In early 2023 they will be hiring for the core software development team, extremely product focused which will set themselves apart in the market moving forwards. Majority of the work here is greenfield development with plenty of team meetings to universally agree on the best process for moving forwards.

They operate in a selection of small teams so you'll get the chance to work in a growing and innovative start-up that takes a technology first approach to their day-to-day operations, meaning that you aren't a complimentary, add on service to the organisation's value proposition. Due to the nature of the work they passionately promote creativity, this is in addition to their culture of knowledge sharing when they promote engineers to be vocal about what they have learnt recently to encourage this being implemented into their main product lines.



The Role:

Hiring for the core software team. You'll be an important member of a small yet efficient product development team building hybrid-cloud data management solutions serving both internal and external systems and platforms. You'll be responsible for building next generation data management tools for a go-to platform that solves challenges regarding system performance, safeguarding data in a portable and secure manner whilst meeting regulatory or company compliance policies, in line with industry governance standards.

You'll be joining a senior and highly collaborative team that works in line with agile methodology principles. Continuing to build the core products at all levels of the software development life-cycle. You'll be expected to add value from architecture planning through to deployments and post release, working full-stack at times but mostly with the backend systems that are programmed in Golang.

Requirements:

* 5+ years experience in software development with at least 2+ years building solutions in Golang, ideally working end-to-end across the product development life-cycle.
* 3+ years cloud services experience having either worked with AWS, GCP or Azure.
* Understanding of DevOps processes having a strong conceptual understanding across containerised solutions OR structuring CICD/data pipelines.

Beneficial if you have:

* Experience working in start-ups or in small, specialised development teams.
* Previous greenfield development experience.

To apply for this position, send your CV via this portal or email it to g.paxton@jeffersonfrank.com. Alternatively, you can reach out to George Paxton on LinkedIn to learn more regarding this position or others which are available at Jefferson Frank.

Senior BI Engineer - Up to 95,000 - Fully Remote

England, London

    Salary: register to unlock
  • Engineer Role
  • Skills: AWS, Quicksight, Power Bi, Python, GoLang, Glue, Athena
  • Seniority: Senior

Job description

The role

I am looking for a BI Data Engineer to join my client on a full-time basis. Your role as BI Data Engineer will be to coordinate and help with a hands-on approach the extraction of domain specific data from transactional databases into data lakes to power the BI analytics pipelines.



What's in it for you?

* Work with the latest cutting-edge technologies
* Disrupt a centuries-old industry in a start-up environment
* Great team spirit with established structure and quick decision-making
* Deliver projects as green field applications
* Solve complex problems, never run out of challenging projects to tackle
* Great culture where each contributor can make a significant change and bring added value
* Work-life balance with a remote-first culture
* Flexibility to organise your work the way you like using our remote-first setup
* No approval loops, quick decision-making and full ownership of your function
* Variety of knowledge-sharing and self-development opportunities
* Competitive salary Responsibilities

General:

* Ensure effectiveness and efficiency at scale of data pipelines
* Help product teams with data extraction
* Help domain expert with data analysis
* Ensure data security
* Monitor and ensure reliability for all data pipelines
* Ensure Company best practices deliver production-ready, industry-quality code
* Participate in technical interviews From time-to-time:
* Mentor, coach and develop engineers
* Help teams plan and deliver complex releases
* Help colleagues with hard technical challenges Requirements
* Deep understanding of database technologies and design data intensive applications
* UnderstandingofstatisticsFull hands-on technical experience * Database experience:

* SQL (Postgres)
* Kafka and high frequency data

* Proficiency in: o AWS data lakes technology: DMS, S3, Parquet, Lake Formation, Glue, Athena, IAM o spark/pySpark o AWS QuickSight o Golang or Python o Data pipeline design
* Ability to perform basic DevOps tasks
* Use of Unix systems commands
* Experience with monitoring services
* Advanced use of Git Experience

Demonstrated track record and proficiency in:

* Delivering features autonomously with a high degree of team coordination
* Delivering features based on precise architecture spec
* Delivering features without relying on precise architecture spec
* Working with CI/CD practices
* Delivering features into production
* Maintaining production-ready code
* Collaborating in small, fast-paced teams
* Event-driven architecture and message passing Nice to have experience in:
* Improving systems' performance and security
* Project management
* High-throughput data processing
* Experience with MongoDB and Cassandra More about you
* Proficient level of English, spoken and written
* Excellent all-round communication skills
* Willingness to learn and an open mind about new technologies
* Confident in operating in a fast-paced environment
* A collaborative approach and willingness to engage in an environment of active ideassharing
* Ability to learn autonomously



Jefferson Frank is the Amazon Web Services (AWS) recruiter of choice. We work with organizations worldwide to find and deliver the best AWS professionals on the planet. Backed by private equity firm TPG Growth, we have a proven track record servicing the AWS recruitment market and, to date, have worked with over 30,000 organizations globally.

BI Data Engineer

England, London, City of London

    Salary: register to unlock
  • Engineer Role
  • Skills: AWS, S3, Python, Golang, AWS quicksight
  • Seniority: Senior

Job description

The role

I am looking for a BI Data Engineer to join my client on a full-time basis. Your role as BI Data Engineer will be to coordinate and help with a hands-on approach the extraction of domain specific data from transactional databases into data lakes to power the BI analytics pipelines.



What's in it for you?

* Work with the latest cutting-edge technologies
* Disrupt a centuries-old industry in a start-up environment
* Great team spirit with established structure and quick decision-making
* Deliver projects as green field applications
* Solve complex problems, never run out of challenging projects to tackle
* Great culture where each contributor can make a significant change and bring added value
* Work-life balance with a remote-first culture
* Flexibility to organise your work the way you like using our remote-first setup
* No approval loops, quick decision-making and full ownership of your function
* Variety of knowledge-sharing and self-development opportunities
* Competitive salary Responsibilities

General:

* Ensure effectiveness and efficiency at scale of data pipelines
* Help product teams with data extraction
* Help domain expert with data analysis
* Ensure data security
* Monitor and ensure reliability for all data pipelines
* Ensure Company best practices deliver production-ready, industry-quality code
* Participate in technical interviews From time-to-time:
* Mentor, coach and develop engineers
* Help teams plan and deliver complex releases
* Help colleagues with hard technical challenges Requirements
* Deep understanding of database technologies and design data intensive applications
* Understanding ofstatisticsFull hands-on technical experience * Database experience:

* SQL (Postgres)
* Kafka and high frequency data

* Proficiency in: o AWS data lakes technology: DMS, S3, Parquet, Lake Formation, Glue, Athena, IAM o spark/pySpark o AWS QuickSight o Golang or Python o Data pipeline design
* Ability to perform basic DevOps tasks
* Use of Unix systems commands
* Experience with monitoring services
* Advanced use of Git Experience

Demonstrated track record and proficiency in:

* Delivering features autonomously with a high degree of team coordination
* Delivering features based on precise architecture spec
* Delivering features without relying on precise architecture spec
* Working with CI/CD practices
* Delivering features into production
* Maintaining production-ready code
* Collaborating in small, fast-paced teams
* Event-driven architecture and message passing Nice to have experience in:
* Improving systems' performance and security
* Project management
* High-throughput data processing
* Experience with MongoDB and Cassandra More about you
* Proficient level of English, spoken and written
* Excellent all-round communication skills
* Willingness to learn and an open mind about new technologies
* Confident in operating in a fast-paced environment
* A collaborative approach and willingness to engage in an environment of active ideassharing
* Ability to learn autonomously



Jefferson Frank is the Amazon Web Services (AWS) recruiter of choice. We work with organizations worldwide to find and deliver the best AWS professionals on the planet. Backed by private equity firm TPG Growth, we have a proven track record servicing the AWS recruitment market and, to date, have worked with over 30,000 organizations globally.

Multi Cloud Platform / HPC (High Performance Compute) Engineer

Northern Ireland, County Antrim, Belfast

    Salary: register to unlock
  • Engineer Role
  • Seniority: Senior

Job description

Multi Cloud Platform/HPC (High Performance Compute) Engineer

Dublin (2-3 days in office per week)

Overall Packages between €150,000 - €250,000+ dependent on experience.

Jefferson Frank have been enlisted to support a well esteemed hedge fund who building a state of the art high performance compute (HPC) capability to support their internal teams. This in turn will enable their teams including Portfolio Management, Data Scientists and Developers scale their computing capabilities on premise and in the cloud platforms.



Primary Responsibilities:

* You will be operating in a customer focused HPC team working with the business and aligned to teams to design, develop, test and deploy High Performance Compute Infrastructure.
* You will be comfortable building relationships with multiple internal stakeholders and delivering platform solutions to their needs (Including Quantitative, Engineers, Data Science etc)
* Help the business drive adoption and subsequent HPC platform offering



Experience Required:

* Experience working on multi tenant compute environments/grid systems (consulting with business units to develop HPC Solutions etc)
* Strong Python (or more) programming experience
* Strong Systems experience (Linux)
* Configuration Management and CI/CD
* Strong knowledge of Public Cloud Platforms, deep expertise in either AWS or GCP (ideally both)
* Expertise in building multi tenant HPC environments and familiarity leveraging relevant frameworks like SLURM
* Terraform
* Strong understanding of network and security



Package

* Base Salary between €120,000 to €220,000 dependent on your level of experience.
* Annual Cash Bonus on top of above base.
* Investment Portfolio managed free of charge by the firm.
* Health Insurance.