Ihre aktuelle Jobsuche

212 Suchergebnisse

Für Festanstellung und freiberuflich

Data Engineer

England, London, City of London

  • £55,000 to £75,000 GBP
  • Engineer Stelle
  • Fähigkeiten: ETL, AWS, Snowflake, Redshift, Python, SQL
  • Seniority: Senior

Jobbeschreibung

Have you ever wanted to be working with global clients on a daily basis? Getting your teeth into projects that last 9-15 months, working with edge-cutting technologies such as Redshift, SQL, Oracle, S3 buckets and loads more!



The group provides a wide range of data and analytics solutions in support of our client's business priorities: maximise revenues, bear down on fraud, and respond to current Covid challenges. This role is a unique chance to design, develop, test and support data and analytics software solutions, delivering key critical systems to the public sector. You will be part of an Agile software delivery team working closely with software architects and supported by product managers, scrum masters and solutions architects.



Benefits

* You will get put through certifications and will get upskilled to the best you can be
* There's a mentorship program as well (10 days of training a year, can be anything)
* Looking to hire 25-35 candidates this year looking to grow as well
* Flat structure within the business
* Opportunity to move internal as they hire from within before going to external agencies
* Working for a growing company averaging around 30% every year



Why this role?

Work as part of an Agile software delivery team; typically delivering within an Agile Scrum framework. You will work on the full range of software engineering principles; covering requirements gathering and analysis, solutions design, software coding and development, testing, implementation and operational support. You will contribute to the software engineering communities by providing your ideas for innovation and process improvement and coaching and mentoring other team members.


Why You?

* Good experience in the following technologies are recommended:
* ETL toolset (Talend, Pentaho, SAS DI, Informatica etc)
* Database (Oracle, RDS, Redshift, MySQL, Hadoop, Postgres, etc)
* Data modelling (Data Warehouse, Marts)
* Job Scheduling toolset (Job Scheduler, TWS, etc)
* Programming and scripting languages (PL/SQL, SQL, Unix, Python, Hive, HiveQL, HDFS, Impala, etc)



Good to have experience in the following technologies:
* Data virtualisation tools (Denodo)
* Reporting (Pentaho BA, Power BI, Business Objects)
* Data Analytics toolset (SAS Viya)
* Cloud (AWS, Azure, GCP)
* ALM Tooling (Jira, Confluence, Bitbucket)
* CI/CD toolsets (Gitlab, Jenkins, Ansible)
* Test Automation (TOSCA)


Experience:
You should have experience as a software engineer delivering within large scale data analytics solutions and the ability to operate at all stages of the software engineering lifecycle, as well as some experience in the following:
* Awareness of devops culture and modern engineering practices
* Experience of Agile Scrum based delivery
* Proactive in nature, personal drive, enthusiasm, willingness to learn
* Excellent communications skills including stakeholder management



Jefferson Frank is the Amazon Web Services (AWS) recruiter of choice. We work with organizations worldwide to find and deliver the best AWS professionals on the planet. Backed by private equity firm TPG Growth, we have a proven track record servicing the AWS recruitment market and, to date, have worked with over 30,000 organizations globally.

Data Engineer (w/m/d) - Big Data & IoT

Germany, Baden-Württemberg, Stuttgart

  • Up to €75,000 EUR
  • Engineer Stelle
  • Fähigkeiten: Data Engineering, Data Architecture, Data Lakes, Data Warehouse, DWH, Datenbanken, Cluster, Open Source, Hadoop, Spark, Kafka, Cloudera, Snowflake, AWS, Redshift, Azure, Data Factory, BigQuery, GCP, Cloud, DevOps, Ansible, Data Security
  • Seniority: Mid-level

Jobbeschreibung

Wir suchen für einen Kunden mit Standorten in ganz Deutschland hybrid oder remote:

Data Engineer (w/m/d)

Technologien im Einsatz:

Sprachen: Python, SQL, Ansible
Datenverarbeitung: Apache Hadoop, Apache Spark
Data Streaming: Apache Kafka, Confluent
Cloud: Cloudera, AWS Redshift, Azure Data Factory, GCP BigQuery, Snowflake
Datenbanken: Couchbase, MariaDB

Anforderungen:
- Erfahrung im Aufbau von Data Lakes und DWH Lösungen
- Erfahrung mit Opensource Lösungen, insb. Apache Technologien und Linux
- bestenfalls Erfahrung mit Cloudera, Snowflake oder Public Cloud DWH Services
- verhandlungssichere Deutschkenntnisse

Worum es geht:

Das Unternehmen ist ein mittelständisches IT Dienstleistungsunternehmen, welches sich über die letzten 25 Jahre über ganz Deutschland ausgebreitet hat und aktuell alle großen IT Bereiche abdeckt, von der Cloud Transformation über die Softwareentwicklung bis hin zu Big Data und IoT, sowohl für die Privatwirtschaft als auch für den öffentlichen Sektor.

Als Data Engineer erstellen Sie im Team Data Lakes und verteilte DWH Cluster für die Speicherung von Big Data Datenpaketen und die spätere Aufbereitung durch das Analytics Team. Diese bestehen in der Regel aus Opensource Software, im Einsatz sind vor allem Apache Hadoop und Apache Spark, zum Teil in Kombination mit Public Cloud Services oder mit Cloudera. Sie steuern zudem die Datenströme hin zum Data Lake, zumeist mit Apache Kafka und schreiben Automatisierungen zur Skalierung der Datenbanken.

So für Kunden v.a. in der Industrie, aber auch in anderen Branchen wie im Gesundheitswesen. Zum Beispiel in einem Projekt für ein großes deutsches Gesundheitsinstitut, für das das Team eine Datenspeicherungs und -analyselösung für konsolidierte Forschungsdaten aufgesetzt hatte. Dies geschah in Zusammenarbeit mit dem Hersteller Cloudera, auf dessen Plattform das Team einen abgesicherten und skalierbaren Data Lake baute, sowie die zugehörigen Data Pipelines mit Apache Spark und Apache Kafka. Dies alles in einem agilen Umfeld nach der DevOps Mentalität.

Dafür wird Ihnen geboten:

Das Unternehmen setzt stark auf individuelle Vielfalt und die Weiterentwicklung aller Mitarbeitenden. So stellt das Unternehmen unbegrenzte Arbeitszeit und Mittel zur Verfügung für die Teilnahme an Kursen, Weiterbildungen und die entsprechende Zertifizierung in neuen Technologien. Zudem inzentiviert das Unternehmen den internen Wissensaustausch durch technologiespezifische Communities, bei denen neue Tools und mögliche Einsatzgebiete in den Projekten offen diskutiert werden.

Das Gehalt setzt sich aus einem fixen und einem variablen Gehaltsanteil zusammen, wobei der variable Anteil an den Einsatz in Projekten geknüpft ist. Die volle Ausschüttung erfolgt bereits bei 120 Projekttagen im Jahr. Teilweise kann es sein, dass Sie im Rahmen der Anforderungs-aufnahme und entlang eines Projektes auch beim Kunden eingesetzt werden. Dies ist natürlich auch mit einer entsprechenden Vergütung verbunden.

Hinzu kommt ein jährlicher Ausflug ins Phantasialand mit dem gesamten Fachbereich.

Impressum - https://www.frankgroup.com/de/impressum/

Software Engineering Manager - Northern Ireland

Republic of Ireland, Dublin

  • €85,000 to €100,000 EUR
  • Engineer Stelle
  • Seniority: Senior

Jobbeschreibung

Software Engineering Manager

Location: Northern Ireland (Hybrid model very flexible here for Irish based candidates or re locators from UK)

Salary DOE

You will be responsible for the technical direction of the flagship software solutions for each of the business units, identifying areas of leverage and synergies across the teams and platforms.

RESPONSIBILITIES

* Develop software technology strategy and architectures to align to business objectives and manage technical and knowledge risks
* Collaborate across teams to ensure the strategy is delivered and technology stack is managed
* Stay abreast of latest technologies, frameworks, platforms and processes
* Collaborate with engineering leadership, product, QA, UX to ensure the technical and business strategies and roadmaps are aligned
* Work across teams to enhance a culture that encourages communication, collaboration, integration, automation and continuous improvement
* Contribute to project/tender proposals to understand high level scope and provide resource estimations
* Introduce policies and processes to comply with quality management and information security management
* Identify opportunities for improving efficiency e.g. reusable micrcoservices



YOU WILL NEED TO SUCCEED

* Degree in computer science, engineering or related field
* Experience leading high performing software engineering teams
* Experience building complex digital technology solutions
* Broad experience with modern web/cloud native technologies, frameworks and platforms e.g. Java, javascript, RESTful APIs, AWS, Azure
* Proven record in developing and delivering on a software technology strategy
* Knowledge of large scale distributed system architectures
* Appreciation of big data and data engineering

AWS Senior Data Engineer - Apache Hudi / Apache Iceberg

England, London, City of London

  • £600 to £700 GBP
  • Engineer Stelle
  • Fähigkeiten: AWS Senior Data Engineer - Apache Hudi / Apache Iceberg
  • Seniority: Senior

Jobbeschreibung

Hi,

Thanks for your time a moment ago. As discussed, if you are interested in the role, please can you send me your updated CV and candidate representation



Role - Senior AWS Data Engineer

Rate- £600-£700

Contract Length - 6-12 Months

Location - Remote (Must be able to work to US / Canadian Time Zones)

Start Date - ASAP



I am working for a large investment company based in London, that provides financial and technology solutions to the global investment industry. My client is looking for an AWS Data Engineer to work on an initial 6 months contract, and fully remote. A fantastic opportunity to be involved in end-to-end data management for cutting-edge Advanced Analytics and Data Science solutions. I am looking for someone to have experience with projects and in handling vast amounts of, and various types of data - working on database design and development, data integration and ingestion, and designing data and ELT architectures using a variety of tools and techniques.



Key Responsibilities

* Ingest from multiple sources and develop data processing pipelines
* Automate test and deployment of your infrastructure
* Work closely with business analysts, data engineers and product owners to gain a deeper understanding of business processes and data requirements
* Define and manage meta-data standards, data mappings and the data dictionary
* Define data quality rules, validation checks and automated cleansing methods for data pipelines
* Use data engineering and DevOps techniques on large datasets to gain business insight.
* Implement and test business logic, transformations and calculations to support data pipelines



Qualifications and Skills

* 5+ years experience with Python and Spark
* 5+ years of experience working with AWS and serverless technologies
* Proficiency with data lake-related technologies like Apache Hudi, Apache Iceberg or Databricks Delta Lake
* Experience using DevOps and automation techniques (CICD, infrastructure-as-code, monitoring tools)
* Proficiency in test automation (TDD, unit tests, Deequ)
* 3+ years of experience with data-related process controls, data mapping, and ETL processes and tools
* 3+ years of experience working with relational databases (eg. PostgreSQL) and NoSQL technologies
* Well-developed analytical skills, quantitative, and problem-solving skills
* Experience in the Fund Administration business is a strong plus
* Financial services technical Experience a plus

Cloud Engineer

USA, Illinois, Chicago

  • $130,000 to $155,000 USD
  • Engineer Stelle
  • Fähigkeiten: AWS, Python, CloudFormation, CDK,
  • Seniority: Mid-level

Jobbeschreibung

Job Description:

The Cloud Computing Engineer designs, creates, configures and delivers cloud infrastructure environments for a variety of environments across the company using best practices and business acumen. This role includes all technological aspects associated with cloud computing technology stack, including architecture, security, design, planning, management, maintenance and support. The scope of the role includes the extensive knowledge of on-premise (private), off-premise (public) and hybrid cloud models along with cloud implementation service models (IaaS, PaaS and SaaS). The major purpose of this role will be mapping of our on-premise databases to cloud database services. Lead/oversee the cloud infrastructure environment construction and implementation during the life cycle of the solutions. Evaluate the success of architecture, network, security and environment designs as they are implemented. Evaluate strengths and flaws in architecture, network, security and environment designs for continuous improvement. Communicate with management and technical personnel on cloud adoption and usage topics on a continuous basis. May be required to represent EIT in a project as a technical lead. May be required to coordinate with other EIT personnel.

Role & Responsibilities:

Design cloud infrastructure environments with best practice configurations for a wide variety of systems including databases, web services and messaging systems, and other application support environments. Efficiently translate project requirements from architecture/environment diagrams to formulate appropriate questions to identify and remediate design gaps. Review and analyze architecture level security solutions at the domain or product level to transform them into cloud infrastructure designs and implementations. Establish strategic relations with key technology vendors in order to influence changes in future product releases. Must have experience with cloud technologies including but not limited to compute, storage, network, databases.

Skills & Qualifications:

* 5+ years or more of enterprise-level Infrastructure consulting or implementation experience
* Experience with Chef, Puppet, Ansible, Powershell or other automation environments and scripting technologies
* Experience migrating workloads from on premise to cloud
* Networking fundamentals, including VPN configuration
* Familiarity with Containerization
* Experience with network technologies and with system, security, and network monitoring tools
* Experience using AWS CDK for IaC automation
* Good working knowledge of the technical aspects of:
* Application protocols, such as HTTP(S), SMTP, SSL, and DNS
* Common applications of cryptography, such as X.509 PKI, PGP, etc.
* Knowledge of Active Directory
* Security event and log management
* Identity and access management
* Encryption in the cloud
* Thorough understanding of the latest security principles, techniques, and protocols