Une société du Tenth Revolution Group

Recherche actuelle

20 Résultats de la recherche

For CDI In United States

    Nouveau

    DevOps Engineer - AI

    USA, Pennsylvania, Bala Cynwyd

    • $130,000 to $150,000 USD
    • DevOps Role
    • Skills: AI (artificial intelligence), DevOps, Machine learning, Python
    • Seniority: Mid-level

    Description du poste

    DevOps Engineer (AI Focused)

    Location: Hyrbid in Philadephia

    Compensation: $130,000 - $150,000

    * Relocation support available

    Overview:
    This role offers an exciting opportunity to join a collaborative, learning-focused team. The organization is forming a brand-new team dedicated to a cutting-edge chatbot project aimed at housing proprietary customer information. This is an impactful role, as you will help shape the team and make key decisions.

    Key Responsibilities:

    * Work on the development of a proprietary chatbot project.
    * Play a pivotal role in forming a new team and contributing to technical decision-making.
    * Collaborate with cross-functional teams in a fast-paced, innovative environment.

    Qualifications:

    * Experience: Around 5 years in relevant fields.
    * DevOps Expertise: Strong understanding required.
    * Programming: Proficiency in Python is mandatory.
    * AI Expertise: At least 2 years of hands-on experience with AI technologies.
    * Cloud Platforms: Experience with Azure, AWS, or GCP.
    * Containerization: Familiarity with Docker and Kubernetes (nice to have).
    * Tech Stacks: Exposure to ELK stack and RAG stack is a plus.

    Culture:

    * Collaborative and innovative environment.
    * Commitment to continuous learning and professional growth.
    * Fun and dynamic office atmosphere.

    Note: This is an opportunity to work on a groundbreaking project while contributing to a growing team within an established organization.

    Sr. Data Engineer

    USA, Texas, Stamford

    • Negotiable
    • Engineer Role
    • Skills: SQL, Python, Snowflake, Snaplogic, Data Ingestion, ETL
    • Seniority: Senior

    Description du poste

    Overview:

    We are seeking a highly skilled and experienced Senior Data Engineer to join our team. The ideal candidate will possess a strong background in data architecture, management, and integration, particularly within the energy commodities or financial services sectors. This role will involve designing robust data solutions, ensuring data quality, and implementing ETL processes to support our business objectives.



    Key Responsibilities:

    * Data Architecture Development: Design, implement, and maintain scalable and reliable data architecture frameworks that support the organization's strategic goals and objectives.
    * ETL/ELT Design and Implementation: Architect and develop ETL and ELT processes utilizing SnapLogic and other integration tools to extract, transform, and load data from various sources into centralized data stores.
    * Data Modeling: Collaborate with business stakeholders to create comprehensive data models, optimizing logical and physical data design and ensuring alignment with business requirements.
    * Data Management and Governance: Establish and enforce data management best practices, including data governance frameworks to ensure data accuracy, consistency, and availability across the organization.
    * Quality Assurance: Conduct comprehensive data quality assessments and audits, identifying and resolving data issues to maintain high standards of data integrity.
    * Data Integration: Facilitate seamless data integration across various systems and platforms by developing and maintaining data ingestion pipelines that support real-time and batch processing.
    * Collaboration with Teams: Work closely with data scientists, business analysts, and software engineers to understand data needs and provide suitable data solutions and architecture for analytics and reporting.
    * Cloud Database Management: Manage relational and non-relational databases using Snowflake and Oracle, including performance tuning, optimization, and effective indexing strategies.
    * Data Analysis and Visualization: Utilize Python and business intelligence tools (Power BI, Tableau) to analyze complex datasets and create meaningful visualizations and reports that facilitate data-driven decision-making.
    * Machine Learning Implementation: Collaborate with machine learning practitioners to implement machine learning solutions, ensuring the proper infrastructure and data pipelines are in place to support model development and deployment.
    * Training and Knowledge Transfer: Mentor and train junior staff, providing knowledge transfer on data architecture and engineering best practices, tools, and technologies.
    * Vendor Management: Evaluate, select, and manage relationships with vendors and technology partners to integrate new data tools and solutions, staying abreast of vendor trends that impact data strategies.
    * Research and Innovation: Stay current with emerging trends, technologies, and best practices in data architecture, big data analytics, and AI/ML, recommending innovative solutions that enhance the organization's data capabilities.



    Relevant Skills and Qualifications:

    * Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field; Master's degree preferred.
    * Proven experience in data architecture, data engineering, or a related field.
    * Proficiency in SQL, T-SQL, and data modeling techniques.
    * Experience with ETL/ELT tools, specifically SnapLogic.
    * Familiarity with cloud data warehousing solutions, particularly Snowflake or Oracle.
    * Strong programming skills in Python, with experience in data libraries such as Pandas and NumPy.
    * Familiarity with modern data integration and data pipeline architectures.
    * Knowledge of machine learning and artificial intelligence concepts and tools.
    * Experience using BI tools (Power BI, Tableau) for data visualization and reporting.
    * Excellent analytical skills with a focus on data quality and management.
    * Strong communication and interpersonal skills, with experience in vendor management.

    Sr Data Engineer - Houston

    USA, Texas, Houston

    • $170,000 to $180,000 USD
    • Engineer Role
    • Skills: ETL, Python, Snowflake, SQL
    • Seniority: Senior

    Description du poste

    Our TOP CLIENT has retained us for a Sr Data Engineer role. They are known and highly recognized for their stability and commitment to a collaborative teamwork environment. Our client is building a best-in-class data science platform as being on the forefront of data management and analytics is core to our investment platform. Overall, this position will play an integral role as the team implements new data management platforms, creates new data ingestion pipelines, and sources new data sets. The role will assist with all aspects of data from data architecture design to on-going data management and will have significant exposure to our Risk and commercial investing teams globally.



    REQUIRED:

    Must be onsite 3 days a week in the Houston, TX area

    Extensive work experience with ETL/ELT frameworks to write pipelines.

    Advanced skills in writing highly optimized SQL code.

    Experience with Snowflake.



    TOP RESPONSIBILITIES:

    Execute data architecture and data management projects for both new and existing data sources.

    Help transition existing data sets, databases, and code to a new technology stack.

    Over time, lead analysis of data sets using a variety of techniques including machine learning.

    Manage end to end data ingestion process and publishing to investing teams.

    Own the process of mapping, standardizing, and normalizing data.

    Ad hoc research on project topics such as vendor trends, usage best practices, big data trends, artificial intelligence, vendors, etc.

    Help transition existing data sets, databases, and code to a new technology stack.

    Assess data loads for tactical errors and build out appropriate workflows, as well as create data quality analysis that identifies larger issues in data.

    Actively manage vendors and capture changes in data input proactively.

    Properly prioritize and resolve data issues based on business usage.

    Assist with managing strategic initiatives around big data projects for the commercial (trading) business.

    Partner with commercial teams to gain understanding of current data flow, data architecture, investment process as well as gather functional requirements.

    Assess gaps in current datasets and remediate.



    PREFERRED SKILLS:

    Experience in SQL programming, data architecture, and dimension modeling.

    Interest and passion for data architecture, analytics, management, and programming.

    Experience in energy commodities or financial services required.

    Experience in mapping, standardizing, and normalizing data.

    Experience with data integration platforms is preferred, and SnapLogic experience is highly preferred.

    Extensive work experience with ETL/ELT frameworks to write pipelines to load millions of records.

    Advanced skills in writing highly optimized SQL code.

    Experience with relational databases Snowflake or Oracle is preferred.

    Python work experience with Pandas, Numpy and Scikit

    Ability to communicate and interact with a wide range of data users - from very technical to non-technical.

    Team player who is execution focused, with the able to handle a rapidly changing set of projects. and priorities, while maintaining strong professional presence.

    Strong analytics skills with demonstrated attention to details.

    Familiarity with Business intelligence tools Power BI and Tableau.

    Interest or experience in machine learning/Artificial Intelligence is a plus.



    BENEFITS:

    Competitive comprehensive medical, dental, retirement and life insurance benefits

    Employee assistance & wellness programs

    Parental and family leave policies

    CCI in the Community: Each office has a Charity Committee and as a part of this program employees are allocated 2 days annually to volunteer at the selected charities.

    Charitable contribution match program

    Tuition assistance & reimbursement

    Quarterly Innovation & Collaboration Awards

    Employee discount program, including access to fitness facilities

    Competitive paid time off

    Continued learning opportunities



    Reach out to me directly if interested:

    E: s.murray@jeffersonfrank.com

    Nouveau

    Data Quality Analyst

    USA, Illinois, Champaign

    • $80,000 to $90,000 USD
    • Analyst Role
    • Skills: AWS, PostgreSQL, SQL, Python, pgAdmin, Toad, DataGrip
    • Seniority: Mid-level

    Description du poste

    Our TOP CLIENT has retained us for a Data Quality Analyst role. They are known and highly recognized for their stability and commitment to a collaborative teamwork environment. The Data Quality Analyst will be responsible acquiring data from primary or secondary sources. This position involves obtaining legacy data from the former billing system, and mapping it to fit the billing system data model.



    Location: Remote

    Salary: $80K to $90K



    Must Have skills:

    Strong SQL skills

    Python scripting

    Database tools like pgAdmin, Toad, or DataGrip

    Relational Databases experience (any)



    Data collection: Acquiring data from primary or secondary sources.This position involves obtaining legacy data from the former billing system, and mapping it to fit our billing system data model. We sometimes do this ETL work internally, and/or work with ETL partners. Analytical skills are important tool, and just being comfortable working with data files.



    * Data interpretation: Using statistical techniques to analyze data and draw logical conclusions.
    * Data visualization: Using tools to identify patterns and present findings to stakeholders.
    * Report and presentation development: Creating reports and presentations to explain data to stakeholders, IT representatives, and other data analysts.
    * Data quality control: Validating and linking data, and providing quality assurance for imported data.
    * Data protection: Understanding data protection issues and processing confidential data according to guidelines.
    * Confirms project requirements by studying client requirements: collaborating with others on development and project teams.
    * Works closely with project managers to understand and maintain focus on their analytics needs, including critical metrics and KPIs, and deliver actionable insights to relevant decision-makers
    * Proactively analyze data to answer key questions for stakeholders or yourself, with an eye on what drives business performance, and investigate and communicate which areas need improvement in efficiency and productivity
    * Create and maintain rich interactive visualizations through data interpretation and analysis, with reporting components from multiple data sources
    * Gather, interpret, and present data to help others understand and solve problems.
    * Turn project requirements into custom-formatted reports, and ability to analyze business procedures and recommend specific types of data that can be used to improve upon them.
    * Develop, implement, and maintain leading-edge analytics systems, taking complicated problems and building simple frameworks.

    DevOps Engineer - AI

    USA, Pennsylvania, Bala Cynwyd

    • $130,000 to $150,000 USD
    • DevOps Role
    • Skills: AI (artificial intelligence), DevOps, Machine learning, Python
    • Seniority: Mid-level

    Description du poste

    DevOps Engineer (AI Focused)

    Location: Hyrbid in Philadephia

    Compensation: $130,000 - $150,000

    * Relocation support available

    Overview:
    This role offers an exciting opportunity to join a collaborative, learning-focused team. The organization is forming a brand-new team dedicated to a cutting-edge chatbot project aimed at housing proprietary customer information. This is an impactful role, as you will help shape the team and make key decisions.

    Key Responsibilities:

    * Work on the development of a proprietary chatbot project.
    * Play a pivotal role in forming a new team and contributing to technical decision-making.
    * Collaborate with cross-functional teams in a fast-paced, innovative environment.

    Qualifications:

    * Experience: Around 5 years in relevant fields.
    * DevOps Expertise: Strong understanding required.
    * Programming: Proficiency in Python is mandatory.
    * AI Expertise: At least 2 years of hands-on experience with AI technologies.
    * Cloud Platforms: Experience with Azure, AWS, or GCP.
    * Containerization: Familiarity with Docker and Kubernetes (nice to have).
    * Tech Stacks: Exposure to ELK stack and RAG stack is a plus.

    Culture:

    * Collaborative and innovative environment.
    * Commitment to continuous learning and professional growth.
    * Fun and dynamic office atmosphere.

    Note: This is an opportunity to work on a groundbreaking project while contributing to a growing team within an established organization.

    Nouveau

    AWS Cloud Infrastructure Engineer

    USA, New Jersey, Whippany

    • $130,000 to $135,000 USD
    • Engineer Role
    • Skills: Amazon Cloudformation, Amazon EC2, Amazon ECS, Amazon Lambda, Amazon Route53, Amazon S3, AWS VPC
    • Seniority: Mid-level

    Description du poste

    Job Summary: We are seeking a skilled AWS Cloud Infrastructure Engineer with expertise in serverless architecture to join our team. In this role, you will play a crucial part in managing our cloud-based infrastructure, which relies extensively on AWS services such as Lambda and API Gateway. You should possess strong troubleshooting skills and a profound understanding of serverless technology. Familiarity with Infrastructure as a Service (IaaS) tools, including S3, Route 53, and CloudWatch, is essential.

    Key Responsibilities:

    * Provision Infrastructure as Code (IaC) using scripts to implement and maintain a highly scalable and redundant global AWS environment.
    * Manage the transition of on-premises infrastructure to cloud-based solutions, ensuring optimal performance of ConnectiveRx applications through ongoing operational cloud management.
    * Conduct routine maintenance, upgrades, and optimization of cloud infrastructure and services.
    * Provide day-to-day support and management for infrastructure, which includes overseeing third-party hosted data centers and cloud-based systems.
    * Assist in creating and executing the migration process for legacy systems to our cloud environments, providing regular management updates on progress.
    * Take ownership of AWS environment management, ensuring best practices are followed in building and maintaining applications within AWS.
    * Design and implement resilient web infrastructure for patient assistance programs, ensuring website and database availability and performance through the use of web application firewalls, load balancers, Route 53, EC2, web servers, and VPC configurations.
    * Articulate and describe strategies for setting up and optimizing cloud environments to ensure resilience and efficiency.

    Preferred Qualifications:

    * Proven experience in a large enterprise environment with a focus on AWS cloud management.
    * In-depth knowledge of various AWS services, including but not limited to VPC, Elastic Load Balancers, Auto Scaling Groups, EC2, ECS, DNS, as well as AWS storage and notification services.
    * Strong understanding of security best practices related to cloud infrastructure.
    * Excellent problem-solving abilities and a proactive approach to infrastructure management.

    AWS Container Platform Engineer

    USA, Maryland, Owings Mills

    • $150,000 to $200,000 USD
    • Engineer Role
    • Skills: Amazon EKS, AWS, Kubernetes, Terraform, Migration
    • Seniority: Senior

    Description du poste

    Job Title: AWS Container Platform Engineer

    Location: Hybrid (1 day per week on-site in Reston, VA). Remote candidates considered if located outside the DMV area (working EST hours).

    Company Overview: A well-established health insurance organization is seeking an experienced AWS Container Platform Engineer to lead a major cloud migration effort. This company, with deep roots and a commitment to innovative solutions, provides an opportunity to make significant technical contributions.

    Key Responsibilities:

    * Lead and act as the subject matter expert for a large-scale migration project from on-premises to AWS EKS, involving the transfer of 9 million lines of code for core applications.
    * Drive the successful containerization process and contribute to an active, high-impact project.
    * Collaborate closely with teams to ensure the migration is seamless and efficient from start to finish.
    * Focus on execution rather than consulting, with hands-on involvement throughout the project.

    Qualifications:

    * Strong expertise in containerization, particularly with EKS, ECS, Docker, and Kubernetes (K8s).
    * Advanced AWS experience, covering EC2, S3, EBS, EFS, IAM, VPC, Lambda, and related services.
    * Proficiency in Terraform for Infrastructure as Code (IaC).
    * In-depth understanding of CI/CD pipelines and DevOps practices.
    * Scripting skills in Python, Bash, or similar languages.
    * Certifications in AWS or Kubernetes are preferred but not required.

    Ideal Candidate:

    * A proven track record of leading complete containerization projects (end-to-end) and driving results.
    * Hands-on experience is essential; candidates with purely consulting backgrounds may not be the right fit.

    This role is perfect for those looking to take ownership of a significant migration project and contribute expertise in a high-impact environment.