Ein Unternehmen der Tenth Revolution Group

Ihre aktuelle Jobsuche

28 Suchergebnisse

Für Festanstellung und freiberuflich, Architect

    Neu

    Senior Data Solutions Architect - London - GCP - £125,000

    England, London, City of London

    • £100,000 to £125,000 GBP
    • Architect Stelle
    • Fähigkeiten: Solutions Architect, Data Architect, Technical Architect, Solution Design, Google Cloud, GCP, Banking, Pre-Sales, Consultancy
    • Seniority: Senior

    Jobbeschreibung

    Senior Data Solutions Architect - London - GCP - £125,000



    Company Overview:

    My client is a trusted partner for major clients across a range of data-driven industries such as banking, insurance, health-care, retail, and more! With a global presence spanning continents and tens of thousands of professionals, there is no better place to grow and advance your career. Due to their commitment and motivation to stay at the top of their industry, there is constant investment in research and development to stay ahead of trends by working with the most cutting-edge technology. My client believes their success stems from the calibre of people they employ and will do everything to support you and your development.

    Role Overview:

    As a Data Solution Architect, you will be a Senior member of a dynamic team of architects. You will leverage your deep expertise in architecture, ranging all the way up to the enterprise level to design and manage solutions with massive scope.

    In this client-facing role, you will lead projects, acting as the strategic lead within a Data Management practice. You will lead fascinating data transformation projects within the financial sphere. Your ability to handle diverse data conversations and solve data-related issues will be key to driving success. This is an exciting opportunity to work with a global team, stay ahead of industry trends, and make a significant impact with your technical leadership and innovative solutions.

    Requirements:

    * Strong Data Solutions Architectural experience
    * Strong experience with GCP
    * Data Modelling Expertise
    * Pre-Sales Experience
    * Enterprise Level Architecture





    Interviews ongoing don't miss your chance to secure the future of your career!



    Contact me @ j.shaw-bollands@tenthrevolution.com or on 0191 338 6641.







    Solutions Architect, Data Architect, Technical Architect, Solution Design, Google Cloud, GCP, Banking, Pre-Sales, Consultancy

    Principal Solutions Architect

    USA, Minnesota, Minneapolis

    • $210,000 to $225,000 USD
    • Architect Stelle
    • Fähigkeiten: AWS, Python, Data, Snowflake, ETL, AI/ML
    • Seniority: Senior

    Jobbeschreibung

    Job Description:
    We are seeking a highly skilled Principal Solutions Architect with deep expertise in AWS, cloud-native applications, and software engineering. This role is pivotal to our success and expansion within customer accounts, focusing on delivering high-quality, data-driven solutions that enable clients to become "AI Ready."

    The ideal candidate will have experience as a Chief Architect or in similar roles within end-user organizations, demonstrating a successful history of designing and delivering cloud-native solutions. We value individuals who possess a blend of end-user insights and consulting expertise.



    Key Responsibilities:

    * Cloud-Native Application Expertise: Utilize your in-depth knowledge of AWS to design, implement, and enhance cloud-native applications that align with customer needs.
    * Technical Leadership: Provide consulting, delivery oversight, and support in revenue generation activities to drive customer satisfaction and project success.
    * Stakeholder Engagement: Collaborate closely with key stakeholders, delivering tailored solutions while fostering strong relationships built on trust and transparency.
    * Delivery Oversight: Lead a team of engineers and architects or work independently, coordinating with delivery teams to ensure project milestones are achieved effectively.
    * Strategic Consulting: Guide clients through the intricacies of data modeling, AI integration, and other advanced analytics frameworks, enhancing their capabilities to prepare for future AI implementations.



    Qualifications:

    * Experience:

    * Extensive expertise in software development and architecture, particularly within cloud-native environments (AWS).
    * Strong background in data solutions, data modeling, and AI applications, with hands-on experience in Snowflake being a significant advantage.
    * Experienced in consulting and delivery oversight, with a focus on revenue generation and high-impact project delivery.



    * Adaptable Leadership: Ability to manage projects effectively, with a focus on mentoring and guiding team members



    * Technical Skills:

    * Proficiency in AWS services such as EC2, Lambda, S3, RDS, and Redshift.
    * Experience with containerization and orchestration platforms, such as Docker and Kubernetes.
    * Familiarity with CI/CD pipelines and DevOps practices, using tools like Jenkins, GitLab CI, or AWS CodePipeline.
    * Strong knowledge of databases, including SQL and NoSQL technologies (e.g., DynamoDB, MongoDB).
    * Experience with data ingestion and processing frameworks (e.g., Apache Kafka, Apache Spark, or AWS Glue).
    * Understanding of microservices architecture and RESTful API design principles.
    * Familiarity with monitoring and logging tools (e.g., CloudWatch, ELK Stack) for operational excellence.

    Databricks Lead - Implementation - St Louis, MO.

    USA, Missouri, St. Louis

    • $150,000 to $200,000 USD
    • Platform Stelle
    • Fähigkeiten: Databricks, Delta Lake Tables
    • Seniority: Senior

    Jobbeschreibung

    Tenth Revolution Group are working with a leading manufacturing organization who are seeking a technically strong professional to lead and manage the implementation, rollout, and subsequent utilization of Databricks across the organization. The successful candidate for this position will be responsible for designing and deploying scalable data pipelines and advanced analytics solutions using the Databricks platform.



    Technical Responsibilities:

    * Architect and implement data solutions using Databricks, including Delta Lake and Unity Catalog
    * Build and optimize ETL/ELT pipelines using PySpark and SQL
    * Configure and manage Databricks Workspaces, clusters, and job workflows
    * Integrate Databricks with cloud platforms (AWS, Azure, or GCP) and data sources
    * Implement CI/CD processes for Databricks using tools like Azure DevOps or GitHub Actions
    * Set up data governance, access control, and monitoring within Databricks
    * Collaborate with data engineers, platform teams, and stakeholders on technical delivery



    Required Technical Expertise:

    * Databricks (core platform, Delta Lake, Unity Catalog)
    * PySpark and Spark SQL
    * Cloud data services (AWS/GCP/Azure)
    * CI/CD automation for data pipelines
    * Git-based version control and release processes
    * Data lake architecture and performance tuning



    Candidates for this role must be able to demonstrate a deep technical fluency in Databricks and a strong track record of delivering complex data infrastructure projects

    Freelance Azure Cloud Architect

    Finland

    • Negotiable
    • Architect Stelle
    • Fähigkeiten: Azure, Cloud
    • Seniority: Senior

    Jobbeschreibung

    Freelance Azure Cloud Architect - Finnish Speaking



    Key Responsibilities



    *

    Design scalable, secure, and cost-optimised cloud architectures on Microsoft Azure.
    *

    Define target state cloud architecture and migration roadmaps in collaboration with business and IT stakeholders.
    *

    Drive adoption of Infrastructure as Code (IaC) using tools like Bicep and Terraform.
    *

    Guide DevOps teams in implementing CI/CD pipelines and automated deployment strategies.
    *

    Conduct cloud readiness assessments and lead architecture reviews.
    *

    Ensure governance, security best practices, and compliance standards are built into solutions.
    *

    Act as a trusted advisor to senior stakeholders on cloud innovation and transformation strategies.



    Requirements



    *

    5+ years in cloud architecture roles, with a strong Azure focus.
    *

    Deep knowledge of Azure services including Azure Functions, App Services, Key Vault, Azure AD, API Management, and more.
    *

    Strong understanding of enterprise networking, security, and hybrid cloud architecture.
    *

    Hands-on experience with IaC (Bicep, Terraform), DevOps (Azure DevOps, GitHub Actions), and containerisation (AKS, Docker).
    *

    Proven ability to lead architecture discussions and influence stakeholders across business and IT.
    *

    Excellent communication in Finnish and English.



    Preferred

    * Microsoft Certified: Azure Solutions Architect Expert (AZ-305).



    If you are interested, please reach out to me directly.

    T: +358 75 3252529

    E: j.leach@tenthrevolution.com

    Principal Solutions Architect

    USA, Minnesota, Minneapolis

    • $225,000 to $280,000 USD
    • Architect Stelle
    • Fähigkeiten: AWS, Python, Architect, Cloud Native, Java, Management, Technical Leader
    • Seniority: Senior

    Jobbeschreibung

    Role: Principal Solutions Architect

    Location: Remote, anywhere in the US

    * 25-30% travel

    Salary: Ideally $225k base, $275-280k OTE

    Company Overview:

    * IT Services and Consulting
    * 600 employees
    * Founded in 2014
    * Key benefits:

    * Remote-first work environment
    * Casual, award-winning small business work culture
    * Collaborative environment that values autonomy, creativity, and transparency
    * Competitive compensation, excellent benefits, 4 weeks of PTO, 10 holidays, and more
    * Accelerated learning and professional development through advanced training and certifications

    Role Requirements:

    * 10+ years as a hands-on Solutions Architect designing and implementing data solutions
    * 2+ years of Consulting leadership experience working with external customers, with the ability to multitask, prioritize, frequently change focus, and manage multiple projects
    * Experience in Technical Account Leadership, including developing a technology strategy, leading delivery teams, and collaborating with sales and other internal teams
    * Ability to develop end-to-end technical solutions into production, ensuring performance, security, scalability, and robust data integration
    * AWS Expert

    * Software Development background, architecture experience
    * Cloud Native Application expertise

    * Highest level of technical engineering
    * Can be either a hands-on individual contributor or a people manager overseeing delivery accounts
    * AI-ready mindset-many customers are leveraging AI
    * Experience in consulting, delivery oversight, and revenue generation
    * Strong stakeholder engagement experience-consulting experience preferred but not required
    * Programming expertise in Java, Python and/or Scala, SQL, and cloud data platforms such as Snowflake, AWS, Azure, Databricks, and GCP
    * Proven ability to lead and manage teams comprising Solution Architects and Data Engineers, fostering growth through coaching, mentoring, and performance management
    * Track record of collaborating with client stakeholders, technology partners, and cross-functional teams for seamless project delivery
    * Strong cross-practice relationships to drive customer success
    * Ownership mindset-committed to ensuring exceptional project execution
    * Client-facing written and verbal communication skills
    * Experience creating detailed solution documentation, including POCs, roadmaps, sequence diagrams, class hierarchies, logical system views, and client presentations

    Technical Background & Expertise in any of the following:

    * Production experience with core data platforms: Snowflake, AWS, Azure, GCP, Hadoop, Databricks
    * Cloud & Distributed Data Storage: S3, ADLS, HDFS, GCS, Kudu, ElasticSearch/Solr, Cassandra, or other NoSQL storage systems
    * Data integration technologies: Spark, Kafka, event/streaming, Streamsets, Matillion, Fivetran, NiFi, AWS Data Migration Services, Azure DataFactory, Informatica IICS, Google DataProc
    * Experience working with multiple data sources: Queues, relational databases, files, search, API
    * Software development life cycle experience: Design, documentation, implementation, testing, deployment
    * Automated data transformation & curation: dbt, Spark, Spark Streaming, automated pipelines
    * Workflow Management & Orchestration: Airflow, AWS Managed Airflow, Luigi, NiFi
    * Methodologies: Agile Project Management, Data Modeling (e.g., Kimball, Data Vault)

    Principal Solutions Architect (MLOps) - US Remote

    USA, Texas

    • $200,000 to $225,000 USD
    • Architect Stelle
    • Fähigkeiten: AWS, Snowflake, GCP, AI, ML, Machine Learning, MLOps, Principal Solutions Architect, PSA, Senior Solutions Architect, Remote, Data Engineer, ML Engineer, Machine Learning Engineer, USA
    • Seniority: Senior

    Jobbeschreibung

    Remote Role

    A leading organization at the forefront of modern data solutions is seeking a Principal Solutions Architect specializing in Machine Learning. Partnering with major cloud providers such as AWS, Azure, GCP, and Snowflake, their team helps enterprises navigate their most complex data challenges with innovative and scalable solutions.

    The company fosters a vibrant culture rooted in curiosity, ownership, and mutual trust. Despite their rapid growth, they maintain an engaging, flexible environment focused on delivering excellence while supporting personal and professional development.



    Key Highlights:

    * Multiple-time Partner of the Year with top cloud platforms
    * 600+ advanced cloud certifications (AWS, Azure, Snowflake, and more)
    * Recognized as a best place to work across multiple regions



    Role Overview: In this role, you will lead the architecture and delivery of advanced AI/ML solutions, while simultaneously cultivating client relationships and uncovering new growth opportunities. This position demands a rare blend of technical leadership, client advocacy, and strategic thinking.



    Primary Responsibilities:

    Technical Leadership & Delivery:

    * Design end-to-end AI/ML solutions ensuring scalability, robustness, and business alignment
    * Lead teams of engineers and data scientists to build sophisticated ML platforms, pipelines, and products
    * Articulate the business impact and ROI of machine learning initiatives to diverse audiences
    * Mentor team members on emerging best practices and industry standards
    * Stay current on evolving AI/ML tools and methodologies to shape future solutions

    Client Engagement & Account Development:

    * Forge strong client relationships to understand their AI/ML vision and translate it into actionable roadmaps
    * Present compelling technical strategies that drive client adoption and success
    * Identify opportunities for additional client value through expanded AI/ML solutions
    * Lead responses to RFIs and RFPs, emphasizing innovative and results-driven approaches



    Required Qualifications:

    * Extensife experience in AI/ML architecture, software engineering, or data science
    * Significant leadership experience in a consulting environment
    * Bachelor's degree in Computer Science, Engineering, or related field
    * Strong expertise deploying machine learning models in production environments
    * Demonstrated success in account growth, solution pre-sales, and proposal development
    * Deep knowledge of AI/ML ecosystems (AWS, GCP, Azure, Databricks, Vertex AI, SageMaker)
    * Proficient in Python, Scala, Java, or other modern programming languages
    * Experience with building scalable data pipelines and API/web server development
    * Full software development lifecycle experience



    Preferred Skills:

    * Master's degree or other advanced qualifications
    * Contributions to open-source AI/ML projects
    * Familiarity with ML libraries (TensorFlow, Keras, scikit-learn, h2o, etc.)
    * Experience with Docker, Kubernetes, MLflow, and Sagemaker/Azure ML



    Benefits:

    * Remote-first work environment
    * Award-winning, collaborative culture emphasizing creativity and autonomy
    * Competitive compensation and benefits, including 4 weeks PTO plus holidays
    * Ongoing investment in advanced training and professional certifications



    Those interested should apply to this advertisement with their resume & contact details. Candidates are being shortlisted this week.

    US Remote - Solution Architect, MLOps

    USA, Texas

    • $175,000 to $190,000 USD
    • Architect Stelle
    • Fähigkeiten: AWS, Python, Snowflake, Databricks, MLOps, Machine Learning, Python, PySpark, Architect, Engineer, Texas, Oklahoma, USA, Remote, Data Engineering
    • Seniority: Senior

    Jobbeschreibung

    A leading global data consultancy is searching for an experienced Machine Learning Solutions Architect to join its growing, remote-first team. Partnering with major cloud platforms (Primarily Snowflake, AWS & Databricks), this company delivers cutting-edge services designed to solve complex data challenges for large enterprises.

    What You'll Do:

    * Architect and implement data science deployment solutions tailored to enterprise client needs, focusing on model inference, monitoring, retraining, and infrastructure setup.
    * Collaborate with data scientists to prepare and transform data into production-ready formats.
    * Build environments for model development and testing, ensuring operational readiness and maintainability.
    * Design integration strategies with existing business systems to ensure scalability and long-term success.
    * Lead efforts in model QA, testing, validation, and production rollout.
    * Champion best practices for software development, model deployment, and infrastructure.

    What You Bring:

    * 6+ years of experience in roles such as ML Engineer, Software Engineer, or Data Engineer. However some kind of background in data engineering is crucial for the role
    * Strong background in Python, Scala, Java, or comparable languages.
    * Proficient in deploying models in real-world production environments.
    * Expertise in SQL and distributed computing tools (e.g., Spark, Snowflake, Databricks).
    * Familiarity with various data sources and systems (Kafka, RDBMS, cloud platforms).
    * Solid grasp of cloud infrastructure and architecture (e.g., AWS, Azure, GCP).
    * Experience in developing APIs or backend applications using modern frameworks.

    Preferred Qualifications:

    * Advanced degree in a relevant technical field.
    * Hands-on experience with ML tools and frameworks such as TensorFlow, Keras, scikit-learn, or H2O.
    * Experience with containerization tools like Docker or Kubernetes.
    * Familiarity with model management tools (MLflow, Sagemaker, Azure ML).
    * Contributions to open-source projects or relevant personal initiatives.

    Benefits:

    * Flexible, remote-first working model.
    * Culture that values creativity, autonomy, and team collaboration.
    * Competitive compensation, strong benefits, and generous PTO.
    * Ongoing professional development including certification opportunities.
    * Inclusive environment where diverse perspectives are welcomed and celebrated.



    Equal Opportunity:
    This company is committed to diversity and inclusion, offering a workplace that supports all employees regardless of background, identity, or experience. Accommodations are available for applicants with disabilities.

    Databricks Lead - Implementation - Minneapolis, MN

    USA, Minnesota, Minneapolis

    • $150,000 to $200,000 USD
    • Platform Stelle
    • Fähigkeiten: Databricks, Delta Lake Tables
    • Seniority: Senior

    Jobbeschreibung

    Tenth Revolution Group are working with a leading manufacturing organization who are seeking a technically strong professional to lead and manage the implementation, rollout, and subsequent utilization of Databricks across the organization. The successful candidate for this position will be responsible for designing and deploying scalable data pipelines and advanced analytics solutions using the Databricks platform.



    Technical Responsibilities:

    * Architect and implement data solutions using Databricks, including Delta Lake and Unity Catalog
    * Build and optimize ETL/ELT pipelines using PySpark and SQL
    * Configure and manage Databricks Workspaces, clusters, and job workflows
    * Integrate Databricks with cloud platforms (AWS, Azure, or GCP) and data sources
    * Implement CI/CD processes for Databricks using tools like Azure DevOps or GitHub Actions
    * Set up data governance, access control, and monitoring within Databricks
    * Collaborate with data engineers, platform teams, and stakeholders on technical delivery



    Required Technical Expertise:

    * Databricks (core platform, Delta Lake, Unity Catalog)
    * PySpark and Spark SQL
    * Cloud data services (AWS/GCP/Azure)
    * CI/CD automation for data pipelines
    * Git-based version control and release processes
    * Data lake architecture and performance tuning



    Candidates for this role must be able to demonstrate a deep technical fluency in Databricks and a strong track record of delivering complex data infrastructure projects

    Databricks Implementation Lead

    USA, Iowa, Iowa City

    • $150,000 to $200,000 USD
    • Platform Stelle
    • Fähigkeiten: Databricks, Delta Lake Tables
    • Seniority: Senior

    Jobbeschreibung

    Tenth Revolution Group are working with a leading manufacturing organization who are seeking a technically strong professional to lead and manage the implementation, rollout, and subsequent utilization of Databricks across the organization. The successful candidate for this position will be responsible for designing and deploying scalable data pipelines and advanced analytics solutions using the Databricks platform.



    Technical Responsibilities:

    * Architect and implement data solutions using Databricks, including Delta Lake and Unity Catalog
    * Build and optimize ETL/ELT pipelines using PySpark and SQL
    * Configure and manage Databricks Workspaces, clusters, and job workflows
    * Integrate Databricks with cloud platforms (AWS, Azure, or GCP) and data sources
    * Implement CI/CD processes for Databricks using tools like Azure DevOps or GitHub Actions
    * Set up data governance, access control, and monitoring within Databricks
    * Collaborate with data engineers, platform teams, and stakeholders on technical delivery



    Required Technical Expertise:

    * Databricks (core platform, Delta Lake, Unity Catalog)
    * PySpark and Spark SQL
    * Cloud data services (AWS/GCP/Azure)
    * CI/CD automation for data pipelines
    * Git-based version control and release processes
    * Data lake architecture and performance tuning



    Candidates for this role must be able to demonstrate a deep technical fluency in Databricks and a strong track record of delivering complex data infrastructure projects

    AWS Archtiect Jobs

    Durchstöbern Sie bei Jefferson Frank eine große Auswahl an AWS Architect-Jobs. Bewerben Sie sich auf eine Stelle, um mit einem erfahrenen Recruitment Consultant in Kontakt gebracht zu werden. Ihr Consultant wird sicherstellen, dass Sie eine passende Stelle oder ein passendes Projekt finden. Bewerben Sie sich jetzt.