Role & Responsibilities
Provide technical leadership in architecting and designing end to end solutions for Datalake initiatives as our client ventures into offering more and more services for consumers across multiple digital channels. Ensuring the current implementation of big data solutions are in-line with industry best practices and standards. Drive improvements in the current design, processes & implementation to improve operational management, scalability, and extensibility
* Articulate business and data requirements from business analysts across the various functional and business areas and then design/manage enterprise big data architecture, data governance, and automated data reconciliation.
* Hands-on technical data architect with extensive experience in designing big data solutions using open source and cloud native technologies.
* Technical line management of remote big data engineering team (10-15). Responsible for engineering excellence, resource management, hiring, coaching engineers, and timely delivery of committed work.
* Create and maintain big data architectural & design artefacts to ensure the up to date documentation for the relevant program and projects is readily available; ensuring knowledge is disseminated.
* Contribute to the evolution and planning of the enterprise architecture from data perspective by ensuring solutions are developed in line with roadmap.
* Collaborate with a wide range of stakeholders to identify and define customer needs
* Involve in partner selection and partner resource selection to ensure that the right partner/resource are selected who would be assisting the company in development & engineering.
* Be the voice of the customer in the context and value of the business requirements
* Manage the Technical Product Backlog of epics and features and drive prioritisation
* Support data lake design, development, release schedule and maintain a development road map
* Ability to clearly articulate pros and cons of various technologies and platforms and architectural options to be able to document use cases, solutions, and recommendations
* Advocate and champion the DataOps within the business, will be responsible that the existing datalake solution running smoothly on data to day basis.
* Maintain your own personal skills and knowledge in the field of enterprise architecture and your functional domain specialisation. Actively contributing to the definition of the overall standards for design, development/configuration (including tools) and related documentation.
Skills & Qualifications
* Demonstrable experience owning and developing big data solutions, using Hadoop, Hive/Hbase, Spark, Databricks, ETL/ELT for 5+ years
* Experience in designing data solution following Agile practices (SAFe methodology); designing for testability, deployability and releaseability; rapid prototyping, data modeling, and decentralized innovation
* DataOps mindset: allowing the architecture of a system to evolve continuously over time, while simultaneously supporting the needs of current users
* Create and maintain Architectural Runway, and Non-Functional Requirements.
* Design for Continuous Delivery Pipeline (CI/CD data pipeline) and enables Built-in Quality & Security from the start.
* To be able to demonstrate an understanding and ideally use of, at least one recognised architecture framework or standard e.g. TOGAF, Zachman Architecture Framework etc
* The ability to apply data, research, and professional judgment and experience to ensure our products are making the biggest difference to consumers
* Excellent written, verbal and social skills - You will be interacting with all types of people (user experience designers, developers, managers, marketers, etc.)
* Ability to work in a fast paced, multiple project environment on an independent basis and with minimal supervision
Technologies: .NET, AWS, Azure; Azure Synapse, Nifi, RDS, Apache Kafka, Azure Data bricks, Azure datalake storage, Power BI, Reporting Analytics, QlickView, SQL on-prem Datawarehouse; BSS, OSS & Enterprise Support Systems