- Design, build and maintain the banks semantic data assets that will drive insight activities for the organisation, integrating new data and supporting prototype activity.
- Build safe, automated, and robust Data Flows and Assets. Ensure that fit for purpose validations and controls exist in our Data Flows and Assets to evidence accuracy and timeliness of our data.
- Maintain and deliver best-in-class data assets, striving to continuously improve data assets and processing. Ensure data is well structured, efficient, and extensible therefore building data assets that deliver business value time and again.
- Identify and leverage new and existing data domains, ensuring integration with the semantic data layer adds value.
- Creation and maintenance of sustainable and accurate data solutions, tailored to the needs of the business, that comply with governance, architecture, and risk frameworks.
- Ensure builds align with Data Governance, Architecture and Modelling Principles. Work to build, monitor and maintain fit for purpose documentation such as Data Dictionaries and Entity Relationship Diagrams for our databases and data assets.
- Maintenance of existing Data Flows and Assets. Make changes to Data Flows as necessary to ensure they run smoothly or to enhance them as required by the business.
- Work closely with Data Insight Developers and Data Engineers to deliver new data requirements or changes and assist with the integration of data sources. Follow well-structured version control methods to ensure that we can roll back to previous versions and can maintain an audit trail of changes made.
- Strong communication and presentation skills
- Proven ability and initiative to learn and research new concepts, ideas, and technologies quickly
- Strong systems/process orientation with demonstrated analytical thinking, organisation skills and problem solving skills
- Thrives on building things from the ground up and enjoys reaching for the highest performance levels
- Ability to work in a team-oriented, collaborative environment
- Strong ETL skills, balancing short and long term needs
- Proven track record of delivering integrated data to drive meaningful self serve analysis and insight
- Experience in Semantic layer data model creation and maintenance, including real time and batch based data loads
- Implementing Data Management controls
- Data Modelling/Data Model Interpretation and Database/Data Solution Design
- Data Analysis/Profiling
- Data processing and model optimisation, including performance tuning
- You must be experienced in programming with cloud-based tools (python, Spark, Airflow etc.)
- Experience of SAS, SAS EG, DI Studio & SAS SQL, R, Shiny (preferred)
- JIRA, Confluence, GitHub, Jenkins knowledge (preferred)
Apply with your CV below or reach out to Steven Mckay at Jefferson Frank directly: email@example.com.