Newcastle, England, United Kingdom · Full-time
We are one of the largest and fastest growing on-line print shops in the UK. We are filled to the brim with a passionate team who are mad about printing, designing, building, and growing the business. In the office you'll find that we are forward-thinking, with a mission to be creative, innovative and to help our customers.
We are constantly investing in new products and using technology to improve how we increase our market share and take advantage of new opportunities. Catch us in our production space and you will find a fast-paced, dedicated team who endeavour to reach perfection for our customers.
We're a growing company with offices in London and Newcastle, looking for employees that are equal parts dynamic and enthusiastic. People brimming over with new, original ideas that they just can't wait to share with their team and bring to life.
*Candidates must be able to commute to Newcastle upon Tyne*
As a Data Engineer, you'll be central to our efforts in building and maintaining robust data pipelines within the AWS ecosystem, primarily focusing on Delta Lakehouse Platform on Databricks.
· Data Ingestion: Build and optimize data ingestion pipelines using AWS services such as Lambda, Kinesis, and Glue.
· Data Transformation: Develop, maintain, and optimize transformation pipelines within our Delta Lake on Databricks.
· Data Modelling: Design and implement efficient and reliable data models to store and retrieve data.
· Collaboration: Work closely with data scientists, analysts, and stakeholders to understand data requirements and deliver optimal solutions.
· Data Quality: Ensure the reliability, integrity, and timeliness of data by implementing best practices and quality checks.
· Performance Optimization: Optimize queries and improve performance of data retrieval operations.
· Technology Stewardship: Stay updated with the latest advancements in AWS, Databricks, and the data engineering field in general. Provide recommendations on the adoption of new tools and technologies.
· AWS Services: Proficiency in AWS services like Lambda, Kinesis, and Glue.
· Databricks & Delta Lake: Hands-on experience with Databricks and Delta Lakehouse architectures.
· Programming: Proficient in Python, Scala, and SQL for data engineering tasks.
· ETL Processes: Demonstrated expertise in designing, building, and maintaining ETL processes.
· Data Modelling: Experience in logical and physical data modelling with large datasets.
· Problem Solving: Strong analytical and problem-solving skills, with a focus on troubleshooting data issues.
· Certification in AWS and/or Databricks.
· Experience in a similar role in online retail industry.
· Knowledge of best practices in data governance and data security.
· 22 days holiday rising to 25
· Staff discounts & Friends and Family discounts
· Cycle to work scheme and Tech Scheme
· Breakfast and drinks provided
· Charity day per annum supported
· Summer and Christmas Parties
· Street food days
· Access to Perkbox portal