• Location: USA, California, San Francisco
  • Salary: US$170000 - US$190000 per annum
  • Technology: AWS Jobs
  • Job Type: Permanent
  • Date Posted: 7th Oct, 2019
  • Reference: 10/07/19-jm_1570485756
Job Overview:

We are currently looking for a Team Leader within Data Engineering who has passion to work with data and to build solutions that supports this companies Analytic Systems Solution stack, which includes Hortonworks Hadoop distribution, SAS (Linux environment) and Tableau Server/Desktop, google cloud platform, Looker etc.

Business Requirements

This company works in a highly collaborative environment working closely with cross functional team members; Business Analysts, Product Managers, Data Analysts and Report Developers.

1 The lead Engineer would be responsible for managing 4-5 data engineers.
2 He/She will be the primary point of contact for all the stakeholders of marketing analytics team.
3 At a given point he/she will be responsible for managing 5-10 external stakeholders and 2-3 internal stakeholders.
4 The candidate will act as a liaison between Marketing Systems leadership and all the end consumers of the solutions built by the team
5 Be able to work in the global delivery model

Technical Requirement

1 The successful candidate would have extensive experience as a data engineer or ETL developer building and automating data transformation and loading procedures.
2 Strong knowledge and experience using Hive, SQL, SAS and Hadoop, GCP to conduct data profiling/discovery, data modelling and process automation is required.
3 The candidate must be comfortable working with data from multiple sources; Hadoop, DB2, Oracle, flat files.
4 The projects are detail intensive, requiring the accurate capture and translation of data requirements (both tactical and analytical needs) and validation of the working solution.

Essential Functions:

* Design, develop and implement end-to-end solutions on Hortonworks Hadoop distribution; strong ability to translate business requirements into technical design plan.
* Automate, deploy and support solutions scheduled on Crontab or Control-M. Deployment includes proper error handling, dependency controls and necessary alerts. Triage and resolve production issues and identify preventive controls.
* Build rapid prototypes or proof of concepts for project feasibility.
* Document technical design specifications explaining how business and functional requirements are met. Document operations run book procedures with each solution deployment.
* Identify and propose improvements for analytics eco-system solution design and architecture.
* Participate in Hadoop, GCP and SAS product support such patches and release upgrades. Provide validation support for Hadoop, GCP and SAS products, including any changes to other infrastructure, systems, processes that impact Analytics infrastructure.
* Participate in full SDLC framework using Agile/Lean methodology.
* Support non-production environments with the Operations and IT teams.
* Regular, dependable attendance & punctuality.

Qualifications:

Education/Experience:

* Degree in Computer Science/Engineering, Analytics, Statistics or equivalent work experience.
* 8+ years of work experience in Data Engineering, ETL Development and Data Analytics.
* 4+ years of hands-on experience using SQL and scripting language such as Unix Shell or Python.
* 5+ years of hands-on experience developing on a Linux platform.
* 5+ years of hands-on experience working in traditional RDBMS such as Oracle, DB2.
* 4+ years of hands-on experience working in Hadoop using HIVE, HDFS, TEZ, MapReduce, Sqoop.
* 4+ years of hands-on experience working scripting language such as Python or SAS.
* Strong knowledge of Hadoop / Big Data architecture and operational workings.

Similar Jobs

Engineering Manager
USA, California, San Francisco

DevOps Manager
USA, California, San Francisco

Engineering Manager
USA, California, San Francisco

Agile Tech Lead - San Francisco
USA, California, San Francisco

Engineering Manager
USA, California, San Francisco