• Location: USA, Texas, Houston
  • Salary: US$100 - US$110 per hour
  • Technology: AWS Jobs
  • Job Type: Contract
  • Date Posted: 8th Oct, 2019
  • Reference: Data1_1570540415
We are looking for a savvy Data Engineer, with heavy cloud experience, to join our growing team. The hire will be responsible for expanding and optimizing data and data pipeline architecture for our clients, as well as optimizing data flow and collection. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support the clients on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of the clients.

*Create and maintain optimal data pipeline architecture,
*Assemble large, complex data sets that meet functional / non-functional business requirements.
*Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
*Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS 'big data' technologies.
*Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
*Work with clients to assist with data-related technical issues and support their data infrastructure needs.
*Work with data and analytics experts to strive for greater functionality across data systems.

Customer Environment: Current customer cloud is AWS. This project will be built in AWS and connect to the Starsteer SOLO web APIs.
Required Skills "The data engineer will need to have the following skills:
1. Experience building data pipelines in AWS (Lambda, S3, Redshift, EC2, etc).
2. Experience with cloud data warehousing using Snowflake and Snowpipe (if you cannot find Snowflake people, then good experience with Redshift is ok).
3. Significant experience using SQL
4. Significant experience using Python and Java.
5. Experience using Databricks with Spark and PySpark.
6. Experience building APIs
7. Good written and verbal communication skills."
Optional Skills "A good candidate will also have experience with the following:
1. Some experience in the oil and gas domain.
2. Some experience with cloud security, data warehouse security.
3. Some experience with data modelling"
Roles & Responsibilities The data engineer will work with the enterprise architect to validate the design and provide the primary build out effort for the solution.