We are working with a well-established organization seeking an experienced Senior Data Engineer to support the design, build, and optimization of large-scale data platforms. This role is highly hands-on and suited to someone who has built end-to-end data pipelines across both batch and real-time workloads in cloud-native environments.
Hours per week: 40
Length of Contract: Minimum 6 months with likely extension
Onsite requirements: 2 days per week in Charlotte, 3 days remote
Required / Core Skills
*
Strong experience as a Data Engineer in cloud-based environments
*
Solid hands-on experience with Google Cloud Platform (GCP)
*
Strong experience with BigQuery, including data modeling and performance optimization
*
Proficient Python experience for building and maintaining data pipelines
*
Experience building streaming and batch data pipelines (for example using Pub/Sub, Kafka, or similar technologies)
*
Strong SQL skills for data transformation and analytics use cases
*
Experience orchestrating workflows using Apache Airflow (or Cloud Composer)
*
Experience working with modern data architectures and end-to-end pipelines
Preferred Skills
*
Exposure to Dataflow, Dataproc, or Apache Beam
*
Experience with Databricks or Spark / PySpark
*
Infrastructure-as-Code experience using Terraform
*
CI/CD exposure for data engineering workflows
*
Containerization experience with Docker and Kubernetes
*
Monitoring and observability tools such as Prometheus or Grafana
*
Exposure to other cloud platforms (AWS or Azure)
Those interested should apply to this advert with an up to date resume. We shall be in touch shortly.
