Job Description: You will be responsible for modeling, implementation, testing, deployment, and supporting the data products and our clients interfaces (APIs). You will also lead the way in setting up and supporting AWS Services.
Role & Responsibilities
* Hands on developing deep learning models
* Ensuring modules will be pushed into data stores like Apache for REST API end points
* Design and build reliable, scalable data infrastructure to safeguard data using AWS
* Design and build scalable cost-effective solutions for predictive analytics
* Design and Architect frameworks through serverless architecture
* Take over and scale data Models (Tableau, Dynamo DB, Kibana)
* Communicate with external stakeholders about data-backend fidings
* Build frameworks for data ingestion pipeline, ETL/ELT processes
Skills & Qualifications
* 5+ years' experience working with enterprise data solutions
* Hands on experience in public cloud environment and on-prem infrastructure
* Serverless, Streaming Technology (Docker, Spark, Kafka, Airflow, Kubernetes)
* SQL skills and Python Coding
* AWS Certified Big Data
* Big Data Stack (Spark, spectrum, Flume, Kafka, Kinesis etc.)
* Data streaming (Kafka, SQS/SNS queuing, etc)
* AWS services (S3, Lambda, Redshift, Glue, EC2, etc)
* BI tools (Tableau, Domo, MicroStrategy) is a plus
Benefits
* 401K
* Medical/Dental
* PTO
* MORE
