Ref: DSA21423_1676384331

Data Solutions Architect

USA, Maryland

Job description

Data Solutions Architect

DSA21423_1676384331

Looking to make a difference and collaborate in a Center of Excellence focused on Data Lake Initiative is seeking an experienced Data Solution Architect to spearhead crucial data related projects. The Data Solution Architect is responsible for the delivery of new AWS/Snowflake/Integration capabilities to our commercial teams. Working closely with product stakeholders, you will prioritise, develop, and implement exciting new solutions on AWS/Snowflake platform

Responsibilities:
* Serve as the bridge between the Product Owner and the technical developers to transform and develop new requirements into designs, development approaches and implementation
* Design, build and implement Data Lake, including ingesting commercial data from heterogeneous sources, then augmenting the data to be served for both descriptive and predictive analytics.
* Lead highly skilled AWS developers, Snowflake developers, UI developers & Testers working in technologies such as snowflake PL/SQL, stream sets, glue, lambda, airflow, shell scripts, react app on EBS.
* Perform code review on different apps such as Python lambda code and snowflake stored procedures, tune the code to meet industries best practices.
* Design and develop solutions to meet both business and central technology requirements
* Lead estimation efforts, working with architects to scope and business to prioritize
* Approve technical path and implementation as part of formal validation process
* Provide support on issues before, during, and after deployment
* Balance the needs of both individual product teams and the broader Salesforce program
* Represent "The Team" in a fast-paced Agile environment
* Support and strengthen technical knowledge base across the BSC Salesforce community

Basic Qualifications:
* Bachelor's degree plus 8-10 years of experience leading Data Engineering teams
* 10+ years of proficiency in writing advanced SQL and PL/SQL in Snowflake or Oracle.
* 8+ years of extensive experience in multiple AWS services particularly Lambda and Cloud Formation
* Should be proficient in core python, specifically slicing and dictionaries.
* Expertise in Python frameworks such as pyspark, NumPy, pandas; scikit-learn is a plus
* Should have worked in at least three(3) Big-data project implemented using pyspark or Scala spark.
* Experience in modelling data warehouses and data marts.
* Proven track record of designing intelligent data models to optimize data storage and retrieval and securely sourcing external data from multiple sources.
* Addressing data, scaling, and product challenges and designing, building, and launching large and small reliable data pipelines.
* Experience in any ETL tools such as IICS, Stream sets, data factory etc.
* Familiar with major applications such as Salesforce, SAP, ServiceNow, Hana, Zoom
* Expertise in release management, should have worked in GIT, Jenkins or other CI/CD tools
* Experience with consuming data from REST API using python or java, should be able to parse JSON/XML data
* Hands-on on any reporting tools like Tableau, PowerBI, SSRS etc.
* Strong and effective written, verbal and presentation skills with the ability to collaborate with team members
* Familiar with Jira and other Atlassian tools for agile lifecycle management
* Familiar with quality assurance and documentation practices within software development
Preferred Qualifications:
* AWS Platform Certifications (Developer, Solution Architect)
* SnowPro Core Certification
* Knowledge on ML & AI
* Experience in Airflow, AWS Glue, Streamsets,
* Expertise in Jupyter notebook
* Experienced with front-end web development frameworks and hands-on development experience with react JS.
* Expertise with Agile software development methods and processes