Have you ever wanted to be working with global clients on a daily basis? Getting your teeth into projects that last 9-15 months, working with edge-cutting technologies such as Redshift, SQL, Oracle, S3 buckets and loads more! My client have recently won a massive project and are looking to grow their company by 25-30 new data engineers before the end of 2022.
The group provides a wide range of data and analytics solutions in support of our client's business priorities: maximise revenues, bear down on fraud, and respond to current Covid challenges. This role is a unique chance to design, develop, test and support data and analytics software solutions, delivering key critical systems to the public sector. You will be part of an Agile software delivery team working closely with software architects and supported by product managers, scrum masters and solutions architects.
* You will get put through certifications and will get upskilled to the best you can be
* There's a mentorship program as well (10 days of training a year, can be anything)
* Looking to hire 25-35 candidates this year looking to grow as well
* Flat structure within the business
* Opportunity to move internal as they hire from within before going to external agencies
* Working for a growing company averaging around 30% every year
Why this role?
Work as part of an Agile software delivery team; typically delivering within an Agile Scrum framework. You will work on the full range of software engineering principles; covering requirements gathering and analysis, solutions design, software coding and development, testing, implementation and operational support. You will contribute to the software engineering communities by providing your ideas for innovation and process improvement and coaching and mentoring other team members.
* Good experience in the following technologies are recommended:
* ETL toolset (Talend, Pentaho, SAS DI, Informatica etc)
* Database (Oracle, RDS, Redshift, MySQL, Hadoop, Postgres, etc)
* Data modelling (Data Warehouse, Marts)
* Job Scheduling toolset (Job Scheduler, TWS, etc)
* Programming and scripting languages (PL/SQL, SQL, Unix, Python, Hive, HiveQL, HDFS, Impala, etc)
Good to have experience in the following technologies:
* Data virtualisation tools (Denodo)
* Reporting (Pentaho BA, Power BI, Business Objects)
* Data Analytics toolset (SAS Viya)
* Cloud (AWS, Azure, GCP)
* ALM Tooling (Jira, Confluence, Bitbucket)
* CI/CD toolsets (Gitlab, Jenkins, Ansible)
* Test Automation (TOSCA)
You should have experience as a software engineer delivering within large scale data analytics solutions and the ability to operate at all stages of the software engineering lifecycle, as well as some experience in the following:
* Awareness of devops culture and modern engineering practices
* Experience of Agile Scrum based delivery
* Proactive in nature, personal drive, enthusiasm, willingness to learn
* Excellent communications skills including stakeholder management
Jefferson Frank is the Amazon Web Services (AWS) recruiter of choice. We work with organizations worldwide to find and deliver the best AWS professionals on the planet. Backed by private equity firm TPG Growth, we have a proven track record servicing the AWS recruitment market and, to date, have worked with over 30,000 organizations globally.