To develop and maintain complete data architecture across several application platforms provide capability across application platforms. To design build operationalise secure and monitor data pipelines and data stores to applicable architecture solution designs standards policies and governance requirements thus making data accessible for the evaluation and optimisation for downstream use case consumption. To execute data engineering duties according to standards frameworks and roadmaps
Qualifications :
- Information Studies or Information Technology
- Basic cloud certificates - DP-203 or DP-700 DP-900
Experience :
Data modelling & warehousing experience.Data pipelines experience - Extract Transform and Load.Knowledge and experience in the following tools - SSIS SSDT Power BI MS SQL Python.Understanding of data quality issues and recon frameworks.Performance optimisation or code reverse engineering experience.Knowledge of setting up and using DevOps for deployments or at least better understanding of it.Azure Cloud Experience.Azure data engineering tools (Optional) e.g. ADF Synapse Analytics Studio or MS FabricAdditional Information :
Behavioral Competencies :
Adopting Practical ApproachesArticulating InformationChecking DetailsDeveloping ExpertiseDocumenting FactsEmbracing ChangeExamining InformationInterpreting DataManaging TasksProducing OutputTaking ActionTeam WorkingTechnical Competencies :
Big Data Frameworks and ToolsData EngineeringData IntegrityData QualityIT KnowledgeStakeholder Management (IT)Remote Work : No
Employment Type : Full-time
Key Skills
Apache Hive,S3,Hadoop,Redshift,Spark,AWS,Apache Pig,NoSQL,Big Data,Data Warehouse,Kafka,Scala
Experience : years
Vacancy : 1