Talent.com
Data Engineer

Data Engineer

Creative Leadership SolutionsPretoria, South Africa
19 days ago
Job description

REQUIREMENTS

Minimum education (essential) :

  • BSc in Computer Science, Engineering or relevant field

Minimum applicable experience (years) :

  • 2-4 years
  • Required nature of experience :

  • Experience with SQL Server and Azure Synapse Analytics / Microsoft Fabric for query writing, indexing, performance tuning and schema design.
  • Hands-on experience developing ETL pipelines, including data extraction from REST / SOAP APIs, databases and flat files.
  • Proficiency in data transformation using Python and Azure-native tools.
  • Experience with data warehousing.
  • Background in data modelling, including dimensional modelling, schema evolution and versioning.
  • Practical knowledge of cloud-based data storage and processing using Azure Blob Storage.
  • Familiarity with pipeline optimisation, fault tolerance, monitoring and security best practices.
  • Experience developing web applications using C# and the .NET platform.
  • Experience with front-end development using Blazor, React.js, JavaScript / TypeScript, HTML, CSS / SCSS.
  • Skills and Knowledge (essential) :

  • SQL Server, Azure Synapse Analytics, Azure Blob Storage, Microsoft Fabric
  • Python
  • REST / SOAP APIs, Data Extraction, Transformation, Loading (ETL)
  • Azure Data Factory, Pipeline Orchestration
  • Dimensional Modelling, Schema Evolution, Data Warehousing
  • Power BI
  • Performance Optimisation, Indexing, Query Tuning
  • Cloud Data Processing, Backups
  • C#, .NET, Blazor
  • JavaScript / TypeScript, HTML, CSS / SCS
  • Other :

  • Proficient in Afrikaans and English
  • Own transport and license
  • KEY PERFORMANCE AREAS AND OBJECTIVES

    ETL and Pipeline Development

  • Design, build, and orchestrate efficient ETL pipelines using Azure Synapse for both batch and near-real-time data ingestion.
  • Extract data from a variety of structured and unstructured sources including REST APIs, SOAP APIs, databases, and flat files.
  • Apply robust data transformation logic using Python and native Azure Synapse transformation tools.
  • Optimise data flows for performance, scalability, and cost-effectiveness.
  • Implement retry mechanisms, logging and monitoring within pipelines to ensure data integrity and fault tolerance.
  • Data Architecture and Management

  • Design and manage scalable and efficient data architectures using Microsoft SQL Server and Azure services, including Synapse Analytics / Microsoft Fabric and Blob Storage.
  • Develop robust schema designs, indexes and query strategies to support analytical and operational workloads.
  • Support schema evolution and version control, ensuring long-term maintainability and consistency across datasets.
  • Implement and maintain metadata repositories and data dictionaries for improved data governance and transparency.
  • Define and maintain role-based access control to ensure data security and compliance.
  • Data Warehousing and BI Integration

  • Architect and manage enterprise data warehouses using Azure Synapse Analytics.
  • Apply best practices for data loading, partitioning strategies, and storage optimisation.
  • Integrate warehousing solutions with Power BI and other analytics platforms for seamless business intelligence consumption.
  • Data Modelling & Standards

  • Develop and maintain conceptual, logical and physical data models.
  • Implement dimensional modelling techniques (e.g., star / snowflake schemas) to support advanced analytics and reporting.
  • Apply normalisation standards and relational modelling techniques to support OLTP and OLAP workloads.
  • Ensure consistency of data models across systems and support schema versioning and evolution.
  • Reporting and Communication

  • Provide clear, timely updates on task status and progress to senior developers / management.
  • Contribute to reports, manuals, and other documentation related to software status, operation, and maintenance.
  • Collaborate effectively with team members and stakeholders using the appropriate communication channels.
  • Maintain system and product change logs and release notes according to company standards.
  • Automation, Monitoring and Optimisation

  • Automate recurring data engineering tasks and deploy solutions with CI / CD best practices.
  • Implement monitoring and alerting mechanisms to detect data quality issues and pipeline failures.
  • Analyse and optimise query performance across platforms (SQL Server, Azure Synapse).
  • Support scalability planning and cost control by monitoring pipeline execution and resource usage
  • Security and Best Practices

  • Enforce security best practices for data access, including encryption and secure authentication.
  • Ensure compliance with data governance policies and applicable regulatory standards.
  • Document processes, architectural decisions and technical implementations in alignment with organisational standards
  • Contribution to The Team

  • Collaborate with developers, data analysts, data scientists and business teams to understand data requirements and deliver scalable solutions.
  • Work with the team to integrate pipelines with source control and deployment workflows
  • Quality Management and Compliance

  • Document data processes, transformations and architectural decisions.
  • Maintain high standards of software quality within the team by adhering to good processes, practices and habits.
  • Ensure compliance to the established processes and standards for the development lifecycle, including but not limited to data archival.
  • Safeguard confidential information and data.
  • Create a job alert for this search

    Data Engineer • Pretoria, South Africa