Join a leading finance and insurance company thats modernizing its data landscape. Were seeking a skilled Data Engineer to help us transition from legacy SAS environments to a scalable, cloud-native platform powered by Databricks .
Responsibilities :
- Migrate and modernize legacy SAS data pipelines to scalable, cloud-based solutions in Databricks
- Design and implement ETL workflows using PySpark, Delta Lake, and Azure / AWS
- Collaborate with data scientists, analysts, and other engineers to ensure smooth data delivery
- Optimize data performance and drive automation in our data infrastructure
- Collaborate with actuarial, underwriting, finance, and analytics teams to deliver trusted data solutions
- Apply best practices in data modeling, quality, and governance aligned with financial regulatory standards
- Support the buildout of a Data Lakehouse on AWS, leveraging Delta Lake and Unity Catalogue
Experience and Qualifications :
3+ years of hands-on experience with SAS (Base, Macros, EG)Proven experience with Databricks, PySpark, and SQLFamiliarity with cloud platforms like Azure, AWS, or GCPStrong understanding of data warehousing, ETL / ELT, and modern data lake architecturesPassion for data modernization and continuous learningExperience with MLflow, Airflow, or dbtKnowledge of CI / CD and DevOps for data engineeringThe Reference Number for this position is NG60438 which is a Permanent Hybrid position based in Johannesburg offering a cost to company of R800k R1mil per annum negotiable on experience and ability. Contact Nokuthula at