Job & Company Description :
Im looking for Data Engineers with solid hands-on experience designing and building scalable data solutions. You will work with clients in Financial Services, Telecommunications, Technology, and Consulting, contributing to the development of enterprise data architectures and engineering best practices.
You will play a key role in designing robust pipelines, implementing ETL / ELT solutions, managing data quality, and supporting cloud-driven data initiatives.
Key Responsibilities :
- Design and maintain ETL / ELT pipelines and scalable data ingestion solutions.
- Build and optimize data warehouses, data lakes, and lakehouse environments.
- Develop batch and real-time data processing solutions.
- Implement data integration solutions (ADF, SSIS, Talend, Informatica, AWS Glue).
- Develop CI / CD pipelines for data workflows.
- Collaborate with BI, Cloud, Analytics, and Development teams.
Job Experience and Skills Required :
Education :
Degree / Diploma in Computer Science, Data Engineering, IT, Software Engineering, or similar.Certifications such as Azure / AWS / GCP Data Engineer certifications or Databricks, Snowflake, or Hadoop certificationsExperience :
Minimum 3+ years' experience as a Data EngineerStrong experience with data pipelines and cloud-based solutionsAdvanced SQL & data modellingETL / ELT experience (ADF, SSIS, Informatica, Talend, Glue, Databricks)Cloud experience (Azure, AWS, or GCP)Big data tools : Spark, Hadoop, KafkaPython or Scala for data engineeringData warehouses / lakes (Snowflake, Synapse, Redshift, BigQuery)CI / CD & version control (Git, Azure DevOps, Jenkins)Nice to Have :
Streaming technologies (Kafka, Kinesis, EventHub)Lakehouse architectures (Delta Lake, Databricks)Exposure to ML data pipelinesApply now!