Overview
We are seeking an immediately available / short-notice AWS Data Engineer (Intermediate to Senior) to support an international client in managing and optimising their data infrastructure. The role focuses on building and maintaining scalable data pipelines, optimising cloud-based data solutions, and ensuring high performance and reliability across systems. You will support the data operations roadmap, leveraging AWS technologies to deliver robust, efficient, and secure solutions.
Responsibilities
- Design, build, and maintain scalable data pipelines and ETL processes.
- Optimise data storage, transformation, and retrieval for performance and cost efficiency.
- Implement best practices in data modelling and architecture.
- Develop and manage data solutions using AWS services such as S3, Glue, Redshift, DBT, Spark, and Terraform.
- Collaborate with cloud architects to ensure smooth integrations and deployments.
- Lead or contribute to migrations and modernisation projects within AWS environments.
- Conduct performance tuning and implement monitoring solutions to ensure system stability.
- Troubleshoot data pipeline failures, ensuring rapid resolution and minimal downtime.
- Build dashboards and reporting tools to monitor data flows and usage.
- Apply role-based access controls and enforce data governance policies.
- Ensure compliance with international data protection and security standards.
- Support audit and compliance initiatives as required.
- Work closely with cross-functional teams (data analysts, product managers, application teams).
- Document processes, pipelines, and architectures for knowledge transfer.
- Mentor junior engineers and contribute to continuous improvement initiatives.
Required Expertise
Proven experience as a Data Engineer (5+ years, with Intermediate to Senior-level capability).Strong proficiency with AWS data services (S3, Glue, Redshift, DBT, Spark, Terraform).Hands-on experience in building and managing ETL / ELT pipelines.Strong knowledge of SQL, data modelling, and performance tuning.Familiarity with CI / CD, version control (Git), and infrastructure-as-code.Excellent problem-solving skills and ability to work in fast-paced environments.Strong communication skills for collaboration with international teams.Experience with multi-region or global data deployments. (Nice to have)Knowledge of Python or other scripting languages for automation. (Nice to have)Exposure to data governance frameworks and observability tools. (Nice to have)Rewards
Competitive contract compensationExposure to cutting-edge AWS technologies and data practices.Collaborative environment with global teams.#J-18808-Ljbffr