Role Objectives
Role Objectives The primary objective of this role is to design, develop, and maintain robust and scalable Big Data Pipelines and data architectures, ensuring optimal extraction, transformation, and loading of data across multiple application platforms. The Data Engineer will act as a custodian of data, ensuring compliance with information classification requirements, and enabling data consumers to build and optimize data consumption effectively. This role demands profound technical expertise in Data Engineering, Data Warehouse Design, and advanced analytics, utilizing modern software engineering concepts and BI tools. The Data Engineer will leverage Azure Data Solutions and various data-related programming languages and frameworks to analyze data elements, perform root cause analysis, and collaborate with technology colleagues and data teams to deliver viable data solutions within architectural guidelines.
Key Responsibilities
- Responsible for building and maintaining Big Data Pipelines
- Custodians of data and must ensure that data is shared in line with the information classification requirements on a need-to-know basis
- Experience using programming skills in data related programming languages and frameworks, such as Python, Spark, SQL
- Experience with Azure Data Solutions : Azure Data Factory, Azure Data Explorer, Azure Databricks
- Profound technical understanding for Data Engineering and Data Warehouse Design
- Familiar with modern software engineering concepts Know-how in advanced analytics and BI Tools.
- Develop and maintain complete data architecture across several application platforms
- Analyze data elements and systems
- Build required infrastructure for optimal extraction, transformation and loading of data
- Build, create, manage and optimize data pipelines
- Create data tooling, enabling data consumers in building and optimizing data consumption
- Execute on the design, definition and development of (API's)
- Develop across several application platforms
- Experience performing root cause analysis on internal and external data and processes
- Knowledge of integration patterns, styles, protocols and systems
- Liaise and collaborate with technology colleagues and data teams to understand viable data solutions within architectural guidelines
- Update technical documentation on data extracts and report functionality to facilitate future understanding to the extent required for ongoing support. Respond to user queries, error logging and further enhancement requests to ensure reports are used and serve their intended purpose.
- Provide OLAP support and end-user training on the various cubes used for reporting downstream
- Data Modelling - emphasizes on what data is needed and how it should be organized instead of what operations need to be performed on the data. Data Model is like architect's building plan which helps to build a conceptual model and set the relationship between data items. This data capability is required in preparation of the Data Platform implementation, and will also align with the approach taken by Group Data to implement ERWIN as a tool of choice for Data Modelling
- Data Architecture - ensures that we have composed of models, policies, rules or standards that govern which data is collected, and how it is stored, arranged, integrated, and put to use in data systems and in organizations. This data capability is also aligned with the direction that Group Data is taking.
Seniority Level
Entry level
Employment Type
Full-time
Job Function
Information Technology
Industries
Insurance
#J-18808-Ljbffr