Data Engineer

Date: 15 May 2023

Location: KW

Company: Alghanim Industries

Long Description

Job Summary

Data integration, unification, cleansing and data quality management

Job Responsibilities

Management of data inflow

  • Create and maintain optimal data pipeline architecture; adopt new technologies to improve existing frameworks of data flow and monitoring
  • Assemble large, complex data sets that meet functional / non-functional business requirements
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and ‘big data’ technologies; take necessary steps to implement necessary changes in the IT infrastructure such as MDM tool acquisition, data lake design, cloud solutions implementation in coordination with IT and technology coordinator
  • Create data tools for analytics and data scientist team members that assist them in building models by automating and simplifying data preparation
  • Translate customer data strategy into actionable data integration plans and execute these plans
  • Maintain 360 degree view of customer, enhance the customer datamarts by continuously integrating new sources of data

Data cleansing and unification

  • Create automated data anomaly detection systems and constant tracking of its performance
  • Process, cleanse, and verify integrity of data used for analysis; active use of built-in data quality dashboard on CDP and coordination of corrective actions
  • Develop algorithms to de-duplicate and export customer data from multiple BUs to ensure data unification
  • Ensure continuous unification of customer records and associated profile and transactional data

Candidate Requirements

Bachelor’s degree in Computer Science, Computer Engineering, Mathematics, or related field
Understanding of data modeling principles
5+ years of significant configuration and data management experience; comfortable handling and manipulating data, with demonstrated experience in a data-intensive setting
Experience building highly scalable and high performing databases using modern NoSQL or cloud based technologies and integrating external APIs for data acquisition
Experience identifying data anomalies or imperfections
Experience of data structures and schemas, data preparation, cleansing and unification
Possess the ability to train others – pass on knowledge within the organization