top of page

Data Engineer

Integra Tower, Jalan Tun Razak, Kampung Datuk Keramat, Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia

Job Type

Full Time

Workspace

OnSite

About the Role

As a Data Engineer, you will play a crucial role in designing, building, and maintaining the data infrastructure and pipelines that empower our organisation to make data-driven decisions. You will collaborate closely with data analysts and other cross-functional teams to ensure data availability, reliability, and quality. This role requires a strong understanding of data architecture, ETL (Extract, Transform, Load) processes, and database technologies.

Requirements

  • Bachelor's or higher degree in Computer Science, Engineering, or a related field.

  • Proven experience in data engineering roles, with a focus on designing and building ETL pipelines and data architectures.

  • Proficiency in programming languages such as Python, Java, or Scala.

  • Strong experience with data integration tools, ETL frameworks, and workflow management tools (e.g., Apache Spark, Apache Airflow, Talend, etc.).

  • Solid understanding of relational databases, data modeling, and query optimization.

  • Experience with cloud-based data platforms (e.g., AWS, Azure, Google Cloud) and related services (e.g., S3, Redshift, BigQuery) is a plus.

  • Familiarity with version control systems and agile development methodologies.

  • Excellent problem-solving skills and attention to detail.

  • Effective communication skills to collaborate with technical and non-technical stakeholders.

 

Join us and contribute to our data-driven culture by shaping the way we collect, manage, and leverage data to drive innovation and business growth. 

Key Responsibilities:

1. Data Pipeline Development:
• Design, develop, and maintain scalable and efficient ETL pipelines to extract, transform, and load data from various sources into our data warehouse.
• Implement data integration solutions that ensure the availability and accessibility of data for different business units and analytical purposes.
• Optimize and monitor the performance of data pipelines to ensure timely and accurate data processing.

2. Data Architecture:
• Design and implement a robust data architecture that aligns with the organisation's data strategy and supports the company's goals.
• Evaluate and select appropriate technologies for data storage, processing, and retrieval based on performance, scalability, and cost considerations.

3. Data Quality and Governance:
• Implement data quality checks and validation processes to ensure accuracy and consistency of data throughout the pipeline.
• Work to establish and enforce data governance policies, including data lineage, metadata management, and access controls.

4. Database Management:
• Administer and maintain the data warehouse and other data storage solutions, ensuring optimal performance, security, and availability.
• Monitor and troubleshoot data-related issues, ensuring quick resolution and minimal downtime.

5. Collaboration and Documentation:
• Collaborate with cross-functional teams, including data analysts, software engineers, and business stakeholders, to understand data requirements and deliver effective solutions.
• Document data processes, workflows, and data dictionaries to facilitate knowledge sharing and onboarding of team members.

6. Continuous Improvement:
• Stay up-to-date with emerging data engineering technologies, best practices, and industry trends, and integrate them into the existing data infrastructure where applicable.
• Identify opportunities to optimize and enhance existing data pipelines and processes to improve efficiency and scalability.

bottom of page