
Data Engineer
- Kuala Lumpur
- Permanent
- Full-time
- Develop and maintain robust data pipelines for ingesting and transforming factory and IoT data.
- Contribute to platform improvements including automation, CI/CD processes, data governance, and monitoring.
- Address technical challenges and implement scalable and efficient features.
- Support upcoming migrations to new technologies such as Databricks, Azure EventHub, and Kafka.
- Collaborate across distributed teams and communicate effectively with global stakeholders.
- Document solutions and promote knowledge sharing across teams.
- 35+ years of experience in Data Engineering, with a focus on ETL, data integration, and data warehousing.
- Proficiency in Databricks and Apache Spark.
- Solid experience working with Azure cloud services.
- Strong programming skills in Python.
- Familiarity with DevOps/DataOps practices.
- Strong communication skills in English (C1 level).
- Self-motivated and capable of independently driving tasks to completion.
- Proactive in identifying and proposing effective solutions.
- Strong collaboration skills with both technical and non-technical stakeholders.
- Experience with MongoDB or other NoSQL databases.
- Knowledge of Airflow or similar workflow orchestration tools.
- Proficiency in Docker and containerization practices.
- Familiarity with advanced Databricks features like Unity Catalog or Delta Live Tables (DLT).
- Strong SQL expertise and experience with both on-prem and cloud database management.
- Understanding of programming fundamentals, algorithms, and data structures.
- Experience with Git (preferably GitHub) and version control best practices.
- Solid grasp of modern data architecture concepts: data lakes, data vaults, and warehouses.
- Working knowledge of Infrastructure-as-Code (IaC) tools, preferably Terraform.
- Familiarity with productivity-enhancing tools like ChatGPT and GitHub Copilot.
- Experience with data modeling.
- Proficiency in additional languages such as C#, Java, Scala.
- Familiarity with DBT, Snowflake, or Kafka.
- Experience working with Scrum/Kanban methodologies and tools like Jira.
- Ability to gather requirements independently and work closely with business users.
- Proven experience working in international, cross-functional teams.