
Fabric Data Engineer
- Melaka
- Permanent
- Full-time
Specialization: IT OR COMPUTER SOFTWAREJob description:Job Responsibilities: \n\n
\n\t * Utilize Microsoft Fabric for data integration and processing, including Notebooks, Azure Data Factory, and Dataflow Gen2.
\n\t * Evaluate and select the appropriate Fabric storage service - Lakehouse or Data Warehouse - based on project requirements and data architecture needs.
\n\t * Design, develop, and optimize semantic models for data analysis and reporting.
\n\t * Analyse and assess data modelling approaches, such as Star Schema and Data Vault, and data storage strategies like Data Lake vs. DWH, to build robust and efficient data-driven solutions.
\n\t * Write clean, efficient, and reusable code using SQL, Python, DAX, and M language.
\n\t * Support machine learning initiatives by performing feature engineering and assisting in model deployment.
\n\t * Manage lifecycle of data applications using Azure DevOps, and create insightful visualizations with Power BI.
\n\t * Share knowledge and contribute to team growth through cross-project collaboration, peer reviews, and internal learning sessions.
\n\t * Reporting to HQ directly
\n \n\n\n\nJob Requirements: \n\n
\n\t * Bachelor’s or Master’s Degree in computer science, Data Engineering, Information Systems, or a related field.
\n\t * Minimum 5 years of working experience in related field.
\n\t * Experience with Microsoft Fabric and its ecosystem (Dataflows Gen2, OneLake, Lakehouse, Synapse-style DWH).
\n\t * Familiarity with Azure services, Power BI, and DevOps practices.
\n\t * Certifications such as DP-600 (Microsoft Fabric Analytics)
\n\t * Might need to travel overseas in HQ for training or supporting SEA countries.
\n\t * Willing to work in Melaka (hybrid working mode is available).
\n \n\nInterested candidates please apply online or may reach out to gan@asiarecruit.com.my. \n\nOnly shortlisted candidates will be notified. \n