Data Engineer

PropertyGuru

  • Kuala Lumpur
  • Permanent
  • Full-time
  • 2 days ago
PropertyGuru is Southeast Asia’s leading PropTech company, and the preferred destination for over 32 million property seekers monthly to connect with over 50,000 agents monthly to find their dream home. PropertyGuru empowers property seekers with more than 2.1 million real estate listings, in-depth insights, and solutions that enable them to make confident property decisions across Singapore, Malaysia, Thailand and Vietnam.was launched in Singapore in 2007 and since then, PropertyGuru Group has made the property journey a transparent one for property seekers in Southeast Asia. In the last 18 years, PropertyGuru has grown into a high-growth PropTech company with a robust portfolio including leading property marketplaces and award-winning mobile apps across its markets in Singapore, Malaysia, Vietnam, Thailand as well as the region’s biggest and most respected industry recognition platform – PropertyGuru Asia Property , events and publications across Asia.For more information, please visit: ; .🧾 Job Title: Data Engineer (Mid-Level)Department: Technology – Data COE
Locations: Malaysia
Teams: Data Platform & Solution🔍 About the RoleWe are looking for a skilled and motivated Data Engineer (Mid-Level) to join our Data COE, contributing to both the Data Platform & Solution teams.In this role, you’ll help build and maintain modern, scalable data infrastructure to meet PropertyGuru evolving needs for data-driven insights and innovation. Our team is advancing toward a Medallion architecture approach and adopting a real-time-first mindset, with batch serving as a secondary option.You’ll also contribute to a “shift-left” data processing philosophy, where cleansing, validation, and transformation are done as early as possible near the source to improve data trust, reduce rework, and simplify downstream logic.🛠️ Key Responsibilities
  • Design, develop, and maintain real-time (e.g., Kafka, Debezium, Apache Flink, Apache Beam, Kinesis) and batch (e.g., Composer Airflow, Apache Spark, AWS Glue) data pipelines
  • Implement and maintain Medallion architecture to support scalable and well-governed data layers.
  • Build and optimize data models, datamarts, and schemas for reporting and ML use cases.
  • Apply shift-left practices by performing early-stage data cleansing, validation, and transformation close to ingestion
  • Ensure data quality, integrity, and availability, with proactive monitoring and alerting (e.g., Telm.ai or similar alternative tooling).
  • Handle large structured and semi-structured datasets using GCP (BigQuery) and AWS.
  • Optimize storage and queries for performance and cost-efficiency.
  • Contribute to data architecture, design discussions, and evolving platform standards.
  • Translate business requirements into technical implementation.
  • Troubleshoot and resolve pipeline and data issues, perform root cause analysis, and continuously improve reliability.
👥 Functional & Team Collaboration
  • Work closely with marketplace, analytics, product, and engineering teams to deliver end-to-end data solutions.
  • Take ownership of assigned tasks and communicate progress, risks, and blockers in a timely manner.
  • Ensure adherence to data governance, security, and compliance requirements.
  • Contribute to technical documentation and participate in peer code reviews.
  • Stay informed about modern data engineering trends and help introduce new technologies or practices (e.g., data contracts, Iceberg/Delta Lake, dbt, event-driven architecture)
  • Collaborate with peers to foster technical growth and knowledge sharing.
🧠 Required Skills & Qualifications
  • 2–4 years of hands-on experience in data engineering or similar roles.
  • Proficient in Python and SQL; knowledge of Java/Scala is a plus
  • Experience with modern data processing frameworks (e.g., Kafka, Spark, Hadoop).
  • Experience with cloud data platforms (GCP and AWS).
  • Familiarity with a variety of database types, including relational (e.g., PostgreSQL), key-value stores (e.g., Redis), and document databases (e.g., MongoDB).
  • Experience working with search or analytics engines like Elasticsearch.
  • Familiar with CI/CD, infrastructure as code, and DevOps tools.
  • Strong grasp of data warehousing and data modeling principles.
  • Excellent problem-solving, analytical, and communication skills
  • Self-driven with the ability to work independently and collaboratively
🌟 Nice to have Skills
  • Experience with containerization (e.g., Docker, Kubernetes)
  • Familiarity with metadata management, data cataloging, or observability platforms
  • Exposure to data visualization tools (e.g., Looker, Looker Studio, Tableau, Power BI)
  • Understanding of how data engineering supports ML and data science workflows
🌏 LocationsWe are hiring across Malaysia (Kuala Lumpur-Hybrid).🚀 Why Join Us
  • Be part of a modern data engineering team driving Data platform capabilities
  • Focus on building, not just implementing, your input will shape the future architecture
  • Work across platform and data product domains in a high-impact role
  • Join a bottom-up engineering culture that values innovation, ownership, and learning
  • Flexible working environment with career growth opportunities into senior or individual contributor specialist tracks
Our commitment to you:Hybrid flexible working that focuses on outcomes over hours.Holistic rewards package covering your financial, physical & mental health.Multi-directional career development across all levels.Inclusive benefits like equal paternity leave, supporting all employees in work-life balance.

PropertyGuru

Similar Jobs

  • Junior Data Engineer

    Rapsys Technologies

    • Kuala Lumpur
    🌟 We're Hiring: Junior Data Engineer! 🌟 We are seeking a motivated Junior Data Engineer to join our growing data team. The ideal candidate will have hands-on experience in buil…
    • 1 day ago
  • Senior Big Data Engineers

    Rapsys Technologies

    • Kuala Lumpur
    🌟 We're Hiring: Senior Big Data Engineer! 🌟 We are seeking an experienced Senior Big Data Engineer to design, develop, and maintain large-scale data processing systems. The ide…
    • 1 day ago
  • Senior Big Data Engineers

    • Kuala Lumpur
    • RM 6,000-11,000 per month
    Job title: Senior Big Data Engineers Location: Bangi Malaysia Duration: 12 Months (Can extend or convert) Job Summary We are seeking an experienced and highly skilled Senior Data E…
    • 1 day ago