Be wary of WhatsApp messages impersonating Jobline Resources's staff offering job opportunities. Those who encounter suspicious messages can contact Jobline at +65 6339 7198

Responsibilities

  • Design, develop, and maintain end to end data pipelines for ingesting, transforming, and delivering data from multiple source systems (databases, files, APIs, streaming platforms).
  • Build and optimize ETL / ELT workflows using SQL, Python, and enterprise data integration tools.
  • Ensure data pipelines are scalable, resilient, and performant to meet operational and analytical requirements.

Database & Data Platform Management
  • Work hands on with RDBMS platforms such as Oracle, DB2, SQL Server, or PostgreSQL for data extraction, transformation, and performance tuning.
  • Develop and optimize SQL queries, views, and stored procedures to support reporting and analytics use cases.
  • Support data modelling activities (logical and physical) for analytics and reporting layers.

Data Quality, Governance & Operations
  • Implement data validation, reconciliation, and monitoring to ensure data accuracy, completeness, and consistency.
  • Support operational data activities, including incident investigation, root cause analysis, and remediation.
  • Maintain clear documentation for data pipelines, schemas, and operational processes to support audits and knowledge transfer.

Collaboration & Stakeholder Engagement
  • Collaborate with business users, product owners, and downstream teams to gather requirements and translate them into technical solutions.
  • Work closely with Data Analysts, BI developers, and Data Scientists to enable dashboards, reports, and advanced analytics.
  • Participate in Agile ceremonies and contribute to sprint planning, estimation, and delivery

Requirements

  • Bachelor’s degree in Computer Science, Engineering, Information Systems, or equivalent practical experience.
  • Strong hands on experience with SQL and relational databases (Oracle, DB2, SQL Server, PostgreSQL).
  • Experience building and supporting ETL / data pipelines in enterprise environments.
  • Solid understanding of data modelling, data quality, and data lifecycle management.
  • Ability to troubleshoot data issues and work in production / operational environments.
  • Experience with Python for data processing or automation.
  • Experience with data streaming technologies (e.g. Kafka, Spark, NiFi).
  • Experience with BI and visualization tools such as Tableau, Qlik, or Power BI.
  • Knowledge of cloud or hybrid data platforms and orchestration tools.
  • Knowledge of Agile / DevOps practices and CI/CD for data pipelines.

Shortlisted candidates will be offered a 1 Year Agency Contract employment