Requirements
- Master's or Bachelor's Degree in Computer Science/Mathematics/ Statistics or equivalent.
- Minimum of 3 years of relevant work experience in data engineering, including in-depth technical knowledge of databases, BI tools, SQL, OLAP, ETL, RAG / Agentic Data pipeline.
- Proficient in RDBMS: Oracle/PL SQL
- Extensive hands-on experience in conceptualising, designing, and implementing data pipelines. Proficiency in handling unstructured data formats (e.g., PPT, PDF, Docx), databases (RDMS, NoSQL such as Elasticsearch, MongoDB, Neo4j, CEPH) and familiarity with big data platforms (HDFS, Spark, Impala).
- Experience in working with AWS technologies focusing on building scalable data pipelines.
- Front-end Reporting & Dashboard and Data Exploration tools –Tableau
- Strong background in Software Engineering & Development cycles (CI/CD) with proficiency in scripting languages, particularly Python.
- Good understanding and experience with Kubernetes / Openshift Platform.
Shortlisted candidate will be offered a 1 Year Agency Contract employment.