Kindly note that Jobline will be offline for maintenance on this Friday (May 14, 2021) from 5:00 PM to 9:00 PM.

Responsibilities

  • Collaborate with stakeholders to understand the requirements for data structure, availability, scalability and accessibility.
  • Identify the current state limitations and goal state of coordinated data flow through the systems and organization, including the data flow channels and processing systems to ingest, stream, stage, extract, transform, load, integrate and consume data from various sources.
  • Provide thought leadership for stakeholders in determining which data management and analysis techniques and solutions will enable the enterprise to achieve defined business goals, taking ownership of the future roadmap on how to best address the gaps.
  • Evaluate different tools and techniques, engaging vendors where needed, to support estimation of effort, costs and bill of materials, and to create a common ground for the solution to be developed, adopting new technologies where necessary to provide enhanced solutions.
  • Serve as a technical expert on solution design, development and implementation requirements to address business needs, while balancing financial and technical constraints such as scalability, security, performance and reliability.
  • Create hands-on Proof of Concept and/or prototypes to validate a solution approach and to concretely show how business requirements can be fulfilled.
  • Provide guidance for data engineers on data model and application design for cleaning and transforming structured and unstructured data in ETL, ESB, SQL, noSQL and Data Lake platforms, through reviews, planning and hands-on development.
  • Collaborate with infrastructure and operations teams to architect and implement performance tuning solutions for data storage and analytics.
  • Work hands-on where needed with data engineers and analysts on projects that deliver data as a strategic asset.
  • Communicate fluently to adapt the message to the audience, deep-dive with vendors, developers and testers, and present technical concepts to business stakeholders in a structured manner.
  • Build strong partnership with business and IT stakeholders to shift priorities, resources, and budgets to align with the solution architecture roadmap and to garner buy-in and support for solution architecture design.
  • Foster continuous improvement by looking for ways to improve the design practices and technical approaches within the department.

Requirements

  • Degree in Computer Science, Information Technology, Engineering or equivalent
  • Certification in relevant IT areas
  • 5+ years of relevant experience in designing and delivering data management and advanced analytics solutions, especially in areas of data warehousing, Big Data storage and analytics.
  • Working knowledge in one or more programming languages such as e.g. Python, R, Java or C#
  • In-depth knowledge of:
  • Big Data storage and analytics products, frameworks and methodologies (e.g. Apache HDFS, Kafka, Spark, Hive)
  • Relational databases (e.g. Oracle, MS SQL, PostgreSQL, Teradata etc.)
  • Data modelling and repository design (e.g. operational data stores, dimensional data stores, data marts)
  • Data orchestration and ETL (e.g. Informatica Powercenter)