Data Analyst (Steward)

Role Summary

You will be part of a young and dynamic data team working in a cross-functional team environment to build up and support our investment data warehouse. The data warehouse will have a set of investment and market data that is used by investment managers and risk professionals in their investment reporting and analysis. This role will report directly to the Head of Data Strategy & Management.

The warehouse’s data assets will be built up incrementally starting with a set of risk data used in the investment risk dashboards. The data analyst will be part of the team overseeing the day-to-day data operations and working with cross-functional teams to build up new datasets and ensuring data quality and compliance with relevant standards and best practices.

Join us if you have a passion for building quality data assets and capabilities that improve process efficiencies, drive investment insights and reduce operational risks. The ideal candidate will have a passion for data with a good working knowledge of investment data and data management skillsets such as designing & implementing data models & quality rules and analyzing complex data. You will work with SQL, Excel and relational databases on data analysis, pipelines and operations. Having additional experience in Snowflake, Python programming language for data pipelines and/or BI tools for implementing dashboards and reporting  will be advantageous.

Key Duties and Responsibilities

As a data analyst/ steward in the data team for our investment data warehouse, you are expected to:

  • Work closing with business teams to drive, implement and deliver changes and/or new investment data assets and solutions.
  • Leverage on good understanding of investment asset class domain knowledge, business processes and data usage in delivering data model design, quality rules and solutions.
  • Drive a scalable data model leveraging on design best practices, with clarity on data definitions and usage.
  • Drive synergy across different business units for common solutions on similar data needs.
  • Influence business processes and workflows to align with data quality management, data definitions and usage.
  • Work closely with data engineers in the delivery of data pipelines and ensure system performance meets business SLAs.
  • Enable data quality management with data quality rules and operational processes and monitoring.
  • Support BAU (business-as-usual) operational activities in data SLA and quality management, issues resolution, user data queries and guide effective and efficient use of investment datasets in dashboards and reporting.

Education Qualification

  • Bachelor’s degree in Data Analytics, Mathematics, Data Science, Business Analytics, Information Technology or equivalent discipline.

Experience and Skills

  • 5-8 years of relevant experience in data management, implementing data pipelines and supporting data operations.
  • Strong domain knowledge of investment data and their usage in both trade life cycle and investment analysis/ reporting.
  • Good working knowledge of data management concepts, data quality rules and data models design & implementation.
  • Hands-on experience using SQL, stored procedures, databases (Snowflake, Oracle) in building data quality rules and data transformations/ aggregations.
  • Knowledge of Amazon S3, Python and Lambda programming for data pipelines’ implementation and/or tools for data visualization (MS PowerBI, Excel) will be advantageous.
  • Strong team player and yet exhibit independence in driving and delivering business outcomes.
  • Structured in thought process, with good attention to details.
  • Fast and independent learner
  • Strong written and verbal communication skills

Remuneration

Register your interest for the position by writing to careers@fullerton.com.sg with your CV, resume and transcript (for internships only).