FNZ Group Logo

FNZ Group

Senior Data Architect

Posted 2 Days Ago
Be an Early Applicant
In-Office
Pune, Maharashtra
Senior level
In-Office
Pune, Maharashtra
Senior level
Design and own the architecture of the Analytical Warehouse, defining data models, ingestion patterns, and governance frameworks on Microsoft Fabric.
The summary above was generated by AI

Job Title: Data Architect — Analytical Warehouse (FNZ)

About FNZ:

FNZ is a global fintech firm transforming the way financial institutions serve their clients. By

combining cutting-edge technology, infrastructure, and investment operations, FNZ

enables wealth management firms to deliver personalized investment solutions at scale.

Operating across multiple regions and supporting over $1.5 trillion in assets under

administration, FNZ partners with leading banks, insurers, and asset managers to create

seamless and innovative wealth platforms that empower millions of investors worldwide.

Job Summary:

We are seeking a senior Data Architect to design and own the architecture of the Analytical

Warehouse built on Microsoft Fabric. This role is responsible for defining the data models,

storage strategies, ingestion patterns, semantic layer, and governance framework that

transform the NRT-ODS Gold-layer streaming data into a structured, performant, and

governed analytical platform. You will architect the bridge between real-time streaming and

historical analytics, serving both operational BI and client-facing reporting workloads.

Key Responsibilities:

• Analytical Warehouse Architecture: Design the end-to-end architecture for the

Analytical Warehouse on Microsoft Fabric — ingestion from ODS Gold topics,

Bronze/Silver/Gold layering within OneLake, transformation pipelines, semantic

layer, and consumption endpoints.

• Data Modelling: Define dimensional models, star schemas, and wide denormalized

tables optimized for analytical query patterns. Design fact and dimension tables for

wealth management domains — accounts, portfolios, transactions, positions, fees,

NAV, AUM.

• Ingestion Architecture: Architect the Kafka-to-Fabric ingestion pipeline — Kafka

Connect sink configuration, Avro-to-Delta schema mapping, partitioning strategy

(date, entity type, client), exactly-once delivery semantics, and error handling.

• Lakehouse Strategy: Define the OneLake storage architecture including

namespace design, table format strategy (Delta Lake near-term, Apache Iceberg

long-term), partition evolution, file compaction policies, and retention

management.

• Semantic Layer Design: Architect the semantic layer that provides businessfriendly metrics (AUM, NAV, trade volumes, fee breakdowns) with consistent

definitions across dashboards, reports, APIs, and client portals.

• Data Sharing Architecture: Design the architecture for Fabric Data Sharing —

OneLake shortcuts and Delta Sharing protocols that enable clients to consume

analytics in their own Fabric tenants with governed, client-scoped access.

• Data Governance & Contracts: Extend the ODS data contracts framework into the

Analytical Warehouse. Define governance policies for the analytical layer including

data classification, access controls (Purview), lineage tracking, and audit trails.

• Batch Extract Migration: Architect the migration of batch extract from SQL-driven

CSV to Kafka-sourced Parquet/Delta via Fabric pipelines. Design the metadatadriven configuration that preserves CentralHub flexibility.

• Performance Architecture: Design for query performance — Z-ordering strategies,

partition pruning, materialized views, caching layers, and compute resource

allocation across Fabric workspaces.

• Apache Iceberg Roadmap: Plan the long-term migration to Apache Iceberg on

OneLake for time-travel queries, partition evolution, and multi-engine access

(Fabric, Spark, Trino, Flink). Evaluate Confluent Tableflow or custom sink for Kafkato-Iceberg pipeline.

• Standards & Governance: Establish naming conventions, modelling standards,

documentation requirements, and code review processes for all Analytical

Warehouse development. Conduct architecture reviews for Data Engineer

deliverables.

Qualifications:

• Education: Bachelor's or Master's degree in Computer Science, Engineering, Data

Science, or a related technical field.

• Experience: 8+ years of experience in data architecture or data engineering, with at

least 3 years in a data architect role on analytical/warehouse platforms.

• Microsoft Fabric / Azure: Deep experience with Microsoft Fabric, Azure Synapse

Analytics, or equivalent cloud analytical platforms. Strong understanding of

OneLake, Fabric lakehouse, and Fabric SQL endpoints.

• Data Modelling: Expert-level skills in dimensional modelling (Kimball), data vault,

and denormalized modelling for analytical workloads. Experience modelling

financial services data domains.

• Delta Lake / Iceberg: Strong understanding of modern table formats — Delta Lake

(ACID transactions, time travel, schema evolution) and Apache Iceberg (partition

evolution, multi-engine support).

• SQL Expertise: Advanced SQL skills for analytical queries, performance tuning, and

query plan analysis.

• Streaming-to-Analytical Bridge: Experience architecting data pipelines that bridge

real-time streaming platforms (Kafka) with analytical warehouses/lakehouses.

• Semantic Layers: Experience with semantic layer and data transformation tools for

defining governed business metrics.

• Data Governance: Experience with data governance frameworks, data catalogs

(Purview, Atlan), and access control policies in multi-tenant environments.

Preferred Qualifications:

• Experience working in the Wealth Management or Financial Services industry with

deep understanding of investment operations data models.

• Experience with Apache Kafka — consumer architecture, Kafka Connect, Avro

schema evolution, and schema registries.

• Familiarity with SQL-based transformation frameworks for managing

transformation layers (models, tests, documentation, CI/CD).

• Experience with data quality frameworks (Great Expectations, Soda) integrated

into analytical pipelines.

• Experience architecting multi-tenant analytical platforms with client-scoped data

isolation.

• Knowledge of privacy-preserving analytics — differential privacy, confidential

compute, or federated analytics patterns.

• Microsoft Fabric certifications, Azure Data Engineer (DP-203), or Azure Solutions

Architect certifications are a plus.

About FNZ

FNZ is committed to opening up wealth so that everyone, everywhere can invest in their future on their terms. We know the foundation to do that already exists in the wealth management industry, but complexity holds firms back. 

We created wealth’s growth platform to help. We provide a global, end-to-end wealth management platform that integrates modern technology with business and investment operations. All in a regulated financial institution. 

We partner with the world’s leading financial institutions, with over US$2.4 trillion in assets on platform (AoP).
Together with our clients, we empower nearly 30 million people across all wealth segments to invest in their future.

Top Skills

Apache Iceberg
Azure
Delta Lake
Kafka
Microsoft Fabric
SQL

Similar Jobs

19 Days Ago
In-Office or Remote
India
Senior level
Senior level
Information Technology
The Sr Data Architect is responsible for leading data governance, managing data quality, and designing data architecture using modern paradigms on AWS. Requires extensive experience and expertise in data modeling and engineering.
Top Skills: AWSConfluent CloudData FabricData GovernanceData Integration HubData LakehouseData Lifecycle ManagementData MeshData PrivacyEltETLKappa ArchitectureLambda ArchitectureMetadata ManagementMs FabricPower BIPythonSQL
11 Days Ago
In-Office
Senior level
Senior level
Artificial Intelligence • Computer Vision • Hardware • Robotics • Metaverse
The role involves architecting and implementing SAP MDG solutions across Material and Business Partner domains, ensuring effective data governance and quality management within supply chain processes.
Top Skills: AleData Replication FrameworkEdiFioriIdocInformaticaOdataPlmPower BIS/4HanaSap DqmSap MdgSFDCSoa ManagerTableauUi CustomizationWeb DynproXML
16 Days Ago
In-Office
Senior level
Senior level
Artificial Intelligence • Computer Vision • Hardware • Robotics • Metaverse
Design and implement enterprise data management solutions focusing on supply chain processes; lead data governance; collaborate with IT and business stakeholders to address complex data architecture challenges.
Top Skills: DatabricksDelta LakeInformatica Intelligent Data Management CloudPysparkSap IbpSap MdgSap S/4HanaSFDCSpark

What you need to know about the Mumbai Tech Scene

From haggling for the best price at Chor Bazaar to the bustle of Crawford Market, the energy of Mumbai's traditional markets is a key part of the city's charm. And while these markets will always have their place, the city also boasts a thriving e-commerce scene, ranking among the largest in the region. Driven by online sales in everything from snacks to licensed sports merchandise to children's apparel, the local industry is worth billions, with companies actively recruiting to meet the demands of continued growth.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account