The Data Engineer role involves data analysis using T-SQL, ETL development with Python in Databricks, and building data pipelines utilizing Azure Data services.
- Excellent data analysis and exploration using T-SQL
- Proficiency in Python for ETL development and data wrangling, especially in Databricks.
- Experience writing automated tests for data pipelines
- Strong SQL programming (stored procedures, functions)
- Knowledge and experience of data warehouse modelling methodologies (Kimball, dimensional modelling, Data Vault 2.0)
- Experience in Azure – one or more of the following: Data Factory, Databricks, Synapse Analytics, ADLS Gen2
- Experience in building robust and performant ETL processes
- Azure Data products and MS Fabric
- Experience in using source control & ADO
- Awareness of data governance tools and practices (e.g., Azure Purview)
- Understanding and experience of deployment pipelines
- Excellent analytical and problem-solving skills, with the ability to think critically and strategically.
- Strong communication and interpersonal skills, with the ability to engage and influence stakeholders at all levels.
- To always act with integrity and embrace the philosophy of treating our customers fairly
- Analytical, ability to arrive at solutions that fit current / future business processes
- Effective writing and verbal communication
- Organisational skills: Ability to effectively manage and co-ordinate themselves.
- Ownership and self-motivation
- Delivery focus
- Assertive, resilient and persistent
- Team oriented
- Deal well with pressure and highly effective at multi-tasking and juggling priorities
- Excellent data analysis and exploration using T-SQL
- Proficiency in Python for ETL development and data wrangling, especially in Databricks.
- Experience writing automated tests for data pipelines
- Strong SQL programming (stored procedures, functions)
- Knowledge and experience of data warehouse modelling methodologies (Kimball, dimensional modelling, Data Vault 2.0)
- Experience in Azure – one or more of the following: Data Factory, Databricks, Synapse Analytics, ADLS Gen2
- Experience in building robust and performant ETL processes
- Azure Data products and MS Fabric
- Experience in using source control & ADO
- Awareness of data governance tools and practices (e.g., Azure Purview)
- Understanding and experience of deployment pipelines
- Excellent analytical and problem-solving skills, with the ability to think critically and strategically.
- Strong communication and interpersonal skills, with the ability to engage and influence stakeholders at all levels.
- To always act with integrity and embrace the philosophy of treating our customers fairly
- Analytical, ability to arrive at solutions that fit current / future business processes
- Effective writing and verbal communication
- Organisational skills: Ability to effectively manage and co-ordinate themselves.
- Ownership and self-motivation
- Delivery focus
- Assertive, resilient and persistent
- Team oriented
- Deal well with pressure and highly effective at multi-tasking and juggling priorities
- Excellent data analysis and exploration using T-SQL
- Proficiency in Python for ETL development and data wrangling, especially in Databricks.
- Experience writing automated tests for data pipelines
- Strong SQL programming (stored procedures, functions)
- Knowledge and experience of data warehouse modelling methodologies (Kimball, dimensional modelling, Data Vault 2.0)
- Experience in Azure – one or more of the following: Data Factory, Databricks, Synapse Analytics, ADLS Gen2
- Experience in building robust and performant ETL processes
- Azure Data products and MS Fabric
- Experience in using source control & ADO
- Awareness of data governance tools and practices (e.g., Azure Purview)
- Understanding and experience of deployment pipelines
- Excellent analytical and problem-solving skills, with the ability to think critically and strategically.
- Strong communication and interpersonal skills, with the ability to engage and influence stakeholders at all levels.
- To always act with integrity and embrace the philosophy of treating our customers fairly
- Analytical, ability to arrive at solutions that fit current / future business processes
- Effective writing and verbal communication
- Organisational skills: Ability to effectively manage and co-ordinate themselves.
- Ownership and self-motivation
- Delivery focus
- Assertive, resilient and persistent
- Team oriented
- Deal well with pressure and highly effective at multi-tasking and juggling priorities
Part of the $4.8 billion RPG Group, we’re a community of 10,000+ innovators across 30+ global locations, including Milpitas, Seattle, Princeton, Cape Town, London, Zurich, Singapore, and Mexico City. Explore Life at Zensar and join us to Grow. Own. Achieve. Learn. to be the best version of yourself.
We believe the best work happens when individuality is celebrated, growth is encouraged, and well-being is prioritized. We are an equal employment opportunity (EEO) and affirmative action employer, committed to creating an inclusive workplace. All qualified applicants will be considered without regard to race, creed, color, ancestry, religion, sex, national origin, citizenship, age, sexual orientation, gender identity, disability, marital status, family medical leave status, or protected veteran status.
Similar Jobs
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The Data Engineer will work on projects for major banks and financial services, requiring skills in Java, Spark, and HBase/Hive/Kafka with 4-6 years of experience.
Top Skills:
APIsHbaseHiveJavaKafkaSparkSpark Streaming
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Seeking a GCP Data Engineer to develop and maintain data pipelines on Google Cloud, using Python and SQL. Responsibilities include data transformation, troubleshooting data issues, and collaborating with project teams.
Top Skills:
BigQueryCloud ComposerCloud FunctionsDataflowGCPGitGoogle Cloud PlatformPub/SubPythonSQL
Artificial Intelligence • Automotive • Computer Vision • Information Technology • Internet of Things • Logistics • Software
The GIS Data Engineer is responsible for designing, developing, and optimizing spatial data pipelines while ensuring data quality and operational reliability within HERE map databases.
Top Skills:
ArcgisCi/CdFmeGisGitGitlabOraclePostgisPythonQgisSQL
What you need to know about the Mumbai Tech Scene
From haggling for the best price at Chor Bazaar to the bustle of Crawford Market, the energy of Mumbai's traditional markets is a key part of the city's charm. And while these markets will always have their place, the city also boasts a thriving e-commerce scene, ranking among the largest in the region. Driven by online sales in everything from snacks to licensed sports merchandise to children's apparel, the local industry is worth billions, with companies actively recruiting to meet the demands of continued growth.


