Capco Logo

Capco

Big Data Engineer

Posted 3 Hours Ago
Be an Early Applicant
Hybrid
Pune, Maharashtra
Senior level
Hybrid
Pune, Maharashtra
Senior level
The Big Data Engineer will develop data pipelines, improve data processes, and ensure standards in data engineering while working on large-scale projects in a collaborative, Agile environment.
The summary above was generated by AI

About Us

“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. 

WHY JOIN CAPCO?

You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.

MAKE AN IMPACT

Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.

#BEYOURSELFATWORK

Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.

CAREER ADVANCEMENT

With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.

DIVERSITY & INCLUSION

We believe that diversity of people and perspective gives us a competitive advantage.

MAKE AN IMPACT

Location: Pune
Experience Range: 5–9 years

 
Job Description- Senior Data Engineer
 
Role:
 
Develop high quality, secure and salable data pipelines using spark, Scala/Python/Java on Hadoop or object storage like MinIo.
Leverage technologies and solutions to innovate with increasingly large data sets.
Drive automation and efficiency in Data ingestion, data movement and data access workflows by innovation and collaboration.
Understand, implement and enforce Software development standards and engineering principles in the Big Data space.
Contribute ideas to help ensure that required standards and processes are in place and actively look for opportunities to enhance standards and improve process efficiency.
Perform assigned tasks and production incident independently.
Base Skill Requirements:
 
MUST Technical:
 
5-9 years of experience in Data Warehouse/Data Lake/Lake House related projects in product or service-based organization
Expertise in Data Engineering and implementing multiple end-to-end DW projects in Big Data environment handling petabyte scale data.
Solid Experience of building complex data pipelines through Spark with Scala/Python/Java on Hadoop or Object storage
Experience of working with Databases like Oracle, Netezza and have strong SQL knowledge.
Proficient in working within an Agile/Scrum framework, including creating user stories with well-defined acceptance criteria, participating in sprint planning and reviews
 
Optional Technical
 
Experience of building Nifi pipelines (Preferred)
Strong analytical skills required for debugging production issues, providing root cause and implementing mitigation plan
Strong communication skills - both verbal and written
Ability to multi-task across multiple projects, interface with external / internal resources
Proactive, detail-oriented and able to function under pressure in an independent environment along with a high degree of initiative and self-motivation to drive results
Willingness to quickly learn and implement new technologies, participate POC to explore best solution for the problem statement
Experience working diverse and geographically distributed project teams

Capco Mumbai, Maharashtra, IND Office

Capco Technologies PVT Ltd, C/O Wipro Limited, 4 LandsEnd, B.J.Road, Opp Sister's Bungalow, Bandstand, Bandra West, Mumbai, 4000, Mumbai, India, 400050

Similar Jobs at Capco

3 Hours Ago
Hybrid
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The Power BI Developer will analyze data from various sources, implement ETL processes, work with BI tools like Tableau and PowerBI, and manage data warehousing.
Top Skills: Data EngineeringDatabricksEltETLMs Sql ServerOraclePower BISnowflakeSsisTableau
3 Hours Ago
Remote or Hybrid
India
Mid level
Mid level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The role involves managing financial reporting requirements including data assessment and transformation, stakeholder engagement, and ensuring regulatory compliance for reporting processes.
Top Skills: ExcelFinrepIfrsMs Powerpoint
3 Hours Ago
Hybrid
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The Big Data Tester will work on testing and validating data using Python, PySpark, SQL, and ETL. Responsibilities include ensuring data quality and applying best testing practices in big data environments.
Top Skills: Amazon RedshiftCassandraETLGoogle BigqueryHadoopHiveKafkaMongoDBNoSQLOraclePysparkPythonSnowflakeSparkSQLSQL ServerTeradata

What you need to know about the Mumbai Tech Scene

From haggling for the best price at Chor Bazaar to the bustle of Crawford Market, the energy of Mumbai's traditional markets is a key part of the city's charm. And while these markets will always have their place, the city also boasts a thriving e-commerce scene, ranking among the largest in the region. Driven by online sales in everything from snacks to licensed sports merchandise to children's apparel, the local industry is worth billions, with companies actively recruiting to meet the demands of continued growth.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account