Nexaminds Logo

Nexaminds

Senior Databricks Engineer (Exp: 8-12 yrs | Databricks ,PySpark , Apache Spark, Python, SQL, Azure, Devops)

Reposted 19 Days Ago
Be an Early Applicant
In-Office or Remote
4 Locations
Senior level
In-Office or Remote
4 Locations
Senior level
Seeking a Data Engineer to design and maintain scalable data pipelines using Databricks and Azure, focusing on ETL/ELT workflows, data quality, and collaboration with teams.
The summary above was generated by AI

Unlock Your Future with Nexaminds!

At Nexaminds, we're on a mission to redefine industries with AI. We're passionate about the limitless potential of artificial intelligence to transform businesses, streamline processes, and drive growth.

Join us on our visionary journey. We're leading the way in AI solutions, and we're committed to innovation, collaboration, and ethical practices. Become a part of our team and shape the future powered by intelligent machines. If you're driven by ambition, success, fun, and learning, Nexaminds is where you belong.

 

πŸš€ Are you a PRO at developing Databricks Pipelines / enhance or support existing pipelines?

Are you strong with Databricks, Pyspark,Apache Spark, SQL, Python debugging skills dealing with business-critical data with extensive exposure to Azure?

Have you worked hands-on in E-Commerce or Retail domains?

Then don’t wait any further β€” your next big opportunity is here! πŸŒŸ

Join us at Nexaminds and be part of an exciting journey where innovation meets impact. The benefits are unbelievable β€” and so is the experience you’ll gain!


Role Summary

We are seeking a Data Engineer with strong Databricks expertise to design, build, and maintain scalable, high-performance data pipelines on cloud platforms. The role focuses on developing production-grade ETL/ELT pipelines, enabling data modernization initiatives, and ensuring data quality, governance, and security across enterprise data platforms.

You will work closely with data engineers, analysts, and business stakeholders to deliver reliable, cost-efficient, and scalable data solutions, primarily on Azure.

Key Responsibilities

  • Design, build, and maintain scalable data pipelines using Databricks (Pyspark, Apache Spark, Delta Lake, SQL, Python).
  • Develop and optimize ETL/ELT workflows for performance, reliability, and cost efficiency.
  • Implement data quality, data profiling, governance, and security best practices.
  • Design and maintain data models to support analytics, reporting, and downstream consumption.
  • Collaborate with data engineers, analysts, and business stakeholders to define and implement data requirements.
  • Troubleshoot and resolve issues across data workflows, Spark jobs, and distributed systems.
  • Support cloud data platform modernization and migration initiatives.
  • Automate workflows using Databricks Workflows / Jobs and scheduling tools.
  • Participate in code reviews and contribute to engineering best practices.
  • Work within Agile/Scrum teams to deliver data solutions iteratively.

Must-Have Skills & Experience

  • 10+ years of experience in Data Engineering
  • Solid understanding with Azure Cloud platform.
  • Strong hands-on expertise with Databricks, Pyspark, Apache SparkDelta LakeDatabricks SQL
  • Excellent programming skills in Python and SQL.
  • Experience building production-grade ETL/ELT pipelines.
  • Experience in Data ModelingData ProfilingData Warehousing , distributed computing concepts
  • Working knowledge of Shell Scripting for automation.
  • Experience with Azure Event Hub / Github/Terraform
  • Experience using JFrog Artifactory or any other similar antifactory for artifact management.
  • Understanding of cloud security and access controls.

Nice-to-Have Skills

  • Exposure to CI/CD pipelines for data engineering workloads.
  • Knowledge of streaming data processing.
  • Familiarity with Azure DevOps or similar tools.
  • Experience supporting large-scale analytics or enterprise data platforms.

Soft Skills

  • Strong analytical and problem-solving skills.
  • Ability to work independently and in cross-functional teams.
  • Excellent communication skills to interact with technical and non-technical stakeholders.
  • Proactive mindset with attention to data accuracy and reliability.

 

What you can expect from us

Here at Nexaminds, we're not your typical workplace. We're all about creating a friendly and trusting environment where you can thrive. Why does this matter? Well, trust and openness lead to better quality, innovation, commitment to getting the job done, efficiency, and cost-effectiveness.

  • Stock options πŸ“ˆ
  • Remote work options 🏠
  • Flexible working hours πŸ•œ
  • Benefits above the law
  • But it's not just about the work; it's about the people too. You'll be collaborating with some seriously awesome IT pros.
  • You'll have access to mentorship and tons of opportunities to learn and level up.

Ready to embark on this journey with us? πŸš€πŸŽ‰ If you're feeling the excitement, go ahead and apply!

Top Skills

Spark
Azure
Azure Devops
Azure Event Hub
Databricks
Delta Lake
Git
Jfrog Artifactory
Pyspark
Python
Shell Scripting
SQL
Terraform

Similar Jobs

7 Hours Ago
Easy Apply
Remote or Hybrid
Bangalore, Bengaluru Urban, Karnataka, IND
Easy Apply
Junior
Junior
Cloud • Information Technology • Security • Software
Outbound BDR responsible for prospecting target accounts via cold calling, email, and LinkedIn; conducting market research; qualifying leads into SQLs; and handing off pipeline to Account Executives while collaborating to refine outreach.
Top Skills: Linkedin,Email Campaigns,Prospecting Tools,Jumpcloud
7 Hours Ago
Remote
India
Mid level
Mid level
Cloud • Information Technology • Productivity • Software • Automation
Design modern, reusable UI components, templates, and branded design systems for demos and POCs. Build front-end components and flows, maintain a shared UI asset library, collaborate with presales and field teams, and define best practices to accelerate POC delivery and improve visual consistency.
Top Skills: Boomi Flow,Boomi,Html,Css,Javascript,React,Low-Code Platforms,Flow-Based Environments
9 Hours Ago
Remote
India
Mid level
Mid level
Cloud • Information Technology • Productivity • Software • Automation
Design, develop, execute, and maintain automated functional and integration tests for Boomi runtime using Java, Selenium, TestNG and API frameworks. Validate REST/SOAP services, perform regression and impact analysis, improve QA processes, collaborate in Agile teams, and use defect management tools to ensure product quality.
Top Skills: Java,Selenium,Testng,Restassured,Readyapi,Selenium Webdriver,Selenium Grid,Rest,Soap,Wsdl,Jmeter,Blazemeter,Jira,Zephyr,Hp Alm,Perl,Shell,Linux,Unix,Intellij,Eclipse,Git,Bitbucket,Sql,Hibernate,Aws

What you need to know about the Mumbai Tech Scene

From haggling for the best price at Chor Bazaar to the bustle of Crawford Market, the energy of Mumbai's traditional markets is a key part of the city's charm. And while these markets will always have their place, the city also boasts a thriving e-commerce scene, ranking among the largest in the region. Driven by online sales in everything from snacks to licensed sports merchandise to children's apparel, the local industry is worth billions, with companies actively recruiting to meet the demands of continued growth.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account