NEC Software Solutions Logo

NEC Software Solutions

Data Engineer- Python, Spark, SQL - 6months Contract

Posted 2 Days Ago
Be an Early Applicant
Remote
Hiring Remotely in Mumbai, Maharashtra
Senior level
Remote
Hiring Remotely in Mumbai, Maharashtra
Senior level
As a Data Engineer, you will architect, create, and maintain data pipelines and ETL processes within AWS. You'll support the optimization of existing tools, collaborate with data science and analytics teams, and ensure compliance and governance in data usage while promoting a DevOps culture.
The summary above was generated by AI

Company Description

Our philosophy is to understand our customers’ business first before we get to the technology.
This approach leads to clever software; streamlining old processes, saving money and delivering positive change.

Our technology has helped the NHS screen millions of babies for hearing loss, ensures hundreds of housing providers are managing their homes efficiently and helps officers in over a dozen different police forces to make better decisions at the frontline.

Based in the UK but working around the world, our 2,000 employees help improve the services that matter most.
We are now part of the NEC corporation, a leader in the integration of IT and network technologies that benefit businesses and people worldwide – this brings in new opportunities without limits for growth and innovation.

Job Description

Role: Data Engineer

Experience: 7-10years

Location: Mumbai Preferred, Open to PAN India

Job Summary:

Skills:

  • Experience with programing in Python, Spark and SQL
  • Prior experience in AWS services (Such as AWS Lambda, Glue, Step function, Cloud Formation, CDK)
  • Knowledge of building bespoke ETL solutions
  • Data modelling, and T-SQL for managing business data and reporting
  • Capable of technical deep-dives into code and architecture
  • Ability to design, build and manage data pipelines for data structures encompassing data transformation, data models, schemas, metadata and workload management
  • Experience in working with data science teams in refining and optimizing data science and machine learning models and algorithms.
  • Effective communication skills

Responsibilities:

  • Build data pipelines: Architecting, creating and maintaining data pipelines and ETL processes in AWS
  • Support and Transition: Support and optimize our current desktop data tool set and Excel analysis pipeline to a transformative Cloud based highly scalable architecture.
  • Work in an agile environment: within a collaborative agile cross-functional product team using Scrum and Kanban
  • Collaborate across departments: Work in close relationship with data science teams and with business (economists/data) analysts in refining their data requirements for various initiatives and data consumption requirements
  • Educate and train: Required to train colleagues such as data scientists, analysts, and stakeholders in data pipelining and preparation techniques, which make it easier for them to integrate and consume the data they need for their own use cases
  • Participate in ensuring compliance and governance during data use: To ensure that the data users and consumers use the data provisioned to them responsibly through data governance and compliance initiatives.
  • Work within, and encourages a Devops culture and Continuous Delivery process

Similar Jobs

12 Days Ago
Remote
Hybrid
India
Junior
Junior
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The Jr. Data Engineer at Capco will develop and design data solutions using Pyspark or Scala, work with scheduling tools like Airflow, and build data pipelines using Hadoop components. The role requires knowledge of big data modeling and experience with version control, deployment tools, and debugging. Good to have skills include Java development and familiarity with cloud patterns and Agile methodologies.
Top Skills: AirflowAnsibleApache HadoopSparkElastic SearchEtl FrameworksGitGitHiveJavaJenkinsJIRAMap ReducePysparkPythonRestful ServicesScalaSQLUnix/LinuxYarn
17 Days Ago
Remote
Hybrid
India
Mid level
Mid level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The Data Engineer at Capco will be responsible for upgrading to the latest golden source image and installing golden source on Google infrastructure, contributing to the transformation of the financial services industry.
Top Skills: Golden SourceGoogle Infrastructure
15 Days Ago
Remote
Hybrid
2 Locations
Senior level
Senior level
Artificial Intelligence • Cloud • Sales • Security • Software • Cybersecurity • Data Privacy
SailPoint seeks a Senior Data Engineer to design and implement robust data ingestion and processing systems. Responsibilities include developing scalable data pipelines, integrating diverse data sources, leveraging AWS services, and using tools like Apache Airflow for orchestration. Candidates should have extensive experience in data engineering and relevant technologies.
Top Skills: Apache AirflowAWSData EngineeringDbtDockerElt ProcessesFlinkHelmJenkinsKafkaKubernetesKustomizeSnowflakeSparkTerraform

What you need to know about the Mumbai Tech Scene

From haggling for the best price at Chor Bazaar to the bustle of Crawford Market, the energy of Mumbai's traditional markets is a key part of the city's charm. And while these markets will always have their place, the city also boasts a thriving e-commerce scene, ranking among the largest in the region. Driven by online sales in everything from snacks to licensed sports merchandise to children's apparel, the local industry is worth billions, with companies actively recruiting to meet the demands of continued growth.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account