This role involves designing, building, and maintaining data systems for analysis and reporting, developing data products, managing data pipelines, and collaborating with stakeholders to optimize data solutions.
Job Purpose and Impact
- The Professional, Data Engineering job designs, builds and maintains moderately complex data systems that enable data analysis and reporting. With limited supervision, this job collaborates to ensure that large sets of data are efficiently processed and made accessible for decision making.
Key Accountabilities
- DATA & ANALYTICAL SOLUTIONS: Develops moderately complex data products and solutions using advanced data engineering and cloud based technologies, ensuring they are designed and built to be scalable, sustainable and robust.
- DATA PIPELINES: Maintains and supports the development of streaming and batch data pipelines that facilitate the seamless ingestion of data from various data sources, transform the data into information and move to data stores like data lake, data warehouse and others.
- DATA SYSTEMS: Reviews existing data systems and architectures to implement the identified areas for improvement and optimization.
- DATA INFRASTRUCTURE: Helps prepare data infrastructure to support the efficient storage and retrieval of data.
- DATA FORMATS: Implements appropriate data formats to improve data usability and accessibility across the organization.
- STAKEHOLDER MANAGEMENT: Partners with multi-functional data and advanced analytic teams to collect requirements and ensure that data solutions meet the functional and non-functional needs of various partners.
- DATA FRAMEWORKS: Builds moderately complex prototypes to test new concepts and implements data engineering frameworks and architectures to support the improvement of data processing capabilities and advanced analytics initiatives.
- AUTOMATED DEPLOYMENT PIPELINES: Implements automated deployment pipelines to support improving efficiency of code deployments with fit for purpose governance.
- DATA MODELING: Performs moderately complex data modeling aligned with the datastore technology to ensure sustainable performance and accessibility.
Qualifications
- Minimum requirement of 2 years of relevant work experience in Python & SQL
- Good to have: Knowledge of AWS ( Lamba/Glue) functions
- At least 1 year experience in Snowflake
- Bachelor's / Masters degree in Computer Science or relevant field with minimum 2 years of relevant work experience
Top Skills
AWS
Python
Snowflake
SQL
Similar Jobs at Cargill
Food • Greentech • Logistics • Sharing Economy • Transportation • Agriculture • Industrial
The role involves managing market data contracts and inventories, ensuring compliance and quality standards, and supporting internal stakeholders with data service operations.
Top Skills:
Agile MethodologyJIRAMarket Data Management
Food • Greentech • Logistics • Sharing Economy • Transportation • Agriculture • Industrial
This role involves maintaining and implementing software applications, including testing, quality assurance, and user support while handling incident tickets and user requests.
Top Skills:
Application Support ProcessesSoftware Applications
Food • Greentech • Logistics • Sharing Economy • Transportation • Agriculture • Industrial
The Software Engineering Manager leads a team to deliver software projects on time, ensures quality standards, and drives process improvements.
Top Skills:
Ci/CdDockerJavaKubernetesPythonReact NativeReactSpring BootTypescript
What you need to know about the Mumbai Tech Scene
From haggling for the best price at Chor Bazaar to the bustle of Crawford Market, the energy of Mumbai's traditional markets is a key part of the city's charm. And while these markets will always have their place, the city also boasts a thriving e-commerce scene, ranking among the largest in the region. Driven by online sales in everything from snacks to licensed sports merchandise to children's apparel, the local industry is worth billions, with companies actively recruiting to meet the demands of continued growth.