Over the last 20 years, Ares’ success has been driven by our people and our culture. Today, our team is guided by our core values – Collaborative, Responsible, Entrepreneurial, Self-Aware, Trustworthy – and our purpose to be a catalyst for shared prosperity and a better future. Through our recruitment, career development and employee-focused programming, we are committed to fostering a welcoming and inclusive work environment where high-performance talent of diverse backgrounds, experiences, and perspectives can build careers within this exciting and growing industry.
Job Description
Primary functions and essential responsibilities
- Analyze, design, develop, and implement full cycle data warehouse/data lake solutions, including the integration of LLMs (Large Language Models) /AI/ML (Artificial Intelligence/Machine Learning) technologies.
- Understand and collaborate on the advancement and normalization of data models for data warehouse builds, incorporating AI/ML algorithms to enhance data processing and analytics.
- Design and develop data modeling, data warehouse and reports for the end users and drive the administration of an enterprise wide reporting/business intelligence/analytics.
- Perform technical development and software code-based delivery of business rules, scripts, data pipelines, data visualization/reporting analytics.
- Prepare all necessary documentation of technical solutions and other related deliverables.
- Assist end users in application/system issues.
- Available to work in US PST working hours and India working hours to be able to interface with US teams.
- Independently lead solution delivery of end-to-end data projects.
- Liaison to the application architecture, business analysts and PMO teams to ensure guidelines and process across all technology initiatives are aligned.
qualifications
Education:
- Bachelor’s degree in Engineering, Computing, Finance, Information Systems or related fields
Experience Required:
- 12-15 years of relevant technical experience developing/implementing Data Lake/Data Warehouse technologies.
- Experience in building data solutions with ERP/financial accounting systems such as Oracle Fusion ERP, Coupa, OneStream, Concur or similar systems/domain is required.
- Strong data modeling skills (normalization/denormalization, data warehouse schema types, dimensional) of data related to general accounting, procurement, expense management, HR, accounts payables, accounts receivables, cash management, products, project accounting.
- Experience with Azure and Cloud technologies (Azure Data Factory, Azure Synapse and Azure data bricks).
- Expertise and hands on software development required in one or more languages/frameworks such as Python, Scala/Spark.
- Strong technical skills with all data warehousing and integration technologies (Data Structures and tools like SQL, RDBMS, BI, ETL)
- Experience in implementing AI/ML models and integrating LLMs into data project will be strong plus
- Experience with Power BI or other reporting platforms.
- Knowledge of Agile project management methodology using tools like JIRA, Confluence.
General Requirements:
- Attention to detail a must
- Adaptable, ability to manage multiple priorities, organized, and independent
- Strong/concise communication skills, experience providing support to end users
- Strong documentation skills of process, steps, methodology
- Strong team player
- High energy level
- Familiarity with alternative manager or investment management industry/Corporate Finance business functions is a plus
Reporting Relationships
There is no set deadline to apply for this job opportunity. Applications will be accepted on an ongoing basis until the search is no longer active.