Senior Data Research Engineer -Database Engineer

Posted 3 Days Ago
Be an Early Applicant
Hiring Remotely in Mumbai, Maharashtra
Remote
Mid level
Insurance • Software • Energy • Financial Services
The Role
The Senior Data Research Engineer will design, develop, and maintain a secure and efficient database infrastructure for managing company data. Responsibilities include data acquisition, data migration, optimizing database performance, ensuring data integrity and compliance, and collaborating with cross-functional teams to meet data needs. Continuous learning and problem-solving are essential for improving data engineering processes.
Summary Generated by Built In

Company Description

Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most.

The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. 


A typical day in the life of a Junior Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role.

Job Description

Responsibilities

  • Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely.

  • ​Work with databases of varying scales, including small-scale databases, and databases involving big data processing.

  • Work on data security and compliance, by implementing access controls, encryption, and compliance standards (GDPR).

  • Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture.

  • Migrate data from spreadsheets or other sources to a relational database system (e.g., PostgreSQL, MySQL) or cloud-based solutions like Google BigQuery.

  • Develop import workflows and scripts to automate the data import process and ensure data accuracy and consistency.

  • Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms.

  • Work with the team to ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity.

  • Monitor database health and identify and resolve issues.

  • Collaborate with the full-stack web developer in the team to support the implementation of efficient data access and retrieval mechanisms.

  • Implement data security measures to protect sensitive information and comply with relevant regulations.

  • Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows.

  • Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices.

  • Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines.

  • Familiarize yourself with tools and technologies used in the team's workflow, such as Knime for data integration and analysis.

  • Use Python for tasks such as data manipulation, automation, and scripting.

  • Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines.

  • Assume accountability for achieving development milestones.

  • Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities.

  • Collaborate with and assist fellow members of the Data Research Engineering Team as required.

  • Perform tasks with precision and build reliable systems.

  • Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations.

Skills and Experience

  • Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential.

  • Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL or MySQL.

  • Knowledge of SQL and understanding of database design principles, normalization, and indexing.

  • Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources.

  • Knowledge of cloud-based databases, such as Google BigQuery and AWS RDS.

  • Eagerness to develop import workflows and scripts to automate data import processes.

  • Ideally, familiarity with Knime or similar tools for data integration and analysis.

  • Knowledge of data security best practices, including access controls, encryption, and compliance standards (e.g., GDPR, HIPAA).

  • Strong problem-solving and analytical skills with attention to detail.

  • Creative and critical thinking.

  • Strong willingness to learn and expand knowledge in data engineering.

  • Knowledge of Python programming, including data manipulation and automation, and with modules such as Pandas, SQLAlchemy, gspread, PyDrive, PySpark.

  • Familiarity with Agile development methodologies is a plus.

  • Familiarity with Docker containers or similar technologies is a plus.

  • Experience with version control systems, such as Git, for collaborative development.

  • Ability to thrive in a fast-paced environment with rapidly changing priorities.

  • Ability to work collaboratively in a team environment.

  • Good and effective communication skills.

  • Comfortable with autonomy and ability to work independently.

Qualifications

Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential.

Additional Information

All your information will be kept confidential according to EEO guidelines.


Top Skills

Python
SQL
The Company
HQ: Jersey City, New Jersey
563 Employees
On-site Workplace
Year Founded: 1917

What We Do

Forbes Advisor is a global platform dedicated to helping consumers make the best financial choices for their individual lives.

We support your pursuit of success by making smart financial decisions simple, to help you get back to doing the things you care about most.

We do this by helping turn your aspirations into reality. By arming you with trusted advice and guidance, you can make informed financial decisions you feel confident in and achieve your financial goals.

Visit Forbes Advisor for unbiased personal finance advice, news and reviews, plus a comparison marketplace that helps you find the financial products that best fit your life and goals.

Similar Jobs

Atlassian Logo Atlassian

Sr. Data Engineer

Cloud • Information Technology • Productivity • Security • Software • App development • Automation
Remote
India
11000 Employees

Walmart Global Tech Logo Walmart Global Tech

Data Engineer III

Big Data • Cloud • Logistics • Machine Learning • Retail
Remote
8 Locations
578950 Employees

Oportun Logo Oportun

Senior Data Engineer

Artificial Intelligence • Financial Services
Remote
India
3057 Employees

Oportun Logo Oportun

Senior Staff Data Engineer

Artificial Intelligence • Financial Services
Remote
India
3057 Employees

Similar Companies Hiring

TransUnion Thumbnail
Information Technology • Fintech • Financial Services • Cybersecurity • Business Intelligence • Big Data Analytics • Big Data
Chicago, IL
13000 Employees
Snap Inc. Thumbnail
Virtual Reality • Software • Mobile • Machine Learning • Cloud • Artificial Intelligence • App development
Santa Monica, CA
5000 Employees
MetLife Thumbnail
Insurance • Information Technology • Fintech • Financial Services • Big Data Analytics
New York, NY
43000 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account