Sr Big Data Engineer (GCP)- Airflow and Oozie

Posted 19 Hours Ago
Be an Early Applicant
Remote
Senior level
Cloud • Information Technology • Software
The Role
The Senior Big Data Engineer will develop and scale stream and batch processing systems using various technologies, manage data workflows with Oozie and Airflow, leverage GCP for big data solutions, and implement DevOps best practices like CI/CD and Infrastructure as Code.
Summary Generated by Built In

About the Role:

We are seeking a highly skilled and experienced Senior Big Data Engineer to join our dynamic team. The ideal candidate will have a strong background in developing and scaling both stream and batch processing systems, and a solid understanding of public cloud technologies, especially GCP. This role involves working in a remote environment, requiring excellent communication skills and the ability to solve complex problems independently and creatively.


What you will be doing

Build a reusable, and reliable code for stream and batch processing systems at scale. This includes working with technologies like Pub/Sub, Kafka, Kinesis, DataFlow, Flink, Hadoop, Pig, Hive, and Spark. Implementing automation/DevOps best practices for CI/CD, IaC, Containerization, etc.

Requirements

  • About the Role:
  • We are seeking a highly skilled and experienced Senior Big Data Engineer to join our dynamic team. The ideal candidate will have a strong background in developing batch processing systems, with extensive experience in Oozie, the Apache Hadoop ecosystem, Airflow, and a solid understanding of public cloud technologies, especially GCP. This role involves working in a remote environment, requiring excellent communication skills and the ability to solve complex problems independently and creatively.

  • What you will be doing
  • Develop scalable and robust code for batch processing systems. This includes working with technologies like Hadoop, Oozie, Pig, Hive, Map Reduce, Spark (Java), Python, Hbase
  • Develop, Manage and optimize data workflows using Oozie and Airflow within the Apache Hadoop ecosystem
  • Leverage GCP for scalable big data processing and storage solutions
  • Implementing automation/DevOps best practices for CI/CD, IaC, etc.

  • Requirements:
  • Experience with GCP managed services and understanding of cloud-based batch processing systems are critical.
  • Proficiency in Oozie, Airflow, Map Reduce, Java
  • Strong programming skills with Java (specifically Spark), Python, Pig, and SQL
  • Expertise in public cloud services, particularly in GCP.
  • Proficiency in the Apache Hadoop ecosystem with Oozie, Pig, Hive, Map Reduce
  • Familiarity with BigTable and Redis
  • Experienced in Infrastructure and Applied DevOps principles in daily work. Utilize tools for continuous integration and continuous deployment (CI/CD), and Infrastructure as Code (IaC) like Terraform to automate and improve development and release processes.
  • Ability to tackle complex challenges and devise effective solutions. Use critical thinking to approach problems from various angles and propose innovative solutions.
  • Worked effectively in a remote setting, maintaining strong written and verbal communication skills. Collaborate with team members and stakeholders, ensuring clear understanding of technical requirements and project goals.
  • Proven experience in engineering batch processing systems at scale.
  • Hands-on experience in public cloud platforms, particularly GCP. Additional experience with other cloud technologies is advantageous.

Top Skills

GCP
Java
Python
The Company
HQ: San Antonio, TX
7,509 Employees
On-site Workplace
Year Founded: 1998

What We Do

At Rackspace Technology, we accelerate the value of the cloud during every phase of digital transformation. By managing apps, data, security and multiple clouds, we are the best choice to help customers get to the cloud, innovate with new technologies and maximize their IT investments. As a recognized Gartner Magic Quadrant leader, we are uniquely positioned to close the gap between the complex reality of today and the promise of tomorrow. Passionate about customer success, we provide unbiased expertise, based on proven results, across all the leading technologies. And across every interaction worldwide, we deliver Fanatical Experience TM — the best customer service experience in the industry. Rackspace has been honored by Fortune, Forbes, Glassdoor and others as one of the best places to work.

Jobs at Similar Companies

Scythe Robotics Logo Scythe Robotics

Temp Assembly & Test Technician

Artificial Intelligence • Computer Vision • Hardware • Machine Learning • Robotics • Sales • Social Impact
Easy Apply
Longmont, CO, USA
105 Employees

Scythe Robotics Logo Scythe Robotics

Sales Specialist

Artificial Intelligence • Computer Vision • Hardware • Machine Learning • Robotics • Sales • Social Impact
Easy Apply
Remote
4 Locations
105 Employees

Scythe Robotics Logo Scythe Robotics

Sales Respresentative

Artificial Intelligence • Computer Vision • Hardware • Machine Learning • Robotics • Sales • Social Impact
Easy Apply
Remote
3 Locations
105 Employees

Square Logo Square

Engineering Manager, Square Banking

eCommerce • Fintech • Hardware • Payments • Software • Financial Services
Remote
8 Locations
12000 Employees

Similar Companies Hiring

Getty Images Thumbnail
Software • News + Entertainment • Natural Language Processing • Machine Learning • Generative AI • Digital Media • Computer Vision
Seattle, WA
1700 Employees
TransUnion Thumbnail
Information Technology • Fintech • Financial Services • Cybersecurity • Business Intelligence • Big Data Analytics • Big Data
Chicago, IL
13000 Employees
UL Solutions Thumbnail
Software • Renewable Energy • Professional Services • Energy • Consulting • Chemical • Automotive
Chicago, IL
15000 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account