Sr Big Data Engineer Airflow and Oozie (GCP)
United States - Remote
Work Location: US-Remote
Key Responsibilities:
- Develop scalable and robust code for batch processing systems. This includes working with technologies like Hadoop, Oozie, Pig, Hive, Map Reduce, Spark (Java), Python, Hbase.
- Develop, Manage and optimize data workflows using Oozie and Airflow within the Apache Hadoop ecosystem.
- Leverage GCP for scalable big data processing and storage solutions.
- Implementing automation/DevOps best practices for CI/CD, IaC, etc.
Qualifications:
- Bachelors's degree in Computer Science, software engineering or related field of study.
- Experience with managed cloud services and understanding of cloud-based batch processing systems are critical.
- Proficiency in Oozie, Airflow, Map Reduce, Java.
- Strong programming skills with Java (specifically Spark), Python, Pig, and SQL.
- Expertise in public cloud services, particularly in GCP.
- Proficiency in the Apache Hadoop ecosystem with Oozie, Pig, Hive, Map Reduce.
- Familiarity with BigTable and Redis.
- Experienced in Infrastructure and Applied DevOps principles in daily work. Utilize tools for continuous integration and continuous deployment (CI/CD), and Infrastructure as Code (IaC) like Terraform to automate and improve development and release processes.
- Proven experience in engineering batch processing systems at scale.
Must Have:
- 5+ years of experience in customer-facing software/technology or consulting.
- 5+ years of experience with “on-premises to cloud” migrations or IT transformations.
- 5+ years of experience building, and operating solutions built on GCP (ideally) or AWS/Azure.
- Proficiency in Oozie, Airflow, Map Reduce, Java.
Unless already included in the posted pay range and based on eligibility, the role may include variable compensation in the form of bonus, commissions, or other discretionary payments. These discretionary payments are based on company and/or individual performance and may change at any time. Actual compensation is influenced by a wide array of factors including but not limited to skill set, level of experience, licenses and certifications, and specific work location. Information on benefits offered is here.#LI-MF1 #LI-Remote
About Rackspace TechnologyWe are the multicloud solutions experts. We combine our expertise with the world’s leading technologies — across applications, data and security — to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future. More on Rackspace TechnologyThough we’re all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know.
Apply
Job Profile
Work location limited to specific states
Benefits/PerksDiscretionary payments Remote work flexibility Variable Compensation
Tasks- Develop batch processing systems
- Implement automation and DevOps best practices
- Optimize data workflows
- Security
Airflow Applications Automation AWS Azure Big Data CD CI CI/CD Cloud Services Cloud Technologies Communication DevOps GCP Hadoop HBase Hive IaC Infrastructure Infrastructure as Code Java MapReduce Oozie Pig Public Cloud Python Redis Spark SQL Terraform
Experience5 years
EducationBachelor's degree in Computer Science Bachelor’s degree in software engineering Business Computer Science Engineering Related Related Field Related field of study Software Engineering
TimezonesAmerica/Anchorage America/Chicago America/Denver America/Los_Angeles America/New_York Pacific/Honolulu UTC-10 UTC-5 UTC-6 UTC-7 UTC-8 UTC-9