Info Services
Info Services - Senior Data Engineer - Apache Airflow
Job Location
hyderabad, India
Job Description
Title : Senior Data Engineer (Airflow Operations Specialist). Location : Financial District, Hyderabad. Mode : Hybrid Role. Requirement : Job Summary : - We are looking for a Software Engineer to help us grow the data platform. - You will be embedded on the Orchestration Platform team that runs workflow scheduling applications that helps power the analytics, operations, and data science behind our flagship streaming applications. Responsibilities : - Be a key player in a cross-functional team of engineers who build software for a large-scale data processing ecosystem, supporting real-time and batch data pipelines for analytics, data science, and operations. - Build tools and services to support data discovery, lineage, resiliency, and privacy compliance across the data platform Design and develop CI/CD processes for infrastructure components that ensure high availability and agility - Maintain software engineering and architecture best practices and standards within the team and wider organization, along with a culture of quality, innovation, and experimentation - Evangelize the platform, best-practices, data driven decisions; identify new use cases and features and drive adoption Basic Qualifications : - Experience with Python Experience operating Airflow - Experience deploying and running cloud-based applications (preferably AWS), utilizing infrastructure-as-code frameworks like Terraform Knowledge of Data Engineering practices and tools, such as Spark, Snowflake, ETL concepts, etc. - Ability to dive deep into any technical component as well as understand the overall systems architecture. - You care deeply about craftsmanship in your software, and can work backwards from the customer experience. - You have excellent written and verbal communication skills. - Manager's Pointers Experience with Operating Airflow Understand how the whole cycle works and have worked on operating it. - Many people know airflow, but not everyone knows the operating part. - So this person must know that. - Work revolves around Airflow in this team. - AWS Python Data Tools Knows and Understand the entire Data Ecosystem EMS, Spark Nice to Have : Terraform. (ref:hirist.tech)
Location: hyderabad, IN
Posted Date: 11/23/2024
Location: hyderabad, IN
Posted Date: 11/23/2024
Contact Information
Contact | Human Resources Info Services |
---|