Prodigy hunters

AWS Data Engineer - Python/ETL/Spark

Click Here to Apply

Job Location

in, India

Job Description

Job Description : We are seeking a skilled AWS Data Engineer with expertise in Airflow, Python, ETL, Spark, SQL databases, and AWS Cloud to join our team. The role involves designing and implementing robust data pipelines, managing ETL workflows, and optimizing data solutions in a hybrid cloud environment. Roles and Responsibilities : - Data Pipeline Development - Design and maintain scalable data pipelines using AWS, Airflow, Spark, and Python to process and integrate data from various sources. - ETL Management - Develop and optimize ETL workflows to ensure accurate and efficient data processing. - Cloud Infrastructure Utilize - AWS services to deploy and manage data solutions, ensuring security and compliance. - Scheduling and Orchestration - Manage and troubleshoot Airflow workflows to automate and streamline ETL processes. - Data Quality and Documentation - Implement data validation, ensure data quality, and maintain documentation for all pipelines and Proficiency in Airflow, Python, ETL, Spark, and SQL databases - Strong experience with AWS Cloud solutions This is an immediate joiner position. If you meet the qualifications, we encourage you to apply. (ref:hirist.tech)

Location: in, IN

Posted Date: 11/22/2024
Click Here to Apply
View More Prodigy hunters Jobs

Contact Information

Contact Human Resources
Prodigy hunters

Posted

November 22, 2024
UID: 4923980848

AboutJobs.com does not guarantee the validity or accuracy of the job information posted in this database. It is the job seeker's responsibility to independently review all posting companies, contracts and job offers.