GENPACT India Private Limited
Data Engineer - Snowflake DB
Job Location
in, India
Job Description
Job Description : Inviting applications for the role of Principal Consultant- Data Engineer (Azure Snowflake Python) In this role, you shall be responsible for design and build modern data pipelines and data streams. Responsibilities : - Ability to implement Snowflake as an Tech Lead-Experience in building data warehouse on snowflake. - Experience in Azure data lake and Azure Data Factory (ADF). - Good knowledge of ETL/ELT process and provide direction to the team. - Must have knowledge of snowflake architecture and implementation patterns. - Must have understanding of snowflake roles and deployments of virtual warehouses. - Good knowledge of data modelling, integration and design techniques. - Must be a hands-on programming(Python/pySparketc) , snowflake and SQL queries, standard DWH ETL concepts - Must be able to write complex sql queries and Stored Procedures and UDFs. - Creates and updates technical requirement documentation for all systems and solutions - Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, time travel and understand how touse these features. - Extensive experience in Data Proofing and Data Modelling, Data Quality, Data Standardization, Data Steward. - Experiences or knowledge of Marketing modelling is mandatory. - The candidate should be able to play a major role in our analytics DataMart project, and help us develop modern end-to-end modelling solution in Marketing domain. - Combination of Python and Snowflakes- SnowSQL; writing SQL queries against Snowflake. - Experience working on different source data - RDBMS, Flat files, XML, JSON - Expertise in SQL, especially within cloud-based data warehouses like Snowflake and Azure - Expertise with SnowSQL, advanced concepts (query performance tuning, time travel etc.) and features/tools (data sharing, events, SnowPipe etc.) - Experience with end to end implementation of Snowflake cloud data warehouse or end to end data warehouse implementations on-premise - Proven analytical and problem-solving skills - Strong understanding of ETL concepts and work experience in any of ETL tool like Informatica, Data stage or Talend - Ability to work independently and on multiple tasks/initiatives with multiple deadlines - Effective oral, presentation, and written communication skills - Data modelling and data integration - Support/Troubleshooting Minimum qualifications : - BE/MCA/B.Tech in Computer Science, Information Systems, Engineering, related fields - Full lifecycle implementation of Snowflake project Experience (both migration and new implementation). - Must have designed and executed at least 1 project of snowflake implementation (ideally on Azure). - Snowflake Certification / Azure Certification Preferred-Good to have - Extensive Experience on Data Analysis Projects & Expertise in writing complex DB SQL Queries - Good Understating of Data-warehousing concepts (- Worked on Traditional Data warehouse projects, not just familiarity with any DB) - Good to have hands-on core python scripting exp. & Specialist in data analysis in Python scripting to write function in Azure. - Solid experience in consulting or client service delivery experience on Snowflake and Azure - Good interpersonal, problem solving and verbal communication skill (ref:hirist.tech)
Location: in, IN
Posted Date: 11/26/2024
Location: in, IN
Posted Date: 11/26/2024
Contact Information
Contact | Human Resources GENPACT India Private Limited |
---|