skillVentory
Principal Big Data Architect - Commercial Data Platform
Job Location
pune, India
Job Description
Job Description : In this role you will primarily focus on the data architecture of Commercial Solutions group. You will lead the design, implementation, and management of Commercial data platform. Work closely with the leadership to form an opinion on choice of tools, technologies, architecture and execute the overall data strategy. You will be responsible for data modelling, database design, data warehousing, and big data technologies, establishing the path to cloud for the entire data platform. Guide and mentor teams on right engineering principals. Responsibilities : - Develop and implement a comprehensive data strategy that aligns with the organization's goals. - Define data architecture standards, principles, and guidelines. - Responsible for defining and designing the data warehouse, data lakes and establish the data integration strategy. - Establish data governance policies and procedures and implement data quality assurance processes. - You will be hands on and work closely to guide a team of Data Engineers in the associated data maintenance, integrations, enhancements, loads and transformation processes for the organization. - Own and drive the evaluation of data management technologies and lead the implementation with a high bar of performance, scalability, and availability standards. Requirements : - AWSSnowflake are expected to be key technologies and we looking for following combination of skills : (Snowflake OR Databricks) AND (Aws OR Azure OR GCP) AND (Python OR Java OR Scala) - Experience in building and deploying production-level data-driven applications and data processing workflows/pipelines and/or implementing machine learning systems at scale. - Experience leading the design and implementation of complex features. - Experience leading a large project and working with other data and software engineers. - Experience working in enterprise databases and ensure follow industry best practices around data privacy. - Expertise in using Python or Scala, Spark, Hadoop platforms & tools (Hive, Impala, Airflow, NiFi, Scoop), SQL to build Big Data products & platforms. - Experience in Java/.net, Scala, or Python technologies and deliver analytics involving all phases like data ingestion, feature engineering, modelling, tuning, evaluating, monitoring, and presenting. - Experience in anonymizing data, data product development, analytical models, and AI governance. - Experience with SQL, Multi-threading, Message Queuing & Distributed Systems. (ref:hirist.tech)
Location: pune, IN
Posted Date: 2/6/2025
Location: pune, IN
Posted Date: 2/6/2025
Contact Information
Contact | Human Resources skillVentory |
---|