Firesoft People

Cloud Data Engineer

Click Here to Apply

Job Location

South Yarra, Australia

Job Description

My client is a Premier Google Partner for Google Cloud Platform (GCP). We are extremely agile and innovative, and offer the complete spectrum of Google cloud services. They have been named as one of the top 3 service and integration partners by Google Cloud in Germany, the European Region and soon to be recognised as the ANZ leader of GCP transformation services. For ongoing programs we are looking for a senior Data professional who could engineer Data architecture on GCP, define integration between services, performance, and best practices. Deep Data Skillset: At least 2 years of building large-scale enterprise data solutions using one or more third-party resources such as Pyspark, Talend, Matellion, Informatica or native utilities as Spark, Hive, Cloud DataProc, Cloud Dataflow, Apache, Beam Composer, Big Table, Cloud BigQuery, Cloud PubSub etc. All round experience in information technology, enterprise data-center technologies. Strong documentation skills, communication and client facing Experience Cloud certifications on either GCP, AWS or AZURE) Understanding of the current state of infrastructure automation, continuous integration/deployment - CI/CD, SQL/NoSQL, security, networking, and cloud-based delivery models. Working knowledge of software development and automation tools and methodologies (Jenkins, Git, Terraform, Chef, Python, etc.). Experience with managing and using Git-repositories (Gitlab, GitHub). Scripting abilities with one or more general purpose programming languages including but not limited to Java, C/C++, C#, Objective C, Python, JavaScript. Experience with on-premise to cloud migrations or IT transformations. Experience working with Containerization technologies (Kubernetes, Docker, etc.). Strong understanding of cloud security requirements, tools, and best practices. Knowledge in Process Automation. Techno experience 360: Migrating data from legacy systems using tools like Hadoop, Exadata, Oracle Teradata, or Netezza Implementing DevOps on GCP - Multi-cloud experience Benefits: Flexible working arrangements - work here there or anywhere Working on the most innovative project works Personal development to enhance your career Rewarding curiosity and collaboration My client is a Premier Google Partner for Google Cloud Platform (GCP). We are extremely agile and innovative, and offer the complete spectrum of Google cloud services. They have been named as one of the top 3 service and integration partners by Google Cloud in Germany, the European Region and soon to be recognised as the ANZ leader of GCP transformation services. For ongoing programs we are looking for a senior Data professional who could engineer Data architecture on GCP, define integration between services, performance, and best practices. Deep Data Skillset: At least 2 years of building large-scale enterprise data solutions using one or more third-party resources such as Pyspark, Talend, Matellion, Informatica or native utilities as Spark, Hive, Cloud DataProc, Cloud Dataflow, Apache, Beam Composer, Big Table, Cloud BigQuery, Cloud PubSub etc. All round experience in information technology, enterprise data-center technologies. Strong documentation skills, communication and client facing Experience Cloud certifications on either GCP, AWS or AZURE) Understanding of the current state of infrastructure automation, continuous integration/deployment - CI/CD, SQL/NoSQL, security, networking, and cloud-based delivery models. Working knowledge of software development and automation tools and methodologies (Jenkins, Git, Terraform, Chef, Python, etc.). Experience with managing and using Git-repositories (Gitlab, GitHub). Scripting abilities with one or more general purpose programming languages including but not limited to Java, C/C++, C#, Objective C, Python, JavaScript. Experience with on-premise to cloud migrations or IT transformations. Experience working with Containerization technologies (Kubernetes, Docker, etc.). Strong understanding of cloud security requirements, tools, and best practices. Knowledge in Process Automation. Techno experience 360: Migrating data from legacy systems using tools like Hadoop, Exadata, Oracle Teradata, or Netezza Implementing DevOps on GCP - Multi-cloud experience Benefits: Flexible working arrangements - work here there or anywhere Working on the most innovative project works Personal development to enhance your career Rewarding curiosity and collaboration

Location: South Yarra, AU

Posted Date: 1/30/2025
Click Here to Apply
View More Firesoft People Jobs

Contact Information

Contact Human Resources
Firesoft People

Posted

January 30, 2025
UID: 5013615737_south-yarra

AboutJobs.com does not guarantee the validity or accuracy of the job information posted in this database. It is the job seeker's responsibility to independently review all posting companies, contracts and job offers.