We are hiring a Senior Data DevOps Engineer with expertise in Google Cloud Platform to join our expanding data team. You will be responsible for cloud data infrastructure design, automation, and system optimization. Join us to advance your career working with innovative cloud data technologies.
Responsibilities
- Build cloud data infrastructure using GCP services like DataFlow, GCS, and Cloud Composer
- Implement Infrastructure as Code solutions using Terraform to automate deployments and monitoring
- Partner with data engineers to develop efficient Python-based workflows
- Configure CI/CD pipelines with Jenkins, GitLab CI, or GitHub Actions for reliable deployments
- Enhance the performance and availability of data platforms collaborating with teams
- Manage cloud data tools such as Apache Spark, Kafka, and Apache Airflow
- Identify and address cloud data system reliability and scalability issues
Requirements
- 3+ years of experience in GCP cloud environments including BigQuery, Cloud Composer, and Dataproc
- Strong skills in Python and SQL for data pipeline operations
- Knowledge of Infrastructure as Code tools like Terraform or CloudFormation
- Experience integrating CI/CD pipelines using Jenkins, GitHub Actions, or GitLab CI
- Background in Linux operating systems and shell scripting
- Understanding of network protocols including TCP/IP, DNS, and NAT
- Competency with Apache Spark, Apache Airflow, or ELK Stack
Nice to have
- Experience with AWS or Azure cloud services such as ECS, S3, Data Lake, or Synapse
- Ability to work with other IaC tools including Ansible
- Demonstrated skill with additional data workflow automation tools
We offer
- International projects with top brands
- Work with global teams of highly skilled, diverse peers
- Employee financial programs
- Paid time off and sick leave
- Upskilling, reskilling and certification courses
- Unlimited access to the LinkedIn Learning library and 22,000+ courses
- Global career opportunities