We are looking for a Senior DevOps Engineer to drive efficient and secure delivery pipelines, ensure robust infrastructure reliability, and uphold security and compliance standards. This pivotal role requires expertise in automation, cloud technologies, and modern deployment practices to enable our teams to deliver high-quality solutions at scale.
Responsibilities
- Design and implement CI / CD pipelines, optimizing delivery processes and minimizing risks
- Build and maintain scalable infrastructure using infrastructure-as-code practices
- Enhance Identity and Access Management systems to secure data and infrastructure
- Manage cloud-based services (e.g., Databricks Lakehouse, Unity Catalog) and optimize usage across teams
- Oversee and enhance deployment workflows while adhering to security and compliance frameworks
- Actively monitor operational systems to ensure uptime and resolve performance bottlenecks
- Collaborate across teams to establish automation best practices and eliminate manual processes
- Continuously evaluate and implement new tools to improve efficiency in DevOps operations
Requirements
Proven experience of over 3 years in DevOps practices, with a strong understanding of automation tools and deployment pipelines (e.g., Jenkins, GitLab CI / CD)Proficiency in Databricks, including Databricks Lakebase and Databricks Unity CatalogExpertise in Infrastructure as Code tooling with capabilities in building and maintaining environments (Terraform, Ansible)Background in managing Identity and Access Management systems, ensuring secure authentication and role managementKnowledge of cloud platforms like AWS, Microsoft Azure, or Google Cloud Platform, including scalable architecture practicesFamiliarity with security compliance processes in DevOps workflowsStrong problem-solving skills and the ability to work collaboratively in a fast-paced environmentExcellent communication skills in English, with a minimum proficiency level of B2Nice to have
Familiarity with AWS Secrets Manager and Amazon VPCExperience with Apache Airflow for pipeline orchestrationUnderstanding of Databricks Asset Bundles for job and environment provisioningExposure to monitoring tools like Datadog or PagerDuty to handle on-call responsibilitiesWe offer
International projects with top brandsWork with global teams of highly skilled, diverse peersEmployee financial programsPaid time off and sick leaveUpskilling, reskilling and certification coursesUnlimited access to the LinkedIn Learning library and 22,000+ coursesGlobal career opportunitiesVolunteer and community involvement opportunitiesEPAM Employee GroupsAward-winning culture recognized by Glassdoor, Newsweek and LinkedInLocation : Santiago, Santiago Metropolitan Region, Chile
#J-18808-Ljbffr