Overview
We are seeking a highly skilled remote Senior Data DevOps Engineer to join our team, working on a cutting-edge project that involves the development and deployment of large-scale data processing pipelines.
In this position, you will play a critical role in designing, implementing, and maintaining the infrastructure that enables data processing, storage, and analysis. You will work with a team of experienced professionals, tackling complex challenges and driving innovation in the field of data engineering. If you are passionate about DevOps and have a solid understanding of data processing technologies, we invite you to apply for this exciting opportunity.
Responsibilities
- Design, implement, and maintain data processing pipelines using technologies
- Develop and maintain CI / CD pipelines for data processing applications, ensuring efficient and reliable deployment
- Implement and manage containerization technologies to enable scalable and flexible infrastructure
- Collaborate with data scientists and analysts to design and implement data storage and retrieval solutions
- Ensure the security and availability of data processing infrastructure, implementing best practices for data protection and disaster recovery
- Monitor and troubleshoot data processing pipelines and infrastructure, identifying and resolving issues in a timely manner
- Continuously improve data processing infrastructure, staying up-to-date with the latest technologies and industry trends
Requirements
A minimum of 3 years of experience in DevOps, with a focus on data engineering and infrastructure managementExpertise in CI / CD processes and tools, including Git, Jenkins, and TeamCityHands-on experience with containerization technologies such as Docker and Kubernetes, as well as container orchestration tools like Amazon ECS or KubernetesIn-depth knowledge of Amazon Web Services (AWS), including EC2, S3, and LambdaStrong proficiency in Linux system administration and shell scriptingExperience with infrastructure as code tools such as Terraform, Ansible, or CloudFormationFamiliarity with the Elastic Stack (Elasticsearch, Logstash, and Kibana) for log management and analysisExcellent communication and collaboration skills, with the ability to work effectively in a team environmentFluent spoken and written English at an Upper-Intermediate level or higher (B2+)Nice to have
Experience with other cloud providers such as Google Cloud Platform or Microsoft AzureExperience with Big Data technologies such as Hadoop, Hive, and PigFamiliarity with configuration management tools such as Chef or PuppetKnowledge of scripting languages such as Python or RubyWe offer
International projects with top brandsWork with global teams of highly skilled, diverse peersEmployee financial programsPaid time off and sick leaveUpskilling, reskilling and certification coursesUnlimited access to the LinkedIn Learning library and 22,000+ coursesGlobal career opportunitiesVolunteer and community involvement opportunitiesEPAM Employee GroupsAward-winning culture recognized by Glassdoor, Newsweek and LinkedIn#J-18808-Ljbffr