We are looking for an experienced Senior Data Quality Engineer to join our team and take responsibility for ensuring the accuracy, reliability, and performance of our data systems and workflows. In this role, you will lead key initiatives to improve data quality, leveraging advanced technologies to deliver impactful results. If you are passionate about refining data processes and enjoy working with cutting-edge solutions, this position offers the chance to shape the future of our data infrastructure.
Responsibilities
- Create and implement data quality strategies to ensure consistent accuracy across data systems and products
- Lead initiatives to improve data workflows by incorporating best practices across teams and projects
- Develop and apply advanced testing frameworks and methodologies to meet enterprise data quality standards
- Efficiently manage complex data quality tasks, ensuring prioritization and delivery under tight deadlines
- Design testing strategies tailored to evolving system architectures and data pipeline requirements
- Provide recommendations on resource allocation and testing priorities that align with compliance and business needs
- Establish and refine governance frameworks to ensure alignment with industry standards
- Build and optimize automated validation pipelines to support production environments
- Collaborate with cross-functional teams to resolve infrastructure challenges and improve system performance
- Mentor junior engineers and maintain comprehensive documentation of testing processes and strategies
Requirements
At least 3 years of professional experience in Data Quality Engineering or related rolesAdvanced proficiency in Python for data validation and workflow automationExpertise in Big Data platforms such as Hadoop tools (HDFS, Hive, Spark) and modern streaming technologies like Kafka, Flume, or KinesisHands‑on experience with NoSQL databases like Cassandra, MongoDB, or HBase for managing large‑scale datasetsProficiency in data visualization tools such as Tableau, Power BI, or Tibco Spotfire for analytics and reportingExtensive experience with cloud platforms like AWS, Azure, or GCP, with knowledge of multi‑cloud architecturesAdvanced knowledge of relational databases and SQL technologies like PostgreSQL, MSSQL, MySQL, and Oracle in high‑volume environmentsProven ability to implement and scale ETL processes using tools like Talend, Informatica, or similar platformsFamiliarity with MDM tools and performance testing applications like JMeterStrong experience with version control systems like Git, GitLab, or SVN, and automation for large‑scale systemsDeep understanding of testing frameworks such as TDD, DDT, and BDT for data‑focused environmentsExperience implementing CI / CD pipelines using tools like Jenkins or GitHub ActionsStrong analytical and problem‑solving abilities, with the capability to derive actionable insights from complex datasetsExcellent verbal and written communication skills in English (B2 level or higher), with experience engaging stakeholdersNice to have
Experience with additional programming languages like Java, Scala, or advanced Bash scripting for production‑level solutionsAdvanced knowledge of XPath for data validation and transformation workflowsProficiency in designing custom data generation tools and synthetic data techniques for testing scenariosWe offer
International projects with top brandsWork with global teams of highly skilled, diverse peersEmployee financial programsPaid time off and sick leaveUpskilling, reskilling and certification coursesUnlimited access to the LinkedIn Learning library and 22,000+ coursesGlobal career opportunitiesVolunteer and community involvement opportunitiesr>
EPAM Employee Groups
Award‑winning culture recognized by Glassdoor, Newsweek and LinkedIn#J-18808-Ljbffr