Quality Engineer

Information Technology SDET & QA
ETL SQL Selenium

Cincinnati, OH
100k/year - 112k/year
September 16, 2025
Candidates: 2 Interviewing: 0 Hired: 0
Contract In Person
Randi Brofft

Randi Brofft

Vice President
Information Technology Retail Transportation

About the Job

Gentis Solutions is looking for a Quality Engineer to join our team. This role is part of our Professional Consultants group, working directly with clients to accomplish their goals. Consulting engagements typically last an average of 12 months but often extend in duration.

Compensation for this position ranges from 100k/year - 112k/year per year, based on experience. Payment is bi-weekly and is based on all hours worked, without eligibility for overtime.

This position requires the individual to be located in the Cincinnati, OH area.

Requirements

  • 5–7+ years of experience in QA engineering with a strong focus on data testing and automation
  • Proven experience with end-to-end integration and regression testing for complex, distributed systems
  • Hands-on experience working with ETL pipelines, SFTP processes, and data ingestion frameworks
  • Advanced SQL skills for data validation, query writing, and verifying transformations across raw, curated, and shared data layers
  • Proficiency with automation frameworks such as Selenium, TestNG, JUnit, or PyTest, and scripting languages like Python or Java
  • Experience with CI/CD pipelines and integrating automated tests into release workflows (e.g., Jenkins, GitHub Actions, Azure DevOps)
  • Ability to analyze transformation rules, data mappings, and business rules to create comprehensive test cases
  • Knowledge of performance and SLA testing for data pipelines and APIs
  • Strong problem-solving skills and ability to collaborate across business, product, and engineering teams
  • Excellent communication skills for stakeholder engagement and clear documentation of test strategies and issues

Nice to Have

  • Familiarity with Azure or other on-prem/cloud platforms
  • Experience with data pipeline orchestration tools such as Airflow or Informatica