Skip to content
- Design and execute test plans for data pipelines, ETL processes, and API endpoints built on BigQuery, Trino/Starburst, and Looker.
- Develop automated test scripts using tools like Great Expectations for data validation or pytest for API testing.
- Perform end-to-end testing of data ingestion, transformation, and reporting workflows, including integration with Google Analytics.
- Identify and document defects in large datasets, ensuring data accuracy, completeness, and performance under load.
- Collaborate with engineers to implement CI/CD pipelines with testing gates (e.g., using Jenkins or GitHub Actions).
- Conduct performance and security testing for APIs, simulating high-volume queries and access controls.
- Monitor production environments for issues and contribute to post-mortem analyses.
- Stay updated on testing best practices for cloud-native data stacks, including tools like dbt tests or Airflow DAG validation.
- Bachelor’s degree in Computer Science, Quality Assurance, or a related field.
- 5+ years of experience in QA/testing, with a focus on data pipelines or backend systems.
- Proficiency in SQL for querying datasets and scripting in Python or Java for automation.
- Experience testing cloud platforms (GCP preferred) and tools like BigQuery or Looker.
- Strong understanding of API testing (REST/GraphQL) using tools like Postman or Karate.
- Detail-oriented with excellent problem-solving skills and experience in agile methodologies.
- Familiarity with data quality frameworks (e.g., Great Expectations) or orchestration testing (Airflow).
- Knowledge of load testing tools (e.g., JMeter) and compliance testing (e.g., data privacy).
- Exposure to Trino/Starburst for query validation or dbt for transformation testing.