We are looking for a Data Engineer with experience in Google Cloud Platform and the BigQuery ecosystem. You will support the modernization of an existing Oracle-based data warehouse by building cloud-native pipelines and models in GCP. You will work on data ingestion, transformation, and optimization tasks while collaborating with engineers, analysts, and product partners. This role suits someone who enjoys learning through hands-on work, pays close attention to detail, and brings steady ownership to day-to-day engineering activities.
What You Will Do
Pipeline Development
Build and maintain data pipelines using Apache Beam and Dataflow under the guidance of senior engineers.
Develop ingestion patterns across batch or near real-time workflows.
Write Python and SQL for transformations, validations, and automation tasks.
Create BigQuery tables with sound partitioning and clustering choices.
Transformation and Modeling
Use dbt or Dataform to manage transformations and testing.
Contribute to data model implementation following established standards.
Document logic and assumptions clearly for partner teams.
Production Operations
Support production workloads by monitoring pipelines, analyzing issues, and applying fixes.
Contribute to performance tuning efforts across BigQuery and Dataflow.
Participate in the implementation of CI and CD practices for data workflows.
Collaboration and Growth
Work with analysts, scientists, and engineers to understand requirements.
Participate in code reviews and apply feedback to improve your craft.
Learn modern GCP approaches through close coordination with senior engineers and architects.
What Will Help You Succeed
Technical Skills
Experience with Dataflow, Apache Beam, BigQuery, Cloud Storage, or similar cloud-native tools.
Solid proficiency in Python for data tasks and automation.
Strong SQL skills and a clear understanding of analytic query patterns.
Experience with dbt or Dataform for transformations and testing.
Understanding of common data modeling concepts used in analytics environments.
Engineering Skills
Familiarity with CI and CD practices.
Comfortable working with logging, metrics, and monitoring tools.
Interest in data quality practices and validation frameworks.
Strong debugging instincts and patience with iterative problem solving.
Professional Qualities
Clear communication with teammates and partner groups.
Desire to grow your technical depth through real project experience.
Steady focus on reliability, clarity, and maintainability.
Intermediate or higher English proficiency.
Nice to Have
Experience with Pub/Sub or other event streaming tools.
Exposure to Dataproc or Spark from legacy environments.
Familiarity with Vertex AI or ML-related workflows.
Understanding of orchestration tools such as Composer or Airflow.
By enabling them, you help us to develop and deliver better services in the way that's most convenient for you. For information and settings, see our Cookie Policy.