+
Login

Enter your email and password to log in if you already have an account on H512.com

Forgot password?
+
Създай своя профил в DEV.BG/Jobs

За да потвърдите, че не сте робот, моля отговорете на въпроса, като попълните празното поле:

96-7 =
+
Forgot password

Enter your email, and we will send you your password

Anthill by Exadel Bulgaria

Middle Data Engineer (GCP, BigQuery) (8414)

ApplySubmit your application

The job listing is published in the following categories

  • Anywhere
  • Report an issue Megaphone icon

Report an issue with the job ad

×

    What is wrong with the job listing?*
    Please describe the problem:
    In order to confirm you are not a robot please fill the answer to the calculation in the field:
    Tech Stack / Requirements

    About the Role

    We are looking for a Data Engineer with experience in Google Cloud Platform and the BigQuery ecosystem. You will support the modernization of an existing Oracle-based data warehouse by building cloud-native pipelines and models in GCP. You will work on data ingestion, transformation, and optimization tasks while collaborating with engineers, analysts, and product partners. This role suits someone who enjoys learning through hands-on work, pays close attention to detail, and brings steady ownership to day-to-day engineering activities.

     

    What You Will Do

    • Pipeline Development
    • Build and maintain data pipelines using Apache Beam and Dataflow under the guidance of senior engineers.
    • Develop ingestion patterns across batch or near real-time workflows.
    • Write Python and SQL for transformations, validations, and automation tasks.
    • Create BigQuery tables with sound partitioning and clustering choices.
    • Transformation and Modeling
    • Use dbt or Dataform to manage transformations and testing.
    • Contribute to data model implementation following established standards.
    • Document logic and assumptions clearly for partner teams.
    • Production Operations
    • Support production workloads by monitoring pipelines, analyzing issues, and applying fixes.
    • Contribute to performance tuning efforts across BigQuery and Dataflow.
    • Participate in the implementation of CI and CD practices for data workflows.
    • Collaboration and Growth
    • Work with analysts, scientists, and engineers to understand requirements.
    • Participate in code reviews and apply feedback to improve your craft.
    • Learn modern GCP approaches through close coordination with senior engineers and architects.

     

    What Will Help You Succeed

     

    Technical Skills

    • Experience with Dataflow, Apache Beam, BigQuery, Cloud Storage, or similar cloud-native tools.
    • Solid proficiency in Python for data tasks and automation.
    • Strong SQL skills and a clear understanding of analytic query patterns.
    • Experience with dbt or Dataform for transformations and testing.
    • Understanding of common data modeling concepts used in analytics environments.

     

    Engineering Skills

    • Familiarity with CI and CD practices.
    • Comfortable working with logging, metrics, and monitoring tools.
    • Interest in data quality practices and validation frameworks.
    • Strong debugging instincts and patience with iterative problem solving.

     

    Professional Qualities

    • Clear communication with teammates and partner groups.
    • Desire to grow your technical depth through real project experience.
    • Steady focus on reliability, clarity, and maintainability.
    • Intermediate or higher English proficiency.

     

    Nice to Have

    • Experience with Pub/Sub or other event streaming tools.
    • Exposure to Dataproc or Spark from legacy environments.
    • Familiarity with Vertex AI or ML-related workflows.
    • Understanding of orchestration tools such as Composer or Airflow.