+
Login

Enter your email and password to log in if you already have an account on H512.com

Forgot password?
+
Създай своя профил в DEV.BG/Jobs

За да потвърдите, че не сте робот, моля отговорете на въпроса, като попълните празното поле:

78-7 =
+
Forgot password

Enter your email, and we will send you your password

EGT Digital

Senior Data Engineer, Core Platform

ApplySubmit your application

The job listing is published in the following categories

  • Anywhere
  • Report an issue Megaphone icon

Report an issue with the job ad

×

    What is wrong with the job listing?*
    Please describe the problem:
    In order to confirm you are not a robot please fill the answer to the calculation in the field:
    Tech Stack / Requirements

    EGT Digital is a next-generation tech company focused on all online gaming products, and its portfolio includes Casino Games, Sportsbook, and the all-in-one solution – a Gambling Platform.

    EGT Digital is a part of the Euro Games Technology (EGT) Group, headquartered in Sofia, Bulgaria. EGT Group is one of the fastest-growing enterprises in the gaming industry. Our global network includes offices in 25 countries and our products are installed in over 85 jurisdictions in Europe, Asia, Africa, and North, Central, and South America.

    Being a part of such a fast-moving industry as iGaming, the company knows no limits and is growing rapidly through its dedication to innovation and constant improvement. This is the reason why we are expanding our Platform Department and now we are looking for some fresh and enthusiastic people to come and join us in the exciting digital world of iGaming

    About thе role:

    EGT Digital is building a unified data platform that powers reporting, analytics, regulatory compliance, and AI‑driven insights. Our lakehouse combines streaming and batch pipelines with governed schemas and tooling to support multiple brands and geographies. As a senior data platform engineer, you will help shape the architecture of this platform, ensuring it scales, stays reliable, and meets regulatory requirements across regions.

    Responsibilities:

    • Design and implement robust batch and streaming pipelines that ingest, transform, and deliver data to lakehouse and warehouse layers
    • Define and maintain scalable data models and schemas; lead decisions on normalization, denormalization, and partitioning strategies
    • Architect and build metadata‑driven and contract‑driven workflows for schema evolution, validation, and data quality
    • Develop and optimize Python services, APIs, and utilities that support data ingestion, orchestration, observability, and platform automation
    • Work with Airflow and similar orchestrators to design resilient, modular DAGs and manage complex dependencies
    • Lead the development of distributed processing jobs using Spark or other frameworks; tune performance and resource usage
    • Collaborate on infrastructure: Docker and Kubernetes deployments, CI/CD pipelines, and cloud/on‑prem integration
    • Set coding standards, implement testing strategies, and establish monitoring and incident‑response practices
    • Mentor engineers, perform code reviews, and provide guidance on system design and implementation choices

    Requirements:

    • Python – 5+ years of experience building production‑grade services, APIs, data pipelines, and libraries
    • SQL – 5+ years of expertise in complex joins, window functions, performance tuning, and analytical modeling
    • Deep experience with streaming systems such as Kafka, Pub/Sub, or Kinesis, including design of topics, consumer groups, and exactly‑once semantics
    • Proficient with Airflow or a comparable orchestrator at scale: dynamic DAGs, custom operators, error handling, and multi‑tenant setups
    • Extensive hands‑on knowledge of DBT or similar tools for transformations, modeling, and testing
    • Strong experience with Docker and Kubernetes, including deployment patterns, Helm/manifest management, and resource tuning
    • Solid understanding of distributed data processing (Spark/Flink/Beam) and how to optimize jobs for large datasets
    • Proven ability to design data models, handle schema evolution, and implement data quality and observability frameworks
    • Experience building and maintaining CI/CD pipelines and working with Git‑based workflows
    • Excellent communication skills in English; able to collaborate across engineering, analytics, and product teams

    Nice to Have:

    • Experience with ClickHouse, PostgreSQL, or other analytical/operational databases at scale
    • Familiarity with lakehouse architectures, object storage formats (Parquet, Iceberg, Delta), and partitioning strategies
    • Knowledge of Infrastructure as Code tools like Terraform or Helm; exposure to GitOps
    • Experience with Flink, Pulsar, or other streaming analytics frameworks
    • Background in building internal data platform products (catalogs, quality services, lineage tools)
    • Exposure to multi‑region or hybrid cloud architectures and compliance challenges

    What we offer:

    • Competitive salary
    • Performance based annual bonus
    • Performance evaluation & salary review twice a year
    • 25 days paid annual leave
    • Work from home option -2 days weekly
    • Flexible working schedule
    • Additional health insurance – premium package
    • Fully paid annual transportation card
    • Fully paid Sports card
    • Free company shuttle by the office
    • Sports Teams/Sports events
    • Professional development, supportive company culture, and challenging projects
    • Company-sponsored trainings
    • Tickets for conferences and seminars
    • Team building events and office parties
    • Referral Program
    • Free snacks, soft drinks, coffee, and fruit are always available
    • Birthday, newborn baby, and first-grader bonuses
    • Corporate discounts in various shops and restaurants
    • State-of-the-art modern office
    • Positive working environment and chill-out zone (PS4, foosball-table, and lazy chairs)

    All applications will be treated in strict confidentiality and only the approved candidates will be invited to an interview.