FIRST. is a dynamic and innovative global technology company that provides B2B SaaS solutions for online sports platforms. We are at the forefront of innovation, empowering gaming partners with revolutionary sportsbook solutions. Our cutting-edge sportsbook software, premium data feeds, and unwavering commitment to partner success are reinventing the gaming industry.
Our success stems from data, with our Data Insights team playing a vital role in collecting, organizing, and streamlining data access across CoreTech’s supplier systems. This allows stakeholders to uncover valuable insights for both strategic and operational decisions.
We’re a small but growing team – data engineers, analysts, and a product owner – building scalable data infrastructure on the Google Cloud Platform (GCP). As we expand, we aim to strengthen our data-driven culture and make data more accessible to stakeholders and R&D teams.
If you love Python coding, tackling data engineering challenges, working with modern cloud tech, and fine-tuning data pipelines, this role is perfect for you!
As a Senior Data Platform Engineer, you will:
Build, enhance, and maintain batch and streaming data pipelines, focusing on performance and scalability.
Refactor existing data workflows to align with best practices in BigQuery, Airflow, and Dataflow.
Build and maintain Python-based applications and tools for data processing and infrastructure management.
Design and improve data models and schemas to enhance accessibility and efficiency.
Collaborate with the team to define best practices for event-driven, metadata-driven systems.
Participate in on-call rotations (optional) to monitor and troubleshoot data pipelines.
Required Skills:
Python (5+ years) – Strong experience in backend development, including API creation.
SQL (5+ years) – Knowledge of data modeling, indexing, window functions, and query optimization.
Experience with async/await, multiprocessing, and multi-threading techniques in Python.
Experience with Pub/Sub, Kafka, or similar messaging systems for event-driven data processing.
Proficiency in Docker, docker-compose, and understanding of Kubernetes fundamentals.
Strong grasp of TDD/BDD, SOLID principles, and software design patterns.
Good understanding of Cloud Build, Jenkins, and ability to work with DevOps processes when needed.
Ability to collaborate effectively with engineers and analysts in English.
Experience with Google Cloud Platform (GCP) or another cloud provider.
Nice to Have:
Experience with BigQuery, PostgreSQL, and interest in Airflow, Dataflow (or similar tools).
Familiarity with Apache Spark and distributed data processing.
Experience with Business Intelligence (BI) tools or Machine Learning (ML)
Exposure to Terraform, GitOps, or infrastructure automation.
What do we offer:
An open-minded environment that values you
An international multicultural team
Career development
An amazing office environment (table tennis, play station, all the fun stuff)
Food vouchers (180 BGN)
Additional health insurance
Sports card
21 days paid leave
Flexible working hours
Manicure, barber, massage, breakfast, and snacks in the office
Wedding and new baby bonus
Team building activities
Professional trainings
If you are interested in joining our diverse and dynamic team, we are looking forward to your application. Only short-listed candidates will be contacted. Confidentiality of all applications is assured!
*In regard to GDPR 2016/679 you hereby give your consent the personal data included in the CV/resume/motivational letter to be processed for the purposes of the recruitment and hiring process in the company.
By enabling them, you help us to develop and deliver better services in the way that's most convenient for you. For information and settings, see our Cookie Policy.