Betty is an innovative entertainment company pioneering at the intersection of real money online casino and casual mobile gaming. Accredited by the Alcohol and Gaming Commission of Ontario (AGCO) as a B2C operator in February 2023, we’ve set a new standard in the industry. Our mission is to redefine the online casino experience by offering a uniquely transparent environment where players can relax, unwind, and enjoy themselves safely. We are committed to accessibility, fairness, and inclusivity, fostering a community of like-minded individuals who value ethical gaming practices and prioritize our players’ safety and enjoyment above everything else.
Our Values:
We are honest – we value honesty in all aspects.
Bring the Olives – we offer premium customer experience.
Think Big – we believe in always striving for more.
Key Responsibilities:
Collaborate with cross-functional teams, including Engineering, and business stakeholders, to define and understand data requirements.
Develop and maintain scalable data pipelines for loading data from different sources into the DWH.
Optimize and fine-tune existing data processes for improved performance.
Collaborate with infrastructure teams to implement scalable and secure data platform.
Create and maintain reports for regulatory requirements and internal business needs.
Evaluate and implement new technologies, tools, and frameworks to enhance data processing and storage capabilities.
Mentor junior colleagues, providing guidance in data engineering best practices.
Requirements:
Bachelor’s degree in Computer Science, Data Science, or a related field.
3+ years of experience in a data warehouse engineering or related role.
Experience with cloud data warehousing technologies, such as Amazon Redshift, Snowflake, or Google BigQuery.
Good programming skills in Python.
Proficiency with SQL and data modelling.
Experience with pipeline orchestration tools (e.g., Airflow, Dagster).
Understanding of CI/CD principles and experience implementing automated deployment processes.
Excellent analytical and problem-solving skills.
Strong communication and teamwork skills.
Fluency in English.
Nice to have:
Experience with DBT.
Experience with stream processing frameworks (e.g., Apache Kafka).
Experience with containerization technologies (e.g., Docker, Kubernetes).
Experience with data visualization tools (e.g., Looker, Tableau, Power BI).
By enabling them, you help us to develop and deliver better services in the way that's most convenient for you. For information and settings, see our Cookie Policy.