Our client is a renowned software and consulting provider with 29 years of market presence. They are a team of passionate software enthusiasts dedicated to building cutting-edge solutions in IoT, Digital Transformation, MR/VR/AR simulation, and AI/ML-driven technologies. What unites them is their commitment to knowledge leadership, a strong focus on R&D, and a mindset that embraces even the toughest technical challenges.
Position Overview:
Our team at CADABRA is looking for a Senior Azure Databricks Engineer to support and maintain their internal BI platform, used by the Finance and Business Operations teams. This is a hands-on technical role focused on backend data operations — including data ingestion, transformation, and CI/CD support, within a cloud-based data warehouse environment.
Primary Duties:
Ensure stable operation of the internal BI platform used by Finance and Business Operations
Develop, maintain, and troubleshoot data pipelines for ingestion, transformation, and load using Azure Databricks (PySpark, SQL)
Support and optimize CI/CD pipelines (Azure DevOps) for smooth deployments and minimal downtime
Collaborate with BI front-end analysts, IT teams, and business stakeholders to ensure alignment of data needs and delivery
Monitor and improve system performance, resolve incidents, and ensure data quality and consistency
Maintain data architecture standards and support platform scalability and compliance
Integrate data from systems like D365 Finance & Operations and other business applications
Work with Azure services such as Data Lake, Key Vaults, Service Principals, and SQL Database
Maintain proper documentation of processes, configurations, and procedures
Participate in improvement initiatives to enhance platform efficiency and usability
Your Expertise:
7+ years of experience with Business Data Analytics Platforms
Strong hands-on experience with Azure Databricks, PySpark, and SparkSQL
Solid understanding of CI/CD pipelines (preferably with Azure DevOps) and troubleshooting deployment issues
Proficiency in Python and working knowledge of Shell scripting
Experience with data ingestion, ETL processes, and managing large-scale data pipelines
Experience with Azure services such as Azure Key Vaults, Azure SQL, Azure Data Lake, and Service Principals
Understanding data governance, security standards, and handling sensitive data
Ability to work closely with both IT and finance/business stakeholders
Good knowledge of data integration from sources like D365 F&O, Unit4, Azure Portal
Strong analytical, problem-solving, and communication skills
Fluency in English, both spoken and written
Reasons to Join:
Working on innovative and unique IT projects
Friendly and collaborative working environment
Attractive remuneration package
Work-life balance: flexible working hours, 20 to 24 days paid vacation depending on years with the company
All kinds of office and additional benefit perks (sports card, additional healthcare package subsidizing, special discounts, and others)
By enabling them, you help us to develop and deliver better services in the way that's most convenient for you. For information and settings, see our Cookie Policy.