We are looking for a motivated Senior Data Engineer (AI) who is willing to dive into the new project with a modern stack. If you’re driven by a curiosity to learn and a desire to produce meaningful results, please apply!
Work at Exadel – Who We Are
We don’t just follow trends—we help define them. For 25+ years, Exadel has transformed global enterprises. Now, we’re leading the charge in AI-driven solutions that scale with impact. And it’s our people who make it happen—driven, collaborative, and always learning.
About Our Customer
You will work with the 6th-largest privately owned organization in the United States. The сustomer is one of the “Big Four” accounting organizations and the largest professional services network in the world in terms of revenue and number of professionals. The company provides audit, tax, consulting, enterprise risk, and financial advisory services to 263,900 professionals globally.
About the Project
As a Data Engineer, you’ll become a part of a cross-functional development team who is working with GenAI solutions for digital transformation across Enterprise Products.
The prospective team you will be working with is responsible for the design, development, and deployment of innovative, enterprise technology, tools, and standard processes to support the delivery of tax services. The team focuses on the ability to deliver comprehensive, value-added, and efficient tax services to our clients. It is a dynamic team with professionals of varying backgrounds from tax technical, technology development, change management, and project management. The team consults and executes on a wide range of initiatives involving process and tool development and implementation including training development, engagement management, tool design, and implementation.
Project Tech Stack
Azure Cloud, Microservices Architecture, .NET 8, ASP.NET Core services, Python, Mongo, Azure SQL, Angular 18, Kendo, GitHub Enterprise with Copilot
Requirements
6+ years of hands-on experience in software development
Experience coding in SQL/Python, with solid CS fundamentals including data structure and algorithm design
Hands-on implementation experience working with a combination of the following technologies: Hadoop, Map Reduce, Kafka, Hive, Spark, SQL and NoSQL data warehouses
Experience in Azure cloud data platform
Experience working with vector databases (Milvus, Postgres, etc.)
Knowledge of embedding models and retrieval-augmented generation (RAG) architectures
Understanding of LLM pipelines, including data preprocessing for GenAI models
Experience deploying data pipelines for AI/ML workloads, ensuring scalability and efficiency
Familiarity with model monitoring, feature stores (Feast, Vertex AI Feature Store), and data versioning
Experience with CI/CD for ML pipelines (Kubeflow, MLflow, Airflow, SageMaker Pipelines)
Understanding of real-time streaming for ML model inference (Kafka, Spark Streaming)
Knowledge of Data Warehousing, design, implementation and optimization
Knowledge of Data Quality testing, automation and results visualization
Knowledge of BI reports and dashboards design and implementation (PowerBI)
Experience with supporting data scientists and complex statistical use cases highly desirable
English level
Intermediate+
Responsibilities
Responsible for the building, deployment, and maintenance of mission-critical analytics solutions that process terabytes of data quickly at big-data scales
Contributes design, code, configurations, manage data ingestion, real-time streaming, batch processing, ETL across multiple data storages
Responsible for performance tuning of complicated SQL queries and Data flows
Advantages of Working with Exadel
Exadel is a global company, and benefits can vary depending on your location and contract type. Your recruiter will provide specific information about the benefits available to you.
By enabling them, you help us to develop and deliver better services in the way that's most convenient for you. For information and settings, see our Cookie Policy.