*This position is fully remote only for employment in Bulgaria. However, people can also work in one of our offices in Sofia or Varna if they prefer to.
About DataArt
DataArt is a global software engineering firm and a trusted technology partner for market leaders and visionaries. Our world-class team designs and engineers data-driven, cloud-native solutions to deliver immediate and enduring business value.
We promote a culture of radical respect, prioritizing your personal well-being as much as your expertise. We stand firmly against prejudice and inequality, valuing each of our employees equally.
We respect the autonomy of others before all else, offering remote, onsite, and hybrid work options. Our Learning and development centers, R&D labs, and mentorship programs encourage professional growth.
Our long-term approach to collaboration with clients and colleagues alike focuses on building partnerships that extend beyond one-off projects. We provide the ability to switch between projects and technology stacks, creating opportunities for exploration through our learning and networking systems to advance your career.
Position Overview
Our client, a UK-based digital bank, delivers innovative and accessible financial solutions for today’s consumers. Known for its user-friendly platform and customer-centered approach, it helps people manage finances with ease and transparency. Recent expansions into flexible payment options and new services strengthen its role as the UK’s digital banking sector leader.
We are supporting a key data engineering initiative aimed at building a secure and scalable AWS-based environment for processing Personally Identifiable Information (PII). The project will enable ingestion and transformation of sensitive data to support regulatory, analytical, and operational use cases.
Responsibilities
Design and build robust data pipelines to ingest and transform PII data from internal sources (Kafka, Aurora, MS SQL, S3)
Operate within a secure AWS environment, strictly following internal data security policies and access controls
Collaborate with engineering and compliance teams to ensure regulatory and security standards are met
Write production-ready, testable code with automated deployment practices
Contribute to documentation, knowledge sharing, and implementation support
Requirements
Strong proficiency in Python, PySpark, and SQL
Experience with CI/CD workflows and version control systems (e.g., Git)
Proven experience in designing and delivering data pipelines that handle sensitive or regulated data in cloud environments
Hands-on experience with AWS services such as Glue and Airflow for ETL/ELT orchestration
Familiarity with Amazon Redshift and SageMaker Studio for analytical workloads
Knowledge of Terraform or other Infrastructure-as-Code tools
Experience with DBT and implementing data quality testing
What We Offer
Unique corporate culture – no micromanagement, friendly atmosphere, freedom, and mutual respect
Flexible schedule – ability to change projects, to work from home, and to try yourself in different roles
Professional Development Map – a comprehensive map of your professional development within DataArt
We hire people not for a project, but for the company. If the project (or your work in it) is over, you go to another project or to a paid “Idle”.
Social benefits – additional health insurance, life insurance, sports card, etc.
Opportunity to work from another DataArt office in a different city or country (temporarily or permanently)
By enabling them, you help us to develop and deliver better services in the way that's most convenient for you. For information and settings, see our Cookie Policy.