+
Login

Enter your email and password to log in if you already have an account on H512.com

Forgot password?
+
Създай своя профил в DEV.BG/Jobs

За да потвърдите, че не сте робот, моля отговорете на въпроса, като попълните празното поле:

109+1 =
+
Forgot password

Enter your email, and we will send you your password

HR agency Elevate

Data and Cloud Engineer

ApplySubmit your application

The job listing is published in the following categories

  • Anywhere
  • Report an issue Megaphone icon

Report an issue with the job ad

×

    What is wrong with the job listing?*
    Please describe the problem:
    In order to confirm you are not a robot please fill the answer to the calculation in the field:
    Tech Stack / Requirements

    Your Tasks:

     

    • Developing tools to automate processes and data pipelines across all development stages;
    • Transforming and adapting solutions to a cloud-native environment;
    • Integrating AI methodologies, especially LLM, into products;
    • Creating user-friendly front-ends and data visualizations;
    • Automating data analyses using advanced analytics and big data methodologies;
    • Evolving and updating products to meet changing requirements;
    • Ensuring appropriate testing, maintenance, stable operation, and compliance of IT products;
    • Advising auditors on leveraging technology to address their professional needs;
    • Supporting program quality assurance within the team.

     

    Your Profile:

    • Practical experience in developing applications within a financial or industrial company;
    • Extensive knowledge in programming, e.g. with Python;
    • Good experience with development and deployment processes, including CI/CD pipelines;
    • Strong background in utilizing cloud platforms and developing cloud-native applications, preferably with Azure or GCP;
    • Experience in implementing AI methodologies, such as Large Language Models;
    • Experience in data handling and preprocessing;
    • Familiarity with tools like Jira, Confluence, and Bitbucket;
    • Background in data analysis within Big Data or Advanced Analytics, using tools such as SQL, Hive-QL, or Spark;
    • Practical experience in data visualization with Qlik or equivalent tools;
    • Experience in front-end programming, preferably with JavaScript and React frameworks would be considered as an advantage;
    • Strong analytical skills to decompose complex datasets and tasks;
    • Excellent problem-solving skills demonstrating creative and strategic thinking to produce effective results and develop innovative tools;
    • Strong self-responsibility to independently develop and enhance products;
    • Fluency in English – both written and spoken, German would be considered an advantage;
    • University degree in IT, finance or relevant;

    Our offer:

    • Good work-life balance, including 25 days annual paid leave (increasing with 1 day per year up to 31 in total), flexible working hours and work-from-home and work from abroad opportunities;
    • Luxury package of additional health and dental insurance;
    • Food vouchers in the amount of 128 BGN monthly;
    • 6 additional annual days off for exceptional circumstances;
    • Employee assistance program for psychological, financial and legal consultations;
    • Multisport card;
    • Annual contribution of 300 BGN net per child for a summer camp/school/kindergarten for children up to age of 15;
    • Possibilities for building career-advancing skills by covering training/certification courses and conferences based on individual learning and development needs, access to an online learning platform;
    • Opportunities for long-term professional development in a stable company while contributing to the vision and mission of a new organizational unit;
    • Friendly and supportive multicultural environment, open to new opinions and ideas.

     

    Apply if you have experience in Python and cloud-native development on Azure or GCP. You should be skilled in integrating AI methodologies, including Large Language Models, and working with big data tools like Spark and Hive.