+
Login

Enter your email and password to log in if you already have an account on H512.com

Forgot password?
+
Създай своя профил в DEV.BG/Jobs

За да потвърдите, че не сте робот, моля отговорете на въпроса, като попълните празното поле:

81+64 =
+
Forgot password

Enter your email, and we will send you your password

N-iX Bulgaria

Senior Data Engineer (HPC) (2847)

ApplySubmit your application

The job listing is published in the following categories

  • Anywhere
  • Report an issue Megaphone icon

Report an issue with the job ad

×

    What is wrong with the job listing?*
    Please describe the problem:
    In order to confirm you are not a robot please fill the answer to the calculation in the field:
    Tech Stack / Requirements

    About us:

    N-iX is a software development service company that helps businesses across the globe develop successful software products. Founded in 2002 in Lviv, N-iX has come a long way and increased its presence in eight countries Poland, Ukraine, Sweden, Bulgaria, Malta, the UK, the US, and Colombia. Today, we are a strong community of 2,000+ professionals and a reliable partner for global industry leaders and Fortune 500 companies.

    About our customer:

    Our client is a global company specializing in software development and consulting who combines science and technology with deep industry expertise to solve complex subsurface and surface challenges in the evolving energy sector.

    The client headquartered is located in Canada, with offices around the world. The company serves organizations globally, providing cutting-edge software technology and unparalleled customer support.

     

    Key Software Solutions:

    Our client offers a range of reservoir simulation software, including:

    IMEX – A black oil simulator for primary, secondary, and tertiary recovery processes.

    GEM – An advanced simulator for compositional, chemical, and unconventional reservoir modeling.

    STARS – The industry standard for thermal and advanced recovery processes.

    CMOST – An intelligent optimization and analysis tool that integrates statistical analysis, machine learning, and unbiased data interpretation to determine optimal reservoir solutions.

    The company invests in research and development, continuously improving its products and delivering state-of-the-art solutions for energy modeling and optimization.

     

    About the Role:

    We are looking for an experienced Data Service Module Engineer to develop and deploy the data service module for the HPC modeling project. This role focuses on implementing high-performance data storage and retrieval systems using HDF5 or similar, with parallel and concurrent I/O capabilities. The ideal candidate will have expertise in designing scalable data services optimized for HPC or distributed workflows, ensuring low latency and high throughput

     

    Key Responsibilities:

    • Design and implement the data service module using HDF5 for efficient data storage and retrieval.
    • Develop parallel and concurrent I/O mechanisms to optimize performance for large-scale datasets.
    • Ensure the module is tightly integrated with HPC and visualization workflows.
    • Optimize I/O operations for CPU/GPU-based workflows to minimize bottlenecks.
    • Implement caching, compression, and other strategies to enhance performance.
    • Design data structures and schemas suitable for storing 3D grid data and other simulation outputs.
    • Ensure data integrity and consistency during concurrent read/write operations.
    • Develop and execute test cases to validate module performance and reliability under various load conditions.
    • Conduct benchmarking to ensure scalability across different hardware configurations.
    • Document the architecture, APIs, and usage guidelines for the data service module.
    • Provide technical support to the development and visualization teams for data integration.

     

    Requirements:

    • Bachelor’s or Master’s degree in Computer Science, Software Engineering, or related fields.
    • 3+ years of experience in developing and deploying data services for HPC or similar systems.
    • Proven expertise with HDF5 or similar, in parallel I/O operations. Equivalent experience in distributed systems is also applicable.
    • Programming: Strong proficiency in (at least one): C++, Python, GoLang, or Fortran.
    • HDF5 Expertise: In-depth knowledge of HDF5 APIs and advanced features like parallel HDF5.
    • Parallel I/O: Experience with MPI I/O, POSIX I/O, or similar frameworks for concurrent/parallel data access.
    • Performance Optimization: Skills in profiling and optimizing I/O operations for large datasets.
    • Proficiency in SQL and experience with any RDMS
    • Might be a plus: knowledge of at least one orchestration and scheduling tool, for example, Airflow, Prefect, Dagster, etc.
    • Strong problem-solving skills and ability to work in a multidisciplinary team.
    • Excellent communication skills for cross-team collaboration and documentation.

     

    Preferred Qualifications:

    • Familiarity with data formats used in scientific computing, 3D visualization, and simulation workflows.

    We offer:

    • Flexible working format – remote, office-based or flexible
    • A competitive salary and good compensation package
    • Personalized career growth
    • Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
    • Active tech communities with regular knowledge sharing
    • Education reimbursement
    • Memorable anniversary presents
    • Corporate events and team buildings
    • Other location-specific benefits