Medior Staff Data Engineer

GSCF

Standszám:

B38

GSCF is the leading global provider of working capital solutions. The company empowers companies and their financial institution partners to accelerate growth, unlock liquidity and manage the risk and complexity of the end-to-end working capital cycle. GSCF’s innovative Working Capital-as-a-Service offering combines the power of an end-to-end connected capital technology platform with expert managed services and alternative capital solutions.

GSCF’s team of working capital experts operates in over 75 countries, offering a truly global and holistic perspective to solve working capital efficiency challenges.

 

The Role:

We are seeking a highly motivated and skilled Data Engineer to join GSCF who will work
hybrid. As a Data Engineer, you will be responsible for creating, maintaining, and
continuously improving the company’s data platform. You will design and implement complex
data models, develop robust ETL/ELT pipelines, and ensure high performance and scalability of
the data warehousing solutions.

How You Will Make an Impact:

  • Data Platform Development:
    • Create, maintain, and continuously improve the company’s data platform
    • Develop and optimize data warehousing solutions for high performance and scalability.
  • Self-Service Solutions: 
    • Create simple and effective self-service solutions to democratize data.
  • Data Modeling: 
    • Design and implement complex data models to support both operational and analytical
      needs.
  • Stakeholder Collaboration: 
    • Work closely with data analysts and business stakeholders to understand data
      requirements and deliver solutions that meet their needs.
  • ETL/ELT Pipeline Development: 
    • Design, develop, and maintain robust ETL/ELT pipelines to ingest, process, and store
      datasets from various sources.
  • Software Engineering Practices: 
    • Apply and endorse software engineering best practices (version control, testing, CI/CD)
      throughout the data platform.
    • Ensure the quality of datasets by implementing thorough data testing.

What You Bring to the Team:

  • Strong coding skills in SQL and Python.
  • Strong experience with modern data processing tools (e.g., dbt, Kafka).
  • Experience with pipeline orchestration solutions like Airflow and Dagster.
  • Experience with a public cloud provider (e.g., AWS, Azure).
  • Proven experience with cloud Data Warehousing solutions such as Snowflake, Databricks, and Redshift.
  • Experience managing and optimizing classic RDBMS systems like MySQL, PostgreSQL, or SQL Server.
  • Solid data modeling skills, with experience designing and optimizing data models for various use cases.
  • Familiarity with infrastructure as code tools (Terraform, CDK) and Kubernetes is beneficial.
  • Familiarity with the modern data stack is a plus.
     

All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, age,
disability, sexual orientation, national origin, or any other category protected by law

 

Jelentkezésedet várjuk a(z) B38 standon!

Ha szeretnéd a profilodba menteni az állást, akkor lépj be, vagy regisztrálj itt.

A rendezvény főszervezője:

Adatvédelmi Nyilatkozat
Jobverse.hu @ All rights reserved.
hello@jobverse.hu

2025. október 1. (szerda) 10-19 óra

2025. október 2. (csütörtök) 10-17 óra

BOK "A" Csarnok

(Budapest, Dózsa György út 1 .)