GCP Data Engineer @ Antal

Polsko

  • Strong proficiency in Java and Spring Boot
  • Understanding of key software design principles: KISSSOLIDDRY
  • Hands-on experience building data processing pipelines (preferably with Apache Beam)
  • Experience designing and building RESTful APIs
  • Familiarity with relational and NoSQL databases, especially PostgreSQL and Bigtable
  • Basic knowledge of DevOps and CI/CD tools, including Jenkins and Groovy scripting
  • Experience with integration frameworks and patterns (e.g., SagaLambda)
  • Strong problem-solving and analytical skills
  • Excellent communication skills and ability to thrive in a collaborative team environment

Nice to Have


  • Experience with Google Cloud Platform (GCP) services: GKE, Cloud SQL, Dataflow, Bigtable
  • Familiarity with OpenTelemetryPrometheusGrafana
  • Knowledge of KubernetesDocker, and Terraform
  • Messaging/streaming experience with Kafka
  • UI experience with Vaadin
  • Exposure to Apache Beam in large-scale data environments

Are you ready to build impactful solutions on a global scale? Join a forward-thinking team that powers critical risk calculations in one of the worlds leading financial institutions.

About the Role


We are looking for a talented GCP Data Engineer with a strong Java background to join the STAR platform team. STAR is HSBC’s strategic cloud-native platform designed to generate and deliver risk factor definitions, historical market data, and scenarios for Value at Risk (VaR) and Expected Shortfall (ES) calculations.

The platform leverages data pipelines and microservices, combining both real-time and batch processing to handle large-scale datasets. You’ll be joining a global team of developers within the Global Traded Risk Technology department, working in an open, inclusive, and innovation-driven environment.


,[Translate complex business requirements into secure, scalable, and high-performance data solutions, Design and implement performant data processing pipelines (batch and streaming), Develop REST APIs and data ingestion patterns in a cloud-native architecture, Integrate internal systems with a focus on cost optimization and fast data processing, Modernize and enhance existing pipelines and microservices, Create and maintain solution blueprints and documentation, Conduct peer code reviews and provide constructive feedback, Promote test-centric development practices including unit and regression tests, Ensure consistent logging, monitoring, error handling, and automated recovery aligned with industry standards, Collaborate closely with engineers, analysts, and stakeholders across regions] Requirements: Java, Spring Boot, PostgreSQL, NoSQL, DevOps, CD tools, Jenkins, Groovy, DRY, AWS Lambda, Apache Beam, Cloud, REST API, Google Cloud Platform, Prometheus, Grafana, Kubernetes, Docker, Terraform, Kafka, UI, Vaadin

Kategorie

backend

  • Podrobné informace o nabídce práce
    Firma: Antal
    Lokalita: Práce v Polsku
    Odvětví práce: backend
    Pracovní pozice: GCP Data Engineer @ Antal
    Směnnost práce fulltime - 40 hours per week
    Nástup do práce od: IHNED
    Nabízená mzda: neuvedeno
    Nabídka přidána: 21. 6. 2025
    Pracovní pozice aktivní
Odpovědět na inzerát
    Buďte první, kdo se na danou nabídku práce přihlásí!
Zajímavé nabídky práce v okolí:

Práce GCP Data Engineer @ Antal: Často kladené otázky

👉 V jakém městě se nabízí nabídka práce GCP Data Engineer @ Antal?

Práce je nabízena v lokalitě Kraków.

👉 Jaká firma nabírá na tuto pozici?

Tato nabídka práce je do firmy Antal.

Zaujala Vás nabídka práce na pozici GCP Data Engineer @ Antal ve městě Kraków? Pošlete svůj životopis firmě Antal ještě dnes.
Pokud hledáte další podobné nabídky práce, podívejte se na aktuální pracovní místa Kraków - backend
0.1232