Big Data Engineer @ ITDS
Youre ideal for this role if you have:
- A degree in Computer Science, IT, or a related discipline
- Proven experience in designing and developing big data systems
- Hands-on experience with Spark and distributed computing
- Solid Java, Python, and Groovy development skills
- Strong knowledge of the Spring ecosystem (Boot, Batch, Cloud)
- Familiarity with REST APIs, Web Services, and API Gateway technologies
- Practical experience in DevOps tooling like Jenkins and Ansible
- Proficiency in using RDBMS, especially PostgreSQL
- Hands-on experience with public cloud platforms, particularly GCP
- Excellent communication in English
It is a strong plus if you have:
- Experience with streaming technologies like Apache Beam or Flink
- Knowledge of OLAP solutions and data modeling
- Background in financial risk management or the banking industry
- Exposure to container technologies such as Docker and Kubernetes
- Familiarity with Traded Risk domain concepts
- Experience with RPC frameworks like gRPC
- Knowledge of data lakehouse tools like Dremio or Trino
- Hands-on experience with BI or UI development
- Scrum Master or PMP certification
Big Data Engineer
Join us, and build data solutions that drive global innovation!
Kraków - based opportunity with hybrid work model (3 days/month in the office).
As a Big Data Developer, you will be working for our client, a leading global financial institution, contributing to the design and development of cutting-edge data solutions for risk management and analytics. The client is undergoing a strategic digital transformation, focusing on scalable, cloud-based big data platforms that support advanced analytics and regulatory compliance. You will be part of a high-performing Agile team, collaborating closely with business stakeholders and technical teams to build and maintain robust distributed systems that process large volumes of data efficiently.
,[Designing and developing distributed big data solutions using Spark , Implementing microservices and APIs for data ingestion and analytics , Managing cloud-native deployments primarily on GCP , Writing and maintaining test automation frameworks using tools like JUnit, Cucumber, or Karate , Collaborating with cross-functional teams to translate business requirements into technical specifications , Developing and scheduling data workflows using Apache Airflow , Maintaining and optimizing existing big data pipelines , Utilizing DevOps tools such as Jenkins and Ansible for CI/CD automation , Participating in Agile ceremonies and contributing to sprint planning and retrospectives , Monitoring, troubleshooting, and improving data systems and services ] Requirements: Big data, risk management, Spark, Microservices, GCP, Test automation, JUnit, Cucumber, Data pipelines, DevOps, Jenkins, Ansible, Degree, Distributed computing, Java, Python, Groovy, Spring, Boot, Cloud, REST API, Web services, API Gateway, RDBMS, PostgreSQL, Cloud platform, Apache Beam, Flink, OLAP, Docker, Kubernetes, gRPC, BI, UI, PMP Additionally: Sport subscription, Private healthcare.Kategorie
businessIntelligence
- Podrobné informace o nabídce práce
Firma: ITDS Lokalita: Práce v Polsku Odvětví práce: businessIntelligence Pracovní pozice: Big Data Engineer @ ITDS Směnnost práce fulltime - 40 hours per week Nástup do práce od: IHNED Nabízená mzda: neuvedeno Nabídka přidána: 29. 10. 2025
Pracovní pozice aktivní
Práce Big Data Engineer @ ITDS: Často kladené otázky
👉 V jakém městě se nabízí nabídka práce Big Data Engineer @ ITDS?
Práce je nabízena v lokalitě Kraków.
👉 Jaká firma nabírá na tuto pozici?
Tato nabídka práce je do firmy ITDS.
Pokud hledáte další podobné nabídky práce, podívejte se na aktuální pracovní místa Kraków - businessIntelligence