Global financial institution with R&D development HUB in Prague.
Define the migration steps of solutions from legacy systems to cloud Lakehouse
Evaluate different cloud services to build out open format data layers
Ensure the data platform is operational, secure, scalable, and reliable
Contribute to define Data Mesh paradigm with different data domains
Collaborate with data scientists, analysts, and other stakeholders to understand their data needs and provide solutions
Write and maintain technical documentation and perform tasks in Agile methodology
Hands on experience with cloud native big data technologies, GCP/Azure
Ability to design and implement engineering solutions for business opportunities
Knowledge of data management, monitoring, security, and privacy
Experience in building data pipelines such as Data Factory, Apache Beam, or Apache Airflow
Familiarity with at least one data platforms and processing frameworks such as Kafka, Spark, Flink; Delta Lake and Databricks is a big plus
Demonstrated experience in one or more programming languages, preferably Python
Knowledge in CI/CD tools such as GitHub Actions is a plus
Strong team player willing to cooperate with colleagues across office locations and functions
Fluency in English
Flexible start/end of working hours
Contributions to the pension / life insurance
Contribution to sport / culture / leisure
Education allowance
Individual budget for personal growth
Educational courses, training
Cafeteria
Refreshments on workplace
Corporate events
Holidays 5 weeks
Sick days