Food, machinery or T-shirts: Hapag-Lloyd moves goods around the globe with over 250 container ships. We connect more than 600 ports on all continents and are one of the largest liner shipping companies. More than 13,000 employees work on board, ashore or in one of our 350 offices. Together, we transport around 12 million containers per year. Our corporate values "We care. We move. We deliver" serve as coordinates on our way. They guide our collaboration with each other as colleagues and with our customers to achieve the best possible quality. In the long run, we are committed to climate and environmental protection as well as human rights and many other social issues.

Data Architect

About the Knowledge Center:
The Knowledge Center, located in Gdańsk, is a hub for innovation and develop state-of-the-art business and technology solutions to help us navigate the future. And we want to do that together with you.

Team and Project overview:

We aim to build a big data analytics platform for a global shipping company - Hapag-Lloyd AG. In Gdańsk, we established a Knowledge Center. Within it - Solutions for Analytics (SFA). A group of top IT experts, designers, and executors, will apply AWS cloud technology in the maritime and shipping industry. We are quality-driven and have a passion for cloud (especially AWS), analytics, and the power of positive changes, technology solutions bring to business globally. Data Architect will work closely with data analysts, data engineers, data scientists, and other stakeholders to design and maintain data models.


  • Develop a reference architecture for our new technology software solutions
  • Design secure, scalable, and highly available cloud infrastructure on AWS
  • Recommend and assist the cross-functional development team in the automation, improvement, and management of the ETL (Extract, Transform and Load) and ELT (Extract, Load and Transform) work
  • Data structure reviews, achieving the highest possible system performance by conducting software optimizations
  • Continually identifying and implementing improvements through technical debt reduction
  • Meeting regulatory compliance standards

Requirements - must have: 

  • A minimum of 3 years of experience in a similar role
  • Bachelor’s degree in computer science, computer engineering, or relevant field
  • Designing diagrams ( or similar, knowledge of The Theory of modern big data warehousing
  • Experience with AWS services: S3, EC2, Lambda, SNS, SQS, Athena Glue, KMS/HMS (or similar)
  • In-depth understanding of Snowflake, Data Bricks/Spark
  • Proven work experience in the development, management, and monitoring of data pipelines
  • Comfortable in one or more of Java, Scala, or Python with the ability to be ‘hands on when required
  • Working with unstructured, semi-structured, and structured data

Desirable Skills:

  • Experience and passion to work in the DevOps approach

Requirements – nice to have:

  • Data Integration tool (IICS)
  • BI tools (QlikView/Qlik Sense)
  • Scheduling (currently Airflow)
  • CI/CD (Terraform, Docker, GitLab)
  • Replication layer (DMS, Qlik Replicate
  • Distributed streaming (Kafka)
  • SQL and NoSQL databases

We offer:

  • Private medical care (Luxmed)
  • Gym card (Multisport)
  • Attractive annual bonus of up to 22,5%! (depending on company performance results)
  • Group life insurance and employee capital plan (PPK)
  • Cafeteria benefit system (cinema tickets, vouchers, etc.)
  • Focus on a healthy lifestyle (fruit days, bike competitions, football training)
  • Charity and volunteer initiatives
  • Modern and well-connected office (Alchemia complex in Gdansk Oliwa)
  • Relocation support (financial support, covering the immigration process for non-Polish citizens)
  • Internal learning management system
  • Development budget (sharing the costs of certifications and conferences/ IT events)
  • Flexible working hours and home office possibility (hybrid work model)

Contact person

Małgorzata Pióro