Food, machinery or T-shirts: Hapag-Lloyd moves goods around the globe with over 250 container ships. We connect more than 600 ports on all continents and are one of the largest liner shipping companies. More than 13,000 employees work on board, ashore or in one of our 350 offices. Together, we transport around 12 million containers per year. Our corporate values "We care. We move. We deliver" serve as coordinates on our way. They guide our collaboration with each other as colleagues and with our customers to achieve the best possible quality. In the long run, we are committed to climate and environmental protection as well as human rights and many other social issues.

Data Engineer

About the Knowledge Center:

The Knowledge Center, located in Gdańsk, is a hub for innovation and develop state-of-the-art business and technology solutions to help us navigate the future. And we want to do that together with you.

Team and Project overview:

We aim to build a big data analytics platform for a global shipping company - Hapag-Lloyd AG. In Gdańsk, we established a Knowledge Center. Within it - Data Management Solutions for Analytics (DMSA). A group of top IT experts, designers, and executors, will apply AWS cloud technology in the maritime and shipping industry. We are quality-driven and have a passion for cloud (especially AWS), analytics, and the power of positive changes that technology solutions bring to business globally.

We are looking for passionate Data Engineers who can create insights from this Data by running Analytics on a scale. We use AWS Cloud extensively, but we are open to people who have experience in other cloud solutions in GCP or Azure and want to switch to AWS. Only one programming language is required (Scala / java / Python) and a strong knowledge of SQL.


Responsibilities:

  • Building and maintaining complex cloud (AWS) data management systems that combine core data sources into data warehouses and analytical services
  • Identifying optimal solutions to efficiently scale the data flow agnostic of on-prem or cloud
  • Working closely with Data Analysts, Data Engineers, Data Scientists, and other stakeholders to design and maintain data models

Requirements:

  •  Knowledge about AWS cloud services (EC2, S3, ECS, EKS, RDS)
  • Hands-on experience in Spark (Scala/Python/Java) - we use Spark in all our Data pipelines
  • Excellence and proficiency in SQL
  • Hands-on experience with CI/CD and version control using GitLab or GitHub
  • Experience with containerization technologies (Docker basic knowledge needed)
  • Knowledge of Airflow for scheduling and Data orchestration
  • Scripting skills: Bash / Python
  • Familiarity and flexibility to work with structured, unstructured, and semi-structured datasets on a GB or PB scale
  • Expert-level knowledge of data integration and familiarity with common data integration challenges
  • Willingness to convert data types, handle errors, and translate between different technology stacks
  • ETL pipeline optimization on Software in code and hardware level

We offer:

  • Private medical care (Luxmed)
  • Gym card (Multisport)
  • Attractive annual bonus up to 22,5%! (depending on company performance results)
  • Group life insurance and employee capital plan (PPK)
  • Cafeteria benefit system (cinema tickets, vouchers etc.)
  • Focus on healthy lifestyle (fruit days, bike competitions, football trainings)
  • Charity and volunteer initiatives
  • Modern and well-connected office (Alchemia complex in Gdansk Oliwa)
  • Relocation support (financial support, covering immigration process for non-Polish citizens)
  • Internal learning management system
  • Development budget (sharing the costs of certifications and conferences/ IT events)
  • Flexible working hours and home office possibility (hybrid work model)

Contact person

Patrycja Bogucka