DHL

AZURE BIG DATA ARCHITECT (M/F/X)


PayCompetitive
LocationBonn/North-Rhine-Westphalia
Employment typeFull-Time

This job is now closed

  • Job Description

      Req#: AV-167825
      DO YOU WANT TO WORK & LEARN ON STATE-OF-THE-ART TECHNOLOGY IN A TEAM WITH STRONG ENGINEERING KNOWLEDGE AND ENJOY THE BENEFITS OF A COPORATE ENVIRONMENT?

      JOIN OUR TEAM IN BONN, BERLIN, PRAGUE, CHENNAI, CYBERJAYA OR FULLY REMOTE IN GERMANY, CZECH REPUBLIC OR INDIA FOR A FULL-TIME OR PART TIME POSITION, STARTING AS SOON AS POSSIBLE

      #Cloud, #CloudArchitecture, #DataLake, #Python, #Scala, #Spark, #DeltaLake, #Kuberneetes, #Azure, #GCP, #DevOps, #GitHubActions, #Terraform

      DO YOU WANT TO MAKE A DIFFERENCE?

      WE OFFER EXCELLENT OPPORTUNITIES FOR PROBLEM SOLVERS.

      The Deutsche Post DHL Group is the leading mail and logistics service provider for the world. As one of the planet’s largest employers operating in over 220 countries and territories, we see the world differently. Join our team and discover how an international network that’s focused on service, quality and sustainability is able to connect people and improve lives through the power of global trade. And not just for our customers, but for every member of our team, too.
      Join a great international team (> 15 nationalities) of data engineers, cloud engineers, DevOps engineers, data scientists and architects to learn from and to share your experiences. The team language is English; hence you don’t need to speak any German. In our family friendly environment, we offer part-time working, flextime and sabbaticals.

      Your tasks
      DPDHL has set up a global big data architecture as part of the Group’s digitalization agenda. In this context, we are looking for an experienced Azure big data architect to join our growing team of data experts. You are the right candidate to join our team if you are excited by the prospect of designing and optimizing our company’s data architecture to support our next generation of data products and solutions in the Azure cloud.

      You will

      • Perform requirements workshops and analyses for Azure data platforms and related solutions with business departments & product owners
      • Create and optimize best-in class technical architectures for our Azure data platforms (data lakes, data lakehouses) and our solutions on top of these platforms, catering to data engineers, business analysts and data scientists
      • Drive the implementation of these architectures by leading data engineers, DevOps engineers and architects and by doing hands-on technical engineering work
      • Investigate emerging technology trends, concepts and solutions, by e.g., building PoCs and prototypes, to enhance our data platforms and to supplement our best practices
      • Create technical documentation on our platforms, related solutions and best practices in general
      • Mentor junior architects and engineers

      Your profile

      • Strong expertise in data platform and solution architecture, specifically:
      • Experience in architecting, designing and building enterprise-grade scalable and robust big data platforms and solutions on cloud - preferably on Azure and nice to have also on on-premise (Cloudera, MapR)
      • Ability to gather and assess business & technical requirements and to map those to technical architectures
      • Ability to evaluate architecture variations (e. g. Spark Streaming vs. Flink, Databricks vs. open-source Spark, Cassandra vs. HBase, Synapse vs. Cosmos DB, …)
      • Knowledge on best practices for selecting optimal component mix of managed services (e. g. ADF, Databricks, Synapse, etc.) and open source components.
      • Strong expertise in data engineering
      • +6 years of experience working as a data engineer or software developer and/or demonstrable involvements in open-source projects
      • Expert in designing, building and maintaining large scale data pipelines, incl. processing (transforming, aggregating, wrangling) data.
      • Capability to understand and write complex SQLs in data analytics projects.
      • Experience in performance tuning of distributed applications specifically with optimizing big data (spark, delta lake, SQL) jobs
      • Experience in at least 2 of the following programming languages: Python, Scala, Java, Kotlin, Go, Rust, C#, F#, C, C++.
      • Strong hands-on expertise in platform and DevOps engineering:
      • Cloud computing concepts and Azure cloud platform concepts (e.g. networking, security and monitoring)
      • Applying IAC, CI/CD and DevOps practices in data analytics projects, preferably using Azure DevOps and Terraform.
      • Strong hands-on expertise in specific big data technologies:
      • At least 3 Azure data services: ADLS Gen2, Azure Data Factory, Azure Databricks, Cosmos DB, Synapse Serverless or Synapse Dedicated SQL Pool, Azure Data Explorer, Stream Analytics, etc.
      • At least 2 big data technologies and frameworks: HDFS/S3/ADLS Gen2/GCS, Apache Spark/EMR/Dataproc, Delta Lake/Iceberg, Flink, Hive/Impala, Presto/Drill, etc.
      • NoSQL technologies like Hbase, MongoDB, Cassandra, Azure Cosmos DB, etc.
      • Streaming technologies like Kafka, Spark Structured Streaming, Flink or equivalent cloud services like Azure EventHub, Azure Streaming Analytics, Confluent, etc.
      • Personal skills
      • Experience in customer communication and managing business stakeholders
      • Ability to work effectively independently as well as in a team.
      • Strong verbal and written communication as well as presentation skills - flexibly adaptable to different target audiences (business, technical, developers, …).
      • Strong analytical skills.
      Your benefits

      We offer excellent employee benefits, a competitive salary package and great development opportunities, such as joining conference or paid trainings.
      We welcome full-time (40 hours) and part-time work and offer hybrid work at home and in our offices.
      If you move to a Bonn or Berlin for joining one of our offices, we support you with moving, all official paperwork, an interim flat and finding a permanent flat, as well as the search for a Kindergarten as per your family needs. We have a dedicated team to support you with your Visa and sponsor all needed activities.

      Your contact

      Ole Vollertsen, VP Data Solutions, Deutsche Post DHL headquarter, will be happy to answer your questions via Mail ole.vollertsen@dpdhl.com.
      Interested in this responsible position with its varied tasks? Please click on “Apply Here” and send us your complete application, including a cover letter, CV, references, your desired salary and your earliest possible starting date. You can find further information at dpdhl.jobs.
      We are looking forward to your application.

      CONNECTING PEOPLE. IMPROVING LIVES.
      #datasciencejobs

  • About the company

      We are an international team of over 400,000 shipping professionals, united by a passion for logistics. And we work in a unique environment. DHL is as innovative as a start-up, with the power of an international organization.

Notice

Talentify is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or protected veteran status.

Talentify provides reasonable accommodations to qualified applicants with disabilities, including disabled veterans. Request assistance at accessibility@talentify.io or 407-000-0000.

Federal law requires every new hire to complete Form I-9 and present proof of identity and U.S. work eligibility.

An Automated Employment Decision Tool (AEDT) will score your job-related skills and responses. Bias-audit & data-use details: www.talentify.io/bias-audit-report. NYC applicants may request an alternative process or accommodation at aedt@talentify.io or 407-000-0000.