Company Overview:
Founded in 2002, Webb Fontaine is a world-leading technology company powered by AI, transforming the future of Trade. The company is headquartered in Dubai, U.A.E., and benefits from a worldwide presence through its subsidiaries and branch offices in Europe, the Middle East, Asia, and Africa. Trusted by governments globally, Webb Fontaine provides industry-wide solutions to accelerate Trade development and modernization.
At Webb Fontaine, we emphasize the significance of people within our organization. We identify our team members with their unwavering commitment to excellence and innovation and continuous drive to learn and enhance their skills. Essential to our ethos is the concept of teamwork, where we foster a culture of care, support, integrity, and openness.
Position Summary:
We welcome a motivated Data Engineer to join our brand-new product team. We are developing a cutting-edge platform that optimizes logistics processes, enhances transparency, and personalizes the client experience. Collaborating with logistics experts, your work is instrumental in bridging the gap between real-world industry challenges and innovative tech solutions. This position offers a unique and exceptional opportunity to spearhead a transformative journey within the logistics sector. By leveraging your expertise and working alongside industry professionals, you will play a pivotal role in redefining and revolutionizing the logistics industry, propelling it into a new era.
Key responsibilities:
– Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and ‘big data’ technologies
– Create big data tools for project analytics, scalable ad-hoc analysis, and automated reporting
– Create testable and high-quality software with minimal risks
– Actively participate in project full cycle – architecture, design, and implementation
– Effectively communicate with team members in diverse locations
– Experience with big data technologies: Hadoop ecosystem, Spark, Hive, Kafka
– Experience with the integration of data from multiple data sources
– Experience with object-oriented/object function scripting languages – Java or Scala
– Proficient in designing efficient and robust ETL workflows
– Strong analytical skills with the ability to attention to detail and accuracy
– Knowledge of industry trends, best practices, and new technologies, and the ability to apply trends to architectural needs
– Fluent in English
Benefits:
-Competitive base salary
-Comprehensive benefits package, including medical insurance and days off
-Exposure to diverse international clients and industries
How to apply:
We look forward to meeting you in person to discuss the role in detail and hear about your career goals. Please, apply for the vacancy by pressing the “Apply” button below.