Data Engineer
Etraveli Group is a leading global flight technology provider, specializing in flight sales and offering flight content delivery and fintech products. We are here to solve complexity, by connecting millions of flights and travelers across the globe, from search and selection to trip and beyond.
We partner with major global platforms such as Booking.com, Google Flights, Skyscanner, and Kayak, providing seamless flight booking and related services. Our B2B portfolio includes a separate fintech entity with its flagship product, PRECISION, a risk management solution, Sweden’s leading flight comparison site Flygresor.se, Tripstack, our B2B Flights as a Service Provider and world leader in virtual interlining, and Wenrix, the embedded AI platform for flights. We also operate our own online travel agency brands including Gotogate, Mytrip, and Flightnetwork.
Every day we strive to make the world smaller for our customers and bigger for our people. Our diverse team of more than 3200 passionate professionals is what makes us the industry’s tech wonder and the best in the world at what we do.
Major offices in Sweden (HQ), Greece, India, Canada, Israel, Poland, UK, and Uruguay.
The Role
At ETG, we rely on powerfully insightful data to inform our systems and solutions. We're seeking an experienced pipeline-centric data engineer to build, maintain, and optimize the data infrastructure that supports our critical customer service functions. The ideal candidate possesses strong technical expertise in data engineering, coupled with analytical rigor and excellent communication skills to translate complex data insights into actionable strategies for the business.
Duties and Responsibilities
As a Customer Services Data Engineer, you will focus on the complete data lifecycle within the CS department to ensure data reliability and accessibility.
Data collection & preparation: You’ll gather data from a variety of sources — including databases, APIs, event tracking tools and spreadsheets — ensuring it’s accurate, complete, and relevant to the business questions at hand.
Design and build scalable data pipelines: Develop, maintain, and optimize ETL/ELT processes that collect, transform, and load data from multiple sources into our data warehouse.
Automation & reliability: Automate data workflows, monitoring, and quality checks to ensure data consistency, accuracy, and reliability across all systems.
Collaboration: Work with analytics and BI teams to ensure data needs are met — from ingestion to analysis — and that data is accessible and well-documented.
Data integrity: Ensure that data is secure, well-documented, and accessible to those who need it most.
Requirements
A university degree in Computer Science, Information Technology, Engineering, or a related quantitative discipline.
+3 years of experience in Python and SQL for data manipulation, transformation, and analysis.
Strong understanding of data architecture, data modeling (dimensional/relational) , data governance principles, and advanced ETL/ELT processes.
+3 years of professional experience in building data-intensive applications, including robust data pipelines and APIs.
Hands-on experience with ETL/workflow orchestration tools (e.g., dbt or Apache Airflow).
Excellent verbal and written communication skills in English, with a proven ability to clearly explain complex data issues and solutions to both technical and non-technical stakeholders.
Ability to thrive on a dynamic, research-oriented team managing concurrent projects.
Nice to have: Experience in data platform tools like Databricks, Dremio or similar
#LI-MK
- Department
- Customer Service
- Locations
- Mumbai, Pune
- Remote status
- Hybrid
Already working at Etraveli Group?
Let’s recruit together and find your next colleague.