We are seeking a seasoned Data Engineering Specialist
to join our global business's Data Platform team permanently. You will be at the forefront of building scalable ETL pipelines, managing massive data lakes, and driving analytics solutions that impact millions of lives across Thailand.
If you thrive in a high-stakes, "Big Data" environment and love working with modern tech stacks like Databricks, PySpark, and Azure, this is the place for you!
Key Responsibilities:
- Design and document scalable data models, integration patterns, and architectural blueprints to ensure long-term consistency and performance
- Build and optimize end-to-end ETL pipelines across cloud and on-premise environments, focusing on reusable components and automation
- Provision high-quality, reliable datasets to power advanced AI use cases and enterprise-wide reporting
- Implement robust data validation, observability, and automated recovery mechanisms to ensure reliable and high-quality data delivery
- Lead technical Proof of Concepts (POCs) and prototypes to experiment with emerging tools and validate new data sources for future scaling
- Translate complex business needs into technical requirements and collaborate with cross-functional teams to drive data governance and platform rollouts
Qualifications
- Bachelor's degree in Computer Science, Computer Engineering, or IT-related fields
- At least 7 years of proven professional experience in data engineering
- Expertise in building and managing large-scale data pipelines within distributed architectures (Datalake or Data Warehouse)
- Hands-on background of using Azure Databricks and Azure Data Factory
- Have strong programming skills in Python, PySpark, and SQL
- Must have a good command of English
Argyll Scott Asia is acting as an Employment Agency in relation to this vacancy.