249 Reporting Engineer jobs in Thailand
Data Engineer/Senior Data Engineer
Posted today
Job Viewed
Job Description
Responsibilities / หน้าที่ความรับผิดชอบ:
- Designing and building data pipelines: Creating automated workflows (pipelines) that efficiently ingest, transform, and load data from disparate sources into a central repository.
- Data warehousing and storage: Designing and managing databases, data lakes, and data warehouses to ensure data is stored in an organized, efficient, and accessible manner.
- Building data platforms: Building and maintaining the foundational platforms and tools that enable data scientists and analysts to perform their work effectively.
- Data quality and governance: Implementing processes to monitor and ensure the accuracy, consistency, and reliability of data.
- Performance optimization: Monitoring and tuning the data infrastructure to improve performance, reduce latency, and lower costs.
Qualifications / คุณสมบัติ:
- Bachelor, Master degree in Computer Science, Information Technology, Data Engineering.
- 5+ years of experience in data engineering or a related field.
- Experience with data warehouse and data lake architectures.
- Excellent problem-solving, communication and leadership skills.
- Experience with ETL/ETL tools and frameworks (eg. Oracle Data Integrator) is a plus.
Data Engineer
Posted today
Job Viewed
Job Description
Job Description:
- Design, develop, and maintain data pipelines and data infrastructure to support business needs.
- Develop end-to-end data processes: extract, transform, and analyze data for reporting and insights.
- Collaborate with Data Analysts, Data Scientists, and Software Engineers to enable data-driven decisions.
- Ensure data quality, accuracy, and performance across systems.
- Automate workflows and improve efficiency through scripting and tool development.
Qualifications:
- Bachelor's or Master's degree in Computer Science, IT, Engineering, or related field.
- 0–5 years of experience in Data Engineering or ETL development (new graduates welcome).
- Strong skills in SQL and programming (Python or Scala).
- Understanding of Data Pipeline, Data Warehouse, and Data Integration.
- Experience with Airflow, dbt, Spark, or Cloud Platforms (AWS, GCP, Azure) is an advantage.
- Good analytical, problem-solving, and communication skills.
Data Engineer
Posted today
Job Viewed
Job Description
Responsibilities
- Design, develop, and maintain ETL (Extract, Transform, Load) pipelines for integrating data from multiple sources (databases, APIs, files, streaming, etc.).
- Build and optimize data pipelines to support both batch and real-time data processing.
- Collaborate closely with Data Analysts, Data Scientists, and business teams to deliver clean and reliable data.
- Manage and optimize data warehouse / data lake platforms (e.g., BigQuery, Snowflake, Redshift, Hadoop, Spark).
- Ensure data quality, consistency, and accuracy across systems.
- Develop scripts and automation tools to improve workflow efficiency and reduce manual work.
- Monitor, troubleshoot, and resolve issues in data pipelines and related infrastructure.
- Bachelor's degree or higher in Computer Science, Information Technology, Data Engineering, or related fields.
- 2–3 years of experience in Data Engineering or ETL development.
- Strong experience in SQL (query optimization, stored procedures, complex joins).
- Hands-on experience with ETL/ELT tools such as Airflow, dbt, Talend, Informatica, SSIS, or custom Python scripts.
- Experience working with both relational databases (MySQL, PostgreSQL, SQL Server, Oracle) and NoSQL databases (MongoDB, Cassandra, DynamoDB).
- Familiarity with at least one cloud platform (AWS, GCP, Azure).
- Knowledge of data modeling, data warehousing, and data governance.
- Proficiency in Python/Scala/Java for data processing.
- Familiarity with Git, Docker, and CI/CD pipelines is a plus.
Data Engineer
Posted today
Job Viewed
Job Description
Job Descriptions
- Evaluate business requirements
- Design solution and process in accordance with the designed architecture
- Develop data pipelines and data acquisition processes using programming language and tools
- Work with data analyst on several projects
Qualifications
-Bachelor's degree in Computer Science, MIS, Statistics, Computer Engineering, Mathematics, Economics, or a related field.
- Work Experience 0-1 Yr.
- Experience in developing a data warehouse for financial institutions
Skills
- Experience in data Integration / data Ingestion / data Pipeline tools such as SSIS, Talend, ODI, Airflow, DataBricksand etc.
- Intermediate SQL Query
- Basic knowledge in Python
- knowledge in BI tool would be a plus
- Intermediate Excel Report / Functions
Data Engineer
Posted today
Job Viewed
Job Description
Job Title : Data Engineer ( Fix Term Contracts 2 Year)
Role Purpose
The Data Engineer (AI) is responsible for building and maintaining robust data pipelines and infrastructure that power AI and advanced analytics across Mitr Phol. This role ensures that data is collected, cleaned, transformed, and made available in an AI-ready format for Data Scientists, Generative AI/LLM specialists, and business stakeholders.
Key Responsibilities
Data Pipeline Development
o Develop and maintain scalable ETL/ELT pipelines for structured, semi-structured, and unstructured data.
o Integrate data from multiple sources including ERP, IoT/sensors, APIs, external datasets, and files.
o Support both batch and streaming ingestion (real-time pipelines).
Data Management & Transformation
o Implement data cleansing, transformation, and normalization processes.
o Ensure data consistency, accuracy, and integrity.
o Build curated datasets and feature stores for AI/ML models.
Collaboration & Support
o Work closely with Data Scientists to prepare and deliver training and inference datasets.
o Support AI Product Managers with data requirements for new use cases.
o Collaborate with AI Data Governance team to implement metadata, lineage, and access controls.
Operations & Monitoring
o Monitor pipeline performance, latency, and costs.
o Troubleshoot data ingestion or quality issues.
o Automate workflows using orchestration tools (Airflow, dbt, Azure Data Factory, etc.).
Document & Best Practices
o Maintain documentation of pipelines, schemas, and data dictionaries.
o Follow coding standards, version control (Git), and CI/CD practices.
o Ensure compliance with Responsible AI and data security guidelines.
Required Qualifications
o Bachelor's degree in Computer Science, Data Engineering, IT, or related.
o –5 years of experience in data engineering, data integration, or ETL development.
o Proficiency in Python, SQL (and optionally Scala/Java).
o Hands-on experience with big data and processing tools (e.g., Spark, Kafka/EventHub, Hadoop, dbt, Airflow).
o Experience with relational and non-relational databases (e.g., Postgres, SQL Server, MongoDB, Cassandra).
o Familiarity with cloud data platforms (Azure Data Factory, AWS Glue/Redshift, GCP BigQuery).
Data Engineer
Posted today
Job Viewed
Job Description
As a Data Engineer at Data Wow, you will work closely with Data Scientists and Data Analysts to create data pipelines. You will get to work with data warehouse technologies as well as data ingestion tools. You will also get to work with Data Scientists to create proof of concept products for clients. You will be part of a team that delivers exciting data-oriented products.
Responsibilities- Creating, maintaining, optimizing various data pipelines
- Work with data ingestion tools
- Optimize data warehouse utilization
- Create proof of concept products that integrate with machine learning models
- Researching new ways to improve existing architecture
- Expertise in Python ; experience with web frameworks is a plus (Flask, Django, FastAPI)
- Working knowledge of data warehousing technologies
- Working knowledge of ETL technologies
- Working knowledge of SQL
- Experience with Google Cloud Service and AWS
- Working knowledge of Docker
- Experience with Kubernetes
- Experience with Apache Kafka
- Problem-solving attitude
- Understanding of best practices regarding security
- Collaborative team spirit
- Good command of English
- Hybrid work
- Health insurance
- Annual health check
- Laptop and other equipments
- Free snacks & drinks
- Weekly massage
- Grab transportation credit
- Education allowance
- Performance bonus
Data Engineer
Posted today
Job Viewed
Job Description
· Develop and maintain ETL/ELT processes for extracting, transforming, and loading data from multiple sources, including SQL/NoSQL databases, APIs, CSV/Excel files, ERP systems, and third-party services
· Write scripts and workflows to support data integration and data cleansing processes
· Design, build, and manage efficient data storage systems such as Data Lakes, Data Warehouses, and Cloud Storage
· Monitor and optimize query performance and data flow on a regular basis to ensure fast and reliable data access and storage
· Verify and ensure data quality by conducting ongoing data validation and correction
· Manage data access controls to comply with relevant security standards and data protection regulations, including PDPA and GDPR
· Design, develop, and maintain analytical data structures that support business insights
· Collaborate with Data Analysts, IT teams, and other departments to analyze data requirements and support strategic decision-making through system data extraction
· Create and maintain a comprehensive Data Dictionary and related system documentation
· Research and implement new technologies or modern methodologies to improve data management efficiency
· Contribute to team development strategies and promote best practices in Data Engineering
Qualifications· Bachelor's degree or higher in Computer Engineering, Computer Science, Data Engineering, or a related field
· Minimum of 3 years of experience in Data Engineering, especially in a senior or lead-level role
· Proficiency in working with database systems and tools within the Microsoft SQL Server family
· Strong skills in SQL and programming languages such as Python
· Hands-on experience with ETL/ELT tools such as Apache Airflow, dbt, or equivalent platforms
· Experience with cloud platforms such as Google Cloud Platform (GCP) or Microsoft Azure, including data pipeline tools like BigQuery, Redshift, Dataflow, or Glue
· Solid understanding of Data Warehouse design principles and Data Modeling techniques
· Familiarity with Power BI and related business intelligence tools
· Knowledge of Data Governance, Data Security, and Privacy Compliance standards
Preferred Skills· Strong understanding and experience with Power BI
· Experience with data orchestration tools or real-time data processing frameworks
· Excellent communication, teamwork, and systematic technical problem-solving skills
· Ability to plan, manage, and lead projects effectively with demonstrated leadership capabilities
Be The First To Know
About the latest Reporting engineer Jobs in Thailand !
Data Engineer
Posted today
Job Viewed
Job Description
Key Responsibilities:
- Build and maintain data pipelines using SQL, Python, and Databricks.
- Optimize workflows for data transformation and automation.
- Collaborate with teams to ensure reliable, high-quality data.
- Troubleshoot and resolve issues in data workflows.
- Document processes and support data integration projects.
Qualifications:
- Proficiency in SQL and Python.
- Hands-on experience with Databricks and ETL processes.
- Familiarity with cloud platforms like Azure or AWS is a plus.
- Strong problem-solving and collaboration skills.
Data Engineer
Posted today
Job Viewed
Job Description
We are looking for:
- Data Engineer - Information Technology
Location - work at Amata City Chonburi (onsite 100%)
Our Team
This position will be under IT Application Department based on Factory and coordinate with Operation user.
What You Will Do: Job Details
- Design and implement scalable and robust data pipelines to support analytics and data processing needs.
- Develop and maintain database architectures, including data lakes and data warehouses.
- Ensure data quality and consistency through data cleaning, transformation, and validation processes.
- Collaborate with data scientists and analysts to gather requirements and deliver data solutions that support business objectives.
- Optimize data retrieval and develop dashboards and reports for various user needs.
Qualification
- Bachelor's degree in Computer Science or Engineering / Business Managment (IT) or related field
- Fresh graduates with the right skill set also welcome or have internship experience in releated field.
- Proven experience with SQL and database management systems.
- Proficiency in programming languages such as Python, Java, or Scala. Familiarity with data modeling and ETL processes
- Good Communication in English both of written and spoken
Data Engineer
Posted today
Job Viewed
Job Description
- Work very closely with data analyst, SRE and developers to develop data models and pipelines for research, reporting
- Hands on experience or technical skills to other team members
- Initiate ideas or lead the team to able to solve problems together
- Create technical documentation from implemented data infrastructure/software such as API documents, code structure design, architecture design etc.
- Do coding for gathering data, ETL and some specific areas that can't be solved with current software setting up
- Improve/maintain/refactor/redesign the code by based on scaling of collected data
- Setup and maintain data pipeline, ETL, data lake, data warehouse and data visualization
- Understand objective of data analyst/business team and make solution or research new technology to support
- Monitor and troubleshoot data pipeline issues
- Optimizing the data pipeline for performance and scalability
Qualifications:
- Graduated or experienced in faculty/area of technology/data engineering
- Experience in data engineer or related at least for 2 years
Experience in at least three of the following languages/ technology
Python
- Scala
- Java
- R
- SQL
- Docker/Kubernetes
Experience in at least three of setting up or working with related data technology
Apache Spark
- Apache Hadoop
- Delta Lake
- PrestoDB
- Superset
- Elasticsearch/Kibana
- Airflow
- Cloud base data technology (AWS/GCP/Azure)
- Experience with data warehousing and ETL processes is a must
- Understand the fundamentals of programming languages
- Understand service deployment pipeline or CI/CD
- Self-motivated and able to work both independently and in collaboration with other technologists at all levels
- Understand data architecture design
- Understand OOP programming and SOLID principles
- Partly understand the networking between services and various protocol in each layer
- Partly experience in various technology that use in setting up data pipeline, ETL, data lake, data warehouse and data visualization