292 Cloud Data Engineer jobs in Malaysia
ETL Cloud Data Engineer
Posted 11 days ago
Job Viewed
Job Description
Job Summary:
We are looking for a highly skilled Python / ETL Data Engineer to design, develop, and optimize data pipelines using AWS Glue, AWS Athena, AWS Data Pipeline, and other ETL tools. The ideal candidate will have strong expertise in Python, SQL, AWS, and data pipeline management, ensuring smooth data extraction, transformation, and loading (ETL) processes.
Key Responsibilities:
- Develop and maintain data pipelines using AWS Glue, AWS Athena, AWS Data Pipeline, and other ETL tools.
- Implement monitoring and early detection processes for pipeline issues such as missing or lagging data.
- Extract, transform, and load (ETL) data from various sources, including relational databases, non-relational databases, and flat files.
- Optimize and transform data to meet business requirements and load it into target data stores.
- Troubleshoot and optimize performance of data pipelines for scalability and efficiency.
- Collaborate with Data Scientists and Business Analysts to understand data requirements and implement solutions.
- Build and maintain REST APIs for data access and integration with external systems.
- Ensure data security and compliance best practices across all pipelines.
- Document and test ETL processes to maintain high-quality data flows.
Required Skills:
- Strong Python programming skills for data processing and ETL automation.
- Proficiency in SQL and database design (relational and non-relational databases).
- In-depth knowledge of AWS services, including AWS Glue, Redshift, S3, and EMR.
- Experience in building ETL pipelines on cloud-based environments (preferably AWS).
- Hands-on experience with REST API development and integration.
- Familiarity with data warehousing concepts and best practices.
- Understanding of data modeling, API testing, and troubleshooting.
- Experience with data governance, data security, and compliance frameworks.
- Exposure to big data technologies like Spark, Hadoop, or Kafka.
- Knowledge of containerization (Docker, Kubernetes) and DevOps practices.
Qualifications:
- Minimum 5 years of experience as a Data Engineer with ETL cloud experience, preferably AWS.
- AWS or Azure certification preferred.
- Strong Python programming skills and experience building APIs.
- Malaysian citizen or permanent resident.
- Oil & Gas industry experience is a plus.
- Degree in Computer Science, Engineering, or equivalent.
ETL Cloud Data Engineer
Posted 11 days ago
Job Viewed
Job Description
Join to apply for the ETL Cloud Data Engineer role at Mission Consultancy Services Sdn. Bhd.
1 day ago Be among the first 25 applicants
Join to apply for the ETL Cloud Data Engineer role at Mission Consultancy Services Sdn. Bhd.
Mission Consultancy Services Sdn. Bhd. provided pay rangeThis range is provided by Mission Consultancy Services Sdn. Bhd. Your actual pay will be based on your skills and experience — talk with your recruiter to learn more.
Base pay rangeJob Summary:
We are looking for a highly skilled Python / ETL Data Engineer to design, develop, and optimize data pipelines using AWS Glue, AWS Athena, AWS Data Pipeline, and other ETLtools. The ideal candidate will have strong expertise in Python, SQL, AWS, and datapipeline management, ensuring smooth data extraction, transformation, and loading (ETL) processes.
Key Responsibilities
Develop and maintain data pipelines using AWS Glue, AWS Athena, AWS Data Pipeline, and other ETL tools.
Implement monitoring & early detection processes for pipeline issues such as missing data or lagging data.
Extract, transform, and load (ETL) data from various sources, including relational databases, non-relational databases, and flat files.
Optimize and transform data to meet business requirements and load it into target data stores.
Troubleshoot & optimize performance of data pipelines for scalability and efficiency.
Collaborate with Data Scientists and Business Analysts to understand data requirements and implement solutions.
Build and maintain REST APIs for data access and integration with external systems.
Ensure data security & compliance best practices across all pipelines.
Document & test ETL processes to maintain high-quality data flows.
Required Skills
Strong Python programming skills for data processing and ETL automation.
Proficiency in SQL and database design (relational and non-relational databases).
In-depth knowledge of AWS services, including AWS Glue, Redshift, S3, and EMR.
Experience in building ETL pipelines on cloud-based environments (preferably AWS).
Hands-on experience with REST API development and integration.
Familiarity with data warehousing concepts and best practices.
Understanding of data modeling, API testing, and troubleshooting.
Experience with data governance, data security, and compliance frameworks.
Exposure to big data technologies like Spark, Hadoop, or Kafka.
Knowledge of containerization (Docker, Kubernetes) and DevOps practices.
Qualifications
Minimum 5 years experience as Data Engineer with ETL Cloud Experience preferably AWS or Azure certification prefered
Strong Python Programming sills and experience building API
Malaysian Citizen or Permanent Resident
Oil & Gas Industry experience a plus
Degree in Computer Science, Engineering or equivalent
- Seniority level Mid-Senior level
- Employment type Full-time
- Job function Information Technology
- Industries IT Services and IT Consulting
Referrals increase your chances of interviewing at Mission Consultancy Services Sdn. Bhd. by 2x
Get notified about new Data Engineer jobs in Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia .
Kota Damansara, Selangor, Malaysia 2 weeks ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 day ago
Data Analyst - SG Business Intelligence (Open for fresh graduates)Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 5 days ago
WP. Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 3 weeks ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 6 days ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 month ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 hour ago
Petaling Street, Federal Territory of Kuala Lumpur, Malaysia 1 day ago
Shah Alam, Selangor, Malaysia 7 hours ago
Shah Alam, Selangor, Malaysia 8 hours ago
WP. Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 7 hours ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 week ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago
Petaling Jaya, Selangor, Malaysia 11 months ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 week ago
Kuala Lumpur City, Federal Territory of Kuala Lumpur, Malaysia 1 week ago
Shah Alam, Selangor, Malaysia 21 hours ago
Federal Territory of Kuala Lumpur, Malaysia 5 days ago
Business Intelligence - Fresh Graduate Recruitment (2025 Intake)Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 5 days ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 week ago
Kuala Lumpur City, Federal Territory of Kuala Lumpur, Malaysia 3 weeks ago
Artificial Intelligence (AI) Software Engineer (Gen AI, Python)Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-LjbffrETL Cloud Data Engineer
Posted today
Job Viewed
Job Description
ETL Cloud Data Engineer
role at
Mission Consultancy Services Sdn. Bhd. 1 day ago Be among the first 25 applicants Join to apply for the
ETL Cloud Data Engineer
role at
Mission Consultancy Services Sdn. Bhd. Mission Consultancy Services Sdn. Bhd. provided pay range
This range is provided by Mission Consultancy Services Sdn. Bhd. Your actual pay will be based on your skills and experience — talk with your recruiter to learn more. Base pay range
Job Summary:
We are looking for a highly skilled Python / ETL Data Engineer to design, develop, and optimize data pipelines using AWS Glue, AWS Athena, AWS Data Pipeline, and other ETLtools. The ideal candidate will have strong expertise in Python, SQL, AWS, and datapipeline management, ensuring smooth data extraction, transformation, and loading (ETL) processes.
Key Responsibilities
Develop and maintain data pipelines using AWS Glue, AWS Athena, AWS Data Pipeline, and other ETL tools.
Implement monitoring & early detection processes for pipeline issues such as missing data or lagging data.
Extract, transform, and load (ETL) data from various sources, including relational databases, non-relational databases, and flat files.
Optimize and transform data to meet business requirements and load it into target data stores.
Troubleshoot & optimize performance of data pipelines for scalability and efficiency.
Collaborate with Data Scientists and Business Analysts to understand data requirements and implement solutions.
Build and maintain REST APIs for data access and integration with external systems.
Ensure data security & compliance best practices across all pipelines.
Document & test ETL processes to maintain high-quality data flows.
Required Skills
Strong Python programming skills for data processing and ETL automation.
Proficiency in SQL and database design (relational and non-relational databases).
In-depth knowledge of AWS services, including AWS Glue, Redshift, S3, and EMR.
Experience in building ETL pipelines on cloud-based environments (preferably AWS).
Hands-on experience with REST API development and integration.
Familiarity with data warehousing concepts and best practices.
Understanding of data modeling, API testing, and troubleshooting.
Experience with data governance, data security, and compliance frameworks.
Exposure to big data technologies like Spark, Hadoop, or Kafka.
Knowledge of containerization (Docker, Kubernetes) and DevOps practices.
Qualifications
Minimum 5 years experience as Data Engineer with ETL Cloud Experience preferably AWS or Azure certification prefered
Strong Python Programming sills and experience building API
Malaysian Citizen or Permanent Resident
Oil & Gas Industry experience a plus
Degree in Computer Science, Engineering or equivalent Seniority level
Seniority level Mid-Senior level Employment type
Employment type Full-time Job function
Job function Information Technology Industries IT Services and IT Consulting Referrals increase your chances of interviewing at Mission Consultancy Services Sdn. Bhd. by 2x Get notified about new Data Engineer jobs in
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia . Kota Damansara, Selangor, Malaysia 2 weeks ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 day ago Data Analyst - SG Business Intelligence (Open for fresh graduates)
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 5 days ago WP. Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 3 weeks ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 6 days ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 month ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 hour ago Petaling Street, Federal Territory of Kuala Lumpur, Malaysia 1 day ago Shah Alam, Selangor, Malaysia 7 hours ago Shah Alam, Selangor, Malaysia 8 hours ago WP. Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 7 hours ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 week ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago Petaling Jaya, Selangor, Malaysia 11 months ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 week ago Kuala Lumpur City, Federal Territory of Kuala Lumpur, Malaysia 1 week ago Shah Alam, Selangor, Malaysia 21 hours ago Federal Territory of Kuala Lumpur, Malaysia 5 days ago Business Intelligence - Fresh Graduate Recruitment (2025 Intake)
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 5 days ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 week ago Kuala Lumpur City, Federal Territory of Kuala Lumpur, Malaysia 3 weeks ago Artificial Intelligence (AI) Software Engineer (Gen AI, Python)
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-Ljbffr
ETL Cloud Data Engineer
Posted today
Job Viewed
Job Description
#J-18808-Ljbffr
Senior Cloud Engineer – AWS & Data Lake DBA
Posted 11 days ago
Job Viewed
Job Description
Join to apply for the Senior Cloud Engineer – AWS & Data Lake DBA role at Bitdeer (NASDAQ: BTDR)
2 days ago Be among the first 25 applicants
Join to apply for the Senior Cloud Engineer – AWS & Data Lake DBA role at Bitdeer (NASDAQ: BTDR)
Get AI-powered advice on this job and more exclusive features.
About Bitdeer
Bitdeer Technologies Group (Nasdaq: BTDR) is a world-leading technology company for Bitcoin mining. Bitdeer is committed to providing comprehensive computing solutions for its customers. The Company handles complex processes involved in computing such as equipment procurement, transport logistics, datacenter design and construction, equipment management, and daily operations. The Company also offers advanced cloud capabilities to customers with high demand for artificial intelligence. Headquartered in Singapore, Bitdeer has deployed datacenters in the United States, Norway, and Bhutan.
About Bitdeer
Bitdeer Technologies Group (Nasdaq: BTDR) is a world-leading technology company for Bitcoin mining. Bitdeer is committed to providing comprehensive computing solutions for its customers. The Company handles complex processes involved in computing such as equipment procurement, transport logistics, datacenter design and construction, equipment management, and daily operations. The Company also offers advanced cloud capabilities to customers with high demand for artificial intelligence. Headquartered in Singapore, Bitdeer has deployed datacenters in the United States, Norway, and Bhutan.
What You Will Be Responsible For
- Manage and maintain cloud infrastructure (AWS), focusing on reliability, performance, and cost-efficiency.
- Build and maintain scalable, secure, and cost-effective Data Lake architecture using services such as Amazon S3, AWS Glue, Lake Formation, and Athena.
- Administer large-scale database platforms (RDS, Redshift, EMR, DynamoDB) and perform backup/recovery, tuning, and upgrades.
- Automate infrastructure provisioning and operations using Infrastructure-as-Code (IaC) tools such as Terraform or AWS CloudFormation.
- Design and operate ETL/ELT pipelines for ingesting data into the data lake.
- Ensure data quality, security, access control, and compliance for all lake-stored data.
- Ensure data platforms are highly available, backed up, and meet disaster recovery requirements.
- Implement and maintain monitoring, alerting, and logging solutions for infrastructure, pipelines and data workloads.
- Participate in on-call rotation for cloud operations and database issue escalation.
- Work cross-functionally with data engineers, developers, and product teams to meet analytical and operational data needs.
- Collaborate with security teams to ensure compliance and best practices are followed for data protection and cloud resource access.
- Bachelor's degree or above in Computer Science, Information Systems, or a related field.
- Over 5 years of experience in cloud operations, with a focus on AWS.
- In-depth knowledge of AWS services including S3, Glue, Lake Formation, Redshift, Athena, EMR, IAM, and CloudWatch.
- Solid hands-on experience with designing and managing Data Lake and big data analytics platforms.
- Strong experience in managing relational and NoSQL databases (e.g., PostgreSQL, MySQL, DynamoDB).
- Strong scripting and automation skills in Python, Bash, or Go.
- Familiar with tools such as Terraform, Ansible, Jenkins, Git, etc.
- Experienced in setting up monitoring and cost optimization for AWS-based infrastructure.
- Understanding of security, encryption, data governance, and compliance best practices for public cloud data platforms.
- AWS Certified Solutions Architect – Associate/Professional.
- AWS Certified Data Analytics – Specialty or Database – Specialty.
- Experience with modern data stack tools (e.g., Apache Iceberg, Delta Lake, dbt) is a plus.
- Kubernetes or containerization certification is preferred.
- A culture that values authenticity and diversity of thoughts and backgrounds;
- An inclusive and respectable environment with open workspaces and exciting start-up spirit;
- Fast-growing company with the chance to network with industrial pioneers and enthusiasts;
- Ability to contribute directly and make an impact on the future of the digital asset industry;
- Involvement in new projects, developing processes/systems;
- Personal accountability, autonomy, fast growth, and learning opportunities;
- Attractive welfare benefits and developmental opportunities such as training and mentoring.
- Seniority level Mid-Senior level
- Employment type Full-time
- Job function Engineering and Information Technology
- Industries Software Development
Referrals increase your chances of interviewing at Bitdeer (NASDAQ: BTDR) by 2x
Sign in to set job alerts for “Cloud Engineer” roles.Setia Pearl Island, Penang, Malaysia 1 month ago
Software Validation Engineer(Graduate&Internship-MYS) Senior Engineer, Software Development Engineering Data Center and Network Operations Engineer (Fresh Grads Only - July Onwards Intake) Engineer, Pre and Post Sputter Engineering (Cleaning) Software Engineer - iOS (Swift + Kotlin) Software & Data System Manufacturing Engineer Software Application Engineer- Based in PenangWe’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-LjbffrSenior Cloud Engineer – AWS & Data Lake DBA
Posted today
Job Viewed
Job Description
Senior Cloud Engineer – AWS & Data Lake DBA
role at
Bitdeer (NASDAQ: BTDR) 2 days ago Be among the first 25 applicants Join to apply for the
Senior Cloud Engineer – AWS & Data Lake DBA
role at
Bitdeer (NASDAQ: BTDR) Get AI-powered advice on this job and more exclusive features. About Bitdeer
Bitdeer Technologies Group (Nasdaq: BTDR) is a world-leading technology company for Bitcoin mining. Bitdeer is committed to providing comprehensive computing solutions for its customers. The Company handles complex processes involved in computing such as equipment procurement, transport logistics, datacenter design and construction, equipment management, and daily operations. The Company also offers advanced cloud capabilities to customers with high demand for artificial intelligence. Headquartered in Singapore, Bitdeer has deployed datacenters in the United States, Norway, and Bhutan. About Bitdeer
Bitdeer Technologies Group (Nasdaq: BTDR) is a world-leading technology company for Bitcoin mining. Bitdeer is committed to providing comprehensive computing solutions for its customers. The Company handles complex processes involved in computing such as equipment procurement, transport logistics, datacenter design and construction, equipment management, and daily operations. The Company also offers advanced cloud capabilities to customers with high demand for artificial intelligence. Headquartered in Singapore, Bitdeer has deployed datacenters in the United States, Norway, and Bhutan.
What You Will Be Responsible For
Manage and maintain cloud infrastructure (AWS), focusing on reliability, performance, and cost-efficiency. Build and maintain scalable, secure, and cost-effective Data Lake architecture using services such as Amazon S3, AWS Glue, Lake Formation, and Athena. Administer large-scale database platforms (RDS, Redshift, EMR, DynamoDB) and perform backup/recovery, tuning, and upgrades. Automate infrastructure provisioning and operations using Infrastructure-as-Code (IaC) tools such as Terraform or AWS CloudFormation. Design and operate ETL/ELT pipelines for ingesting data into the data lake. Ensure data quality, security, access control, and compliance for all lake-stored data. Ensure data platforms are highly available, backed up, and meet disaster recovery requirements. Implement and maintain monitoring, alerting, and logging solutions for infrastructure, pipelines and data workloads. Participate in on-call rotation for cloud operations and database issue escalation. Work cross-functionally with data engineers, developers, and product teams to meet analytical and operational data needs. Collaborate with security teams to ensure compliance and best practices are followed for data protection and cloud resource access.
How You Will Stand Out
Bachelor's degree or above in Computer Science, Information Systems, or a related field. Over 5 years of experience in cloud operations, with a focus on AWS. In-depth knowledge of AWS services including S3, Glue, Lake Formation, Redshift, Athena, EMR, IAM, and CloudWatch. Solid hands-on experience with designing and managing Data Lake and big data analytics platforms. Strong experience in managing relational and NoSQL databases (e.g., PostgreSQL, MySQL, DynamoDB). Strong scripting and automation skills in Python, Bash, or Go. Familiar with tools such as Terraform, Ansible, Jenkins, Git, etc. Experienced in setting up monitoring and cost optimization for AWS-based infrastructure. Understanding of security, encryption, data governance, and compliance best practices for public cloud data platforms.
Preferred Qualifications
AWS Certified Solutions Architect – Associate/Professional. AWS Certified Data Analytics – Specialty or Database – Specialty. Experience with modern data stack tools (e.g., Apache Iceberg, Delta Lake, dbt) is a plus. Kubernetes or containerization certification is preferred.
What You Will Experience Working With Us
A culture that values authenticity and diversity of thoughts and backgrounds; An inclusive and respectable environment with open workspaces and exciting start-up spirit; Fast-growing company with the chance to network with industrial pioneers and enthusiasts; Ability to contribute directly and make an impact on the future of the digital asset industry; Involvement in new projects, developing processes/systems; Personal accountability, autonomy, fast growth, and learning opportunities; Attractive welfare benefits and developmental opportunities such as training and mentoring.
Bitdeer is committed to providing equal employment opportunities in accordance with country, state, and local laws. Bitdeer does not discriminate against employees or applicants based on conditions such as race, colour, gender identity and/or expression, sexual orientation, marital and/or parental status, religion, political opinion, nationality, ethnic background or social origin, social status, disability, age, indigenous status, and union. Seniority level
Seniority level Mid-Senior level Employment type
Employment type Full-time Job function
Job function Engineering and Information Technology Industries Software Development Referrals increase your chances of interviewing at Bitdeer (NASDAQ: BTDR) by 2x Sign in to set job alerts for “Cloud Engineer” roles.
Setia Pearl Island, Penang, Malaysia 1 month ago Software Validation Engineer(Graduate&Internship-MYS)
Senior Engineer, Software Development Engineering
Data Center and Network Operations Engineer
(Fresh Grads Only - July Onwards Intake) Engineer, Pre and Post Sputter Engineering (Cleaning)
Software Engineer - iOS (Swift + Kotlin)
Software & Data System Manufacturing Engineer
Software Application Engineer- Based in Penang
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-Ljbffr
Data Engineer
Posted today
Job Viewed
Job Description
Get AI-powered advice on this job and more exclusive features.
Kasagi Labo is building the definitive heartbeat of anime and manga culture worldwide. We create vibrant digital platforms and global communities that connect fans with creators and rights-holders in authentic, scalable ways. Our mission is bold, and our team thrives on experimentation, ownership, and a winning spirit.
Role Overview
We’re hiring a Data/AI Engineer with strong foundations in data engineering who’s excited to step into the world of AI. You’ll design scalable data systems and collaborate with data scientists to bring machine learning applications to life. This is an ideal role for someone eager to evolve from traditional data work into AI/ML-powered innovation.
Your Responsibilities
- Build and maintain scalable data pipelines for structured and unstructured data
- Develop infrastructure to support analytics, AI, and ML applications
- Partner with software engineer to integrate and deploy models into production
- Ensure data governance, quality, and security standards are upheld
- Explore AI frameworks and tools to test and implement new ideas
Who You Are
- Proficient in Python and SQL , with experience in distributed data systems (e.g., Spark, Hadoop)
- Expert in both Relational Databases and NoSQL (Postgresql, Firebase, Mongodb)
- Hands-on with cloud data platforms (BigQuery, RedShift, Snowflake, or similar)
- Familiar with (or eager to learn) ML/AI frameworks such as TensorFlow or PyTorch
- Curious, adaptable, and motivated to grow your expertise in Data engineering & AI
- Bonus: Experience deploying fine tuned AL models or ML models into production
Culture
At Kasagi Labo, our culture is built around shared values:
- Move with Purpose – We act with clarity and decisiveness.
- Carve Your Own Path – Initiative and ownership drive our work.
- Say It Like It Is – Honesty and feedback shape how we grow.
- We Play to Win – We bring bold energy and succeed as a team.
- Seniority level Mid-Senior level
- Employment type Full-time
- Industries Entertainment Providers
Referrals increase your chances of interviewing at Kasagi Labo by 2x
Sign in to set job alerts for “Data Engineer” roles.Kota Damansara, Selangor, Malaysia 1 week ago
Kota Damansara, Selangor, Malaysia 1 week ago
Petaling Jaya, Selangor, Malaysia 6 days ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia MYR4,000.00-MYR5,000.00 2 months ago
Petaling Jaya, Selangor, Malaysia 6 days ago
Petaling Jaya, Selangor, Malaysia 6 days ago
Petaling Jaya, Selangor, Malaysia 6 days ago
Petaling Jaya, Selangor, Malaysia 6 days ago
Petaling Jaya, Selangor, Malaysia 6 days ago
Petaling Jaya, Selangor, Malaysia 6 days ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago
WP. Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 week ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 month ago
Petaling Jaya, Selangor, Malaysia MYR5,000.00-MYR7,000.00 1 month ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 9 months ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 month ago
WP. Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 days ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 4 weeks ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 months ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia MYR3,500.00-MYR4,000.00 1 month ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 7 months ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia MYR4,000.00-MYR5,500.00 4 months ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago
Petaling Jaya, Selangor, Malaysia 10 months ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 month ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 6 days ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-LjbffrBe The First To Know
About the latest Cloud data engineer Jobs in Malaysia !
Data Engineer
Posted today
Job Viewed
Job Description
CloudMile Federal Territory of Kuala Lumpur, Malaysia
CloudMile Federal Territory of Kuala Lumpur, Malaysia
Direct message the job poster from CloudMile
We are looking for a Data Engineer to help build and manage robust data pipelines and platforms that power advanced analytics and AI solutions.
Key Responsibilities
- Develop and maintain batch and streaming data pipelines using GCP services , such as BigQuery, Dataflow, Dataproc, Composer, Dataform, Cloud Functions .
- Load, transform, and optimize data in BigQuery for analytics and reporting.
- Integrate data from multiple sources including APIs, databases, and files.
- Assist in data migration from legacy systems such as Oracle and MicroStrategy.
- Ensure data quality, governance, and security compliance.
- Collaborate with analysts and business teams to support reporting needs.
Requirements
- 3–5 years’ experience in data engineering or ETL development.
- Hands-on experience with GCP Data Stack (BigQuery, Dataflow, Composer, Dataproc).
- Solid SQL and Python skills.
- Familiarity with Azure Data Stack is a plus.
- Understanding of data modelling concepts and performance optimization.
- Willingness to learn and work on large-scale migration projects.
- Seniority level Associate
- Employment type Contract
- Job function Information Technology
- Industries IT Services and IT Consulting
Referrals increase your chances of interviewing at CloudMile by 2x
Get notified about new Data Engineer jobs in Federal Territory of Kuala Lumpur, Malaysia .
Data Analyst - SG Business Intelligence (Open for fresh graduates)Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 day ago
Federal Territory of Kuala Lumpur, Malaysia 3 weeks ago
Petaling Street, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago
Junior Data Engineer (Recent graduates are encouraged to apply)Federal Territory of Kuala Lumpur, Malaysia 1 week ago
Federal Territory of Kuala Lumpur, Malaysia 5 days ago
WP. Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 3 weeks ago
WP. Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 days ago
Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 week ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 month ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago
Federal Territory of Kuala Lumpur, Malaysia 3 weeks ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 3 days ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 week ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 month ago
Data Engineer/ Python Developer - Data LakeFederal Territory of Kuala Lumpur, Malaysia 2 days ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago
Federal Territory of Kuala Lumpur, Malaysia 1 week ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 month ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 4 months ago
WP. Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 months ago
WP. Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 5 months ago
Federal Territory of Kuala Lumpur, Malaysia 3 weeks ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia MYR3,500.00-MYR4,000.00 1 month ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago
Federal Territory of Kuala Lumpur, Malaysia 2 days ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 week ago
Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 month ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia MYR4,000.00-MYR4,500.00 4 days ago
Federal Territory of Kuala Lumpur, Malaysia 7 hours ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago
WP. Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-LjbffrData Engineer
Posted 1 day ago
Job Viewed
Job Description
Brooks is a leading provider of automation solutions with over 40 years of experience in the semiconductor industry, offering precision robotics, integrated automation systems, and contamination control solutions that empower chip manufacturers worldwide. Our product portfolio includes a range of automation solutions, including robots, vacuum systems, and atmospheric robots for semiconductor manufacturing ( ).
Are you looking for a place where you can be part of a transformation? Join us at Brooks Automation and be a part of a dynamic organization that is shaping the future of technology.
Data EngineerJob Description• Proven analytical and problem-solving skills.
• Proficient with SQL as a tool to solve problems, analyze data, create ad hoc reports and test.
• Nice to have knowledge of Oracle ERP Order management
• Should have hands on experience on ETL tools like SSIS/Informatica Cloud
• Strong oral and written communication skills.
• Proficient with Power BI/Tableau BI tools
• Ability to work in a team environment.
• Proficient in prioritizing, multi-tasking and managing multiple projects.
• Proven independent worker with strong organizational and time management skills.
• Flexible and can adapt to change.
What you’ll bring:
5 to 10 Years’ experience as Data Engineer
Bachelor of Science in Computer Engineering or Information Systems
Work Location & Flexibility
At Brooks, we aim to foster a collaborative and engaging environment while offering flexibility where possible. Work arrangements may include a mix of in-office and remote work, depending on the nature of the role and business needs. Specific expectations will be shared during the interview process.
Brooks is committed to fostering a diverse and inclusive workplace and proudly serves as an equal-opportunity employer. We welcome all qualified applicants regardless of race, color, religion, gender identity or expression, sexual orientation, national origin, genetics, disability, age, veteran status, or any other legally protected characteristics.
Diversity enhances our innovative capabilities and strengthens our ability to serve our customers and communities effectively. At Brooks Automation, we celebrate the unique experiences and perspectives each individual brings, believing they are essential to our collective success. Join us in building a workplace where every team member is valued and can thrive.
For applicants with disabilities requiring accommodations, don't hesitate to get in touch with or call +1 ( to discuss your needs.
#J-18808-LjbffrData Engineer
Posted 3 days ago
Job Viewed
Job Description
MoneyLion is a leader in financial technology powering the next generation of personalized products and content, with a top consumer finance super app, a premier embedded finance platform for enterprise businesses and a world-class media arm. MoneyLion’s mission is to give everyone the power to make their best financial decisions. We pride ourselves on serving the many, not the few; providing confidence through guidance, choice, personalization; and shortening the distance to an informed action.
In our go-to money app for consumers, we deliver curated content on finance and related topics, through a tailored feed that engages people to learn and share. People take control of their finances with our innovative financial products and marketplace - including our full-fledged suite of features to save, borrow, spend, and invest - seamlessly bringing together the best offers and content from MoneyLion and our 1,100+ Enterprise Partner network, together in one experience. MoneyLion’s enterprise technology provides the definitive search engine and marketplace for financial products, enabling any company to add embedded finance to their business, with advanced AI-backed data and tools through our platform and API. Established in 2013, MoneyLion connects millions of people with the financial products and content they need, when and where they need it.
About the Role
The Kuala Lumpur office is the technology powerhouse of MoneyLion. We pride ourselves on innovative initiatives and thrive in a fast paced and challenging environment. Join our multicultural team of visionaries and industry rebels in disrupting the traditional finance industry!
MoneyLion is building a world class data ecosystem that will dramatically improve the ability of internal data users and analysts to create products and experiences that captivate our customers. As the Data Engineer, you will participate in technical and architectural planning as well as contributing as a developer while interacting with data scientists, software engineers, and analysts as we build and scale the world's most rewarding program for financial wellness.
Key Responsibilities
- Assist with reviewing and interpreting business and technical requirements with a focus on data objectives of the product
- Identify requirements and develop data delivery solutions
- Implement company policies on data access and data distribution
- Support members of various teams to determine database structural requirements by analyzing their applications and client operations
- Collaborate with highly talented engineers while striving to help them build their skills and achieve their career goals
- Support data flow and ETL processes from end to end for projects and initiatives including troubleshooting of data issues, integrity checks, data model design, querying of data, and performance resulting from data issues
- Assist in training over the data systems used by the company for the employees. Specifically, training programs for new hires on various reports and models available in the data visualization tools that are used
- Analyze large data sets, some of them messy, to answer and solve real-world business problems while charting paths towards data cleanup and preventing data from becoming messy
About You
- Familiarity with financial and banking products
- Demonstrated skills in good software engineering practices
- Building and shipping large-scale engineering products and/or infrastructure
- Scalable data architecture, fault-tolerant ETL, and monitoring data quality in the cloud (We use AWS)
- Collaborating with a diverse set of engineers, data scientists and product managers
- Deep expertise in data engineering and data processing at scale. Requires a focus on the development of pipelines for the creation of data lakes to enable exploration as well as machine learning model training, deployment and scoring at massive scale.
- Familiarity with data science techniques and frameworks
- NoSQL, Relational databases and Presto (We use MongoDB, MySQL, PostgreSQL, ElasticSearch)
- The AWS stack combined with technologies such as Java, Python, Spark, and Kafka
- Familiarity with Apache Airflow or equivalent workflow management tools
What's Next.
After you submit your application, you can expect the following steps in the recruitment process:
- Online Preliminary HackerRank test
- Interview - Talent Acquisition Team (Virtual or face-to-face)
- Take-home Assessment - To be discussed in the technical round
- Interview - Hiring Manager (Virtual or face-to-face)
We value growth-minded and collaborative people with high learning agility who embody our core values of teamwork, customer-first and innovation . Every member of the MoneyLion Team is passionate about fintech and ready to give 100% in helping us achieve our mission.
Working At MoneyLion
At MoneyLion, we want you to be well and thrive. Our generous benefits package includes:
- Competitive salary packages
- Wellness perks
- Paid parental leave
- Generous Paid Time Off
- Learning and Development resources
MoneyLion is committed to equal employment opportunities for all employees. Inside our company, every decision we make regarding our employees is based on merit, competence, and performance, completely free of discrimination. We are committed to building a team that represents a variety of backgrounds, perspectives, and skills. Within that team, no one will feel more “other” than anyone else. We realize the full promise of diversity and want you to bring your whole self to work every single day.
#J-18808-Ljbffr