862 Big Data jobs in Malaysia
Big Data Engineer
Posted 4 days ago
Job Viewed
Job Description
Position Summary
We are looking for a full-time Big Data Engineer II to join our team at Shirlyn Technology in Petaling Jaya, Malaysia and Palo Alto, California. The Big Data team is at the heart of our operations, playing a critical role in scaling and optimizing our data infrastructure. As we continue to grow, we are seeking an experienced and driven Big Data Engineer to help us tackle complex data challenges, enhance data solutions, and ensure the security and quality of our data systems. You will be instrumental in building and deploying scalable data pipelines to ensure seamless data flow across systems, driving business success through reliable data solutions.
Job Responsibilities
- Collaborate with global teams in data, security, infrastructure, and business functions to understand data requirements and provide scalable solutions.
- Design, implement, and maintain efficient ETL (Extract, Transform, Load) pipelines to enable smooth data flow between systems.
- Conduct regular data quality checks to ensure accuracy, consistency, and completeness within our data pipelines.
- Continuously improve data pipeline performance and reliability, identifying and addressing any inefficiencies or bottlenecks.
- Ensure data integrity, security, privacy, and availability across all systems.
- Proactively monitor data pipelines, quickly diagnosing issues and taking action to resolve them and maintain system uptime.
- Conduct root cause analysis of data-related issues and work on long-term solutions to prevent recurring problems.
- Document pipeline designs, data processes, and troubleshooting procedures, keeping stakeholders informed with clear communication of updates.
- Provide on-call support for critical data operations, ensuring systems are running 24/7, with rotational responsibilities.
Job Requirements
- Bachelor’s degree in Computer Science, Information Systems, or a related technical field.
- 5+ years of hands-on experience building and optimizing data pipelines using big data technologies (Hive, Presto, Spark, Flink).
- Expertise in writing complex SQL queries and proficiency in Python programming, focusing on maintainable and clean code.
- Solid knowledge of scripting languages (e.g., Shell, Perl).
- Experience working in Unix/Linux environments.
- Familiarity with cloud platforms such as Azure, AWS, or GCP.
- High personal integrity and professionalism in handling confidential information, exercising sound judgment.
- Ability to remain calm and resolve issues swiftly during high-pressure situations, adhering to SLAs.
- Strong leadership skills, including the ability to guide junior team members and lead projects across the team.
- Excellent verbal and written communication skills, with the ability to collaborate effectively with remote teams globally.
Big Data Engineer
Posted 21 days ago
Job Viewed
Job Description
#J-18808-Ljbffr
Big Data Hadoop Developer
Posted 2 days ago
Job Viewed
Job Description
Job Summary:
We are looking for a Big Data Hadoop Developer to design, develop, and maintain large-scale data processing solutions. The ideal candidate should have strong hands-on experience with the Hadoop ecosystem and integration with relational databases such as MariaDB or Oracle DB for analytics and reporting.
Key Responsibilities:
- Design, develop, and optimize Hadoop-based big data solutions for batch and real-time data processing.
- Work with data ingestion frameworks to integrate data from MariaDB/Oracle DB into Hadoop (Sqoop, Apache Nifi, Kafka).
- Implement Hive, Spark, and MapReduce jobs for data transformation and analytics.
- Optimize Hive queries, Spark jobs, and HDFS usage for performance and cost efficiency.
- Create and maintain ETL pipelines for structured and unstructured data.
- Troubleshoot and resolve issues in Hadoop jobs and database connectivity.
- Collaborate with BI, analytics, and data science teams for data provisioning.
- Ensure data security, governance, and compliance in all solutions.
Technical Skills:
- Big Data Ecosystem: Hadoop (HDFS, YARN), Hive, Spark, Sqoop, MapReduce, Oozie, Flume.
- Databases: MariaDB and/or Oracle DB (SQL, PL/SQL).
- Programming: Java, Scala, or Python for Spark/MapReduce development.
- Data Ingestion: Sqoop, Kafka, Nifi (for integrating RDBMS with Hadoop).
- Query Optimization: Hive tuning, partitioning, bucketing, indexing.
- Tools: Ambari, Cloudera Manager, Git, Jenkins.
- OS & Scripting: Linux/Unix shell scripting.
Soft Skills:
- Strong analytical skills and problem-solving abilities.
- Good communication skills for working with cross-functional teams.
- Ability to manage priorities in a fast-paced environment.
Nice to Have:
- Experience with cloud-based big data platforms (AWS EMR, Azure HDInsight, GCP Dataproc).
- Knowledge of NoSQL databases (HBase, Cassandra).
- Exposure to machine learning integration with Hadoop/Spark.
Senior Big Data Engineer
Posted 4 days ago
Job Viewed
Job Description
Company : Web3 & Blockchain-focused Company
Job Advantages
- Cutting-edge technology (Web3 and blockchain)
- Globalized company
Job Responsibilities
As a Senior Java Engineer , you will play a pivotal role in the development and optimization of the company’s data infrastructure, ensuring it supports the evolving needs of our blockchain-focused business. Your primary responsibilities will include:
- Architectural Design & Development : Lead the design and development of the company’s data center, ensuring robust performance and scalability.
- Data Platform Optimization : Provide ongoing solutions for platform architecture and performance optimization to support the company’s rapid growth and business needs.
- Business Enablement : Drive data-driven solutions that accelerate business development and continuously enhance the company's core competitiveness.
- Big Data Iteration : Lead the rapid iteration of big data platforms, ensuring efficiency, cost-effectiveness, and high-quality output.
Job Requirements
We are seeking candidates who are passionate about blockchain technology and possess a strong technical foundation. Ideal candidates will meet the following criteria:
- Education : Bachelor’s degree or above in Computer Science or related majors.
- Experience : More than 3 years of experience in Java development, with a deep understanding of JVM principles and Java development best practices.
- Frameworks : Proficient in frameworks such as Spring, Spring MVC, gRPC, Mybatis , and an ability to understand their underlying principles and mechanisms.
- Core Computer Fundamentals : Strong knowledge of computer operating systems, network architecture, and proficiency in commonly used algorithms, data structures, and design patterns.
- Distributed Systems : Familiarity with distributed systems, caching mechanisms (Redis), messaging systems (Kafka), and big data processing frameworks like Spark, Flink , and Zookeeper . Experience with TiDB is a plus.
- Experience in Microservices : Experience in the design and development of data centers or microservices, with an emphasis on high availability and scalability.
- Tagging Systems/Recommendation Systems : Experience with tagging systems or algorithmic recommendation systems is highly desirable.
- Passion for Blockchain : A strong enthusiasm for the blockchain industry and a commitment to contributing to its growth and development.
Why Join Us?
This role offers the opportunity to work at the forefront of blockchain and Web3 technologies within a global company. You will have the chance to develop and optimize critical infrastructure that powers innovative and scalable solutions in the blockchain space. If you’re ready to work in a fast-paced and cutting-edge environment, this role could be the perfect fit for you.
Apply now to be part of an exciting journey in revolutionizing the Web3 ecosystem!
Pioneer Talent Program - Organic Content Operations Crypto Operations Manager - Oregon, United States Crypto-Native UX Researcher (Remote - Jakarta, Indonesia) Web3 Head of Marketing and Communications (Remote) #J-18808-LjbffrSenior Big Data Engineers
Posted 7 days ago
Job Viewed
Job Description
Senior Big Data Engineer at RAPSYS TECHNOLOGIES PTE LTD. We are seeking an experienced Senior Big Data Engineer to design, develop, and maintain large-scale data processing systems. The ideal candidate will have expertise in big data technologies, data architecture, and analytics to drive data-driven insights and support business objectives.
LocationKuala Lumpur, Malaysia
Work ModeWork From Office
RoleSenior Big Data Engineer
Responsibilities- Design and evolve the overall data architecture, ensuring scalability, flexibility, and compliance with enterprise standards.
- Build efficient, secure, and reliable data pipelines using the Bronze-Silver-Gold architecture within EDL.
- Develop and orchestrate scheduled jobs in the EDL environment to support continuous ingestion and transformation.
- Implement Apache Iceberg for data versioning, governance, and optimization.
- Leverage the Medallion framework to standardize data product maturity and delivery.
- Govern metadata, data lineage, and business glossary using tools like Apache Atlas.
- Ensure data security, privacy, and regulatory compliance across all data processes.
- Support Data Mesh principles by collaborating with domain teams to design and implement reusable Data Products.
- Integrate data across structured, semi-structured, and unstructured sources from enterprise systems such as ODS and CRM systems.
- Drive adoption of DataOps/MLOps best practices and mentor peers across units.
- Generate and manage large-scale batch files using Spark and Hive for high-volume data processing.
- Design and implement document-based data models and transform relational models into NoSQL document-oriented structures.
- Bachelor’s, Master’s, or PhD in Computer Science, Data Engineering, or a related discipline.
- 5–7 years of experience in data engineering and distributed data systems.
- Strong hands-on experience with Apache Hive, HBase, Kafka, Solr, Elasticsearch.
- Proficient in data architecture, data modelling, and pipeline scheduling/orchestration.
- Operational experience with Data Mesh, Data Product development, and hybrid cloud data platforms.
- Familiarity with CRM systems, including CRM system, and data sourcing/mapping strategies.
- Proficient in managing metadata, glossary, and lineage tools like Apache Atlas.
- Proven experience in generating large-scale batch files using Spark and Hive.
- Strong understanding of document-based data models and the transformation of relational schemas into document-oriented structures.
- Expertise in data administration, modelling, mapping, collection, and distribution.
- Strong understanding of business workflows to support metadata governance.
- Hands-on experience with analytics and DWH tools (e.g., SAS, Oracle, MS SQL, Python, R Programming).
- Familiarity with data modelling tools (e.g., ERWIN), and enterprise databases (Oracle, IBM DB2, MS SQL, Hadoop, Object Store).
- Experience working across hybrid cloud environments (e.g., AWS, Azure Data Factory).
- In-depth knowledge of ETL/ELT processes and automation frameworks.
- Analytical thinker with strong problem-solving and communication skills.
- Able to collaborate effectively across technical and business teams.
Assistant Manager - Big Data
Posted 10 days ago
Job Viewed
Job Description
Overview
Direct message the job poster from Sunway Digital Hub, Digital Technology Solutions & Group Cybersecurity
Responsibilities- Design, build, and maintain big data systems with stakeholder requirements in mind.
- Develop data processing solutions using technologies such as Hadoop, Spark, and Kafka and ensure data accuracy, consistency, and integrity.
- Leverage advanced data science techniques (such as predictive modeling, clustering, regression, NLP, etc.) to extract actionable insights and build customer-centric data products.
- Collaborate with cross-functional teams (marketing, engineering, product, etc.) to understand business needs and translate them into technical specifications for models and data solutions.
- Ensure compliance with data privacy and security regulations during development.
- Stay current with industry trends in data science, big data technologies, and marketing technology to inform new approaches and solutions.
- Experience working with big data technologies such as Spark (preferred), Hadoop, or Kafka.
- Strong programming skills in languages such as Python, SQL, Terraform, or Docker. Pyspark is a plus.
- Experience working with big data analytics platforms such as Huawei Cloud (DLI), Databricks, or Amazon EMR.
- Familiarity with data warehousing concepts and technologies such as DWS (Huawei), Airflow, BigQuery, Redshift, or Snowflake.
- Familiarity with data modelling (supervised training and unsupervised training).
- Experience with data visualization tools such as PowerBI (preferred), Tableau or Qilksense.
- Proven experience working with databases such as MSSQL (preferred), Postgres, and NoSQL.
- Strong problem-solving skills and attention to detail.
- Experience working in Agile development environments is a plus.
- Experience in CDP / Martech tools / Marketing industry will be a plus.
- Manage and lead a team of junior to deliver the task.
- Leaves: Annual Leave, Medical Leave, Hospitalization Leave, Special Leave.
- Medical Benefits – Sunway Medical Insurance for Outpatient & Inpatient inclusive for dependents.
- Dental and Optical benefits for confirmed executive.
- Group Term Life & Personal Accident Insurance Scheme.
- Salary increment based on individual performance.
- Bonus based on company & individual performance.
- Career Development: Training and certification sponsored by the company, Annual Talent Review, Career Planning.
- Rewards and recognition: Long Service Award.
- Additional Benefits: Staff Discount (ThemePark, Hospitality, Education, Property, Medical, Retail, Food & Beverages), Sports and Recreational, Family Day, Annual Dinner, Flexible Working Arrangement for working mothers.
- Hybrid Working Arrangement, Flexible Working Hours.
- Open communication. Young, energetic and fun working environment.
- Seniority level: Mid-Senior level
- Employment type: Full-time
- Job function: Information Technology and Consulting
- Industries: IT Services and IT Consulting
Are you ready to elevate your working skills and experience? Click the ‘Apply Now’ and you are one step ahead to an outstanding career. Our recruitment team will reach out to shortlisted candidates only.
#J-18808-LjbffrSenior Big Data Engineers
Posted 7 days ago
Job Viewed
Job Description
Kuala Lumpur, Malaysia Work Mode
Work From Office Role
Senior Big Data Engineer Responsibilities
Design and evolve the overall data architecture, ensuring scalability, flexibility, and compliance with enterprise standards. Build efficient, secure, and reliable data pipelines using the Bronze-Silver-Gold architecture within EDL. Develop and orchestrate scheduled jobs in the EDL environment to support continuous ingestion and transformation. Implement Apache Iceberg for data versioning, governance, and optimization. Leverage the Medallion framework to standardize data product maturity and delivery. Govern metadata, data lineage, and business glossary using tools like Apache Atlas. Ensure data security, privacy, and regulatory compliance across all data processes. Support Data Mesh principles by collaborating with domain teams to design and implement reusable Data Products. Integrate data across structured, semi-structured, and unstructured sources from enterprise systems such as ODS and CRM systems. Drive adoption of DataOps/MLOps best practices and mentor peers across units. Generate and manage large-scale batch files using Spark and Hive for high-volume data processing. Design and implement document-based data models and transform relational models into NoSQL document-oriented structures. Qualifications
Bachelor’s, Master’s, or PhD in Computer Science, Data Engineering, or a related discipline. 5–7 years of experience in data engineering and distributed data systems. Strong hands-on experience with Apache Hive, HBase, Kafka, Solr, Elasticsearch. Proficient in data architecture, data modelling, and pipeline scheduling/orchestration. Operational experience with Data Mesh, Data Product development, and hybrid cloud data platforms. Familiarity with CRM systems, including CRM system, and data sourcing/mapping strategies. Proficient in managing metadata, glossary, and lineage tools like Apache Atlas. Proven experience in generating large-scale batch files using Spark and Hive. Strong understanding of document-based data models and the transformation of relational schemas into document-oriented structures. Additional Technical & Business Competencies
Expertise in data administration, modelling, mapping, collection, and distribution. Strong understanding of business workflows to support metadata governance. Hands-on experience with analytics and DWH tools (e.g., SAS, Oracle, MS SQL, Python, R Programming). Familiarity with data modelling tools (e.g., ERWIN), and enterprise databases (Oracle, IBM DB2, MS SQL, Hadoop, Object Store). Experience working across hybrid cloud environments (e.g., AWS, Azure Data Factory). In-depth knowledge of ETL/ELT processes and automation frameworks. Analytical thinker with strong problem-solving and communication skills. Able to collaborate effectively across technical and business teams.
#J-18808-Ljbffr
Be The First To Know
About the latest Big data Jobs in Malaysia !
Senior Big Data Engineer
Posted 21 days ago
Job Viewed
Job Description
Senior Java Engineer , you will play a pivotal role in the development and optimization of the company’s data infrastructure, ensuring it supports the evolving needs of our blockchain-focused business. Your primary responsibilities will include: Architectural Design & Development : Lead the design and development of the company’s data center, ensuring robust performance and scalability. Data Platform Optimization : Provide ongoing solutions for platform architecture and performance optimization to support the company’s rapid growth and business needs. Business Enablement : Drive data-driven solutions that accelerate business development and continuously enhance the company's core competitiveness. Big Data Iteration : Lead the rapid iteration of big data platforms, ensuring efficiency, cost-effectiveness, and high-quality output. Job Requirements We are seeking candidates who are passionate about blockchain technology and possess a strong technical foundation. Ideal candidates will meet the following criteria: Education : Bachelor’s degree or above in Computer Science or related majors. Experience : More than 3 years of experience in Java development, with a deep understanding of JVM principles and Java development best practices. Frameworks : Proficient in frameworks such as
Spring, Spring MVC, gRPC, Mybatis , and an ability to understand their underlying principles and mechanisms. Core Computer Fundamentals : Strong knowledge of computer operating systems, network architecture, and proficiency in commonly used algorithms, data structures, and design patterns. Distributed Systems : Familiarity with distributed systems, caching mechanisms (Redis), messaging systems (Kafka), and big data processing frameworks like
Spark, Flink , and
Zookeeper . Experience with
TiDB
is a plus. Experience in Microservices : Experience in the design and development of data centers or microservices, with an emphasis on high availability and scalability. Tagging Systems/Recommendation Systems : Experience with tagging systems or algorithmic recommendation systems is highly desirable. Passion for Blockchain : A strong enthusiasm for the blockchain industry and a commitment to contributing to its growth and development. Why Join Us? This role offers the opportunity to work at the forefront of
blockchain
and
Web3 technologies
within a global company. You will have the chance to develop and optimize critical infrastructure that powers innovative and scalable solutions in the blockchain space. If you’re ready to work in a fast-paced and cutting-edge environment, this role could be the perfect fit for you. Apply now to be part of an exciting journey in revolutionizing the Web3 ecosystem! Pioneer Talent Program - Organic Content Operations
Crypto Operations Manager - Oregon, United States
Crypto-Native UX Researcher (Remote - Jakarta, Indonesia)
Web3 Head of Marketing and Communications (Remote) #J-18808-Ljbffr
Senior Software Engineer (Big Data)
Posted 3 days ago
Job Viewed
Job Description
Overview
EPAM Systems Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia
Software Engineers at EPAM are the driving force behind strategic initiatives for our clients. As a Senior Software Engineer at EPAM Malaysia, you will use your expertise in Big Data to collaborate with product and engineering teams and combine the functional and technical aspects of Software Development with Big Data technology in the project space of cloud services.
Responsibilities- Design and implement innovative analytical solutions using Hadoop, NoSQL and other Big Data related technologies, evaluating new features and architecture in cloud / on-premise / hybrid solutions
- Build collaborative partnerships with architects, technical leads and key individuals within other functional groups
- Perform detailed analysis of business problems and technical environments to design quality technical solutions
- Participate in code review and test solutions
- At least 6 years of working experience with Big Data technologies and Enterprise Software Development
- Solid skills in enterprise software development, practical experience in infrastructure & bug troubleshooting
- Proven expertise in overseeing and evaluating deliverables produced by team members
- Strong skills in enterprise software development, including infrastructure troubleshooting, incident investigation, performance tuning and root cause analysis
- Hands-on experience in Spark / Pandas, Airflow and Python
- Hands-on experience in cloud experience, preferably AWS, and NoSQL
- Experience with component / integration testing, unit testing, and hands-on experience with GitHub, Kubernetes and Docker
- By choosing EPAM, you're getting a job at one of the most loved workplaces according to Newsweek 2021 & 2022&2023.
- Employee ideas are the main driver of our business. We have a very supportive environment where your voice matters
- You will be challenged while working side-by-side with the best talent globally. We work with top-notch technologies, constantly seeking new industry trends and best practices
- We offer a transparent career path and an individual roadmap to engineer your future & accelerate your journey
- At EPAM, you can find vast opportunities for self-development: online courses and libraries, mentoring programs, partial grants of certification, and experience exchange with colleagues around the world. You will learn, contribute, and grow with us
- EPAM is a leader in the fastest-growing segment (product development/digital platform engineering) of the IT industry. We acquired Just-BI in 2021 to reinforce our leading position as a global Business Intelligence services provider and have been growing rapidly. With a talented multinational team, we provide data and analytics expertise
- We are currently involved in end-to-end BI design and implementation projects in major national and international companies. We are proud of our entrepreneurial start-up culture and are focused on investing in people by creating continuous learning and development opportunities for our employees who deliver engineering excellence for our clients
Senior Software Engineer (Big Data)
Posted 2 days ago
Job Viewed
Job Description
EPAM Systems Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia Software Engineers at EPAM are the driving force behind strategic initiatives for our clients. As a
Senior Software Engineer
at EPAM Malaysia, you will use your expertise in Big Data to collaborate with product and engineering teams and combine the functional and technical aspects of Software Development with Big Data technology in the project space of cloud services. Responsibilities
Design and implement innovative analytical solutions using Hadoop, NoSQL and other Big Data related technologies, evaluating new features and architecture in cloud / on-premise / hybrid solutions Build collaborative partnerships with architects, technical leads and key individuals within other functional groups Perform detailed analysis of business problems and technical environments to design quality technical solutions Participate in code review and test solutions Requirements
At least 6 years of working experience with Big Data technologies and Enterprise Software Development Solid skills in enterprise software development, practical experience in infrastructure & bug troubleshooting Proven expertise in overseeing and evaluating deliverables produced by team members Strong skills in enterprise software development, including infrastructure troubleshooting, incident investigation, performance tuning and root cause analysis Hands-on experience in Spark / Pandas, Airflow and Python Hands-on experience in cloud experience, preferably AWS, and NoSQL Experience with component / integration testing, unit testing, and hands-on experience with GitHub, Kubernetes and Docker We offer
By choosing EPAM, you're getting a job at one of the most loved workplaces according to Newsweek 2021 & 2022&2023. Employee ideas are the main driver of our business. We have a very supportive environment where your voice matters You will be challenged while working side-by-side with the best talent globally. We work with top-notch technologies, constantly seeking new industry trends and best practices We offer a transparent career path and an individual roadmap to engineer your future & accelerate your journey At EPAM, you can find vast opportunities for self-development: online courses and libraries, mentoring programs, partial grants of certification, and experience exchange with colleagues around the world. You will learn, contribute, and grow with us Life at EPAM
EPAM is a leader in the fastest-growing segment (product development/digital platform engineering) of the IT industry. We acquired Just-BI in 2021 to reinforce our leading position as a global Business Intelligence services provider and have been growing rapidly. With a talented multinational team, we provide data and analytics expertise We are currently involved in end-to-end BI design and implementation projects in major national and international companies. We are proud of our entrepreneurial start-up culture and are focused on investing in people by creating continuous learning and development opportunities for our employees who deliver engineering excellence for our clients
#J-18808-Ljbffr