653 Big Data jobs in Kuala Lumpur

Big Data Engineer

Kuala Lumpur, Kuala Lumpur MYR90000 - MYR120000 Y Private Advertiser

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description:

1.Gain insights into the competitiveness and serviceability requirements of HUAWEI CLOUD Big Data service products. Such as: MRS, DWS, DataLake Insights, Elasticsearch, etc.

2.Be responsible for the delivery of global key projects, including Solution design, Resources management, Data migration, Expanding and Optimizing resources based-on HUAWEI CLOUD Big Data service products.

3.Manage requirements of key projects, develop project plans, identify project risks, ensure timely project delivery, and take charge of operations of some HUAWEI CLOUD sites.

4.Provides technical support for HUAWEI CLOUD customers and ensures the proper running of customer services at the technical level.

5.Troubleshoot faults of Big Data service products in various scenarios, communicate with customers about solutions, and track the solution implementation results.

Job Requirement:

1.Have a good project management experience in the ICT or CLOUD industry.

2.Be familiar with Big Data knowledge. Such as: Hadoop, MapReduce, Kafka, HBase, Spark, Datalake Insights, etc.

3.Experience in cloud development, delivery, or O&M. Background in the cloud industry preferred.

4.HCIA/HCIP/HCIE certificate in cloud computing, or equivalent certificates in the industry preferred, ep: AWS, Azure, GCP cloud certificates.

5.Have a good sense of teamwork and organizational coordination skills.

This advertiser has chosen not to accept applicants from your region.

Big Data Engineer

Kuala Lumpur, Kuala Lumpur MYR120000 - MYR240000 Y LHK CENTURY SDN. BHD

Posted today

Job Viewed

Tap Again To Close

Job Description

Are you a dedicated IT Operation Executive with a passion for maintaining and optimizing IT systems? Do you thrive in a fast-paced, dynamic environment where your expertise ensures seamless operations and supports millions of users? If so, we want YOU on our team

你是一位热爱 IT 运营的专业人士吗?你擅长维护和优化 IT 系统,并希望在充满活力的游戏行业确保系统稳定运行,支持数百万用户?如果你的答案是 YES,那我们正在寻找的就是你

Why Join Us? / 为什么加入我们?

  • 100% Work from Home (100% 远程工作)
  • 13th-month salary + attractive bonus & increments (第13个月薪资 + 诱人奖金 & 加薪机会)
  • Career growth opportunities – Promotion (Review in every HALF a Year) & professional development (职业发展机会 – 每半年一次晋升评估 & 专业成长)
  • Many Public holiday entitlement (超多公共假期福利)
  • Annual leave start from 14 days and MORE (年假从 14 天起,还有更多)
  • Medical claims, Annual Dinner, Team Building, Sport Activities, Meals Gathering and MORE (医疗报销、年度晚宴、团队建设、体育活动、聚餐等多项福利)
  • Birthday Gift, Birthday Celebration, Festival Gifts and MANY MORE (生日礼物、生日庆祝、节日礼品,还有更多惊喜等待你)
  • Annual Dinner Cash Prizes Lucky Draw and MANY MORE Rewards WAITING FOR YOU (年度晚宴现金大奖抽奖,更多奖励等你来拿)

It will be a considerable amount of time post-writing. If the preceding content aligns with your preferences, promptly click the "Quick apply" button Missing out on such a remarkable and stable company would be regrettable, and not attempting it would be a lost opportunity. Take the chance and seize it Hurry Up….Join our team

为了公司业务扩展,我们需求更多的人才加入我们的团队这将是一个值得投入时间的机会如果以上内容符合你的期待,请立即点击"快速申请"按钮错过如此优质稳定的公司将是巨大的遗憾,不尝试就等于失去一次宝贵的机会抓住机会,立即加入我们吧

Big Data Responsibilities

  1. Design, develop, and tackle technical challenges for core components of the big data platform;

  2. Participate in data middleware construction, including the development and upgrades of components such as data integration, metadata management, and task management;

  3. Research big data technologies (e.g., ELK, Flink, Spark, ClickHouse) to optimize cluster architecture, troubleshoot issues, and resolve performance bottlenecks.

大数据岗位职责

  1. 负责大数据平台核心组件模块的架构设计、开发及技术攻关;

  2. 参与数据中台建设,完成数据集成、元数据管理、任务管理等组件的开发与升级;

  3. 研究ELK、Flink、Spark、ClickHouse等大数据计算与存储技术,持续优化集群架构,发现并解决故障及性能瓶颈。

Requirements

  1. Proficient in big data platform principles, with expertise in building and optimizing platforms using Hadoop ecosystem tools (e.g., Spark, Impala, Flume, Kafka, HBase, Hive, ZooKeeper) and ClickHouse;

  2. Hands-on experience in developing and deploying ELK-based big data analytics platforms;

  3. Strong grasp of big data modeling methodologies and techniques;

  4. Skilled in Java/Scala programming, design patterns, and big data technologies (Spark, Flink);

  5. Experience in large-scale data warehouse architecture/model/ETL design, with capabilities in massive data processing and performance tuning;

  6. Extensive database design/development experience, familiar with relational databases and NoSQL.

  7. Fluent in Mandarin in order to liaise with Mandarin speaking associates

任职要求

  1. 熟悉大数据平台原理,精通大数据平台搭建与优化,熟练使用Hadoop生态系统组件(如Spark、Impala、Flume、Kafka、HBase、Hive、ZooKeeper等)及ClickHouse构建大数据平台;

  2. 具备ELK大数据分析平台开发及部署经验;

  3. 熟练掌握大数据建模方法和技术;

  4. 精通Java/Scala编程及常用设计模式,熟悉Spark、Flink等大数据编程技术与原理;

  5. 具备大型数据仓库架构设计、模型设计、ETL设计经验,有海量数据处理及性能调优能力;

  6. 丰富的数据库设计与开发经验,熟悉关系型数据库及NoSQL技术。

  7. 流利的中文,需要与中文团队成员沟通.

Candidates with PhP or Java Experienced is Prioritized / 拥有PhP 或 大数据经验者会优先考虑

This advertiser has chosen not to accept applicants from your region.

Big Data Engineer

Kuala Lumpur, Kuala Lumpur MYR80000 - MYR120000 Y POWER IT SERVICES

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description:


• Gather operational Intel on business processes and policies from multiple sources


• Prepare periodical and ad-hoc reports using operational data


• Develop semantic core to align data with business processes


• Support operations team's work streams for data processing, analysis and reporting


• Analyze data and Create dashboards for the senior management


• Design and implement optimal processes


• Regression testing of the releases

Skills Required:


• Big Data : Spark, Hive, Data Bricks


• Language : SQL, JAVA/Python


• BI & Analytics: Power BI (DAX), Tableau, Dataiku


• Operating System : Unix


• Experience with Data Migration, Data Engineering, Data Analysis


• Big Data : SCALA, HADOOP


• Tools: DB Visualizer, JIRA, GIT, Bit bucket, Control-M


• Strong problem-solving skills and the ability to work independently and in a team environment.


• Excellent communication skills and the ability to work effectively with cross-functional teams

This advertiser has chosen not to accept applicants from your region.

Big Data Engineer

Kuala Lumpur, Kuala Lumpur MYR120000 - MYR240000 Y Private Advertiser

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description:

1.Gain insights into the competitiveness and serviceability requirements of HUAWEI CLOUD Big Data service products. Such as: MRS, DWS, DataLake Insights, Elasticsearch, etc.

2.Be responsible for the delivery of global key projects, including Solution design, Resources management, Data migration, Expanding and Optimizing resources based-on HUAWEI CLOUD Big Data service products.

3.Manage requirements of key projects, develop project plans, identify project risks, ensure timely project delivery, and take charge of operations of some HUAWEI CLOUD sites.

4.Provides technical support for HUAWEI CLOUD customers and ensures the proper running of customer services at the technical level.

5.Troubleshoot faults of Big Data service products in various scenarios, communicate with customers about solutions, and track the solution implementation results.

Job Requirement:

1.Have a good project management experience in the ICT or CLOUD industry.

2.Be familiar with Big Data knowledge. Such as: Hadoop, MapReduce, Kafka, HBase, Spark, Datalake Insights, etc.

3.Experience in cloud development, delivery, or O&M. Background in the cloud industry preferred.

4.HCIA/HCIP/HCIE certificate in cloud computing, or equivalent certificates in the industry preferred, ep: AWS, Azure, GCP cloud certificates.

5.Have a good sense of teamwork and organizational coordination skills.

This advertiser has chosen not to accept applicants from your region.

Big Data Engineer

Kuala Lumpur, Kuala Lumpur MYR80000 - MYR120000 Y Accord Innovations

Posted today

Job Viewed

Tap Again To Close

Job Description

  • Design, develop, and optimize
    big data pipelines
    using Apache Spark (batch and streaming).
  • Implement data ingestion, transformation, and processing frameworks to handle
    structured, semi-structured, and unstructured data
    .
  • Work with
    NoSQL databases
    (Cassandra, MongoDB, HBase, DynamoDB, or Couchbase) for large-scale data storage and retrieval.
  • Integrate big data solutions with
    data lakes, cloud platforms (AWS, Azure, GCP), and traditional RDBMS systems
    .
  • Bachelor's/Master's degree in
    Computer Science, Data Engineering, or related field
    .
  • 3–6+ years of hands-on experience in
    Big Data Development
    .
  • Strong expertise in
    Apache Spark (PySpark/Scala/Java)
    for data processing and optimization.
  • Proficiency in
    NoSQL databases
    (Cassandra, MongoDB, HBase, Dynamo
This advertiser has chosen not to accept applicants from your region.

Big Data Engineer

Kuala Lumpur, Kuala Lumpur MYR60000 - MYR120000 Y Beyondsoft Malaysia

Posted today

Job Viewed

Tap Again To Close

Job Description

Responsibilities

  • ETL & Data Pipeline Development (80%): Design and implement ETL processes, data pipeline development, and data warehouse management using Airflow, Spark, and Python/Java.
  • Data Extraction (20%): Provide data tables or CSVs as per the requirements of game operators.

Qualifications

  • Degree in Computer Science or related technical and engineering fields
  • Experience with programming language Python or java or scala or golang;SQL is a must.
  • Good understanding of various types of databases, including relational databases and distributed databases
  • Tabeau or PowerBI or Looker experience will be a plus.
  • Good analytical and technical skills in building batch or streaming data pipelines for big data
  • Business proficiency (written and spoken) in Mandarin to communicate both verbally and in writing with non-English speaking counterparts based in China for gathering requirements and analysis
This advertiser has chosen not to accept applicants from your region.

Big Data Analyst

Kuala Lumpur, Kuala Lumpur MYR80000 - MYR120000 Y Beyondsoft Malaysia

Posted today

Job Viewed

Tap Again To Close

Job Description

Responsible for Tencent overseas mobile game business related security strategy development, operation, relevant statistical analysis and other work.

  1. Bachelor degree or above in computer science or related major, at least 1 year related working experience;

  2. Proficiency in at least one development language (lua/python/c++, etc.), and familiarity with lua is preferred;

  3. Experience in using linux, proficient in using python and sql for big data analysis and processing;

  4. Solid basic computer knowledge, good coding ability, good problem location, analysis and solution ability.

  5. Strong sense of responsibility, strong logical thinking ability, communication ability and pressure resistance ability;

  6. Experience with FPS games such as "PUBG MOBILE", high-ranking players are preferred.

This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Big data Jobs in Kuala Lumpur !

Senior Big Data Engineer

Kuala Lumpur, Kuala Lumpur MYR150000 - MYR250000 Y POWER IT SERVICES

Posted today

Job Viewed

Tap Again To Close

Job Description

Required Skills & Qualifications


• Bachelor's, Master's, or PhD in Computer Science, Data Engineering, or a related discipline.


• 5–7 years of experience in data engineering and distributed data systems.


• Strong hands-on experience with Apache Hive, HBase, Kafka, Solr, Elasticsearch.


• Proficient in data architecture, data modelling, and pipeline scheduling/orchestration.


• Operational experience with Data Mesh, Data Product development, and hybrid cloud data

platforms.


• Familiarity with CRM systems, including CRM system, and data sourcing/mapping strategies.


• Proficient in managing metadata, glossary, and lineage tools like Apache Atlas.


• Proven experience in generating large-scale batch files using Spark and Hive.


• Strong understanding of document-based data models and the transformation of relational

schemas into document-oriented structures.

Additional Technical & Business Competencies


• Expertise in data administration, modelling, mapping, collection, and distribution.


• Strong understanding of business workflows to support metadata governance.


• Hands-on experience with analytics and DWH tools (e.g., SAS, Oracle, MS SQL, Python, R

Programming).


• Familiarity with data modelling tools (e.g., ERWIN), and enterprise databases (Oracle, IBM

DB2, MS SQL, Hadoop, Object Store).


• Experience working across hybrid cloud environments (e.g., AWS, Azure Data Factory).


• In-depth knowledge of ETL/ELT processes and automation frameworks.


• Analytical thinker with strong problem-solving and communication skills.


• Able to collaborate effectively across technical and business teams.


• Proven ability to deliver high-quality outcomes within

This advertiser has chosen not to accept applicants from your region.

Big Data Hadoop Developer

Kuala Lumpur, Kuala Lumpur Unison Consulting Pte Ltd

Posted 14 days ago

Job Viewed

Tap Again To Close

Job Description

Job Summary:
We are looking for a Big Data Hadoop Developer to design, develop, and maintain large-scale data processing solutions. The ideal candidate should have strong hands-on experience with the Hadoop ecosystem and integration with relational databases such as MariaDB or Oracle DB for analytics and reporting.

Key Responsibilities:

  • Design, develop, and optimize Hadoop-based big data solutions for batch and real-time data processing.
  • Work with data ingestion frameworks to integrate data from MariaDB/Oracle DB into Hadoop (Sqoop, Apache Nifi, Kafka).
  • Implement Hive, Spark, and MapReduce jobs for data transformation and analytics.
  • Optimize Hive queries, Spark jobs, and HDFS usage for performance and cost efficiency.
  • Create and maintain ETL pipelines for structured and unstructured data.
  • Troubleshoot and resolve issues in Hadoop jobs and database connectivity.
  • Collaborate with BI, analytics, and data science teams for data provisioning.
  • Ensure data security, governance, and compliance in all solutions.

Technical Skills:

  • Big Data Ecosystem: Hadoop (HDFS, YARN), Hive, Spark, Sqoop, MapReduce, Oozie, Flume.
  • Databases: MariaDB and/or Oracle DB (SQL, PL/SQL).
  • Programming: Java, Scala, or Python for Spark/MapReduce development.
  • Data Ingestion: Sqoop, Kafka, Nifi (for integrating RDBMS with Hadoop).
  • Query Optimization: Hive tuning, partitioning, bucketing, indexing.
  • Tools: Ambari, Cloudera Manager, Git, Jenkins.
  • OS & Scripting: Linux/Unix shell scripting.

Soft Skills:

  • Strong analytical skills and problem-solving abilities.
  • Good communication skills for working with cross-functional teams.
  • Ability to manage priorities in a fast-paced environment.

Nice to Have:

  • Experience with cloud-based big data platforms (AWS EMR, Azure HDInsight, GCP Dataproc).
  • Knowledge of NoSQL databases (HBase, Cassandra).
  • Exposure to machine learning integration with Hadoop/Spark.
#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Senior Big Data Engineers

Kuala Lumpur, Kuala Lumpur MYR80000 - MYR120000 Y Rapsys Technologies Pte. Ltd

Posted today

Job Viewed

Tap Again To Close

Job Description

Key Responsibilities

  • Design and evolve the overall data architecture, ensuring scalability, flexibility, and compliance with enterprise standards.
  • Build efficient, secure, and reliable data pipelines using the Bronze-Silver-Gold architecture within EDL.
  • Develop and orchestrate scheduled jobs in the EDL environment to support continuous ingestion and transformation.
  • Implement Apache Iceberg for data versioning, governance, and optimization.
  • Leverage the Medallion framework to standardize data product maturity and delivery.
  • Govern metadata, data lineage, and business glossary using tools like Apache Atlas.
  • Ensure data security, privacy, and regulatory compliance across all data processes.
  • Support Data Mesh principles by collaborating with domain teams to design and implement reusable Data Products.
  • Integrate data across structured, semi-structured, and unstructured sources from enterprise systems such as ODS and CRM systems.
  • Drive adoption of DataOps/MLOps best practices and mentor peers across units.
  • Generate and manage large-scale batch files using Spark and Hive for high-volume data processing.
  • Design and implement document-based data models and transform relational models into NoSQL document-oriented structures (eg NoSQL Database or similar system).

Required Skills & Qualifications

  • Bachelor's, Master's, or PhD in Computer Science, Data Engineering, or a related discipline.
  • 5–7 years of experience in data engineering and distributed data systems.
  • Strong hands-on experience with Apache Hive, HBase, Kafka, Solr, Elasticsearch.
  • Proficient in data architecture, data modelling, and pipeline scheduling/orchestration.
  • Operational experience with Data Mesh, Data Product development, and hybrid cloud data platforms.
  • Familiarity with CRM systems, including CRM system, and data sourcing/mapping strategies.
  • Proficient in managing metadata, glossary, and lineage tools like Apache Atlas.
  • Proven experience in generating large-scale batch files using Spark and Hive.
  • Strong understanding of document-based data models and the transformation of relational schemas into document-oriented structures.

Additional Technical & Business Competencies

  • Expertise in data administration, modelling, mapping, collection, and distribution.
  • Strong understanding of business workflows to support metadata governance.
  • Hands-on experience with analytics and DWH tools (e.g., SAS, Oracle, MS SQL, Python, R Programming).
  • Familiarity with data modelling tools (e.g., ERWIN), and enterprise databases (Oracle, IBM DB2, MS SQL, Hadoop, Object Store).
  • Experience working across hybrid cloud environments (e.g., AWS, Azure Data Factory).
  • In-depth knowledge of ETL/ELT processes and automation frameworks.
  • Analytical thinker with strong problem-solving and communication skills.
  • Able to collaborate effectively across technical and business teams.
  • Proven ability to deliver high-quality outcomes within

Job Type: Contract

Contract length: 12 months

Pay: RM12, RM15,000.00 per month

Work Location: In person

This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Big Data Jobs View All Jobs in Kuala Lumpur