146 Big Data jobs in Kuala Lumpur

Big Data Hadoop Developer

Kuala Lumpur, Kuala Lumpur Unison Consulting Pte Ltd

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

Job Summary:
We are looking for a Big Data Hadoop Developer to design, develop, and maintain large-scale data processing solutions. The ideal candidate should have strong hands-on experience with the Hadoop ecosystem and integration with relational databases such as MariaDB or Oracle DB for analytics and reporting.

Key Responsibilities:

  • Design, develop, and optimize Hadoop-based big data solutions for batch and real-time data processing.
  • Work with data ingestion frameworks to integrate data from MariaDB/Oracle DB into Hadoop (Sqoop, Apache Nifi, Kafka).
  • Implement Hive, Spark, and MapReduce jobs for data transformation and analytics.
  • Optimize Hive queries, Spark jobs, and HDFS usage for performance and cost efficiency.
  • Create and maintain ETL pipelines for structured and unstructured data.
  • Troubleshoot and resolve issues in Hadoop jobs and database connectivity.
  • Collaborate with BI, analytics, and data science teams for data provisioning.
  • Ensure data security, governance, and compliance in all solutions.

Technical Skills:

  • Big Data Ecosystem: Hadoop (HDFS, YARN), Hive, Spark, Sqoop, MapReduce, Oozie, Flume.
  • Databases: MariaDB and/or Oracle DB (SQL, PL/SQL).
  • Programming: Java, Scala, or Python for Spark/MapReduce development.
  • Data Ingestion: Sqoop, Kafka, Nifi (for integrating RDBMS with Hadoop).
  • Query Optimization: Hive tuning, partitioning, bucketing, indexing.
  • Tools: Ambari, Cloudera Manager, Git, Jenkins.
  • OS & Scripting: Linux/Unix shell scripting.

Soft Skills:

  • Strong analytical skills and problem-solving abilities.
  • Good communication skills for working with cross-functional teams.
  • Ability to manage priorities in a fast-paced environment.

Nice to Have:

  • Experience with cloud-based big data platforms (AWS EMR, Azure HDInsight, GCP Dataproc).
  • Knowledge of NoSQL databases (HBase, Cassandra).
  • Exposure to machine learning integration with Hadoop/Spark.
#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Senior Big Data Engineer

Kuala Lumpur, Kuala Lumpur Bybit

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

Company : Web3 & Blockchain-focused Company

Job Advantages

  • Cutting-edge technology (Web3 and blockchain)
  • Globalized company

Job Responsibilities

As a Senior Java Engineer , you will play a pivotal role in the development and optimization of the company’s data infrastructure, ensuring it supports the evolving needs of our blockchain-focused business. Your primary responsibilities will include:

  • Architectural Design & Development : Lead the design and development of the company’s data center, ensuring robust performance and scalability.
  • Data Platform Optimization : Provide ongoing solutions for platform architecture and performance optimization to support the company’s rapid growth and business needs.
  • Business Enablement : Drive data-driven solutions that accelerate business development and continuously enhance the company's core competitiveness.
  • Big Data Iteration : Lead the rapid iteration of big data platforms, ensuring efficiency, cost-effectiveness, and high-quality output.

Job Requirements

We are seeking candidates who are passionate about blockchain technology and possess a strong technical foundation. Ideal candidates will meet the following criteria:

  • Education : Bachelor’s degree or above in Computer Science or related majors.
  • Experience : More than 3 years of experience in Java development, with a deep understanding of JVM principles and Java development best practices.
  • Frameworks : Proficient in frameworks such as Spring, Spring MVC, gRPC, Mybatis , and an ability to understand their underlying principles and mechanisms.
  • Core Computer Fundamentals : Strong knowledge of computer operating systems, network architecture, and proficiency in commonly used algorithms, data structures, and design patterns.
  • Distributed Systems : Familiarity with distributed systems, caching mechanisms (Redis), messaging systems (Kafka), and big data processing frameworks like Spark, Flink , and Zookeeper . Experience with TiDB is a plus.
  • Experience in Microservices : Experience in the design and development of data centers or microservices, with an emphasis on high availability and scalability.
  • Tagging Systems/Recommendation Systems : Experience with tagging systems or algorithmic recommendation systems is highly desirable.
  • Passion for Blockchain : A strong enthusiasm for the blockchain industry and a commitment to contributing to its growth and development.

Why Join Us?

This role offers the opportunity to work at the forefront of blockchain and Web3 technologies within a global company. You will have the chance to develop and optimize critical infrastructure that powers innovative and scalable solutions in the blockchain space. If you’re ready to work in a fast-paced and cutting-edge environment, this role could be the perfect fit for you.

Apply now to be part of an exciting journey in revolutionizing the Web3 ecosystem!

Pioneer Talent Program - Organic Content Operations Crypto Operations Manager - Oregon, United States Crypto-Native UX Researcher (Remote - Jakarta, Indonesia) Web3 Head of Marketing and Communications (Remote) #J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Senior Big Data Engineers

Kuala Lumpur, Kuala Lumpur RAPSYS TECHNOLOGIES PTE LTD

Posted 7 days ago

Job Viewed

Tap Again To Close

Job Description

Senior Big Data Engineer at RAPSYS TECHNOLOGIES PTE LTD. We are seeking an experienced Senior Big Data Engineer to design, develop, and maintain large-scale data processing systems. The ideal candidate will have expertise in big data technologies, data architecture, and analytics to drive data-driven insights and support business objectives.

Location

Kuala Lumpur, Malaysia

Work Mode

Work From Office

Role

Senior Big Data Engineer

Responsibilities
  • Design and evolve the overall data architecture, ensuring scalability, flexibility, and compliance with enterprise standards.
  • Build efficient, secure, and reliable data pipelines using the Bronze-Silver-Gold architecture within EDL.
  • Develop and orchestrate scheduled jobs in the EDL environment to support continuous ingestion and transformation.
  • Implement Apache Iceberg for data versioning, governance, and optimization.
  • Leverage the Medallion framework to standardize data product maturity and delivery.
  • Govern metadata, data lineage, and business glossary using tools like Apache Atlas.
  • Ensure data security, privacy, and regulatory compliance across all data processes.
  • Support Data Mesh principles by collaborating with domain teams to design and implement reusable Data Products.
  • Integrate data across structured, semi-structured, and unstructured sources from enterprise systems such as ODS and CRM systems.
  • Drive adoption of DataOps/MLOps best practices and mentor peers across units.
  • Generate and manage large-scale batch files using Spark and Hive for high-volume data processing.
  • Design and implement document-based data models and transform relational models into NoSQL document-oriented structures.
Qualifications
  • Bachelor’s, Master’s, or PhD in Computer Science, Data Engineering, or a related discipline.
  • 5–7 years of experience in data engineering and distributed data systems.
  • Strong hands-on experience with Apache Hive, HBase, Kafka, Solr, Elasticsearch.
  • Proficient in data architecture, data modelling, and pipeline scheduling/orchestration.
  • Operational experience with Data Mesh, Data Product development, and hybrid cloud data platforms.
  • Familiarity with CRM systems, including CRM system, and data sourcing/mapping strategies.
  • Proficient in managing metadata, glossary, and lineage tools like Apache Atlas.
  • Proven experience in generating large-scale batch files using Spark and Hive.
  • Strong understanding of document-based data models and the transformation of relational schemas into document-oriented structures.
Additional Technical & Business Competencies
  • Expertise in data administration, modelling, mapping, collection, and distribution.
  • Strong understanding of business workflows to support metadata governance.
  • Hands-on experience with analytics and DWH tools (e.g., SAS, Oracle, MS SQL, Python, R Programming).
  • Familiarity with data modelling tools (e.g., ERWIN), and enterprise databases (Oracle, IBM DB2, MS SQL, Hadoop, Object Store).
  • Experience working across hybrid cloud environments (e.g., AWS, Azure Data Factory).
  • In-depth knowledge of ETL/ELT processes and automation frameworks.
  • Analytical thinker with strong problem-solving and communication skills.
  • Able to collaborate effectively across technical and business teams.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Big Data Hadoop Developer

Kuala Lumpur, Kuala Lumpur Unison Consulting Pte Ltd

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Job Summary: We are looking for a Big Data Hadoop Developer to design, develop, and maintain large-scale data processing solutions. The ideal candidate should have strong hands-on experience with the Hadoop ecosystem and integration with relational databases such as MariaDB or Oracle DB for analytics and reporting. Key Responsibilities: Design, develop, and optimize Hadoop-based big data solutions for batch and real-time data processing. Work with data ingestion frameworks to integrate data from MariaDB/Oracle DB into Hadoop (Sqoop, Apache Nifi, Kafka). Implement Hive, Spark, and MapReduce jobs for data transformation and analytics. Optimize Hive queries, Spark jobs, and HDFS usage for performance and cost efficiency. Create and maintain ETL pipelines for structured and unstructured data. Troubleshoot and resolve issues in Hadoop jobs and database connectivity. Collaborate with BI, analytics, and data science teams for data provisioning. Ensure data security, governance, and compliance in all solutions. Technical Skills: Big Data Ecosystem:

Hadoop (HDFS, YARN), Hive, Spark, Sqoop, MapReduce, Oozie, Flume. Databases:

MariaDB and/or Oracle DB (SQL, PL/SQL). Programming:

Java, Scala, or Python for Spark/MapReduce development. Data Ingestion:

Sqoop, Kafka, Nifi (for integrating RDBMS with Hadoop). Query Optimization:

Hive tuning, partitioning, bucketing, indexing. Tools:

Ambari, Cloudera Manager, Git, Jenkins. OS & Scripting:

Linux/Unix shell scripting. Soft Skills: Strong analytical skills and problem-solving abilities. Good communication skills for working with cross-functional teams. Ability to manage priorities in a fast-paced environment. Nice to Have: Experience with cloud-based big data platforms (AWS EMR, Azure HDInsight, GCP Dataproc). Knowledge of NoSQL databases (HBase, Cassandra). Exposure to machine learning integration with Hadoop/Spark.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Senior Big Data Engineers

Kuala Lumpur, Kuala Lumpur RAPSYS TECHNOLOGIES PTE LTD

Posted 7 days ago

Job Viewed

Tap Again To Close

Job Description

Senior Big Data Engineer at RAPSYS TECHNOLOGIES PTE LTD. We are seeking an experienced Senior Big Data Engineer to design, develop, and maintain large-scale data processing systems. The ideal candidate will have expertise in big data technologies, data architecture, and analytics to drive data-driven insights and support business objectives. Location

Kuala Lumpur, Malaysia Work Mode

Work From Office Role

Senior Big Data Engineer Responsibilities

Design and evolve the overall data architecture, ensuring scalability, flexibility, and compliance with enterprise standards. Build efficient, secure, and reliable data pipelines using the Bronze-Silver-Gold architecture within EDL. Develop and orchestrate scheduled jobs in the EDL environment to support continuous ingestion and transformation. Implement Apache Iceberg for data versioning, governance, and optimization. Leverage the Medallion framework to standardize data product maturity and delivery. Govern metadata, data lineage, and business glossary using tools like Apache Atlas. Ensure data security, privacy, and regulatory compliance across all data processes. Support Data Mesh principles by collaborating with domain teams to design and implement reusable Data Products. Integrate data across structured, semi-structured, and unstructured sources from enterprise systems such as ODS and CRM systems. Drive adoption of DataOps/MLOps best practices and mentor peers across units. Generate and manage large-scale batch files using Spark and Hive for high-volume data processing. Design and implement document-based data models and transform relational models into NoSQL document-oriented structures. Qualifications

Bachelor’s, Master’s, or PhD in Computer Science, Data Engineering, or a related discipline. 5–7 years of experience in data engineering and distributed data systems. Strong hands-on experience with Apache Hive, HBase, Kafka, Solr, Elasticsearch. Proficient in data architecture, data modelling, and pipeline scheduling/orchestration. Operational experience with Data Mesh, Data Product development, and hybrid cloud data platforms. Familiarity with CRM systems, including CRM system, and data sourcing/mapping strategies. Proficient in managing metadata, glossary, and lineage tools like Apache Atlas. Proven experience in generating large-scale batch files using Spark and Hive. Strong understanding of document-based data models and the transformation of relational schemas into document-oriented structures. Additional Technical & Business Competencies

Expertise in data administration, modelling, mapping, collection, and distribution. Strong understanding of business workflows to support metadata governance. Hands-on experience with analytics and DWH tools (e.g., SAS, Oracle, MS SQL, Python, R Programming). Familiarity with data modelling tools (e.g., ERWIN), and enterprise databases (Oracle, IBM DB2, MS SQL, Hadoop, Object Store). Experience working across hybrid cloud environments (e.g., AWS, Azure Data Factory). In-depth knowledge of ETL/ELT processes and automation frameworks. Analytical thinker with strong problem-solving and communication skills. Able to collaborate effectively across technical and business teams.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Senior Big Data Engineer

Kuala Lumpur, Kuala Lumpur Bybit

Posted 21 days ago

Job Viewed

Tap Again To Close

Job Description

Company : Web3 & Blockchain-focused Company Job Advantages Cutting-edge technology (Web3 and blockchain) Globalized company Job Responsibilities As a

Senior Java Engineer , you will play a pivotal role in the development and optimization of the company’s data infrastructure, ensuring it supports the evolving needs of our blockchain-focused business. Your primary responsibilities will include: Architectural Design & Development : Lead the design and development of the company’s data center, ensuring robust performance and scalability. Data Platform Optimization : Provide ongoing solutions for platform architecture and performance optimization to support the company’s rapid growth and business needs. Business Enablement : Drive data-driven solutions that accelerate business development and continuously enhance the company's core competitiveness. Big Data Iteration : Lead the rapid iteration of big data platforms, ensuring efficiency, cost-effectiveness, and high-quality output. Job Requirements We are seeking candidates who are passionate about blockchain technology and possess a strong technical foundation. Ideal candidates will meet the following criteria: Education : Bachelor’s degree or above in Computer Science or related majors. Experience : More than 3 years of experience in Java development, with a deep understanding of JVM principles and Java development best practices. Frameworks : Proficient in frameworks such as

Spring, Spring MVC, gRPC, Mybatis , and an ability to understand their underlying principles and mechanisms. Core Computer Fundamentals : Strong knowledge of computer operating systems, network architecture, and proficiency in commonly used algorithms, data structures, and design patterns. Distributed Systems : Familiarity with distributed systems, caching mechanisms (Redis), messaging systems (Kafka), and big data processing frameworks like

Spark, Flink , and

Zookeeper . Experience with

TiDB

is a plus. Experience in Microservices : Experience in the design and development of data centers or microservices, with an emphasis on high availability and scalability. Tagging Systems/Recommendation Systems : Experience with tagging systems or algorithmic recommendation systems is highly desirable. Passion for Blockchain : A strong enthusiasm for the blockchain industry and a commitment to contributing to its growth and development. Why Join Us? This role offers the opportunity to work at the forefront of

blockchain

and

Web3 technologies

within a global company. You will have the chance to develop and optimize critical infrastructure that powers innovative and scalable solutions in the blockchain space. If you’re ready to work in a fast-paced and cutting-edge environment, this role could be the perfect fit for you. Apply now to be part of an exciting journey in revolutionizing the Web3 ecosystem! Pioneer Talent Program - Organic Content Operations

Crypto Operations Manager - Oregon, United States

Crypto-Native UX Researcher (Remote - Jakarta, Indonesia)

Web3 Head of Marketing and Communications (Remote) #J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Senior Software Engineer (Big Data)

Kuala Lumpur, Kuala Lumpur EPAM Systems

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Overview

EPAM Systems Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia

Software Engineers at EPAM are the driving force behind strategic initiatives for our clients. As a Senior Software Engineer at EPAM Malaysia, you will use your expertise in Big Data to collaborate with product and engineering teams and combine the functional and technical aspects of Software Development with Big Data technology in the project space of cloud services.

Responsibilities
  • Design and implement innovative analytical solutions using Hadoop, NoSQL and other Big Data related technologies, evaluating new features and architecture in cloud / on-premise / hybrid solutions
  • Build collaborative partnerships with architects, technical leads and key individuals within other functional groups
  • Perform detailed analysis of business problems and technical environments to design quality technical solutions
  • Participate in code review and test solutions
Requirements
  • At least 6 years of working experience with Big Data technologies and Enterprise Software Development
  • Solid skills in enterprise software development, practical experience in infrastructure & bug troubleshooting
  • Proven expertise in overseeing and evaluating deliverables produced by team members
  • Strong skills in enterprise software development, including infrastructure troubleshooting, incident investigation, performance tuning and root cause analysis
  • Hands-on experience in Spark / Pandas, Airflow and Python
  • Hands-on experience in cloud experience, preferably AWS, and NoSQL
  • Experience with component / integration testing, unit testing, and hands-on experience with GitHub, Kubernetes and Docker
We offer
  • By choosing EPAM, you're getting a job at one of the most loved workplaces according to Newsweek 2021 & 2022&2023.
  • Employee ideas are the main driver of our business. We have a very supportive environment where your voice matters
  • You will be challenged while working side-by-side with the best talent globally. We work with top-notch technologies, constantly seeking new industry trends and best practices
  • We offer a transparent career path and an individual roadmap to engineer your future & accelerate your journey
  • At EPAM, you can find vast opportunities for self-development: online courses and libraries, mentoring programs, partial grants of certification, and experience exchange with colleagues around the world. You will learn, contribute, and grow with us
Life at EPAM
  • EPAM is a leader in the fastest-growing segment (product development/digital platform engineering) of the IT industry. We acquired Just-BI in 2021 to reinforce our leading position as a global Business Intelligence services provider and have been growing rapidly. With a talented multinational team, we provide data and analytics expertise
  • We are currently involved in end-to-end BI design and implementation projects in major national and international companies. We are proud of our entrepreneurial start-up culture and are focused on investing in people by creating continuous learning and development opportunities for our employees who deliver engineering excellence for our clients

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Big data Jobs in Kuala Lumpur !

Senior Software Engineer (Big Data)

Kuala Lumpur, Kuala Lumpur EPAM Systems

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

Overview

EPAM Systems Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia Software Engineers at EPAM are the driving force behind strategic initiatives for our clients. As a

Senior Software Engineer

at EPAM Malaysia, you will use your expertise in Big Data to collaborate with product and engineering teams and combine the functional and technical aspects of Software Development with Big Data technology in the project space of cloud services. Responsibilities

Design and implement innovative analytical solutions using Hadoop, NoSQL and other Big Data related technologies, evaluating new features and architecture in cloud / on-premise / hybrid solutions Build collaborative partnerships with architects, technical leads and key individuals within other functional groups Perform detailed analysis of business problems and technical environments to design quality technical solutions Participate in code review and test solutions Requirements

At least 6 years of working experience with Big Data technologies and Enterprise Software Development Solid skills in enterprise software development, practical experience in infrastructure & bug troubleshooting Proven expertise in overseeing and evaluating deliverables produced by team members Strong skills in enterprise software development, including infrastructure troubleshooting, incident investigation, performance tuning and root cause analysis Hands-on experience in Spark / Pandas, Airflow and Python Hands-on experience in cloud experience, preferably AWS, and NoSQL Experience with component / integration testing, unit testing, and hands-on experience with GitHub, Kubernetes and Docker We offer

By choosing EPAM, you're getting a job at one of the most loved workplaces according to Newsweek 2021 & 2022&2023. Employee ideas are the main driver of our business. We have a very supportive environment where your voice matters You will be challenged while working side-by-side with the best talent globally. We work with top-notch technologies, constantly seeking new industry trends and best practices We offer a transparent career path and an individual roadmap to engineer your future & accelerate your journey At EPAM, you can find vast opportunities for self-development: online courses and libraries, mentoring programs, partial grants of certification, and experience exchange with colleagues around the world. You will learn, contribute, and grow with us Life at EPAM

EPAM is a leader in the fastest-growing segment (product development/digital platform engineering) of the IT industry. We acquired Just-BI in 2021 to reinforce our leading position as a global Business Intelligence services provider and have been growing rapidly. With a talented multinational team, we provide data and analytics expertise We are currently involved in end-to-end BI design and implementation projects in major national and international companies. We are proud of our entrepreneurial start-up culture and are focused on investing in people by creating continuous learning and development opportunities for our employees who deliver engineering excellence for our clients

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Deployment Engineer- MY (Big Data / Hadoop Admin)

Kuala Lumpur, Kuala Lumpur Tookitaki Holding PTE LTD

Posted 7 days ago

Job Viewed

Tap Again To Close

Job Description

Position Overview
Job Title: Deployment Engineer
Department: Services Delivery
Reporting To: Regional Head of Service Delivery


The Deployment Engineer is a critical technical role within Tookitaki’s Services Delivery team. This position is responsible for deploying Tookitaki’s FinCense platform across both on-premise and cloud-hosted (CaaS) environments. The role requires a strong understanding of infrastructure, system integrations, and big data technologies to ensure successful deployments for banking and fintech clients.

Position Purpose

The Deployment Engineer ensures smooth and efficient deployment of Tookitaki’s FinCense platform by working closely with internal teams and client stakeholders. This includes end-to-end deployment, configuration, and initial troubleshooting, ensuring the platform is fully functional and integrated into the client’s infrastructure.

This role is critical for achieving a seamless transition during the implementation phase, setting the foundation for client success.

Key Responsibilities

1. Deployment of FinCense Platform
  • On-Premise Deployment:

    • Install and configure the entire FinCense platform on the client’s infrastructure.

    • Ensure full integration with the client’s existing systems and databases.

    • Conduct rigorous testing to validate deployment success.

  • CaaS Deployment:

    • Install and configure client tenants on Tookitaki’s cloud-hosted infrastructure (AWS/GCP).

    • Ensure scalability and seamless operation within Tookitaki’s hosted cloud environment.

2. System Configuration and Integration
  • Configure system settings, APIs, and data pipelines to meet client-specific requirements.

  • Collaborate with Data Engineers to integrate client data into the platform.

  • Optimize system performance to ensure stability and scalability.

3. Client Collaboration and Support
  • Work closely with the Client Enablement team to gather deployment requirements and ensure alignment with client needs.

  • Provide technical guidance to client teams during the deployment phase.

  • Act as the primary technical point of contact for all deployment-related queries.

4. Post-Deployment Validation
  • Conduct end-to-end system tests to validate the platform’s performance and functionality.

  • Resolve any deployment issues and ensure the platform meets agreed-upon SLAs.

  • Document deployment processes and configurations for future reference.

5. Collaboration with Cross-Functional Teams
  • Work closely with Product Management and Engineering teams to address technical challenges during deployment.

  • Provide feedback on deployment experiences to improve product features and deployment efficiency.

  • Support the Client Enablement and Support teams during the handover process.

Qualifications and Skills

Education
  • Required: Bachelor’s degree in Computer Science, IT, or a related technical field.

  • Preferred: Master’s degree in IT, Cloud Computing, or Big Data Analytics.

Experience
  • Minimum: 6 years of experience in system deployment, cloud computing, or IT infrastructure roles.

  • Proven expertise in deploying SaaS platforms or big data systems for financial or regulated industries.


Technical Expertise
  • Big Data Technologies: Strong knowledge of Hadoop, Spark, Hive, Kubernetes, and Docker.

  • Cloud Infrastructure: Hands-on experience with AWS (preferred) or GCP, including EC2, S3, and VPC configurations.

  • System Integration: Proficient in integrating systems via APIs, connectors, and middleware solutions.

  • Scripting and Automation: Proficiency in scripting languages such as Python, Bash, or PowerShell.

Soft Skills
  • Excellent problem-solving and troubleshooting skills.

  • Strong communication skills to interact with technical and non-technical stakeholders.

  • Ability to manage multiple deployments simultaneously while meeting deadlines.

Preferred
  • Certifications in AWS, Kubernetes, or Big Data technologies.

  • Experience with AML and fraud detection systems is a strong plus.

Key Competencies

  • Client-Centric Approach: Focused on delivering high-quality deployments tailored to client needs.

  • Technical Acumen: Expertise in big data and cloud technologies to ensure flawless deployments.

  • Collaboration: Works effectively with cross-functional teams to ensure deployment success.

  • Ownership: Takes full responsibility for deployment activities and outcomes.

  • Adaptability: Thrives in dynamic environments with changing requirements.

Success Metrics

  1. Deployment Accuracy:

  • 100% of deployments completed successfully without post-deployment issues.

  • Timeliness:

    • Deployments delivered within agreed timelines for both on-premise and CaaS clients.

  • System Performance:

    • Achieve target SLAs for platform performance and stability post-deployment.

  • Client Satisfaction:

    • Positive feedback from clients on deployment experience and system functionality.

  • Knowledge Sharing:

    • Maintain and share deployment documentation to improve team efficiency.

    #J-18808-Ljbffr
    This advertiser has chosen not to accept applicants from your region.

    Deployment Engineer- MY (Big Data / Hadoop Admin)

    Kuala Lumpur, Kuala Lumpur Tookitaki Holding PTE LTD

    Posted 7 days ago

    Job Viewed

    Tap Again To Close

    Job Description

    Position Overview Job Title: Deployment Engineer Department: Services Delivery Reporting To: Regional Head of Service Delivery

    The Deployment Engineer is a critical technical role within Tookitaki’s Services Delivery team. This position is responsible for deploying Tookitaki’s FinCense platform across both on-premise and cloud-hosted (CaaS) environments. The role requires a strong understanding of infrastructure, system integrations, and big data technologies to ensure successful deployments for banking and fintech clients. Position Purpose The Deployment Engineer ensures smooth and efficient deployment of Tookitaki’s FinCense platform by working closely with internal teams and client stakeholders. This includes end-to-end deployment, configuration, and initial troubleshooting, ensuring the platform is fully functional and integrated into the client’s infrastructure. This role is critical for achieving a seamless transition during the implementation phase, setting the foundation for client success.

    Key Responsibilities 1. Deployment of FinCense Platform

    On-Premise Deployment:

    Install and configure the entire FinCense platform on the client’s infrastructure.

    Ensure full integration with the client’s existing systems and databases.

    Conduct rigorous testing to validate deployment success.

    CaaS Deployment:

    Install and configure client tenants on Tookitaki’s cloud-hosted infrastructure (AWS/GCP).

    Ensure scalability and seamless operation within Tookitaki’s hosted cloud environment.

    2. System Configuration and Integration

    Configure system settings, APIs, and data pipelines to meet client-specific requirements.

    Collaborate with Data Engineers to integrate client data into the platform.

    Optimize system performance to ensure stability and scalability.

    3. Client Collaboration and Support

    Work closely with the Client Enablement team to gather deployment requirements and ensure alignment with client needs.

    Provide technical guidance to client teams during the deployment phase.

    Act as the primary technical point of contact for all deployment-related queries.

    4. Post-Deployment Validation

    Conduct end-to-end system tests to validate the platform’s performance and functionality.

    Resolve any deployment issues and ensure the platform meets agreed-upon SLAs.

    Document deployment processes and configurations for future reference.

    5. Collaboration with Cross-Functional Teams

    Work closely with Product Management and Engineering teams to address technical challenges during deployment.

    Provide feedback on deployment experiences to improve product features and deployment efficiency.

    Support the Client Enablement and Support teams during the handover process.

    Qualifications and Skills Education

    Required: Bachelor’s degree in Computer Science, IT, or a related technical field.

    Preferred: Master’s degree in IT, Cloud Computing, or Big Data Analytics.

    Experience

    Minimum: 6 years of experience in system deployment, cloud computing, or IT infrastructure roles.

    Proven expertise in deploying SaaS platforms or big data systems for financial or regulated industries.

    Technical Expertise

    Big Data Technologies: Strong knowledge of Hadoop, Spark, Hive, Kubernetes, and Docker.

    Cloud Infrastructure: Hands-on experience with AWS (preferred) or GCP, including EC2, S3, and VPC configurations.

    System Integration: Proficient in integrating systems via APIs, connectors, and middleware solutions.

    Scripting and Automation: Proficiency in scripting languages such as Python, Bash, or PowerShell.

    Soft Skills

    Excellent problem-solving and troubleshooting skills.

    Strong communication skills to interact with technical and non-technical stakeholders.

    Ability to manage multiple deployments simultaneously while meeting deadlines.

    Preferred

    Certifications in AWS, Kubernetes, or Big Data technologies.

    Experience with AML and fraud detection systems is a strong plus.

    Key Competencies Client-Centric Approach: Focused on delivering high-quality deployments tailored to client needs.

    Technical Acumen: Expertise in big data and cloud technologies to ensure flawless deployments.

    Collaboration: Works effectively with cross-functional teams to ensure deployment success.

    Ownership: Takes full responsibility for deployment activities and outcomes.

    Adaptability: Thrives in dynamic environments with changing requirements.

    Success Metrics Deployment Accuracy:

    100% of deployments completed successfully without post-deployment issues.

    Timeliness:

    Deployments delivered within agreed timelines for both on-premise and CaaS clients.

    System Performance:

    Achieve target SLAs for platform performance and stability post-deployment.

    Client Satisfaction:

    Positive feedback from clients on deployment experience and system functionality.

    Knowledge Sharing:

    Maintain and share deployment documentation to improve team efficiency.

    #J-18808-Ljbffr
    This advertiser has chosen not to accept applicants from your region.
     

    Nearby Locations

    Other Jobs Near Me

    Industry

    1. request_quote Accounting
    2. work Administrative
    3. eco Agriculture Forestry
    4. smart_toy AI & Emerging Technologies
    5. school Apprenticeships & Trainee
    6. apartment Architecture
    7. palette Arts & Entertainment
    8. directions_car Automotive
    9. flight_takeoff Aviation
    10. account_balance Banking & Finance
    11. local_florist Beauty & Wellness
    12. restaurant Catering
    13. volunteer_activism Charity & Voluntary
    14. science Chemical Engineering
    15. child_friendly Childcare
    16. foundation Civil Engineering
    17. clean_hands Cleaning & Sanitation
    18. diversity_3 Community & Social Care
    19. construction Construction
    20. brush Creative & Digital
    21. currency_bitcoin Crypto & Blockchain
    22. support_agent Customer Service & Helpdesk
    23. medical_services Dental
    24. medical_services Driving & Transport
    25. medical_services E Commerce & Social Media
    26. school Education & Teaching
    27. electrical_services Electrical Engineering
    28. bolt Energy
    29. local_mall Fmcg
    30. gavel Government & Non Profit
    31. emoji_events Graduate
    32. health_and_safety Healthcare
    33. beach_access Hospitality & Tourism
    34. groups Human Resources
    35. precision_manufacturing Industrial Engineering
    36. security Information Security
    37. handyman Installation & Maintenance
    38. policy Insurance
    39. code IT & Software
    40. gavel Legal
    41. sports_soccer Leisure & Sports
    42. inventory_2 Logistics & Warehousing
    43. supervisor_account Management
    44. supervisor_account Management Consultancy
    45. supervisor_account Manufacturing & Production
    46. campaign Marketing
    47. build Mechanical Engineering
    48. perm_media Media & PR
    49. local_hospital Medical
    50. local_hospital Military & Public Safety
    51. local_hospital Mining
    52. medical_services Nursing
    53. local_gas_station Oil & Gas
    54. biotech Pharmaceutical
    55. checklist_rtl Project Management
    56. shopping_bag Purchasing
    57. home_work Real Estate
    58. person_search Recruitment Consultancy
    59. store Retail
    60. point_of_sale Sales
    61. science Scientific Research & Development
    62. wifi Telecoms
    63. psychology Therapy
    64. pets Veterinary
    View All Big Data Jobs View All Jobs in Kuala Lumpur