157 Global Data jobs in Malaysia

Global Data Warehouse-Teradata,ETL,Informatica

Kuala Lumpur, Kuala Lumpur RAPSYS TECHNOLOGIES PTE LTD

Posted 11 days ago

Job Viewed

Tap Again To Close

Job Description

Global Data Warehouse-Teradata,ETL,Informatica

Join to apply for the Global Data Warehouse-Teradata,ETL,Informatica role at RAPSYS TECHNOLOGIES PTE LTD

Global Data Warehouse-Teradata,ETL,Informatica

3 days ago Be among the first 25 applicants

Join to apply for the Global Data Warehouse-Teradata,ETL,Informatica role at RAPSYS TECHNOLOGIES PTE LTD

We're Hiring: Global Data Warehouse Specialist!

We are seeking an experienced Data Warehouse professional to join our global team. The ideal candidate will have expertise in Teradata, ETL processes, and Informatica to design, develop, and maintain our enterprise data warehouse solutions.

Location: Kuala Lumpur, Malaysia

Work Mode: Work From Office

Role: Global Data Warehouse-Teradata,ETL,Informatica

What You'll Do

Design and develop Teradata data warehouse solutions

Build and optimize ETL processes using Informatica

Perform data modeling and database performance tuning

Create and maintain data pipelines for business intelligence

Collaborate with cross-functional teams on data requirements

️ Troubleshoot and resolve data warehouse issues

What We're Looking For

3+ years experience with Teradata and data warehousing

Strong expertise in Informatica PowerCenter/IICS

Proficiency in SQL, ETL design patterns, and data modeling

Experience with performance tuning and optimization

Knowledge of data governance and quality processes

Strong analytical and problem-solving skills

Ready to make an impact? Apply now and let's grow together!

Seniority level
  • Seniority level Mid-Senior level
Employment type
  • Employment type Full-time
Job function
  • Job function Other
  • Industries IT Services and IT Consulting

Referrals increase your chances of interviewing at RAPSYS TECHNOLOGIES PTE LTD by 2x

Sign in to set job alerts for “Data Warehouse Specialist” roles.

Bukit Raja, Selangor, Malaysia 5 days ago

Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 week ago

Support Escalation Engineer - SQL Engine

Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago

Federal Territory of Kuala Lumpur, Malaysia 1 week ago

Junior Data Analyst / Inventory Planning Executive (IPO)

WP. Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago

Data Analyst - SG Business Intelligence (Open for fresh graduates)

Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 day ago

Federal Territory of Kuala Lumpur, Malaysia 9 hours ago

Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 6 hours ago

Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 6 days ago

WP. Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 4 weeks ago

Federal Territory of Kuala Lumpur, Malaysia 1 week ago

Petaling Jaya, Selangor, Malaysia 1 month ago

Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 week ago

Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 6 days ago

Petaling Jaya, Selangor, Malaysia 1 month ago

Federal Territory of Kuala Lumpur, Malaysia 4 days ago

Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 4 days ago

WP. Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 week ago

Kota Damansara, Selangor, Malaysia 4 weeks ago

WP. Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 week ago

Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 3 days ago

Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 week ago

Bandar Puchong Jaya, Selangor, Malaysia 1 month ago

Associate, Over-the-Top and Cloud Platform

Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 week ago

We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Big Data Engineer

Shirlyn Technology

Posted 11 days ago

Job Viewed

Tap Again To Close

Job Description

Position Summary

We are looking for a full-time Big Data Engineer II to join our team at Shirlyn Technology in Petaling Jaya, Malaysia and Palo Alto, California. The Big Data team is at the heart of our operations, playing a critical role in scaling and optimizing our data infrastructure. As we continue to grow, we are seeking an experienced and driven Big Data Engineer to help us tackle complex data challenges, enhance data solutions, and ensure the security and quality of our data systems. You will be instrumental in building and deploying scalable data pipelines to ensure seamless data flow across systems, driving business success through reliable data solutions.

Job Responsibilities

  • Collaborate with global teams in data, security, infrastructure, and business functions to understand data requirements and provide scalable solutions.
  • Design, implement, and maintain efficient ETL (Extract, Transform, Load) pipelines to enable smooth data flow between systems.
  • Conduct regular data quality checks to ensure accuracy, consistency, and completeness within our data pipelines.
  • Continuously improve data pipeline performance and reliability, identifying and addressing any inefficiencies or bottlenecks.
  • Ensure data integrity, security, privacy, and availability across all systems.
  • Proactively monitor data pipelines, quickly diagnosing issues and taking action to resolve them and maintain system uptime.
  • Conduct root cause analysis of data-related issues and work on long-term solutions to prevent recurring problems.
  • Document pipeline designs, data processes, and troubleshooting procedures, keeping stakeholders informed with clear communication of updates.
  • Provide on-call support for critical data operations, ensuring systems are running 24/7, with rotational responsibilities.

Job Requirements

  • Bachelor’s degree in Computer Science, Information Systems, or a related technical field.
  • 5+ years of hands-on experience building and optimizing data pipelines using big data technologies (Hive, Presto, Spark, Flink).
  • Expertise in writing complex SQL queries and proficiency in Python programming, focusing on maintainable and clean code.
  • Solid knowledge of scripting languages (e.g., Shell, Perl).
  • Experience working in Unix/Linux environments.
  • Familiarity with cloud platforms such as Azure, AWS, or GCP.
  • High personal integrity and professionalism in handling confidential information, exercising sound judgment.
  • Ability to remain calm and resolve issues swiftly during high-pressure situations, adhering to SLAs.
  • Strong leadership skills, including the ability to guide junior team members and lead projects across the team.
  • Excellent verbal and written communication skills, with the ability to collaborate effectively with remote teams globally.
#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Big Data Engineer

Kelantan, Kelantan Shirlyn Technology

Posted today

Job Viewed

Tap Again To Close

Job Description

Position Summary We are looking for a full-time Big Data Engineer II to join our team at Shirlyn Technology in Petaling Jaya, Malaysia and Palo Alto, California. The Big Data team is at the heart of our operations, playing a critical role in scaling and optimizing our data infrastructure. As we continue to grow, we are seeking an experienced and driven Big Data Engineer to help us tackle complex data challenges, enhance data solutions, and ensure the security and quality of our data systems. You will be instrumental in building and deploying scalable data pipelines to ensure seamless data flow across systems, driving business success through reliable data solutions. Job Responsibilities Collaborate with global teams in data, security, infrastructure, and business functions to understand data requirements and provide scalable solutions. Design, implement, and maintain efficient ETL (Extract, Transform, Load) pipelines to enable smooth data flow between systems. Conduct regular data quality checks to ensure accuracy, consistency, and completeness within our data pipelines. Continuously improve data pipeline performance and reliability, identifying and addressing any inefficiencies or bottlenecks. Ensure data integrity, security, privacy, and availability across all systems. Proactively monitor data pipelines, quickly diagnosing issues and taking action to resolve them and maintain system uptime. Conduct root cause analysis of data-related issues and work on long-term solutions to prevent recurring problems. Document pipeline designs, data processes, and troubleshooting procedures, keeping stakeholders informed with clear communication of updates. Provide on-call support for critical data operations, ensuring systems are running 24/7, with rotational responsibilities. Job Requirements Bachelor’s degree in Computer Science, Information Systems, or a related technical field. 5+ years of hands-on experience building and optimizing data pipelines using big data technologies (Hive, Presto, Spark, Flink). Expertise in writing complex SQL queries and proficiency in Python programming, focusing on maintainable and clean code. Solid knowledge of scripting languages (e.g., Shell, Perl). Experience working in Unix/Linux environments. Familiarity with cloud platforms such as Azure, AWS, or GCP. High personal integrity and professionalism in handling confidential information, exercising sound judgment. Ability to remain calm and resolve issues swiftly during high-pressure situations, adhering to SLAs. Strong leadership skills, including the ability to guide junior team members and lead projects across the team. Excellent verbal and written communication skills, with the ability to collaborate effectively with remote teams globally.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Big Data Hadoop Developer

Kuala Lumpur, Kuala Lumpur Unison Consulting Pte Ltd

Posted 7 days ago

Job Viewed

Tap Again To Close

Job Description

Job Summary:
We are looking for a Big Data Hadoop Developer to design, develop, and maintain large-scale data processing solutions. The ideal candidate should have strong hands-on experience with the Hadoop ecosystem and integration with relational databases such as MariaDB or Oracle DB for analytics and reporting.

Key Responsibilities:

  • Design, develop, and optimize Hadoop-based big data solutions for batch and real-time data processing.
  • Work with data ingestion frameworks to integrate data from MariaDB/Oracle DB into Hadoop (Sqoop, Apache Nifi, Kafka).
  • Implement Hive, Spark, and MapReduce jobs for data transformation and analytics.
  • Optimize Hive queries, Spark jobs, and HDFS usage for performance and cost efficiency.
  • Create and maintain ETL pipelines for structured and unstructured data.
  • Troubleshoot and resolve issues in Hadoop jobs and database connectivity.
  • Collaborate with BI, analytics, and data science teams for data provisioning.
  • Ensure data security, governance, and compliance in all solutions.

Technical Skills:

  • Big Data Ecosystem: Hadoop (HDFS, YARN), Hive, Spark, Sqoop, MapReduce, Oozie, Flume.
  • Databases: MariaDB and/or Oracle DB (SQL, PL/SQL).
  • Programming: Java, Scala, or Python for Spark/MapReduce development.
  • Data Ingestion: Sqoop, Kafka, Nifi (for integrating RDBMS with Hadoop).
  • Query Optimization: Hive tuning, partitioning, bucketing, indexing.
  • Tools: Ambari, Cloudera Manager, Git, Jenkins.
  • OS & Scripting: Linux/Unix shell scripting.

Soft Skills:

  • Strong analytical skills and problem-solving abilities.
  • Good communication skills for working with cross-functional teams.
  • Ability to manage priorities in a fast-paced environment.

Nice to Have:

  • Experience with cloud-based big data platforms (AWS EMR, Azure HDInsight, GCP Dataproc).
  • Knowledge of NoSQL databases (HBase, Cassandra).
  • Exposure to machine learning integration with Hadoop/Spark.
#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Senior Big Data Engineer

Kuala Lumpur, Kuala Lumpur NEAR

Posted 11 days ago

Job Viewed

Tap Again To Close

Job Description

Senior Java Engineer (Base Kuala Lumpur)

Job Advantages

Cutting-edge technology (web3 and blockchain), globalized company

Job Responsibilities

  1. Responsible for the architectural design and development of the company's data center.
  2. Continuously deliver data value to the company's businesses with high quality, low cost and high efficiency.
  3. Provide solutions for platform architecture and performance optimization, and lead the rapid iteration of big data platform.
  4. Enabling the rapid development of the company's business through data-driven approaches, continuously improving the company's core competitiveness.

Job Requirements

  1. Bachelor's degree or above in computer science or related majors, with more than 3 years of experience in Java development, proficient in Java development, and an in-depth understanding of JVM principles.
  2. Familiar with commonly used frameworks (Spring, Spring MVC, gRPC, Mybatis) and able to master their principles and mechanisms.
  3. Solid computer fundamentals, mastery of computer operating systems and network architecture, familiar with commonly used algorithms, data structures, and design patterns.
  4. Familiar with distributed systems, caching, messaging, and other mechanisms; experience with Redis, Kafka, Spark, Flink, Zookeeper, and TiDB is preferred.
  5. Experience in data center or microservice design and development, with a focus on high availability and scalability of architecture is preferred.
  6. Experience in tagging systems or algorithmic recommendation systems is preferred.
  7. Full of enthusiasm and expectation for the blockchain industry, committed to working hard for the development of the industry.
#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Senior Big Data Engineer

Kuala Lumpur, Kuala Lumpur Bybit

Posted 11 days ago

Job Viewed

Tap Again To Close

Job Description

Company : Web3 & Blockchain-focused Company

Job Advantages

  • Cutting-edge technology (Web3 and blockchain)
  • Globalized company

Job Responsibilities

As a Senior Java Engineer , you will play a pivotal role in the development and optimization of the company’s data infrastructure, ensuring it supports the evolving needs of our blockchain-focused business. Your primary responsibilities will include:

  • Architectural Design & Development : Lead the design and development of the company’s data center, ensuring robust performance and scalability.
  • Data Platform Optimization : Provide ongoing solutions for platform architecture and performance optimization to support the company’s rapid growth and business needs.
  • Business Enablement : Drive data-driven solutions that accelerate business development and continuously enhance the company's core competitiveness.
  • Big Data Iteration : Lead the rapid iteration of big data platforms, ensuring efficiency, cost-effectiveness, and high-quality output.

Job Requirements

We are seeking candidates who are passionate about blockchain technology and possess a strong technical foundation. Ideal candidates will meet the following criteria:

  • Education : Bachelor’s degree or above in Computer Science or related majors.
  • Experience : More than 3 years of experience in Java development, with a deep understanding of JVM principles and Java development best practices.
  • Frameworks : Proficient in frameworks such as Spring, Spring MVC, gRPC, Mybatis , and an ability to understand their underlying principles and mechanisms.
  • Core Computer Fundamentals : Strong knowledge of computer operating systems, network architecture, and proficiency in commonly used algorithms, data structures, and design patterns.
  • Distributed Systems : Familiarity with distributed systems, caching mechanisms (Redis), messaging systems (Kafka), and big data processing frameworks like Spark, Flink , and Zookeeper . Experience with TiDB is a plus.
  • Experience in Microservices : Experience in the design and development of data centers or microservices, with an emphasis on high availability and scalability.
  • Tagging Systems/Recommendation Systems : Experience with tagging systems or algorithmic recommendation systems is highly desirable.
  • Passion for Blockchain : A strong enthusiasm for the blockchain industry and a commitment to contributing to its growth and development.

Why Join Us?

This role offers the opportunity to work at the forefront of blockchain and Web3 technologies within a global company. You will have the chance to develop and optimize critical infrastructure that powers innovative and scalable solutions in the blockchain space. If you’re ready to work in a fast-paced and cutting-edge environment, this role could be the perfect fit for you.

Apply now to be part of an exciting journey in revolutionizing the Web3 ecosystem!

Pioneer Talent Program - Organic Content Operations Crypto Operations Manager - Oregon, United States Crypto-Native UX Researcher (Remote - Jakarta, Indonesia) Web3 Head of Marketing and Communications (Remote) #J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Senior Big Data Engineer

Kuala Lumpur, Kuala Lumpur Bybit

Posted today

Job Viewed

Tap Again To Close

Job Description

Company : Web3 & Blockchain-focused Company Job Advantages Cutting-edge technology (Web3 and blockchain) Globalized company Job Responsibilities As a

Senior Java Engineer , you will play a pivotal role in the development and optimization of the company’s data infrastructure, ensuring it supports the evolving needs of our blockchain-focused business. Your primary responsibilities will include: Architectural Design & Development : Lead the design and development of the company’s data center, ensuring robust performance and scalability. Data Platform Optimization : Provide ongoing solutions for platform architecture and performance optimization to support the company’s rapid growth and business needs. Business Enablement : Drive data-driven solutions that accelerate business development and continuously enhance the company's core competitiveness. Big Data Iteration : Lead the rapid iteration of big data platforms, ensuring efficiency, cost-effectiveness, and high-quality output. Job Requirements We are seeking candidates who are passionate about blockchain technology and possess a strong technical foundation. Ideal candidates will meet the following criteria: Education : Bachelor’s degree or above in Computer Science or related majors. Experience : More than 3 years of experience in Java development, with a deep understanding of JVM principles and Java development best practices. Frameworks : Proficient in frameworks such as

Spring, Spring MVC, gRPC, Mybatis , and an ability to understand their underlying principles and mechanisms. Core Computer Fundamentals : Strong knowledge of computer operating systems, network architecture, and proficiency in commonly used algorithms, data structures, and design patterns. Distributed Systems : Familiarity with distributed systems, caching mechanisms (Redis), messaging systems (Kafka), and big data processing frameworks like

Spark, Flink , and

Zookeeper . Experience with

TiDB

is a plus. Experience in Microservices : Experience in the design and development of data centers or microservices, with an emphasis on high availability and scalability. Tagging Systems/Recommendation Systems : Experience with tagging systems or algorithmic recommendation systems is highly desirable. Passion for Blockchain : A strong enthusiasm for the blockchain industry and a commitment to contributing to its growth and development. Why Join Us? This role offers the opportunity to work at the forefront of

blockchain

and

Web3 technologies

within a global company. You will have the chance to develop and optimize critical infrastructure that powers innovative and scalable solutions in the blockchain space. If you’re ready to work in a fast-paced and cutting-edge environment, this role could be the perfect fit for you. Apply now to be part of an exciting journey in revolutionizing the Web3 ecosystem! Pioneer Talent Program - Organic Content Operations

Crypto Operations Manager - Oregon, United States

Crypto-Native UX Researcher (Remote - Jakarta, Indonesia)

Web3 Head of Marketing and Communications (Remote) #J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Global data Jobs in Malaysia !

Senior Big Data Engineer

Kuala Lumpur, Kuala Lumpur NEAR

Posted today

Job Viewed

Tap Again To Close

Job Description

Senior Java Engineer (Base Kuala Lumpur) Job Advantages Cutting-edge technology (web3 and blockchain), globalized company Job Responsibilities Responsible for the architectural design and development of the company's data center. Continuously deliver data value to the company's businesses with high quality, low cost and high efficiency. Provide solutions for platform architecture and performance optimization, and lead the rapid iteration of big data platform. Enabling the rapid development of the company's business through data-driven approaches, continuously improving the company's core competitiveness. Job Requirements Bachelor's degree or above in computer science or related majors, with more than 3 years of experience in Java development, proficient in Java development, and an in-depth understanding of JVM principles. Familiar with commonly used frameworks (Spring, Spring MVC, gRPC, Mybatis) and able to master their principles and mechanisms. Solid computer fundamentals, mastery of computer operating systems and network architecture, familiar with commonly used algorithms, data structures, and design patterns. Familiar with distributed systems, caching, messaging, and other mechanisms; experience with Redis, Kafka, Spark, Flink, Zookeeper, and TiDB is preferred. Experience in data center or microservice design and development, with a focus on high availability and scalability of architecture is preferred. Experience in tagging systems or algorithmic recommendation systems is preferred. Full of enthusiasm and expectation for the blockchain industry, committed to working hard for the development of the industry.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Big Data Hadoop Developer

Kuala Lumpur, Kuala Lumpur Unison Consulting Pte Ltd

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Summary: We are looking for a Big Data Hadoop Developer to design, develop, and maintain large-scale data processing solutions. The ideal candidate should have strong hands-on experience with the Hadoop ecosystem and integration with relational databases such as MariaDB or Oracle DB for analytics and reporting. Key Responsibilities: Design, develop, and optimize Hadoop-based big data solutions for batch and real-time data processing. Work with data ingestion frameworks to integrate data from MariaDB/Oracle DB into Hadoop (Sqoop, Apache Nifi, Kafka). Implement Hive, Spark, and MapReduce jobs for data transformation and analytics. Optimize Hive queries, Spark jobs, and HDFS usage for performance and cost efficiency. Create and maintain ETL pipelines for structured and unstructured data. Troubleshoot and resolve issues in Hadoop jobs and database connectivity. Collaborate with BI, analytics, and data science teams for data provisioning. Ensure data security, governance, and compliance in all solutions. Technical Skills: Big Data Ecosystem:

Hadoop (HDFS, YARN), Hive, Spark, Sqoop, MapReduce, Oozie, Flume. Databases:

MariaDB and/or Oracle DB (SQL, PL/SQL). Programming:

Java, Scala, or Python for Spark/MapReduce development. Data Ingestion:

Sqoop, Kafka, Nifi (for integrating RDBMS with Hadoop). Query Optimization:

Hive tuning, partitioning, bucketing, indexing. Tools:

Ambari, Cloudera Manager, Git, Jenkins. OS & Scripting:

Linux/Unix shell scripting. Soft Skills: Strong analytical skills and problem-solving abilities. Good communication skills for working with cross-functional teams. Ability to manage priorities in a fast-paced environment. Nice to Have: Experience with cloud-based big data platforms (AWS EMR, Azure HDInsight, GCP Dataproc). Knowledge of NoSQL databases (HBase, Cassandra). Exposure to machine learning integration with Hadoop/Spark.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Global Master Data Coordinator

Petaling Jaya, Selangor Givaudan

Posted 11 days ago

Job Viewed

Tap Again To Close

Job Description

Select how often (in days) to receive an alert:

Select how often (in days) to receive an alert:

Givaudan is the global leader in the creation of flavours and fragrances. In close collaboration with food, beverage, consumer product and fragrance partners, Givaudan develops tastes and scents that delight consumers the world over. With a passion to understand consumers’ preferences and a relentless drive to innovate, Givaudan is at the forefront of creating flavours and fragrances that ‘engage your senses’. The Company achieved sales of CHF5.1billion in 2017. Headquartered in Switzerland with local presence in over100 locations, the company has more than 11,100 employees worldwide. Givaudan invites you to discover more at

Join us and celebrate the beauty of human experience. Create for happier, healthier lives, with love for nature. Together, with kindness and humility, we deliver food innovations, craft inspired fragrances and develop beauty and wellbeing solutions that make people look and feel good. There’s much to learn and many to learn from, with more than 16,000 employees around the world to explore ideas and ambitions with. Dive into varied, flexible, and stimulating environments. Meet empowered professionals to partner with, befriend, and stretch your skills alongside. Every day, your energy, your creativity, and your determination will shape our future, making a positive difference on billions of people. Every essence of you enriches our world. We are Givaudan. Human by nature.

Overall mission:

The EDM Master Data Coordinator creates and maintains master data and monitors process to identify pro-actively potential delays, extracts, analyzes and mass-loads master data in our core systems. The individual would also be supporting in continuous improvement, contributing towards improving service levels in the business by making master data available in a timely and accurate manner.

What you will be accountable for:

Daily activities

  • Data Management – Master data creation and maintenance activities in the given service level, Information Management daily activities execution in the given service level
  • Basic master data service requests – execution of reports and mass uploads
  • Issue resolution requests execution related Data/Information Management
  • Run pro-active monitoring and follow-up on activities related to the expected activities
  • Support and advise the business in master data/information management related matters
  • Perform data quality checks and correction activities

Process related activites

  • Contribute to Master Data Projects & Change Requests
  • Propose process improvements – idea management
  • Support initiatives/projects/data cleansing activities
  • Act as contact person for key internal and external stakeholders for operational topics
  • Builds constructive and effective relationships with cross functional teams and stakeholders
  • Communicate effectively with internal and external partners by phone or via email
  • Mentoring the new team members/ training team members - Delivering training to the team for new processes being implemented
  • Being the reference for the Juniors


Your professional profile includes

  • Possess a Bachelor Degree in relevant field
  • Fresh graduates are welcome to apply
  • Possess good English communication skills, additional language knowledge is an added advantage
  • Experienced candidates with 1-2years of relevant experience in function-specific role or equivalent are preferred
  • Experience of working in a shared service environment is an advantage
  • Possess growth mindset, driven, organized and customer centric.
  • Good working knowledge of SAP or other ERP systems
  • First experiences in ETL tools and processes
  • MS Excel and good knowledge of Microsoft Office or equivalent tool
  • Strong numeracy and administration skills
  • Understanding of legal and statutory requirements, corporate policies work instructions and regulations
  • Best-in-class benefits, competitive pay, and a nurturing and progressive environment,
  • Excellent opportunities for progressive learning and development
  • A creative team environment that will inspire you
  • Comprehensive healthcare plans

*LI-Y

At Givaudan, you contribute to delightful taste and scent experiences that touch people’s lives.
You work within an inspiring teamwork culture – where you can thrive, collaborate and learn from other talented and passionate people across disciplines, regions and divisions.
Every essence of you enriches our world.
Diversity drives innovation and creates closer connections with our employees, customers and partners.
Givaudan embraces diversity and is committed to building an inclusive environment where everyone impacts our world.



At Givaudan, you contribute to delightful taste and scent experiences that touch people’s lives.
You work within an inspiring teamwork culture – where you can thrive, collaborate and learn from other talented and passionate people across disciplines, regions and divisions.
Join us and Impact Your World

Diversity drives innovation and creates closer connections with our employees, customers and partners.
Givaudan embraces diversity and is committed to building an inclusive environment where everyone impacts our world.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Global Data Jobs