115 Data Integration jobs in Malaysia

Senior Software Engineer (Data Integration)

Kuala Lumpur, Kuala Lumpur EPAM Systems

Posted 5 days ago

Job Viewed

Tap Again To Close

Job Description

Senior Software Engineer (Data Integration)

EPAM Systems Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia

Join or sign in to find your next job

Join to apply for the Senior Software Engineer (Data Integration) role at EPAM Systems

Senior Software Engineer (Data Integration)

EPAM Systems Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia

1 week ago Be among the first 25 applicants

Join to apply for the Senior Software Engineer (Data Integration) role at EPAM Systems

We are currently seeking a data integration expert for a Back End Developer backfill role in Kuala Lumpur. Your Azure expertise will be put to good use and you will focus on building our Python ETL processes and writing excellent SQL. If your passion is in making use of data and working with equally brilliant and driven teammates, this is the ideal role for you to grow your career.

Responsibilities


  • Develop and maintain data integration solutions
  • Collaborate with the team to design and implement ETL processes
  • Create and maintain database structures and schema
  • Develop and implement data quality controls
  • Assist with troubleshooting and issue resolution


Requirements


  • At least 5 years of experience working in Data Integration with 3 years of Azure Data Factory experience
  • Azure Synapse Analytics experience
  • DWH & DB Concepts knowledge
  • ETL/ELT Solutions experience
  • SAP Data Services experience
  • SQL proficiency
  • Microsoft Power BI familiarity is a bonus


We offer


  • By choosing EPAM, you're getting a job at one of the most loved workplaces according to Newsweek 2021 & 2022&2023.
  • Employee ideas are the main driver of our business. We have a very supportive environment where your voice matters
  • You will be challenged while working side-by-side with the best talent globally. We work with top-notch technologies, constantly seeking new industry trends and best practices
  • We offer a transparent career path and an individual roadmap to engineer your future & accelerate your journey
  • At EPAM, you can find vast opportunities for self-development: online courses and libraries, mentoring programs, partial grants of certification, and experience exchange with colleagues around the world. You will learn, contribute, and grow with us


Life at EPAM


  • EPAM is a leader in the fastest-growing segment (product development/digital platform engineering) of the IT industry. We acquired Just-BI in 2021 to reinforce our leading position as a global Business Intelligence services provider and have been growing rapidly. With a talented multinational team, we provide data and analytics expertise
  • We are currently involved in end-to-end BI design and implementation projects in major national and international companies. We are proud of our entrepreneurial start-up culture and are focused on investing in people by creating continuous learning and development opportunities for our employees who deliver engineering excellence for our clients

Seniority level
  • Seniority level Mid-Senior level
Employment type
  • Employment type Full-time
Job function
  • Job function Business Development, Information Technology, and Engineering
  • Industries Software Development and IT Services and IT Consulting

Referrals increase your chances of interviewing at EPAM Systems by 2x

Sign in to set job alerts for “Senior Software Engineer” roles.

Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia MYR4,000.00-MYR5,000.00 1 month ago

Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 week ago

Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 week ago

Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 5 months ago

Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 month ago

Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 8 months ago

Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 month ago

Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago

Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia MYR3,800.00-MYR5,000.00 3 months ago

Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 4 months ago

Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 6 months ago

Kota Damansara, Selangor, Malaysia 3 weeks ago

Petaling Jaya, Selangor, Malaysia 2 months ago

Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 6 months ago

Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia MYR3,500.00-MYR4,000.00 1 week ago

Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago

Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 6 months ago

We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

AHCS & GL – Data & Integration Engineer

Kuala Lumpur, Kuala Lumpur Prudential Services Asia

Posted 19 days ago

Job Viewed

Tap Again To Close

Job Description

AHCS & GL – Data & Integration Engineer

Apply locations: Kuala Lumpur (Group Head Office)

Time type: Full time

Posted on: Posted 5 Days Ago

Job requisition id: 24110415

Prudential’s purpose is to be partners for every life and protectors for every future. Our purpose encourages everything we do by creating a culture in which diversity is celebrated and inclusion assured, for our people, customers, and partners. We provide a platform for our people to do their best work and make an impact to the business, and we support our people’s career ambitions. We pledge to make Prudential a place where you can Connect, Grow, and Succeed.

The AHCS & GL – Data & Integration Engineer is responsible for co-creating the technical design and working closely with the Finance GTP Accounting lead in the E2E implementation of the integration platform supporting insurance/reinsurance feeds, investment feeds and IFRS 17/Consolidation platform from source systems to Oracle Fusion Financials ERP (inclusive of AHCS) and outbound from Oracle Fusion Financials ERP to IFRS 17 platform.

This role collaborates closely with cross-geo teams (insurance and non-insurance entities), cross-functional teams (other FFP workstreams) and cross-systems teams (source & consuming systems teams from both local business entities and group office) to ensure the integrity, security, and performance of Prudential’s E2E closing throughout the system lifecycle.

Role and Responsibilities

1. Co-create the technical design on integration platform / patterns & Support build team, with Finance GTP Accounting Lead to:

  • Ensure compliance of the data architecture with relevant financial regulations, accounting standards, and data privacy requirements.
  • Design and implement data security controls, such as data encryption, access controls, and audit trails, across the Oracle Fusion Financials ERP ecosystem.
  • Collaborate with Security and Compliance teams to ensure adherence to industry standards and regulatory requirements across SEA for the insurance industry.

2. Support E2E QA (with focus on SIT & NFT), Data Migration and Integration, Oracle Quarterly Release Upgrade:

  • Support in definition of test plan, migration plan and execute data migration activities from legacy systems to Oracle Fusion Financials ERP.
  • Help in design and implement test cases, data migration strategies, considering data mapping, data transformations, and data validation requirements.
  • Collaborate with the integration specialists to ensure seamless data flow and integration between Oracle Fusion Financials modules and other enterprise systems.
  • Develop and maintain data integration specifications and documentation.
  • Provide functional and technical support around the Oracle quarterly release upgrade.
Qualifications
  • Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
  • 7+ years of experience in designing and implementing data architectures for complex financial systems, preferably in the Oracle ecosystem.
  • 2-3 full cycle implementation of Oracle Fusion Financials ERP, including AHCS.
  • Experience with data migration, data quality management, and data governance frameworks.
  • Excellent problem-solving and analytical skills.
  • Strong communication and leadership abilities.
  • Ability to work collaboratively in a fast-paced, multi-stakeholder environment.
  • Willing to travel as the project may require.
Mandatory Skills
  • Strong expertise in Oracle Fusion Financials ERP modules including AHCS and their underlying data structures.
  • Experience with data architecture, data engineering and integration patterns in Financial Services (with Insurance as preferred).
  • Familiarity with financial accounting concepts and processes (ideally with IFRS 17 exposure).
Preferred Skills
  • Technology delivery project management experience, especially legacy systems migration.
  • Experience with DevOps, CI/CD practices, master data management and data governance.
  • In-depth knowledge of Oracle Financials data structure and data integration techniques.

Prudential is an equal opportunity employer. We provide equality of opportunity of benefits for all who apply and who perform work for our organisation irrespective of sex, race, age, ethnic origin, educational, social and cultural background, marital status, pregnancy and maternity, religion or belief, disability or part-time / fixed-term work, or any other status protected by applicable law. We encourage the same standards from our recruitment and third-party suppliers taking into account the context of grade, job and location. We also allow for reasonable adjustments to support people with individual physical or mental health requirements.

About Us

Prudentialplc provides life and health insurance and asset management, with a focus on Asia and Africa. We help people get the most out of life, by making healthcare affordable and accessible and by promoting financial inclusion. We protect people’s wealth, help them grow their assets, and empower them to save for their goals. The business has more than 18 million life customers in Asia and Africa. Prudential has been providing trusted financial security for 95 years and is listed on stock exchanges in London, Hong Kong, Singapore, and New York.

We are proud to be included in the 2023 Bloomberg Gender Equality Index. The index measures gender equality across five pillars: female leadership and talent pipeline, equal pay and gender pay parity, inclusive culture, sexual harassment policies, and pro-women brand. Our inclusion in this global index is testament to our commitment to nurturing diverse talent.

Prudential plc is not affiliated in any manner with Prudential Financial, Inc., a company whose principal place of business is in the United States of America or with the Prudential Assurance Company, a subsidiary of M&G plc, a company incorporated in the United Kingdom.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

AHCS & GL – Data & Integration Engineer

Kuala Lumpur, Kuala Lumpur Prudential Services Asia

Posted 8 days ago

Job Viewed

Tap Again To Close

Job Description

AHCS & GL – Data & Integration Engineer Apply locations: Kuala Lumpur (Group Head Office) Time type: Full time Posted on: Posted 5 Days Ago Job requisition id: 24110415 Prudential’s purpose is to be partners for every life and protectors for every future. Our purpose encourages everything we do by creating a culture in which diversity is celebrated and inclusion assured, for our people, customers, and partners. We provide a platform for our people to do their best work and make an impact to the business, and we support our people’s career ambitions. We pledge to make Prudential a place where you can Connect, Grow, and Succeed. The AHCS & GL – Data & Integration Engineer is responsible for co-creating the technical design and working closely with the Finance GTP Accounting lead in the E2E implementation of the integration platform supporting insurance/reinsurance feeds, investment feeds and IFRS 17/Consolidation platform from source systems to Oracle Fusion Financials ERP (inclusive of AHCS) and outbound from Oracle Fusion Financials ERP to IFRS 17 platform. This role collaborates closely with cross-geo teams (insurance and non-insurance entities), cross-functional teams (other FFP workstreams) and cross-systems teams (source & consuming systems teams from both local business entities and group office) to ensure the integrity, security, and performance of Prudential’s E2E closing throughout the system lifecycle. Role and Responsibilities

1. Co-create the technical design on integration platform / patterns & Support build team, with Finance GTP Accounting Lead to: Ensure compliance of the data architecture with relevant financial regulations, accounting standards, and data privacy requirements. Design and implement data security controls, such as data encryption, access controls, and audit trails, across the Oracle Fusion Financials ERP ecosystem. Collaborate with Security and Compliance teams to ensure adherence to industry standards and regulatory requirements across SEA for the insurance industry. 2. Support E2E QA (with focus on SIT & NFT), Data Migration and Integration, Oracle Quarterly Release Upgrade: Support in definition of test plan, migration plan and execute data migration activities from legacy systems to Oracle Fusion Financials ERP. Help in design and implement test cases, data migration strategies, considering data mapping, data transformations, and data validation requirements. Collaborate with the integration specialists to ensure seamless data flow and integration between Oracle Fusion Financials modules and other enterprise systems. Develop and maintain data integration specifications and documentation. Provide functional and technical support around the Oracle quarterly release upgrade. Qualifications

Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. 7+ years of experience in designing and implementing data architectures for complex financial systems, preferably in the Oracle ecosystem. 2-3 full cycle implementation of Oracle Fusion Financials ERP, including AHCS. Experience with data migration, data quality management, and data governance frameworks. Excellent problem-solving and analytical skills. Strong communication and leadership abilities. Ability to work collaboratively in a fast-paced, multi-stakeholder environment. Willing to travel as the project may require. Mandatory Skills

Strong expertise in Oracle Fusion Financials ERP modules including AHCS and their underlying data structures. Experience with data architecture, data engineering and integration patterns in Financial Services (with Insurance as preferred). Familiarity with financial accounting concepts and processes (ideally with IFRS 17 exposure). Preferred Skills

Technology delivery project management experience, especially legacy systems migration. Experience with DevOps, CI/CD practices, master data management and data governance. In-depth knowledge of Oracle Financials data structure and data integration techniques. Prudential is an equal opportunity employer.

We provide equality of opportunity of benefits for all who apply and who perform work for our organisation irrespective of sex, race, age, ethnic origin, educational, social and cultural background, marital status, pregnancy and maternity, religion or belief, disability or part-time / fixed-term work, or any other status protected by applicable law. We encourage the same standards from our recruitment and third-party suppliers taking into account the context of grade, job and location. We also allow for reasonable adjustments to support people with individual physical or mental health requirements. About Us

Prudentialplc provides life and health insurance and asset management, with a focus on Asia and Africa. We help people get the most out of life, by making healthcare affordable and accessible and by promoting financial inclusion. We protect people’s wealth, help them grow their assets, and empower them to save for their goals. The business has more than 18 million life customers in Asia and Africa. Prudential has been providing trusted financial security for 95 years and is listed on stock exchanges in London, Hong Kong, Singapore, and New York. We are proud to be included in the 2023 Bloomberg Gender Equality Index. The index measures gender equality across five pillars: female leadership and talent pipeline, equal pay and gender pay parity, inclusive culture, sexual harassment policies, and pro-women brand. Our inclusion in this global index is testament to our commitment to nurturing diverse talent. Prudential plc is not affiliated in any manner with Prudential Financial, Inc., a company whose principal place of business is in the United States of America or with the Prudential Assurance Company, a subsidiary of M&G plc, a company incorporated in the United Kingdom.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Senior Software Engineer (Data Integration)

Kuala Lumpur, Kuala Lumpur EPAM Systems

Posted 8 days ago

Job Viewed

Tap Again To Close

Job Description

Senior Software Engineer (Data Integration)

EPAM Systems Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia Join or sign in to find your next job

Join to apply for the

Senior Software Engineer (Data Integration)

role at

EPAM Systems Senior Software Engineer (Data Integration)

EPAM Systems Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 week ago Be among the first 25 applicants Join to apply for the

Senior Software Engineer (Data Integration)

role at

EPAM Systems We are currently seeking a data integration expert for a Back End Developer backfill role in Kuala Lumpur. Your Azure expertise will be put to good use and you will focus on building our Python ETL processes and writing excellent SQL. If your passion is in making use of data and working with equally brilliant and driven teammates, this is the ideal role for you to grow your career.

Responsibilities

Develop and maintain data integration solutions Collaborate with the team to design and implement ETL processes Create and maintain database structures and schema Develop and implement data quality controls Assist with troubleshooting and issue resolution

Requirements

At least 5 years of experience working in Data Integration with 3 years of Azure Data Factory experience Azure Synapse Analytics experience DWH & DB Concepts knowledge ETL/ELT Solutions experience SAP Data Services experience SQL proficiency Microsoft Power BI familiarity is a bonus

We offer

By choosing EPAM, you're getting a job at one of the most loved workplaces according to Newsweek 2021 & 2022&2023. Employee ideas are the main driver of our business. We have a very supportive environment where your voice matters You will be challenged while working side-by-side with the best talent globally. We work with top-notch technologies, constantly seeking new industry trends and best practices We offer a transparent career path and an individual roadmap to engineer your future & accelerate your journey At EPAM, you can find vast opportunities for self-development: online courses and libraries, mentoring programs, partial grants of certification, and experience exchange with colleagues around the world. You will learn, contribute, and grow with us

Life at EPAM

EPAM is a leader in the fastest-growing segment (product development/digital platform engineering) of the IT industry. We acquired Just-BI in 2021 to reinforce our leading position as a global Business Intelligence services provider and have been growing rapidly. With a talented multinational team, we provide data and analytics expertise We are currently involved in end-to-end BI design and implementation projects in major national and international companies. We are proud of our entrepreneurial start-up culture and are focused on investing in people by creating continuous learning and development opportunities for our employees who deliver engineering excellence for our clients

Seniority level

Seniority level Mid-Senior level Employment type

Employment type Full-time Job function

Job function Business Development, Information Technology, and Engineering Industries Software Development and IT Services and IT Consulting Referrals increase your chances of interviewing at EPAM Systems by 2x Sign in to set job alerts for “Senior Software Engineer” roles.

Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia MYR4,000.00-MYR5,000.00 1 month ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 week ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 week ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 5 months ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 month ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 8 months ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 month ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia MYR3,800.00-MYR5,000.00 3 months ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 4 months ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 6 months ago Kota Damansara, Selangor, Malaysia 3 weeks ago Petaling Jaya, Selangor, Malaysia 2 months ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 6 months ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia MYR3,500.00-MYR4,000.00 1 week ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 6 months ago We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

AVP, Data Integration (pySpark, Nifi, Hadoop)

Kuala Lumpur, Kuala Lumpur Maybank

Posted 5 days ago

Job Viewed

Tap Again To Close

Job Description

AVP, Data Integration (pySpark, Nifi, Hadoop)

Maybank WP. Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia

Get AI-powered advice on this job and more exclusive features.

  • Implement ETL systems that are operationally stable, efficient, and automated. This includes technical solutions that are scalable, aligned with the enterprise architecture, and adaptable to business changes.
  • Collaborate with internal and external teams to define requirements for data integrations, specifically for Data Warehouse/Data Marts implementations.

Responsibilities of the Role

  • Review business and technical requirements to ensure the data integration platform meets specifications.
  • Apply industry best practices for ETL design and development.
  • Produce technical design documents, system testing plans, and implementation documentation.
  • Conduct system testing: execute job flows, investigate and resolve system defects, and document results.
  • Work with DBAs, application specialists, and technical support teams to optimize ETL system performance and meet SLAs.
  • Assist in developing, documenting, and applying best practices and procedures.
  • Strong SQL writing skills are required.
  • Familiarity with ETL tools such as pySpark, NiFi, Informatica, and Hadoop is preferred.
  • Understanding of data integration best practices, including master data management, entity resolution, data quality, and metadata management.
  • Experience with data warehouse architecture, source system data analysis, and data profiling.
  • Ability to work effectively in a fast-paced, adaptive environment.
  • Financial domain experience is a plus.
  • Ability to work independently and communicate effectively across various levels, including product owners, executive sponsors, and team members.
  • Experience working in an Agile environment is advantageous.

Qualifications

  • Bachelor’s Degree in Computer Science, Information Technology, or equivalent.
  • Over 5 years of total work experience, with experience programming ETL processes using Informatica, NiFi, pySpark, and Hadoop.
  • At least 4 years of experience in data analysis, profiling, and designing ETL systems/programs.
Seniority level
  • Mid-Senior level
Employment type
  • Full-time
Job function
  • Information Technology
Industries
  • Banking
#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

AVP, Data Integration (pySpark, Nifi, Hadoop)

Kuala Lumpur, Kuala Lumpur Maybank

Posted 5 days ago

Job Viewed

Tap Again To Close

Job Description

AVP, Data Integration (pySpark, Nifi, Hadoop)

Maybank WP. Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia

Get AI-powered advice on this job and more exclusive features.

  • Implement ETL systems that are operationally stable, efficient, and automated. This includes technical solutions that are scalable, aligned with the enterprise architecture, and adaptable to business changes.
  • Collaborate with internal and external teams to define requirements for data integrations, specifically for Data Warehouse/Data Marts implementations.

Responsibilities of the Role

  • Review business and technical requirements to ensure the data integration platform meets specifications.
  • Apply industry best practices for ETL design and development.
  • Produce technical design documents, system testing plans, and implementation documentation.
  • Conduct system testing: execute job flows, investigate and resolve system defects, and document results.
  • Work with DBAs, application specialists, and technical support teams to optimize ETL system performance and meet SLAs.
  • Assist in developing, documenting, and applying best practices and procedures.
  • Strong SQL writing skills are required.
  • Familiarity with ETL tools such as pySpark, NiFi, Informatica, and Hadoop is preferred.
  • Understanding of data integration best practices, including master data management, entity resolution, data quality, and metadata management.
  • Experience with data warehouse architecture, source system data analysis, and data profiling.
  • Ability to work effectively in a fast-paced, adaptive environment.
  • Financial domain experience is a plus.
  • Ability to work independently and communicate effectively across various levels, including product owners, executive sponsors, and team members.
  • Experience working in an Agile environment is advantageous.

Qualifications

  • Bachelor’s Degree in Computer Science, Information Technology, or equivalent.
  • Over 5 years of total work experience, with experience programming ETL processes using Informatica, NiFi, pySpark, and Hadoop.
  • At least 4 years of experience in data analysis, profiling, and designing ETL systems/programs.
Seniority level
  • Mid-Senior level
Employment type
  • Full-time
Job function
  • Information Technology
Industries
  • Banking
#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Research Officer (Research Data Integration), BII

Negeri Sembilan, Negeri Sembilan A*STAR RESEARCH ENTITIES

Posted 5 days ago

Job Viewed

Tap Again To Close

Job Description

About us:

Data science is an important component of biomedical and translational research, where data of multiple modalities are being constantly generated at unprecedented scale. The Research Data Integration group in the Biomedical Datahub Division of the Bioinformatics Institute (BII), A*STAR, aims to bridge the complexity of computational biology and data science with the needs of biologists and clinicians to drive biological discoveries and predict translational outcomes. One of our immediate challenges is to analyze and integrate and analyze multi-omics, imaging and clinical data generated by biomedical institutes in A*STAR, healthcare institutions and national initiatives in Singapore to improve the usability and interpretability of large-scale multimodal datasets of cancer, metabolic diseases, and skin diseases. We seek motivated individuals to join us to push the potential of biomedical data in truly benefitting patients.

Job description:

We are seeking a highly motivated research officer to drive data management, analysis and integration of large-scale and diverse multi-omics (genomics, transcriptomics, proteomics, metabolomics, lipidomics) and clinical data related to cancer and other diseases. The candidate will be expected handle a variety of tasks, including curate and QC incoming large-scale multi-omics data, implement and benchmark bioinformatics workflows, perform downstream analysis, develop data integration platforms comprising of file catalogues, databases, knowledge graphs and visualization of datasets through interactive dashboards, and implement international data standards to ensure data interoperability. The successful candidate will also extensively mine publicly available resources and link up with our datasets. The candidate will also manage and maintain all data based on necessary security requirements. This position offers the candidate an opportunity to gain expert knowledge in multi-dimensional data from cancer and other diseases, and become computationally competent in delivering these knowledge to a diverse group of stake-holders including clinicians and wet-lab scientists. There are also many opportunities to apply AI/ML to all the tasks and data to drive efficiency and uncover new insights.

Requirements:

  • Bachelor or Master degree in Bioinformatics, Computational Biology, Data Science, Computer Science, Analytics, Engineering, Genetics or a related field.
  • Strong programming skills (e.g. Python, R, RStudio, Jupyter Notebook, Shinyapps).
  • Highly experienced with Unix/Linux environment and cloud architecture (AWS).
  • Familiar with bioinformatics, database development (structured and unstructured, SQL), visualization tools, AI/ML and other big data technologies.
  • Experience with large datasets and knowledge in data management and data standards.
  • Understanding of biological concepts and methodologies relevant to multi-omics data and cancer.
  • Strong analytical and problem-solving skills, and attention to detail.
  • Excellent oral and written communication and presentation skills.
  • Able to work independently, and work collaboratively in a team environment, with a positive and enthusiastic learning attitude for new methodologies and technologies.
  • Competent project management and organizational skills will be very valuable.

The above eligibility criteria are not exhaustive. A*STAR may include additional selection criteria based on its prevailing recruitment policies. These policies may be amended from time to time without notice. We regret that only shortlisted candidates will be notified.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data integration Jobs in Malaysia !

Research Officer (Research Data Integration), BII

Negeri Sembilan, Negeri Sembilan A*STAR RESEARCH ENTITIES

Posted 8 days ago

Job Viewed

Tap Again To Close

Job Description

About us: Data science is an important component of biomedical and translational research, where data of multiple modalities are being constantly generated at unprecedented scale. The Research Data Integration group in the Biomedical Datahub Division of the Bioinformatics Institute (BII), A*STAR, aims to bridge the complexity of computational biology and data science with the needs of biologists and clinicians to drive biological discoveries and predict translational outcomes. One of our immediate challenges is to analyze and integrate and analyze multi-omics, imaging and clinical data generated by biomedical institutes in A*STAR, healthcare institutions and national initiatives in Singapore to improve the usability and interpretability of large-scale multimodal datasets of cancer, metabolic diseases, and skin diseases. We seek motivated individuals to join us to push the potential of biomedical data in truly benefitting patients. Job description: We are seeking a highly motivated research officer to drive data management, analysis and integration of large-scale and diverse multi-omics (genomics, transcriptomics, proteomics, metabolomics, lipidomics) and clinical data related to cancer and other diseases. The candidate will be expected handle a variety of tasks, including curate and QC incoming large-scale multi-omics data, implement and benchmark bioinformatics workflows, perform downstream analysis, develop data integration platforms comprising of file catalogues, databases, knowledge graphs and visualization of datasets through interactive dashboards, and implement international data standards to ensure data interoperability. The successful candidate will also extensively mine publicly available resources and link up with our datasets. The candidate will also manage and maintain all data based on necessary security requirements. This position offers the candidate an opportunity to gain expert knowledge in multi-dimensional data from cancer and other diseases, and become computationally competent in delivering these knowledge to a diverse group of stake-holders including clinicians and wet-lab scientists. There are also many opportunities to apply AI/ML to all the tasks and data to drive efficiency and uncover new insights. Requirements: Bachelor or Master degree in Bioinformatics, Computational Biology, Data Science, Computer Science, Analytics, Engineering, Genetics or a related field. Strong programming skills (e.g. Python, R, RStudio, Jupyter Notebook, Shinyapps). Highly experienced with Unix/Linux environment and cloud architecture (AWS). Familiar with bioinformatics, database development (structured and unstructured, SQL), visualization tools, AI/ML and other big data technologies. Experience with large datasets and knowledge in data management and data standards. Understanding of biological concepts and methodologies relevant to multi-omics data and cancer. Strong analytical and problem-solving skills, and attention to detail. Excellent oral and written communication and presentation skills. Able to work independently, and work collaboratively in a team environment, with a positive and enthusiastic learning attitude for new methodologies and technologies. Competent project management and organizational skills will be very valuable. The above eligibility criteria are not exhaustive. A*STAR may include additional selection criteria based on its prevailing recruitment policies. These policies may be amended from time to time without notice. We regret that only shortlisted candidates will be notified.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Computational Scientist - AI & Data Integration (Metabolic Disease), A*STAR BII

Negeri Sembilan, Negeri Sembilan A*STAR RESEARCH ENTITIES

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

About us

AL/ML and data science is an important component of biomedical and translational research, where data of multiple modalities are being constantly generated at an unprecedented scale. The Research Data Integration group in the Bioinformatics Institute (BII), A*STAR, aims to bridge the complexity of computational biology and data science with the hypotheses and findings of biologists and clinicians. One of our immediate challenges is to analyze, integrate and analyze multi omics, imaging and clinical data generated by biomedical institutes in A*STAR, healthcare institutions and national initiatives in Singapore in the areas of cancer, metabolic, eye and skin diseases. We seek motivated individuals to join us to push the potential of biomedical data in driving biological discoveries and create translational impact that benefits patients.

Position summary

We are seeking a highly motivated Computational Scientist with expertise in artificial intelligence, multi omics data integration, and metabolic disease biology. The successful candidate will play a critical role in developing AI/ML frameworks to integrate and analyze high dimensional biological datasets (e.g., genomics, transcriptomics, proteomics, imaging) with clinical data to identify novel mechanisms, biomarkers, and therapeutic targets in metabolic diseases such as MASLD/MASH and obesity. This position offer the candidate an opportunity to work closely with a diverse and collaborative team of computational and experimental scientists, and clinicians on cutting edge research at the intersection of AI and human health, and drive translational impact in metabolic diseases.

Key Responsibilities

· Develop and implement AI/ML models to analyze and integrate multi omics and clinical datasets related to metabolic disorders.

· Collaborate with biologists and clinicians to generate testable hypotheses from integrated datasets, and contribute actively to laboratory validations and screening studies through analytical findings.

· Design pipelines for preprocessing, normalization, and harmonization of heterogeneous data types.

· Apply and develop novel computational methodologies to uncover disease mechanisms.

· Drive biomarker discovery, patient stratification, target identification using machine learning approaches.

· Build a database integrating and linking multi dimensional clinical and multi omics data for metabolic diseases.

· Develop visualizations and dashboards to communicate complex data insights to interdisciplinary teams.

· Stay up to date with emerging computational techniques and tools in systems biology and AI.

· Drive high impact scientific publications, patent filings, presentations and grant proposals.

Qualifications

· Ph.D. in Computational Biology, Bioinformatics, Computer Science, Data Science, Systems Biology, or a related field.

· Strong background in programming and AI/ML (e.g., deep learning, ensemble methods, graph based learning, agentic AI, explainable AI).

· Proficiency in programming languages such as Python and R; experience with AI/ML frameworks like TensorFlow, PyTorch, or scikit learn.

· Demonstrated experience in integrating and analysing complex multi omics datasets (e.g., RNA seq, WGS, proteomics, GWAS).

· Highly experienced with Unix/Linux environment and/or cloud architecture.

· Solid understanding of metabolic disease biology and relevant clinical phenotypes.

· Experience working with large scale multi dimensional datasets from biobanks, cohorts, or clinical trials.

· Track record of peer reviewed publications in computational biology or bioinformatics.

· Experience in a cross functional, collaborative environment in academia or industry.

· Knowledge in data security, data standards and interoperability, and reproducible research practices.

· Strong analytical and problem solving skills, and attention to details.

· Excellent oral and written communication and presentation skills.

· Able to work independently and work collaboratively in a multi disciplinary team environment.

· Competent project and data management, and organizational skills.

The above eligibility criteria are not exhaustive. A*STAR may include additional selection criteria based on its prevailing recruitment policies. These policies may be amended from time to time without notice. We regret that only shortlisted candidates will be notified.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Computational Scientist - AI & Data Integration (Metabolic Disease), A*STAR BII

Negeri Sembilan, Negeri Sembilan A*STAR RESEARCH ENTITIES

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

About us AL/ML and data science is an important component of biomedical and translational research, where data of multiple modalities are being constantly generated at an unprecedented scale. The Research Data Integration group in the Bioinformatics Institute (BII), A*STAR, aims to bridge the complexity of computational biology and data science with the hypotheses and findings of biologists and clinicians. One of our immediate challenges is to analyze, integrate and analyze multi omics, imaging and clinical data generated by biomedical institutes in A*STAR, healthcare institutions and national initiatives in Singapore in the areas of cancer, metabolic, eye and skin diseases. We seek motivated individuals to join us to push the potential of biomedical data in driving biological discoveries and create translational impact that benefits patients. Position summary We are seeking a highly motivated Computational Scientist with expertise in artificial intelligence, multi omics data integration, and metabolic disease biology. The successful candidate will play a critical role in developing AI/ML frameworks to integrate and analyze high dimensional biological datasets (e.g., genomics, transcriptomics, proteomics, imaging) with clinical data to identify novel mechanisms, biomarkers, and therapeutic targets in metabolic diseases such as MASLD/MASH and obesity. This position offer the candidate an opportunity to work closely with a diverse and collaborative team of computational and experimental scientists, and clinicians on cutting edge research at the intersection of AI and human health, and drive translational impact in metabolic diseases. Key Responsibilities · Develop and implement AI/ML models to analyze and integrate multi omics and clinical datasets related to metabolic disorders. · Collaborate with biologists and clinicians to generate testable hypotheses from integrated datasets, and contribute actively to laboratory validations and screening studies through analytical findings. · Design pipelines for preprocessing, normalization, and harmonization of heterogeneous data types. · Apply and develop novel computational methodologies to uncover disease mechanisms. · Drive biomarker discovery, patient stratification, target identification using machine learning approaches. · Build a database integrating and linking multi dimensional clinical and multi omics data for metabolic diseases. · Develop visualizations and dashboards to communicate complex data insights to interdisciplinary teams. · Stay up to date with emerging computational techniques and tools in systems biology and AI. · Drive high impact scientific publications, patent filings, presentations and grant proposals. Qualifications · Ph.D. in Computational Biology, Bioinformatics, Computer Science, Data Science, Systems Biology, or a related field. · Strong background in programming and AI/ML (e.g., deep learning, ensemble methods, graph based learning, agentic AI, explainable AI). · Proficiency in programming languages such as Python and R; experience with AI/ML frameworks like TensorFlow, PyTorch, or scikit learn. · Demonstrated experience in integrating and analysing complex multi omics datasets (e.g., RNA seq, WGS, proteomics, GWAS). · Highly experienced with Unix/Linux environment and/or cloud architecture. · Solid understanding of metabolic disease biology and relevant clinical phenotypes. · Experience working with large scale multi dimensional datasets from biobanks, cohorts, or clinical trials. · Track record of peer reviewed publications in computational biology or bioinformatics. · Experience in a cross functional, collaborative environment in academia or industry. · Knowledge in data security, data standards and interoperability, and reproducible research practices. · Strong analytical and problem solving skills, and attention to details. · Excellent oral and written communication and presentation skills. · Able to work independently and work collaboratively in a multi disciplinary team environment. · Competent project and data management, and organizational skills. The above eligibility criteria are not exhaustive. A*STAR may include additional selection criteria based on its prevailing recruitment policies. These policies may be amended from time to time without notice. We regret that only shortlisted candidates will be notified.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Integration Jobs