46 Data Pipeline jobs in Malaysia
Data Engineer (Python, SQL, ETL Pipeline, PowerBI, etc)
Posted 1 day ago
Job Viewed
Job Description
This job is for a Data Engineer who builds and manages data pipelines for reporting. You might like this job because you’ll work with tools like Python and SQL, collaborate with others, and ensure data is accurate and well-organized!
- Assist in the design, development, and maintenance of data pipelines to support PowerBI reporting.
- Help integrate data from various third-party applications into our data warehouse.
- Collaborate with senior team members to understand reporting requirements and translate them into technical specifications. Support the optimization and troubleshooting of data workflows to ensure efficient data processing.
- Contribute to ensuring data quality and integrity across all data sources.
- Assist in developing and maintaining documentation for data processes and systems.
- Bachelor’s degree in computer science, Information Technology, or a related field.
- Minimum 1 to 2 years’ experience in data engineering or a related field.
- Basic proficiency in Python for data manipulation and automation.
- Familiarity with SQL, Power BI and database management.
- Must have experience in PowerBI Management
- Understanding data integration tools and ETL processes.
- Eagerness to learn and develop new skills.
- Strong problem-solving skills and attention to detail
- I have coding experience in Python and data management
- Have a proven record in pipeline maintenance and data migration
- Familiarity with data warehouse, SQL and Zendesk
Data Analysis
Extract Transform Load (ETL)
SQL (Programming Language)
Microsoft PowerPoint
Python (Programming Language)
Company Benefits Dental Claim & Medical InsuranceWe support employee well-being care through our subsidy.
Opportunity to expose international career.
Travel allowanceFix monthly travel allowance to ease your commute.
Employee Referral ProgramEmployee Referral Program - Refer us your friend and get rewarded!
Annual performance bonusDiscretionary bonus - get rewarded if you and the company performed well!
Hytech is a leading management consulting firm headquartered in Australia and Singapore, specializing in digital transformation for fintech and financial services companies. We provide comprehensive consulting solutions, as well as middle- and back-office support, to empower our clients with streamlined operations and cutting-edge strategies. Our key clients include top trading platforms and cryptocurrency exchanges.
#J-18808-LjbffrData Engineer (Python, SQL, ETL Pipeline, PowerBI, etc)
Posted 10 days ago
Job Viewed
Job Description
This job is for a Data Engineer who builds and manages data pipelines for reporting. You might like this job because you’ll work with tools like Python and SQL, collaborate with others, and ensure data is accurate and well-organized!
- Assist in the design, development, and maintenance of data pipelines to support PowerBI reporting.
- Help integrate data from various third-party applications into our data warehouse.
- Collaborate with senior team members to understand reporting requirements and translate them into technical specifications. Support the optimization and troubleshooting of data workflows to ensure efficient data processing.
- Contribute to ensuring data quality and integrity across all data sources.
- Assist in developing and maintaining documentation for data processes and systems.
- Bachelor’s degree in computer science, Information Technology, or a related field.
- Minimum 1 to 2 years’ experience in data engineering or a related field.
- Basic proficiency in Python for data manipulation and automation.
- Familiarity with SQL, Power BI and database management.
- Must have experience in PowerBI Management
- Understanding data integration tools and ETL processes.
- Eagerness to learn and develop new skills.
- Strong problem-solving skills and attention to detail
- I have coding experience in Python and data management
- Have a proven record in pipeline maintenance and data migration
- Familiarity with data warehouse, SQL and Zendesk
Data Analysis
Extract Transform Load (ETL)
SQL (Programming Language)
Microsoft PowerPoint
Python (Programming Language)
Company Benefits Dental Claim & Medical InsuranceWe support employee well-being care through our subsidy.
Opportunity to expose international career.
Travel allowanceFix monthly travel allowance to ease your commute.
Employee Referral ProgramEmployee Referral Program - Refer us your friend and get rewarded!
Annual performance bonusDiscretionary bonus - get rewarded if you and the company performed well!
Hytech is a leading management consulting firm headquartered in Australia and Singapore, specializing in digital transformation for fintech and financial services companies. We provide comprehensive consulting solutions, as well as middle- and back-office support, to empower our clients with streamlined operations and cutting-edge strategies. Our key clients include top trading platforms and cryptocurrency exchanges.
#J-18808-LjbffrData Engineer (Python, SQL, ETL Pipeline, PowerBI, etc)
Posted today
Job Viewed
Job Description
Bachelor’s degree in computer science, Information Technology, or a related field. Minimum 1 to 2 years’ experience in data engineering or a related field. Basic proficiency in Python for data manipulation and automation. Familiarity with SQL, Power BI and database management. Must have experience in PowerBI Management Understanding data integration tools and ETL processes. Eagerness to learn and develop new skills. Strong problem-solving skills and attention to detail I have coding experience in Python and data management Have a proven record in pipeline maintenance and data migration Familiarity with data warehouse, SQL and Zendesk Skills
Data Analysis Extract Transform Load (ETL) SQL (Programming Language) Microsoft PowerPoint Python (Programming Language) Company Benefits
Dental Claim & Medical Insurance
We support employee well-being care through our subsidy. Opportunity to expose international career. Travel allowance
Fix monthly travel allowance to ease your commute. Employee Referral Program
Employee Referral Program - Refer us your friend and get rewarded! Annual performance bonus
Discretionary bonus - get rewarded if you and the company performed well! Hytech is a leading management consulting firm headquartered in Australia and Singapore, specializing in digital transformation for fintech and financial services companies. We provide comprehensive consulting solutions, as well as middle- and back-office support, to empower our clients with streamlined operations and cutting-edge strategies. Our key clients include top trading platforms and cryptocurrency exchanges.
#J-18808-Ljbffr
Senior Software Engineer (Data Integration)
Posted 11 days ago
Job Viewed
Job Description
EPAM Systems Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia
Join or sign in to find your next jobJoin to apply for the Senior Software Engineer (Data Integration) role at EPAM Systems
Senior Software Engineer (Data Integration)EPAM Systems Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia
1 week ago Be among the first 25 applicants
Join to apply for the Senior Software Engineer (Data Integration) role at EPAM Systems
We are currently seeking a data integration expert for a Back End Developer backfill role in Kuala Lumpur. Your Azure expertise will be put to good use and you will focus on building our Python ETL processes and writing excellent SQL. If your passion is in making use of data and working with equally brilliant and driven teammates, this is the ideal role for you to grow your career.
Responsibilities
- Develop and maintain data integration solutions
- Collaborate with the team to design and implement ETL processes
- Create and maintain database structures and schema
- Develop and implement data quality controls
- Assist with troubleshooting and issue resolution
- At least 5 years of experience working in Data Integration with 3 years of Azure Data Factory experience
- Azure Synapse Analytics experience
- DWH & DB Concepts knowledge
- ETL/ELT Solutions experience
- SAP Data Services experience
- SQL proficiency
- Microsoft Power BI familiarity is a bonus
- By choosing EPAM, you're getting a job at one of the most loved workplaces according to Newsweek 2021 & 2022&2023.
- Employee ideas are the main driver of our business. We have a very supportive environment where your voice matters
- You will be challenged while working side-by-side with the best talent globally. We work with top-notch technologies, constantly seeking new industry trends and best practices
- We offer a transparent career path and an individual roadmap to engineer your future & accelerate your journey
- At EPAM, you can find vast opportunities for self-development: online courses and libraries, mentoring programs, partial grants of certification, and experience exchange with colleagues around the world. You will learn, contribute, and grow with us
- EPAM is a leader in the fastest-growing segment (product development/digital platform engineering) of the IT industry. We acquired Just-BI in 2021 to reinforce our leading position as a global Business Intelligence services provider and have been growing rapidly. With a talented multinational team, we provide data and analytics expertise
- We are currently involved in end-to-end BI design and implementation projects in major national and international companies. We are proud of our entrepreneurial start-up culture and are focused on investing in people by creating continuous learning and development opportunities for our employees who deliver engineering excellence for our clients
- Seniority level Mid-Senior level
- Employment type Full-time
- Job function Business Development, Information Technology, and Engineering
- Industries Software Development and IT Services and IT Consulting
Referrals increase your chances of interviewing at EPAM Systems by 2x
Sign in to set job alerts for “Senior Software Engineer” roles.Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia MYR4,000.00-MYR5,000.00 1 month ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 week ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 week ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 5 months ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 month ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 8 months ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 month ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia MYR3,800.00-MYR5,000.00 3 months ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 4 months ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 6 months ago
Kota Damansara, Selangor, Malaysia 3 weeks ago
Petaling Jaya, Selangor, Malaysia 2 months ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 6 months ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia MYR3,500.00-MYR4,000.00 1 week ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 6 months ago
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-LjbffrAHCS & GL – Data & Integration Engineer
Posted 25 days ago
Job Viewed
Job Description
AHCS & GL – Data & Integration Engineer
Apply locations: Kuala Lumpur (Group Head Office)
Time type: Full time
Posted on: Posted 5 Days Ago
Job requisition id: 24110415
Prudential’s purpose is to be partners for every life and protectors for every future. Our purpose encourages everything we do by creating a culture in which diversity is celebrated and inclusion assured, for our people, customers, and partners. We provide a platform for our people to do their best work and make an impact to the business, and we support our people’s career ambitions. We pledge to make Prudential a place where you can Connect, Grow, and Succeed.
The AHCS & GL – Data & Integration Engineer is responsible for co-creating the technical design and working closely with the Finance GTP Accounting lead in the E2E implementation of the integration platform supporting insurance/reinsurance feeds, investment feeds and IFRS 17/Consolidation platform from source systems to Oracle Fusion Financials ERP (inclusive of AHCS) and outbound from Oracle Fusion Financials ERP to IFRS 17 platform.
This role collaborates closely with cross-geo teams (insurance and non-insurance entities), cross-functional teams (other FFP workstreams) and cross-systems teams (source & consuming systems teams from both local business entities and group office) to ensure the integrity, security, and performance of Prudential’s E2E closing throughout the system lifecycle.
Role and Responsibilities1. Co-create the technical design on integration platform / patterns & Support build team, with Finance GTP Accounting Lead to:
- Ensure compliance of the data architecture with relevant financial regulations, accounting standards, and data privacy requirements.
- Design and implement data security controls, such as data encryption, access controls, and audit trails, across the Oracle Fusion Financials ERP ecosystem.
- Collaborate with Security and Compliance teams to ensure adherence to industry standards and regulatory requirements across SEA for the insurance industry.
2. Support E2E QA (with focus on SIT & NFT), Data Migration and Integration, Oracle Quarterly Release Upgrade:
- Support in definition of test plan, migration plan and execute data migration activities from legacy systems to Oracle Fusion Financials ERP.
- Help in design and implement test cases, data migration strategies, considering data mapping, data transformations, and data validation requirements.
- Collaborate with the integration specialists to ensure seamless data flow and integration between Oracle Fusion Financials modules and other enterprise systems.
- Develop and maintain data integration specifications and documentation.
- Provide functional and technical support around the Oracle quarterly release upgrade.
- Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
- 7+ years of experience in designing and implementing data architectures for complex financial systems, preferably in the Oracle ecosystem.
- 2-3 full cycle implementation of Oracle Fusion Financials ERP, including AHCS.
- Experience with data migration, data quality management, and data governance frameworks.
- Excellent problem-solving and analytical skills.
- Strong communication and leadership abilities.
- Ability to work collaboratively in a fast-paced, multi-stakeholder environment.
- Willing to travel as the project may require.
- Strong expertise in Oracle Fusion Financials ERP modules including AHCS and their underlying data structures.
- Experience with data architecture, data engineering and integration patterns in Financial Services (with Insurance as preferred).
- Familiarity with financial accounting concepts and processes (ideally with IFRS 17 exposure).
- Technology delivery project management experience, especially legacy systems migration.
- Experience with DevOps, CI/CD practices, master data management and data governance.
- In-depth knowledge of Oracle Financials data structure and data integration techniques.
Prudential is an equal opportunity employer. We provide equality of opportunity of benefits for all who apply and who perform work for our organisation irrespective of sex, race, age, ethnic origin, educational, social and cultural background, marital status, pregnancy and maternity, religion or belief, disability or part-time / fixed-term work, or any other status protected by applicable law. We encourage the same standards from our recruitment and third-party suppliers taking into account the context of grade, job and location. We also allow for reasonable adjustments to support people with individual physical or mental health requirements.
About UsPrudentialplc provides life and health insurance and asset management, with a focus on Asia and Africa. We help people get the most out of life, by making healthcare affordable and accessible and by promoting financial inclusion. We protect people’s wealth, help them grow their assets, and empower them to save for their goals. The business has more than 18 million life customers in Asia and Africa. Prudential has been providing trusted financial security for 95 years and is listed on stock exchanges in London, Hong Kong, Singapore, and New York.
We are proud to be included in the 2023 Bloomberg Gender Equality Index. The index measures gender equality across five pillars: female leadership and talent pipeline, equal pay and gender pay parity, inclusive culture, sexual harassment policies, and pro-women brand. Our inclusion in this global index is testament to our commitment to nurturing diverse talent.
Prudential plc is not affiliated in any manner with Prudential Financial, Inc., a company whose principal place of business is in the United States of America or with the Prudential Assurance Company, a subsidiary of M&G plc, a company incorporated in the United Kingdom.
#J-18808-LjbffrSenior Software Engineer (Data Integration)
Posted today
Job Viewed
Job Description
EPAM Systems Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia Join or sign in to find your next job
Join to apply for the
Senior Software Engineer (Data Integration)
role at
EPAM Systems Senior Software Engineer (Data Integration)
EPAM Systems Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 week ago Be among the first 25 applicants Join to apply for the
Senior Software Engineer (Data Integration)
role at
EPAM Systems We are currently seeking a data integration expert for a Back End Developer backfill role in Kuala Lumpur. Your Azure expertise will be put to good use and you will focus on building our Python ETL processes and writing excellent SQL. If your passion is in making use of data and working with equally brilliant and driven teammates, this is the ideal role for you to grow your career.
Responsibilities
Develop and maintain data integration solutions Collaborate with the team to design and implement ETL processes Create and maintain database structures and schema Develop and implement data quality controls Assist with troubleshooting and issue resolution
Requirements
At least 5 years of experience working in Data Integration with 3 years of Azure Data Factory experience Azure Synapse Analytics experience DWH & DB Concepts knowledge ETL/ELT Solutions experience SAP Data Services experience SQL proficiency Microsoft Power BI familiarity is a bonus
We offer
By choosing EPAM, you're getting a job at one of the most loved workplaces according to Newsweek 2021 & 2022&2023. Employee ideas are the main driver of our business. We have a very supportive environment where your voice matters You will be challenged while working side-by-side with the best talent globally. We work with top-notch technologies, constantly seeking new industry trends and best practices We offer a transparent career path and an individual roadmap to engineer your future & accelerate your journey At EPAM, you can find vast opportunities for self-development: online courses and libraries, mentoring programs, partial grants of certification, and experience exchange with colleagues around the world. You will learn, contribute, and grow with us
Life at EPAM
EPAM is a leader in the fastest-growing segment (product development/digital platform engineering) of the IT industry. We acquired Just-BI in 2021 to reinforce our leading position as a global Business Intelligence services provider and have been growing rapidly. With a talented multinational team, we provide data and analytics expertise We are currently involved in end-to-end BI design and implementation projects in major national and international companies. We are proud of our entrepreneurial start-up culture and are focused on investing in people by creating continuous learning and development opportunities for our employees who deliver engineering excellence for our clients
Seniority level
Seniority level Mid-Senior level Employment type
Employment type Full-time Job function
Job function Business Development, Information Technology, and Engineering Industries Software Development and IT Services and IT Consulting Referrals increase your chances of interviewing at EPAM Systems by 2x Sign in to set job alerts for “Senior Software Engineer” roles.
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia MYR4,000.00-MYR5,000.00 1 month ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 week ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 week ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 5 months ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 month ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 8 months ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 month ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia MYR3,800.00-MYR5,000.00 3 months ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 4 months ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 6 months ago Kota Damansara, Selangor, Malaysia 3 weeks ago Petaling Jaya, Selangor, Malaysia 2 months ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 6 months ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia MYR3,500.00-MYR4,000.00 1 week ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 weeks ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 6 months ago We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-Ljbffr
AHCS & GL – Data & Integration Engineer
Posted today
Job Viewed
Job Description
1. Co-create the technical design on integration platform / patterns & Support build team, with Finance GTP Accounting Lead to: Ensure compliance of the data architecture with relevant financial regulations, accounting standards, and data privacy requirements. Design and implement data security controls, such as data encryption, access controls, and audit trails, across the Oracle Fusion Financials ERP ecosystem. Collaborate with Security and Compliance teams to ensure adherence to industry standards and regulatory requirements across SEA for the insurance industry. 2. Support E2E QA (with focus on SIT & NFT), Data Migration and Integration, Oracle Quarterly Release Upgrade: Support in definition of test plan, migration plan and execute data migration activities from legacy systems to Oracle Fusion Financials ERP. Help in design and implement test cases, data migration strategies, considering data mapping, data transformations, and data validation requirements. Collaborate with the integration specialists to ensure seamless data flow and integration between Oracle Fusion Financials modules and other enterprise systems. Develop and maintain data integration specifications and documentation. Provide functional and technical support around the Oracle quarterly release upgrade. Qualifications
Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. 7+ years of experience in designing and implementing data architectures for complex financial systems, preferably in the Oracle ecosystem. 2-3 full cycle implementation of Oracle Fusion Financials ERP, including AHCS. Experience with data migration, data quality management, and data governance frameworks. Excellent problem-solving and analytical skills. Strong communication and leadership abilities. Ability to work collaboratively in a fast-paced, multi-stakeholder environment. Willing to travel as the project may require. Mandatory Skills
Strong expertise in Oracle Fusion Financials ERP modules including AHCS and their underlying data structures. Experience with data architecture, data engineering and integration patterns in Financial Services (with Insurance as preferred). Familiarity with financial accounting concepts and processes (ideally with IFRS 17 exposure). Preferred Skills
Technology delivery project management experience, especially legacy systems migration. Experience with DevOps, CI/CD practices, master data management and data governance. In-depth knowledge of Oracle Financials data structure and data integration techniques. Prudential is an equal opportunity employer.
We provide equality of opportunity of benefits for all who apply and who perform work for our organisation irrespective of sex, race, age, ethnic origin, educational, social and cultural background, marital status, pregnancy and maternity, religion or belief, disability or part-time / fixed-term work, or any other status protected by applicable law. We encourage the same standards from our recruitment and third-party suppliers taking into account the context of grade, job and location. We also allow for reasonable adjustments to support people with individual physical or mental health requirements. About Us
Prudentialplc provides life and health insurance and asset management, with a focus on Asia and Africa. We help people get the most out of life, by making healthcare affordable and accessible and by promoting financial inclusion. We protect people’s wealth, help them grow their assets, and empower them to save for their goals. The business has more than 18 million life customers in Asia and Africa. Prudential has been providing trusted financial security for 95 years and is listed on stock exchanges in London, Hong Kong, Singapore, and New York. We are proud to be included in the 2023 Bloomberg Gender Equality Index. The index measures gender equality across five pillars: female leadership and talent pipeline, equal pay and gender pay parity, inclusive culture, sexual harassment policies, and pro-women brand. Our inclusion in this global index is testament to our commitment to nurturing diverse talent. Prudential plc is not affiliated in any manner with Prudential Financial, Inc., a company whose principal place of business is in the United States of America or with the Prudential Assurance Company, a subsidiary of M&G plc, a company incorporated in the United Kingdom.
#J-18808-Ljbffr
Be The First To Know
About the latest Data pipeline Jobs in Malaysia !
AVP, Data Integration (pySpark, Nifi, Hadoop)
Posted 11 days ago
Job Viewed
Job Description
Maybank WP. Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia
Get AI-powered advice on this job and more exclusive features.
- Implement ETL systems that are operationally stable, efficient, and automated. This includes technical solutions that are scalable, aligned with the enterprise architecture, and adaptable to business changes.
- Collaborate with internal and external teams to define requirements for data integrations, specifically for Data Warehouse/Data Marts implementations.
Responsibilities of the Role
- Review business and technical requirements to ensure the data integration platform meets specifications.
- Apply industry best practices for ETL design and development.
- Produce technical design documents, system testing plans, and implementation documentation.
- Conduct system testing: execute job flows, investigate and resolve system defects, and document results.
- Work with DBAs, application specialists, and technical support teams to optimize ETL system performance and meet SLAs.
- Assist in developing, documenting, and applying best practices and procedures.
- Strong SQL writing skills are required.
- Familiarity with ETL tools such as pySpark, NiFi, Informatica, and Hadoop is preferred.
- Understanding of data integration best practices, including master data management, entity resolution, data quality, and metadata management.
- Experience with data warehouse architecture, source system data analysis, and data profiling.
- Ability to work effectively in a fast-paced, adaptive environment.
- Financial domain experience is a plus.
- Ability to work independently and communicate effectively across various levels, including product owners, executive sponsors, and team members.
- Experience working in an Agile environment is advantageous.
Qualifications
- Bachelor’s Degree in Computer Science, Information Technology, or equivalent.
- Over 5 years of total work experience, with experience programming ETL processes using Informatica, NiFi, pySpark, and Hadoop.
- At least 4 years of experience in data analysis, profiling, and designing ETL systems/programs.
- Mid-Senior level
- Full-time
- Information Technology
- Banking
AVP, Data Integration (pySpark, Nifi, Hadoop)
Posted 11 days ago
Job Viewed
Job Description
Maybank WP. Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia
Get AI-powered advice on this job and more exclusive features.
- Implement ETL systems that are operationally stable, efficient, and automated. This includes technical solutions that are scalable, aligned with the enterprise architecture, and adaptable to business changes.
- Collaborate with internal and external teams to define requirements for data integrations, specifically for Data Warehouse/Data Marts implementations.
Responsibilities of the Role
- Review business and technical requirements to ensure the data integration platform meets specifications.
- Apply industry best practices for ETL design and development.
- Produce technical design documents, system testing plans, and implementation documentation.
- Conduct system testing: execute job flows, investigate and resolve system defects, and document results.
- Work with DBAs, application specialists, and technical support teams to optimize ETL system performance and meet SLAs.
- Assist in developing, documenting, and applying best practices and procedures.
- Strong SQL writing skills are required.
- Familiarity with ETL tools such as pySpark, NiFi, Informatica, and Hadoop is preferred.
- Understanding of data integration best practices, including master data management, entity resolution, data quality, and metadata management.
- Experience with data warehouse architecture, source system data analysis, and data profiling.
- Ability to work effectively in a fast-paced, adaptive environment.
- Financial domain experience is a plus.
- Ability to work independently and communicate effectively across various levels, including product owners, executive sponsors, and team members.
- Experience working in an Agile environment is advantageous.
Qualifications
- Bachelor’s Degree in Computer Science, Information Technology, or equivalent.
- Over 5 years of total work experience, with experience programming ETL processes using Informatica, NiFi, pySpark, and Hadoop.
- At least 4 years of experience in data analysis, profiling, and designing ETL systems/programs.
- Mid-Senior level
- Full-time
- Information Technology
- Banking
Research Officer (Research Data Integration), BII
Posted 11 days ago
Job Viewed
Job Description
About us:
Data science is an important component of biomedical and translational research, where data of multiple modalities are being constantly generated at unprecedented scale. The Research Data Integration group in the Biomedical Datahub Division of the Bioinformatics Institute (BII), A*STAR, aims to bridge the complexity of computational biology and data science with the needs of biologists and clinicians to drive biological discoveries and predict translational outcomes. One of our immediate challenges is to analyze and integrate and analyze multi-omics, imaging and clinical data generated by biomedical institutes in A*STAR, healthcare institutions and national initiatives in Singapore to improve the usability and interpretability of large-scale multimodal datasets of cancer, metabolic diseases, and skin diseases. We seek motivated individuals to join us to push the potential of biomedical data in truly benefitting patients.
Job description:
We are seeking a highly motivated research officer to drive data management, analysis and integration of large-scale and diverse multi-omics (genomics, transcriptomics, proteomics, metabolomics, lipidomics) and clinical data related to cancer and other diseases. The candidate will be expected handle a variety of tasks, including curate and QC incoming large-scale multi-omics data, implement and benchmark bioinformatics workflows, perform downstream analysis, develop data integration platforms comprising of file catalogues, databases, knowledge graphs and visualization of datasets through interactive dashboards, and implement international data standards to ensure data interoperability. The successful candidate will also extensively mine publicly available resources and link up with our datasets. The candidate will also manage and maintain all data based on necessary security requirements. This position offers the candidate an opportunity to gain expert knowledge in multi-dimensional data from cancer and other diseases, and become computationally competent in delivering these knowledge to a diverse group of stake-holders including clinicians and wet-lab scientists. There are also many opportunities to apply AI/ML to all the tasks and data to drive efficiency and uncover new insights.
Requirements:
- Bachelor or Master degree in Bioinformatics, Computational Biology, Data Science, Computer Science, Analytics, Engineering, Genetics or a related field.
- Strong programming skills (e.g. Python, R, RStudio, Jupyter Notebook, Shinyapps).
- Highly experienced with Unix/Linux environment and cloud architecture (AWS).
- Familiar with bioinformatics, database development (structured and unstructured, SQL), visualization tools, AI/ML and other big data technologies.
- Experience with large datasets and knowledge in data management and data standards.
- Understanding of biological concepts and methodologies relevant to multi-omics data and cancer.
- Strong analytical and problem-solving skills, and attention to detail.
- Excellent oral and written communication and presentation skills.
- Able to work independently, and work collaboratively in a team environment, with a positive and enthusiastic learning attitude for new methodologies and technologies.
- Competent project management and organizational skills will be very valuable.
The above eligibility criteria are not exhaustive. A*STAR may include additional selection criteria based on its prevailing recruitment policies. These policies may be amended from time to time without notice. We regret that only shortlisted candidates will be notified.
#J-18808-Ljbffr