43 Data Architect jobs in Kuala Lumpur
Data Architect
Posted 2 days ago
Job Viewed
Job Description
Responsibilities
- Minimum 5 years of Data architecture experience in respective platform
- Proficiency in data modelling tools and concepts.
- Knowledge of various database types (SQL, NoSQL) and cloud platforms (e.g., AWS, Snowflake).
- Understanding of distributed architectures, data warehousing, and data lakes.
- Familiarity with data security and privacy principles.
- Experience in working with large customers in enterprise application development in the specified skillsets
- Data Modelling: Creating conceptual, logical, and physical data models to define data structures, entities, and relationships
- System Design: Designing and overseeing the data infrastructure, including data warehouses, data lakes, and cloud platforms
- Technology Selection: Choosing appropriate databases, cloud services, and other tools to meet performance and scalability needs
- Data Governance: Establishing policies for data quality, security, access, and lifecycle management to ensure compliance with regulations and standards.
- Integration: Designing solutions for integrating disparate data sources into a unified framework.
- Collaboration: Working with business analysts, data scientists, engineers, and other stakeholders to translate business needs into technical solutions
- Optimization: Continuously optimizing data storage and processing for performance, scalability, and security.
- Minimum 5 years of Data architecture experience in respective platform
- Proficiency in data modelling tools and concepts.
- Knowledge of various database types (SQL, NoSQL) and cloud platforms (e.g., AWS, Snowflake).
- Understanding of distributed architectures, data warehousing, and data lakes.
- Familiarity with data security and privacy principles.
- Experience in working with large customers in enterprise application development in the specified skillsets
- Mid-Senior level
- Contract
- Information Technology
- IT Services and IT Consulting
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-LjbffrData Architect
Posted 3 days ago
Job Viewed
Job Description
SCOPE & AUTHORITY
Data Architecture:
- Execute Enterprise data initiatives / programs to establish a governed, curated, and agile data ecosystem that enables business to make data-driven decisions.
- Translate strategic requirements into a usable Enterprise information architecture (e.g. enterprise data model, associated metamodel, common business vocabulary, and naming taxonomy, etc.).
- Develop and maintain architecture artifacts, frameworks, and patterns as references for the development teams across the Group.
- Deliver tangible Data & AI solutions, Choosing the right technology, evaluating architecture evolution as well as being capable of creating and maintaining architecture using leading Data & AI technology frameworks
- Participate in key transformational project design reviews as part of the methodology process to ensure application designs adhere to enterprise information architecture guidelines.
- Monitor regulatory guidelines such as consumer privacy laws, data retention policies, outsourced data, and specific industry guidelines to determine impact on the enterprise information architecture.
- Provide capability assessment tool on Data Management & Governance in all Dimensions of Data Management adhering to DMBOK V2.
- Establish and monitor the operations of the data governance organization across the Group.
- Drive the implementation of corrective measures to ensure data governance policies and procedures are followed.
- Publish data reference architecture, data architecture principles, best practices, and design patterns to enable data engineering teams to build scalable and resilient platforms.
- Exploring new technology trends and leveraging on it in order to simplify our data eco-system (e.g. development of architectures, strategies, and policies around data governance, including master data management, metadata management, data quality and data profiling).
Data Management (Regulatory & Compliance):
Academic Qualification:
Bachelor's degree in computer science, computer engineering, electrical engineering, systems analysis or a related field of study
Min 8 years of experience architecting, designing and developing large scale data solutions utilizing a mixture of Big Data and Relational database platforms. Data/information modeling expertise at the enterprise level Space, NLTK
Skills Required:
Requires advance knowledge of Big Data analysis and data management tools to be able to recommend and provide industry best practices.
You will drive end to end data solutions and data management strategy across data and analytics platforms.
Enterprise scale expertise in data analysis, modelling, data security, data warehousing, metadata management and data quality.
Extensive knowledge and experience in architecting modern data ingestion frameworks, highly scalable distributed systems using open source and emerging data architecture patterns.
Data/information modelling expertise at the enterprise level
Experience with Master Data Management, Metadata Management, and Data Quality tools, Data Security and Privacy methods and frameworks
- Hands on experience in Data Management Lifecycle, Data Modelling and Data Governance
- Experience with Hadoop clusters, in memory processing, GPGPU processing and parallel distributed computing systems
- Experience building data pipelines using Kafka, Flume, and accelerated Stream processing - Deep understanding of Apache Hadoop 1/2 and the Hadoop ecosystem. Experience with one or more relevant tools (Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, HCatalog, Solr, Avro).
- Good knowledge of Elastic search and Solr
- Experience in designing NoSql, HDFS, Hive, HBASE datamarts and creating data lakes - Familiarity with one or more SQL-on-Hadoop technology (Hive, Pig, Impala, Spark SQL, Presto)
- At least 10 years of experience with data warehouse design for RDBMS such as for Oracle, MS SQL, PostgreSQL and MySQL – Can incorporate with above
- Experience with Service Oriented Architecture (SOA), web services, enterprise data management, information security, applications development, and cloud-based architectures
- Experience with enterprise data management technologies, including database platforms, ETL tools such as Talend/Pentaho (developing Spark ETL jobs), and SQL.
- Experience in languages like Java, PHP, Python and/or R on Linux OS. – JavaScript, Scala and Windows OS
- Experience in implementing machine-learning solutions, development in multiple languages and statistical analysis. He/she will also need to be familiar with a whole host of other approaches used in practical applications of machine learning.
- Experience in AI Integration, Natural Language Processing and AI Application Programming
- Experience with Telecommunications, IoT, Data visualization and GIS projects - Current hands-on implementation experience required
- Database Administrator with big data project and TOGAF certification will be an advantage
- Customer facing skills to represent Big Data Architectures well within the opco environments and drive discussions with senior personnel regarding trade-offs, best practices, project management and risk mitigation.
Collaborate with Opco’s Analytics, data science and full stack team to ensure structured and unstructured data capturing and ingestion as per design requirements
Work with Axiata’s IT and security architects to ensure compliance with data security and privacy policy
Manage and lead end-to end data lifecycle management activities and ensure consistency/quality/availability between data management – More of Data Engineers role
Excellent communication skills and ability to convey complex topics through effective documentation as well as presentation.
#J-18808-LjbffrData Architect
Posted 7 days ago
Job Viewed
Job Description
Maybank WP. Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia
Data ArchitectGet AI-powered advice on this job and more exclusive features.
- Subject matter expertise in technologies (TD, HD, BDA, R, Python) and Data Management
- Understand and deliver data modeling requirements – physical and logical data modeling
- Drive data quality enhancements at scale to solve business problems in a prioritized manner
- Build and support data flow integration between different products increasing the capabilities of our product and expanding its use across the firm
- Ability to understand and deliver client requirements end to end for reference data and data lake aligned use cases
- Ensure proper risk management process is adhered too, to mitigate risk exposure for the firm
Hard Skills / Requirements
Technology Infrastructure (HW, SW & Applications hosted) Expertise
- Ability to understand in depth how technology products are delivered as per requirement
- Advanced understanding of reference data, data lake, and other information system solutions
- Experience integrated multiple systems together to create an intelligent and resilient information system solution
- Ability to use industry standard tools to create logical and physical data models for technology infrastructure and reference data
Information System Design
- Experience delivering or supporting information systems using extract, translate, load (ETL) and publishing methods
- Understanding the right technical solution to represent the data appropriately to meet the client requirement
- Experience with performance tuning of information systems (e.g. streamlining ETLs to run faster)
Data Quality Experience
- Understanding the core concepts of data quality–how to measure improve
- Understanding what data quality issues pose the greatest risks to the Maybank operations and raising those to governance effectively
Functional Skills:
Must Have (Experience in any of the areas):
- Data Quality Score Cards, Data Management, Data Governance
- Dashboards, Visualization Scorecards, Sharepoint
Good to have (Experience in any of the areas):
- Data Warehousing (5 years)
- Big Data (4 years)
- Master Data Management, Reference Data Management (4 years)
- Worked as SI/Sa in Financial domain (6 years)
Technical Skills:
Must Have (Certified in anyone of the below but should have multiple skillset):
- R, Python
- Hadoop-Cloudera
- ETL/ELT Tools - Trillium, informatica (Any data Modelling or data profiling tools)
Good to have (Experience in any of the following):
- Pig, MapReduce, Hive, Hbase
- Big Data Integration, Ingestation Framework (4 years)
- Data Lab, Data Lake (4 years)
- ITIL, TOGAF (any one)
Mid-Senior level
Employment typeFull-time
Job functionInformation Technology
Banking
#J-18808-LjbffrData Architect
Posted 24 days ago
Job Viewed
Job Description
Explore our latest thought leadership, ideas, and insights on the issues that are shaping the future of business and society.
Choose a partner with intimate knowledge of your industry and first-hand experience of defining its future.
Discover our portfolio – constantly evolving to keep pace with the ever-changing needs of our clients.
Become part of a diverse collective of free-thinkers, entrepreneurs and experts – and help us to make a difference.
See our latest news, and stories from across the business, and explore our archives.
We are a global leader in partnering with companies to transform and manage their business by harnessing the power of technology.
Data Architects drive change that creates business opportunity through data driven insights. Architects shape and translate Business strategy to data products and needs into realisable, sustainable technology solutions. Architects take end-to-end solution delivery ownership from idea to benefits delivery.
Job Description - Grade SpecificManaging Data Architect - Design, deliver and manage complete data architecture solutions. Demonstrate leadership of topics in the architect community and show a passion for technology and business acumen. Work as a stream lead at CIO/CTO level for an internal or external client. Lead Capgemini operations relating to market development and/or service delivery excellence. Are seen as a role model in their (local) community. Certification: preferably Capgemini Architects certification level 2 or above, relevant data architecture certifications, IAF and/or industry certifications such as TOGAF 9 or equivalent.
Ref. code -en_GB
Posted on 23 Jun 2025
Experience level Experienced Professionals
When you join Capgemini, you don’t just start a new job. You become part of something bigger.
Learn about how the recruitment process works – how to apply, where to follow your application, and next steps.
To help you bring out the best of yourself during the interview process, we’ve got some great interview tips to share before the big day.
#J-18808-LjbffrData Architect
Posted today
Job Viewed
Job Description
Responsibilities of the Role
- Subject matter expertise in technologies (TD, HD, BDA, R, Python) and Data Management
- Understand and deliver data modeling requirements – physical and logical data modeling
- Drive data quality enhancements at scale to solve business problems in a prioritized manner
- Build and support data flow integration between different products increasing the capabilities of our product and expanding its use across the firm
- Ability to understand and deliver client requirements end to end for reference data and data lake aligned use cases
- Ensure proper risk management process is adhered too, to mitigate risk exposure for the firm
Hard Skills / Requirements
Technology Infrastructure (HW, SW & Applications hosted) Expertise
- Ability to understand in depth how technology products are delivered as per requirement
Strong Data Architecture experience
- Advanced understanding of reference data, data lake, and other information system solutions
- Experience integrated multiple systems together to create an intelligent and resilient information system solution
- Ability to use industry standard tools to create logical and physical data models for technology infrastructure and reference data
Information System Design
- Experience delivering or supporting information systems using extract, translate, load (ETL) and publishing methods
- Understanding the right technical solution to represent the data appropriately to meet the client requirement
- Experience with performance tuning of information systems (e.g. streamlining ETLs to run faster)
Data Quality Experience
- Understanding the core concepts of data quality–how to measure improve
- Understanding what data quality issues pose the greatest risks to the Maybank operations and raising those to governance effectively
Functional Skills:
Must Have (Experience in any of the areas):
- Data Mining. Data modelling, Data Standards
- Data Lineage, Masking, Encrypt/Decrypt
- Data Quality Score Cards, Data Management, Data Governance
- Dashboards, Visualization Scorecards, Sharepoint
Good to have (Experience in any of the areas):
- Data Warehousing (5 years)
- Big Data (4 years)
- Master Data Management, Reference Data Management (4 years)
- Worked as SI/Sa in Financial domain (6 years)
Technical Skills:
Must Have (Certified in anyone of the below but should have multiple skillset):
- R, Python
- Teradata
- Hadoop-Cloudera
- ETL/ELT Tools - Trillium, informatica (Any data Modelling or data profiling tools)
- Data Visualization Tools (Oracle DV, Denodo, Clairvoyant, OEDQ, Paxata)
- Cloud (AWS, Azure, GCP)
- OS - Scripting (Any OS)
Good to have (Experience in any of the following):
- Pig, MapReduce, Hive, Hbase
- Big Data Integration, Ingestation Framework (4 yeras)
- Data Lab, Data Lake (4 years)
- ITIL, TOGAF (any one)
Data Architect
Posted today
Job Viewed
Job Description
Responsibilities:
- Design and implement enterprise data architecture strategy.
- Define data models, pipelines, and integration frameworks.
- Work with stakeholders to translate business needs into scalable data solutions.
- Oversee data governance, quality, and security compliance.
- Optimize data storage and retrieval for large datasets.
- Support advanced analytics, AI/ML, and reporting initiatives.
Requirements:
- Bachelor's/Master's in Computer Science, Data Engineering, or related field.
- 10+ years of IT experience with at least 5 years in data architecture.
- Strong hands-on experience in SQL, NoSQL, and cloud data platforms (Azure Synapse, Snowflake, Databricks, BigQuery).
- Expertise in ETL/ELT tools and data pipeline frameworks.
- Knowledge of data security and governance standards.
- Excellent analytical and communication skills.
Job Types: Full-time, Contract
Contract length: 12 months
Pay: RM7, RM20,183.18 per month
Ability to commute/relocate:
- Kuala Lumpur: Reliably commute or planning to relocate before starting work (Required)
Education:
- Bachelor's (Required)
Experience:
- Azure Synapse: 3 years (Required)
- Databricks: 3 years (Required)
Work Location: In person
Data Architect
Posted 1 day ago
Job Viewed
Job Description
Minimum 5 years of Data architecture experience in respective platform Proficiency in data modelling tools and concepts. Knowledge of various database types (SQL, NoSQL) and cloud platforms (e.g., AWS, Snowflake). Understanding of distributed architectures, data warehousing, and data lakes. Familiarity with data security and privacy principles. Experience in working with large customers in enterprise application development in the specified skillsets Data Modelling: Creating conceptual, logical, and physical data models to define data structures, entities, and relationships System Design: Designing and overseeing the data infrastructure, including data warehouses, data lakes, and cloud platforms Technology Selection: Choosing appropriate databases, cloud services, and other tools to meet performance and scalability needs Data Governance: Establishing policies for data quality, security, access, and lifecycle management to ensure compliance with regulations and standards. Integration: Designing solutions for integrating disparate data sources into a unified framework. Collaboration: Working with business analysts, data scientists, engineers, and other stakeholders to translate business needs into technical solutions Optimization: Continuously optimizing data storage and processing for performance, scalability, and security. Qualifications
Minimum 5 years of Data architecture experience in respective platform Proficiency in data modelling tools and concepts. Knowledge of various database types (SQL, NoSQL) and cloud platforms (e.g., AWS, Snowflake). Understanding of distributed architectures, data warehousing, and data lakes. Familiarity with data security and privacy principles. Experience in working with large customers in enterprise application development in the specified skillsets Seniority level
Mid-Senior level Employment type
Contract Job function
Information Technology Industries
IT Services and IT Consulting We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-Ljbffr
Be The First To Know
About the latest Data architect Jobs in Kuala Lumpur !
Data Architect
Posted 3 days ago
Job Viewed
Job Description
#J-18808-Ljbffr
Data Architect
Posted 7 days ago
Job Viewed
Job Description
Get AI-powered advice on this job and more exclusive features. Subject matter expertise in technologies (TD, HD, BDA, R, Python) and Data Management Understand and deliver data modeling requirements – physical and logical data modeling Drive data quality enhancements at scale to solve business problems in a prioritized manner Build and support data flow integration between different products increasing the capabilities of our product and expanding its use across the firm Ability to understand and deliver client requirements end to end for reference data and data lake aligned use cases Ensure proper risk management process is adhered too, to mitigate risk exposure for the firm Hard Skills / Requirements Technology Infrastructure (HW, SW & Applications hosted) Expertise Ability to understand in depth how technology products are delivered as per requirement Advanced understanding of reference data, data lake, and other information system solutions Experience integrated multiple systems together to create an intelligent and resilient information system solution Ability to use industry standard tools to create logical and physical data models for technology infrastructure and reference data Information System Design Experience delivering or supporting information systems using extract, translate, load (ETL) and publishing methods Understanding the right technical solution to represent the data appropriately to meet the client requirement Experience with performance tuning of information systems (e.g. streamlining ETLs to run faster) Data Quality Experience Understanding the core concepts of data quality–how to measure improve Understanding what data quality issues pose the greatest risks to the Maybank operations and raising those to governance effectively Functional Skills: Must Have (Experience in any of the areas): Data Quality Score Cards, Data Management, Data Governance Dashboards, Visualization Scorecards, Sharepoint Good to have (Experience in any of the areas): Data Warehousing (5 years) Big Data (4 years) Master Data Management, Reference Data Management (4 years) Worked as SI/Sa in Financial domain (6 years) Technical Skills: Must Have (Certified in anyone of the below but should have multiple skillset): R, Python Hadoop-Cloudera ETL/ELT Tools - Trillium, informatica (Any data Modelling or data profiling tools) Good to have (Experience in any of the following): Pig, MapReduce, Hive, Hbase Big Data Integration, Ingestation Framework (4 years) Data Lab, Data Lake (4 years) ITIL, TOGAF (any one) Seniority level
Mid-Senior level Employment type
Full-time Job function
Information Technology Banking
#J-18808-Ljbffr
Data Architect
Posted 11 days ago
Job Viewed
Job Description
Managing Data Architect - Design, deliver and manage complete data architecture solutions. Demonstrate leadership of topics in the architect community and show a passion for technology and business acumen. Work as a stream lead at CIO/CTO level for an internal or external client. Lead Capgemini operations relating to market development and/or service delivery excellence. Are seen as a role model in their (local) community. Certification: preferably Capgemini Architects certification level 2 or above, relevant data architecture certifications, IAF and/or industry certifications such as TOGAF 9 or equivalent. Ref. code -en_GB Posted on 23 Jun 2025 Experience level Experienced Professionals When you join Capgemini, you don’t just start a new job. You become part of something bigger. Learn about how the recruitment process works – how to apply, where to follow your application, and next steps. To help you bring out the best of yourself during the interview process, we’ve got some great interview tips to share before the big day.
#J-18808-Ljbffr