Data Architect

Kuala Lumpur, Kuala Lumpur Net2Source (N2S)

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

Responsibilities

  • Minimum 5 years of Data architecture experience in respective platform
  • Proficiency in data modelling tools and concepts.
  • Knowledge of various database types (SQL, NoSQL) and cloud platforms (e.g., AWS, Snowflake).
  • Understanding of distributed architectures, data warehousing, and data lakes.
  • Familiarity with data security and privacy principles.
  • Experience in working with large customers in enterprise application development in the specified skillsets
  • Data Modelling: Creating conceptual, logical, and physical data models to define data structures, entities, and relationships
  • System Design: Designing and overseeing the data infrastructure, including data warehouses, data lakes, and cloud platforms
  • Technology Selection: Choosing appropriate databases, cloud services, and other tools to meet performance and scalability needs
  • Data Governance: Establishing policies for data quality, security, access, and lifecycle management to ensure compliance with regulations and standards.
  • Integration: Designing solutions for integrating disparate data sources into a unified framework.
  • Collaboration: Working with business analysts, data scientists, engineers, and other stakeholders to translate business needs into technical solutions
  • Optimization: Continuously optimizing data storage and processing for performance, scalability, and security.
Qualifications
  • Minimum 5 years of Data architecture experience in respective platform
  • Proficiency in data modelling tools and concepts.
  • Knowledge of various database types (SQL, NoSQL) and cloud platforms (e.g., AWS, Snowflake).
  • Understanding of distributed architectures, data warehousing, and data lakes.
  • Familiarity with data security and privacy principles.
  • Experience in working with large customers in enterprise application development in the specified skillsets
Seniority level
  • Mid-Senior level
Employment type
  • Contract
Job function
  • Information Technology
Industries
  • IT Services and IT Consulting

We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Architect

Kuala Lumpur, Kuala Lumpur Two95 International Inc.

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

SCOPE & AUTHORITY

Data Architecture:

  • Execute Enterprise data initiatives / programs to establish a governed, curated, and agile data ecosystem that enables business to make data-driven decisions.
  • Translate strategic requirements into a usable Enterprise information architecture (e.g. enterprise data model, associated metamodel, common business vocabulary, and naming taxonomy, etc.).
  • Develop and maintain architecture artifacts, frameworks, and patterns as references for the development teams across the Group.
  • Deliver tangible Data & AI solutions, Choosing the right technology, evaluating architecture evolution as well as being capable of creating and maintaining architecture using leading Data & AI technology frameworks
  • Participate in key transformational project design reviews as part of the methodology process to ensure application designs adhere to enterprise information architecture guidelines.
  • Monitor regulatory guidelines such as consumer privacy laws, data retention policies, outsourced data, and specific industry guidelines to determine impact on the enterprise information architecture.
  • Provide capability assessment tool on Data Management & Governance in all Dimensions of Data Management adhering to DMBOK V2.
  • Establish and monitor the operations of the data governance organization across the Group.
  • Drive the implementation of corrective measures to ensure data governance policies and procedures are followed.
  • Publish data reference architecture, data architecture principles, best practices, and design patterns to enable data engineering teams to build scalable and resilient platforms.
  • Exploring new technology trends and leveraging on it in order to simplify our data eco-system (e.g. development of architectures, strategies, and policies around data governance, including master data management, metadata management, data quality and data profiling).

Data Management (Regulatory & Compliance):

Academic Qualification:

Bachelor's degree in computer science, computer engineering, electrical engineering, systems analysis or a related field of study

Min 8 years of experience architecting, designing and developing large scale data solutions utilizing a mixture of Big Data and Relational database platforms. Data/information modeling expertise at the enterprise level Space, NLTK

Skills Required:

Requires advance knowledge of Big Data analysis and data management tools to be able to recommend and provide industry best practices.

You will drive end to end data solutions and data management strategy across data and analytics platforms.

Enterprise scale expertise in data analysis, modelling, data security, data warehousing, metadata management and data quality.

Extensive knowledge and experience in architecting modern data ingestion frameworks, highly scalable distributed systems using open source and emerging data architecture patterns.

Data/information modelling expertise at the enterprise level

Experience with Master Data Management, Metadata Management, and Data Quality tools, Data Security and Privacy methods and frameworks

  • Hands on experience in Data Management Lifecycle, Data Modelling and Data Governance
  • Experience with Hadoop clusters, in memory processing, GPGPU processing and parallel distributed computing systems
  • Experience building data pipelines using Kafka, Flume, and accelerated Stream processing - Deep understanding of Apache Hadoop 1/2 and the Hadoop ecosystem. Experience with one or more relevant tools (Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, HCatalog, Solr, Avro).
  • Good knowledge of Elastic search and Solr
  • Experience in designing NoSql, HDFS, Hive, HBASE datamarts and creating data lakes - Familiarity with one or more SQL-on-Hadoop technology (Hive, Pig, Impala, Spark SQL, Presto)
  • At least 10 years of experience with data warehouse design for RDBMS such as for Oracle, MS SQL, PostgreSQL and MySQL – Can incorporate with above
  • Experience with Service Oriented Architecture (SOA), web services, enterprise data management, information security, applications development, and cloud-based architectures
  • Experience with enterprise data management technologies, including database platforms, ETL tools such as Talend/Pentaho (developing Spark ETL jobs), and SQL.
  • Experience in languages like Java, PHP, Python and/or R on Linux OS. – JavaScript, Scala and Windows OS
  • Experience in implementing machine-learning solutions, development in multiple languages and statistical analysis. He/she will also need to be familiar with a whole host of other approaches used in practical applications of machine learning.
  • Experience in AI Integration, Natural Language Processing and AI Application Programming
  • Experience with Telecommunications, IoT, Data visualization and GIS projects - Current hands-on implementation experience required
  • Database Administrator with big data project and TOGAF certification will be an advantage
  • Customer facing skills to represent Big Data Architectures well within the opco environments and drive discussions with senior personnel regarding trade-offs, best practices, project management and risk mitigation.

Collaborate with Opco’s Analytics, data science and full stack team to ensure structured and unstructured data capturing and ingestion as per design requirements

Work with Axiata’s IT and security architects to ensure compliance with data security and privacy policy

Manage and lead end-to end data lifecycle management activities and ensure consistency/quality/availability between data management – More of Data Engineers role

Excellent communication skills and ability to convey complex topics through effective documentation as well as presentation.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Architect

Kuala Lumpur, Kuala Lumpur Maybank

Posted 7 days ago

Job Viewed

Tap Again To Close

Job Description

Maybank WP. Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia

Data Architect

Get AI-powered advice on this job and more exclusive features.

  • Subject matter expertise in technologies (TD, HD, BDA, R, Python) and Data Management
  • Understand and deliver data modeling requirements – physical and logical data modeling
  • Drive data quality enhancements at scale to solve business problems in a prioritized manner
  • Build and support data flow integration between different products increasing the capabilities of our product and expanding its use across the firm
  • Ability to understand and deliver client requirements end to end for reference data and data lake aligned use cases
  • Ensure proper risk management process is adhered too, to mitigate risk exposure for the firm

Hard Skills / Requirements

Technology Infrastructure (HW, SW & Applications hosted) Expertise

  • Ability to understand in depth how technology products are delivered as per requirement
  • Advanced understanding of reference data, data lake, and other information system solutions
  • Experience integrated multiple systems together to create an intelligent and resilient information system solution
  • Ability to use industry standard tools to create logical and physical data models for technology infrastructure and reference data

Information System Design

  • Experience delivering or supporting information systems using extract, translate, load (ETL) and publishing methods
  • Understanding the right technical solution to represent the data appropriately to meet the client requirement
  • Experience with performance tuning of information systems (e.g. streamlining ETLs to run faster)

Data Quality Experience

  • Understanding the core concepts of data quality–how to measure improve
  • Understanding what data quality issues pose the greatest risks to the Maybank operations and raising those to governance effectively

Functional Skills:

Must Have (Experience in any of the areas):

  • Data Quality Score Cards, Data Management, Data Governance
  • Dashboards, Visualization Scorecards, Sharepoint

Good to have (Experience in any of the areas):

  • Data Warehousing (5 years)
  • Big Data (4 years)
  • Master Data Management, Reference Data Management (4 years)
  • Worked as SI/Sa in Financial domain (6 years)

Technical Skills:

Must Have (Certified in anyone of the below but should have multiple skillset):

  • R, Python
  • Hadoop-Cloudera
  • ETL/ELT Tools - Trillium, informatica (Any data Modelling or data profiling tools)

Good to have (Experience in any of the following):

  • Pig, MapReduce, Hive, Hbase
  • Big Data Integration, Ingestation Framework (4 years)
  • Data Lab, Data Lake (4 years)
  • ITIL, TOGAF (any one)
Seniority level

Mid-Senior level

Employment type

Full-time

Job function

Information Technology

Banking

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Architect

Kuala Lumpur, Kuala Lumpur Capgemini

Posted 24 days ago

Job Viewed

Tap Again To Close

Job Description

Explore our latest thought leadership, ideas, and insights on the issues that are shaping the future of business and society.

Choose a partner with intimate knowledge of your industry and first-hand experience of defining its future.

Discover our portfolio – constantly evolving to keep pace with the ever-changing needs of our clients.

Become part of a diverse collective of free-thinkers, entrepreneurs and experts – and help us to make a difference.

See our latest news, and stories from across the business, and explore our archives.

We are a global leader in partnering with companies to transform and manage their business by harnessing the power of technology.

Data Architects drive change that creates business opportunity through data driven insights. Architects shape and translate Business strategy to data products and needs into realisable, sustainable technology solutions. Architects take end-to-end solution delivery ownership from idea to benefits delivery.

Job Description - Grade Specific

Managing Data Architect - Design, deliver and manage complete data architecture solutions. Demonstrate leadership of topics in the architect community and show a passion for technology and business acumen. Work as a stream lead at CIO/CTO level for an internal or external client. Lead Capgemini operations relating to market development and/or service delivery excellence. Are seen as a role model in their (local) community. Certification: preferably Capgemini Architects certification level 2 or above, relevant data architecture certifications, IAF and/or industry certifications such as TOGAF 9 or equivalent.

Ref. code -en_GB

Posted on 23 Jun 2025

Experience level Experienced Professionals

When you join Capgemini, you don’t just start a new job. You become part of something bigger.

Learn about how the recruitment process works – how to apply, where to follow your application, and next steps.

To help you bring out the best of yourself during the interview process, we’ve got some great interview tips to share before the big day.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Architect

Kuala Lumpur, Kuala Lumpur MYR150000 - MYR250000 Y Maybank

Posted today

Job Viewed

Tap Again To Close

Job Description

Responsibilities of the Role

  • Subject matter expertise in technologies (TD, HD, BDA, R, Python) and Data Management
  • Understand and deliver data modeling requirements – physical and logical data modeling
  • Drive data quality enhancements at scale to solve business problems in a prioritized manner
  • Build and support data flow integration between different products increasing the capabilities of our product and expanding its use across the firm
  • Ability to understand and deliver client requirements end to end for reference data and data lake aligned use cases
  • Ensure proper risk management process is adhered too, to mitigate risk exposure for the firm

Hard Skills / Requirements

Technology Infrastructure (HW, SW & Applications hosted) Expertise

  • Ability to understand in depth how technology products are delivered as per requirement

Strong Data Architecture experience

  • Advanced understanding of reference data, data lake, and other information system solutions
  • Experience integrated multiple systems together to create an intelligent and resilient information system solution
  • Ability to use industry standard tools to create logical and physical data models for technology infrastructure and reference data

Information System Design

  • Experience delivering or supporting information systems using extract, translate, load (ETL) and publishing methods
  • Understanding the right technical solution to represent the data appropriately to meet the client requirement
  • Experience with performance tuning of information systems (e.g. streamlining ETLs to run faster)

Data Quality Experience

  • Understanding the core concepts of data quality–how to measure improve
  • Understanding what data quality issues pose the greatest risks to the Maybank operations and raising those to governance effectively

Functional Skills:

Must Have (Experience in any of the areas):

  • Data Mining. Data modelling, Data Standards
  • Data Lineage, Masking, Encrypt/Decrypt
  • Data Quality Score Cards, Data Management, Data Governance
  • Dashboards, Visualization Scorecards, Sharepoint

Good to have (Experience in any of the areas):

  • Data Warehousing (5 years)
  • Big Data (4 years)
  • Master Data Management, Reference Data Management (4 years)
  • Worked as SI/Sa in Financial domain (6 years)

Technical Skills:

Must Have (Certified in anyone of the below but should have multiple skillset):

  • R, Python
  • Teradata
  • Hadoop-Cloudera
  • ETL/ELT Tools - Trillium, informatica (Any data Modelling or data profiling tools)
  • Data Visualization Tools (Oracle DV, Denodo, Clairvoyant, OEDQ, Paxata)
  • Cloud (AWS, Azure, GCP)
  • OS - Scripting (Any OS)

Good to have (Experience in any of the following):

  • Pig, MapReduce, Hive, Hbase
  • Big Data Integration, Ingestation Framework (4 yeras)
  • Data Lab, Data Lake (4 years)
  • ITIL, TOGAF (any one)
This advertiser has chosen not to accept applicants from your region.

Data Architect

Kuala Lumpur, Kuala Lumpur MYR121220 - MYR242440 Y ISCISTECH Business Solution Sdn Bhd

Posted today

Job Viewed

Tap Again To Close

Job Description

Responsibilities:

  • Design and implement enterprise data architecture strategy.
  • Define data models, pipelines, and integration frameworks.
  • Work with stakeholders to translate business needs into scalable data solutions.
  • Oversee data governance, quality, and security compliance.
  • Optimize data storage and retrieval for large datasets.
  • Support advanced analytics, AI/ML, and reporting initiatives.

Requirements:

  • Bachelor's/Master's in Computer Science, Data Engineering, or related field.
  • 10+ years of IT experience with at least 5 years in data architecture.
  • Strong hands-on experience in SQL, NoSQL, and cloud data platforms (Azure Synapse, Snowflake, Databricks, BigQuery).
  • Expertise in ETL/ELT tools and data pipeline frameworks.
  • Knowledge of data security and governance standards.
  • Excellent analytical and communication skills.

Job Types: Full-time, Contract

Contract length: 12 months

Pay: RM7, RM20,183.18 per month

Ability to commute/relocate:

  • Kuala Lumpur: Reliably commute or planning to relocate before starting work (Required)

Education:

  • Bachelor's (Required)

Experience:

  • Azure Synapse: 3 years (Required)
  • Databricks: 3 years (Required)

Work Location: In person

This advertiser has chosen not to accept applicants from your region.

Data Architect

Kuala Lumpur, Kuala Lumpur Net2Source (N2S)

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Responsibilities

Minimum 5 years of Data architecture experience in respective platform Proficiency in data modelling tools and concepts. Knowledge of various database types (SQL, NoSQL) and cloud platforms (e.g., AWS, Snowflake). Understanding of distributed architectures, data warehousing, and data lakes. Familiarity with data security and privacy principles. Experience in working with large customers in enterprise application development in the specified skillsets Data Modelling: Creating conceptual, logical, and physical data models to define data structures, entities, and relationships System Design: Designing and overseeing the data infrastructure, including data warehouses, data lakes, and cloud platforms Technology Selection: Choosing appropriate databases, cloud services, and other tools to meet performance and scalability needs Data Governance: Establishing policies for data quality, security, access, and lifecycle management to ensure compliance with regulations and standards. Integration: Designing solutions for integrating disparate data sources into a unified framework. Collaboration: Working with business analysts, data scientists, engineers, and other stakeholders to translate business needs into technical solutions Optimization: Continuously optimizing data storage and processing for performance, scalability, and security. Qualifications

Minimum 5 years of Data architecture experience in respective platform Proficiency in data modelling tools and concepts. Knowledge of various database types (SQL, NoSQL) and cloud platforms (e.g., AWS, Snowflake). Understanding of distributed architectures, data warehousing, and data lakes. Familiarity with data security and privacy principles. Experience in working with large customers in enterprise application development in the specified skillsets Seniority level

Mid-Senior level Employment type

Contract Job function

Information Technology Industries

IT Services and IT Consulting We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data architect Jobs in Kuala Lumpur !

Data Architect

Kuala Lumpur, Kuala Lumpur Two95 International Inc.

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

SCOPE & AUTHORITY Data Architecture: Execute Enterprise data initiatives / programs to establish a governed, curated, and agile data ecosystem that enables business to make data-driven decisions. Translate strategic requirements into a usable Enterprise information architecture (e.g. enterprise data model, associated metamodel, common business vocabulary, and naming taxonomy, etc.). Develop and maintain architecture artifacts, frameworks, and patterns as references for the development teams across the Group. Deliver tangible Data & AI solutions, Choosing the right technology, evaluating architecture evolution as well as being capable of creating and maintaining architecture using leading Data & AI technology frameworks Participate in key transformational project design reviews as part of the methodology process to ensure application designs adhere to enterprise information architecture guidelines. Monitor regulatory guidelines such as consumer privacy laws, data retention policies, outsourced data, and specific industry guidelines to determine impact on the enterprise information architecture. Provide capability assessment tool on Data Management & Governance in all Dimensions of Data Management adhering to DMBOK V2. Establish and monitor the operations of the data governance organization across the Group. Drive the implementation of corrective measures to ensure data governance policies and procedures are followed. Publish data reference architecture, data architecture principles, best practices, and design patterns to enable data engineering teams to build scalable and resilient platforms. Exploring new technology trends and leveraging on it in order to simplify our data eco-system (e.g. development of architectures, strategies, and policies around data governance, including master data management, metadata management, data quality and data profiling). Data Management (Regulatory & Compliance): Academic Qualification: Bachelor's degree in computer science, computer engineering, electrical engineering, systems analysis or a related field of study Min 8 years of experience architecting, designing and developing large scale data solutions utilizing a mixture of Big Data and Relational database platforms. Data/information modeling expertise at the enterprise level Space, NLTK Skills Required: Requires advance knowledge of Big Data analysis and data management tools to be able to recommend and provide industry best practices. You will drive end to end data solutions and data management strategy across data and analytics platforms. Enterprise scale expertise in data analysis, modelling, data security, data warehousing, metadata management and data quality. Extensive knowledge and experience in architecting modern data ingestion frameworks, highly scalable distributed systems using open source and emerging data architecture patterns. Data/information modelling expertise at the enterprise level Experience with Master Data Management, Metadata Management, and Data Quality tools, Data Security and Privacy methods and frameworks Hands on experience in Data Management Lifecycle, Data Modelling and Data Governance Experience with Hadoop clusters, in memory processing, GPGPU processing and parallel distributed computing systems Experience building data pipelines using Kafka, Flume, and accelerated Stream processing - Deep understanding of Apache Hadoop 1/2 and the Hadoop ecosystem. Experience with one or more relevant tools (Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, HCatalog, Solr, Avro). Good knowledge of Elastic search and Solr Experience in designing NoSql, HDFS, Hive, HBASE datamarts and creating data lakes - Familiarity with one or more SQL-on-Hadoop technology (Hive, Pig, Impala, Spark SQL, Presto) At least 10 years of experience with data warehouse design for RDBMS such as for Oracle, MS SQL, PostgreSQL and MySQL – Can incorporate with above Experience with Service Oriented Architecture (SOA), web services, enterprise data management, information security, applications development, and cloud-based architectures Experience with enterprise data management technologies, including database platforms, ETL tools such as Talend/Pentaho (developing Spark ETL jobs), and SQL. Experience in languages like Java, PHP, Python and/or R on Linux OS. – JavaScript, Scala and Windows OS Experience in implementing machine-learning solutions, development in multiple languages and statistical analysis. He/she will also need to be familiar with a whole host of other approaches used in practical applications of machine learning. Experience in AI Integration, Natural Language Processing and AI Application Programming Experience with Telecommunications, IoT, Data visualization and GIS projects - Current hands-on implementation experience required Database Administrator with big data project and TOGAF certification will be an advantage Customer facing skills to represent Big Data Architectures well within the opco environments and drive discussions with senior personnel regarding trade-offs, best practices, project management and risk mitigation. Collaborate with Opco’s Analytics, data science and full stack team to ensure structured and unstructured data capturing and ingestion as per design requirements Work with Axiata’s IT and security architects to ensure compliance with data security and privacy policy Manage and lead end-to end data lifecycle management activities and ensure consistency/quality/availability between data management – More of Data Engineers role Excellent communication skills and ability to convey complex topics through effective documentation as well as presentation.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Architect

Kuala Lumpur, Kuala Lumpur Maybank

Posted 7 days ago

Job Viewed

Tap Again To Close

Job Description

Maybank WP. Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia Data Architect

Get AI-powered advice on this job and more exclusive features. Subject matter expertise in technologies (TD, HD, BDA, R, Python) and Data Management Understand and deliver data modeling requirements – physical and logical data modeling Drive data quality enhancements at scale to solve business problems in a prioritized manner Build and support data flow integration between different products increasing the capabilities of our product and expanding its use across the firm Ability to understand and deliver client requirements end to end for reference data and data lake aligned use cases Ensure proper risk management process is adhered too, to mitigate risk exposure for the firm Hard Skills / Requirements Technology Infrastructure (HW, SW & Applications hosted) Expertise Ability to understand in depth how technology products are delivered as per requirement Advanced understanding of reference data, data lake, and other information system solutions Experience integrated multiple systems together to create an intelligent and resilient information system solution Ability to use industry standard tools to create logical and physical data models for technology infrastructure and reference data Information System Design Experience delivering or supporting information systems using extract, translate, load (ETL) and publishing methods Understanding the right technical solution to represent the data appropriately to meet the client requirement Experience with performance tuning of information systems (e.g. streamlining ETLs to run faster) Data Quality Experience Understanding the core concepts of data quality–how to measure improve Understanding what data quality issues pose the greatest risks to the Maybank operations and raising those to governance effectively Functional Skills: Must Have (Experience in any of the areas): Data Quality Score Cards, Data Management, Data Governance Dashboards, Visualization Scorecards, Sharepoint Good to have (Experience in any of the areas): Data Warehousing (5 years) Big Data (4 years) Master Data Management, Reference Data Management (4 years) Worked as SI/Sa in Financial domain (6 years) Technical Skills: Must Have (Certified in anyone of the below but should have multiple skillset): R, Python Hadoop-Cloudera ETL/ELT Tools - Trillium, informatica (Any data Modelling or data profiling tools) Good to have (Experience in any of the following): Pig, MapReduce, Hive, Hbase Big Data Integration, Ingestation Framework (4 years) Data Lab, Data Lake (4 years) ITIL, TOGAF (any one) Seniority level

Mid-Senior level Employment type

Full-time Job function

Information Technology Banking

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Architect

Kuala Lumpur, Kuala Lumpur Capgemini

Posted 11 days ago

Job Viewed

Tap Again To Close

Job Description

Explore our latest thought leadership, ideas, and insights on the issues that are shaping the future of business and society. Choose a partner with intimate knowledge of your industry and first-hand experience of defining its future. Discover our portfolio – constantly evolving to keep pace with the ever-changing needs of our clients. Become part of a diverse collective of free-thinkers, entrepreneurs and experts – and help us to make a difference. See our latest news, and stories from across the business, and explore our archives. We are a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. Data Architects drive change that creates business opportunity through data driven insights. Architects shape and translate Business strategy to data products and needs into realisable, sustainable technology solutions. Architects take end-to-end solution delivery ownership from idea to benefits delivery. Job Description - Grade Specific

Managing Data Architect - Design, deliver and manage complete data architecture solutions. Demonstrate leadership of topics in the architect community and show a passion for technology and business acumen. Work as a stream lead at CIO/CTO level for an internal or external client. Lead Capgemini operations relating to market development and/or service delivery excellence. Are seen as a role model in their (local) community. Certification: preferably Capgemini Architects certification level 2 or above, relevant data architecture certifications, IAF and/or industry certifications such as TOGAF 9 or equivalent. Ref. code -en_GB Posted on 23 Jun 2025 Experience level Experienced Professionals When you join Capgemini, you don’t just start a new job. You become part of something bigger. Learn about how the recruitment process works – how to apply, where to follow your application, and next steps. To help you bring out the best of yourself during the interview process, we’ve got some great interview tips to share before the big day.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Architect Jobs View All Jobs in Kuala Lumpur