339 Data Engineer jobs in Malaysia
Big Data Engineer
Posted 9 days ago
Job Viewed
Job Description
Position Summary
We are looking for a full-time Big Data Engineer II to join our team at Shirlyn Technology in Petaling Jaya, Malaysia and Palo Alto, California. The Big Data team is at the heart of our operations, playing a critical role in scaling and optimizing our data infrastructure. As we continue to grow, we are seeking an experienced and driven Big Data Engineer to help us tackle complex data challenges, enhance data solutions, and ensure the security and quality of our data systems. You will be instrumental in building and deploying scalable data pipelines to ensure seamless data flow across systems, driving business success through reliable data solutions.
Job Responsibilities
- Collaborate with global teams in data, security, infrastructure, and business functions to understand data requirements and provide scalable solutions.
- Design, implement, and maintain efficient ETL (Extract, Transform, Load) pipelines to enable smooth data flow between systems.
- Conduct regular data quality checks to ensure accuracy, consistency, and completeness within our data pipelines.
- Continuously improve data pipeline performance and reliability, identifying and addressing any inefficiencies or bottlenecks.
- Ensure data integrity, security, privacy, and availability across all systems.
- Proactively monitor data pipelines, quickly diagnosing issues and taking action to resolve them and maintain system uptime.
- Conduct root cause analysis of data-related issues and work on long-term solutions to prevent recurring problems.
- Document pipeline designs, data processes, and troubleshooting procedures, keeping stakeholders informed with clear communication of updates.
- Provide on-call support for critical data operations, ensuring systems are running 24/7, with rotational responsibilities.
Job Requirements
- Bachelor’s degree in Computer Science, Information Systems, or a related technical field.
- 5+ years of hands-on experience building and optimizing data pipelines using big data technologies (Hive, Presto, Spark, Flink).
- Expertise in writing complex SQL queries and proficiency in Python programming, focusing on maintainable and clean code.
- Solid knowledge of scripting languages (e.g., Shell, Perl).
- Experience working in Unix/Linux environments.
- Familiarity with cloud platforms such as Azure, AWS, or GCP.
- High personal integrity and professionalism in handling confidential information, exercising sound judgment.
- Ability to remain calm and resolve issues swiftly during high-pressure situations, adhering to SLAs.
- Strong leadership skills, including the ability to guide junior team members and lead projects across the team.
- Excellent verbal and written communication skills, with the ability to collaborate effectively with remote teams globally.
Big Data Engineer
Posted 11 days ago
Job Viewed
Job Description
#J-18808-Ljbffr
Senior Big Data Engineer
Posted 9 days ago
Job Viewed
Job Description
Senior Java Engineer (Base Kuala Lumpur)
Job Advantages
Cutting-edge technology (web3 and blockchain), globalized company
Job Responsibilities
- Responsible for the architectural design and development of the company's data center.
- Continuously deliver data value to the company's businesses with high quality, low cost and high efficiency.
- Provide solutions for platform architecture and performance optimization, and lead the rapid iteration of big data platform.
- Enabling the rapid development of the company's business through data-driven approaches, continuously improving the company's core competitiveness.
Job Requirements
- Bachelor's degree or above in computer science or related majors, with more than 3 years of experience in Java development, proficient in Java development, and an in-depth understanding of JVM principles.
- Familiar with commonly used frameworks (Spring, Spring MVC, gRPC, Mybatis) and able to master their principles and mechanisms.
- Solid computer fundamentals, mastery of computer operating systems and network architecture, familiar with commonly used algorithms, data structures, and design patterns.
- Familiar with distributed systems, caching, messaging, and other mechanisms; experience with Redis, Kafka, Spark, Flink, Zookeeper, and TiDB is preferred.
- Experience in data center or microservice design and development, with a focus on high availability and scalability of architecture is preferred.
- Experience in tagging systems or algorithmic recommendation systems is preferred.
- Full of enthusiasm and expectation for the blockchain industry, committed to working hard for the development of the industry.
Senior Big Data Engineer
Posted 9 days ago
Job Viewed
Job Description
Company : Web3 & Blockchain-focused Company
Job Advantages
- Cutting-edge technology (Web3 and blockchain)
- Globalized company
Job Responsibilities
As a Senior Java Engineer , you will play a pivotal role in the development and optimization of the company’s data infrastructure, ensuring it supports the evolving needs of our blockchain-focused business. Your primary responsibilities will include:
- Architectural Design & Development : Lead the design and development of the company’s data center, ensuring robust performance and scalability.
- Data Platform Optimization : Provide ongoing solutions for platform architecture and performance optimization to support the company’s rapid growth and business needs.
- Business Enablement : Drive data-driven solutions that accelerate business development and continuously enhance the company's core competitiveness.
- Big Data Iteration : Lead the rapid iteration of big data platforms, ensuring efficiency, cost-effectiveness, and high-quality output.
Job Requirements
We are seeking candidates who are passionate about blockchain technology and possess a strong technical foundation. Ideal candidates will meet the following criteria:
- Education : Bachelor’s degree or above in Computer Science or related majors.
- Experience : More than 3 years of experience in Java development, with a deep understanding of JVM principles and Java development best practices.
- Frameworks : Proficient in frameworks such as Spring, Spring MVC, gRPC, Mybatis , and an ability to understand their underlying principles and mechanisms.
- Core Computer Fundamentals : Strong knowledge of computer operating systems, network architecture, and proficiency in commonly used algorithms, data structures, and design patterns.
- Distributed Systems : Familiarity with distributed systems, caching mechanisms (Redis), messaging systems (Kafka), and big data processing frameworks like Spark, Flink , and Zookeeper . Experience with TiDB is a plus.
- Experience in Microservices : Experience in the design and development of data centers or microservices, with an emphasis on high availability and scalability.
- Tagging Systems/Recommendation Systems : Experience with tagging systems or algorithmic recommendation systems is highly desirable.
- Passion for Blockchain : A strong enthusiasm for the blockchain industry and a commitment to contributing to its growth and development.
Why Join Us?
This role offers the opportunity to work at the forefront of blockchain and Web3 technologies within a global company. You will have the chance to develop and optimize critical infrastructure that powers innovative and scalable solutions in the blockchain space. If you’re ready to work in a fast-paced and cutting-edge environment, this role could be the perfect fit for you.
Apply now to be part of an exciting journey in revolutionizing the Web3 ecosystem!
Pioneer Talent Program - Organic Content Operations Crypto Operations Manager - Oregon, United States Crypto-Native UX Researcher (Remote - Jakarta, Indonesia) Web3 Head of Marketing and Communications (Remote) #J-18808-LjbffrSenior Big Data Engineer
Posted 11 days ago
Job Viewed
Job Description
#J-18808-Ljbffr
Senior Big Data Engineer
Posted 11 days ago
Job Viewed
Job Description
Senior Java Engineer , you will play a pivotal role in the development and optimization of the company’s data infrastructure, ensuring it supports the evolving needs of our blockchain-focused business. Your primary responsibilities will include: Architectural Design & Development : Lead the design and development of the company’s data center, ensuring robust performance and scalability. Data Platform Optimization : Provide ongoing solutions for platform architecture and performance optimization to support the company’s rapid growth and business needs. Business Enablement : Drive data-driven solutions that accelerate business development and continuously enhance the company's core competitiveness. Big Data Iteration : Lead the rapid iteration of big data platforms, ensuring efficiency, cost-effectiveness, and high-quality output. Job Requirements We are seeking candidates who are passionate about blockchain technology and possess a strong technical foundation. Ideal candidates will meet the following criteria: Education : Bachelor’s degree or above in Computer Science or related majors. Experience : More than 3 years of experience in Java development, with a deep understanding of JVM principles and Java development best practices. Frameworks : Proficient in frameworks such as
Spring, Spring MVC, gRPC, Mybatis , and an ability to understand their underlying principles and mechanisms. Core Computer Fundamentals : Strong knowledge of computer operating systems, network architecture, and proficiency in commonly used algorithms, data structures, and design patterns. Distributed Systems : Familiarity with distributed systems, caching mechanisms (Redis), messaging systems (Kafka), and big data processing frameworks like
Spark, Flink , and
Zookeeper . Experience with
TiDB
is a plus. Experience in Microservices : Experience in the design and development of data centers or microservices, with an emphasis on high availability and scalability. Tagging Systems/Recommendation Systems : Experience with tagging systems or algorithmic recommendation systems is highly desirable. Passion for Blockchain : A strong enthusiasm for the blockchain industry and a commitment to contributing to its growth and development. Why Join Us? This role offers the opportunity to work at the forefront of
blockchain
and
Web3 technologies
within a global company. You will have the chance to develop and optimize critical infrastructure that powers innovative and scalable solutions in the blockchain space. If you’re ready to work in a fast-paced and cutting-edge environment, this role could be the perfect fit for you. Apply now to be part of an exciting journey in revolutionizing the Web3 ecosystem! Pioneer Talent Program - Organic Content Operations
Crypto Operations Manager - Oregon, United States
Crypto-Native UX Researcher (Remote - Jakarta, Indonesia)
Web3 Head of Marketing and Communications (Remote) #J-18808-Ljbffr
Big Data Engineer (Spark, Flink, Java)
Posted 4 days ago
Job Viewed
Job Description
Binance Kuala Lumpur City, Federal Territory of Kuala Lumpur, Malaysia
Join or sign in to find your next jobJoin to apply for the Big Data Engineer (Spark, Flink, Java) role at Binance
Binance Kuala Lumpur City, Federal Territory of Kuala Lumpur, Malaysia
6 days ago Be among the first 25 applicants
Join to apply for the Big Data Engineer (Spark, Flink, Java) role at Binance
Get AI-powered advice on this job and more exclusive features.
Binance is a leading global blockchain ecosystem behind the world’s largest cryptocurrency exchange by trading volume and registered users. We are trusted by over 250 million people in 100+ countries for our industry-leading security, user fund transparency, trading engine speed, deep liquidity, and an unmatched portfolio of digital-asset products. Binance offerings range from trading and finance to education, research, payments, institutional services, Web3 features, and more. We leverage the power of digital assets and blockchain to build an inclusive financial ecosystem to advance the freedom of money and improve financial access for people around the world.
Responsibilities
- Design and develop microservices architecture using Java Spring Boot, ensuring system performance, scalability, and reliability.
- Implement and manage microservices components such as service discovery, configuration management, and load balancing with Spring Cloud.
- Analyze, process, and explore data using big data technologies.
- Optimize and manage data storage and retrieval systems like ES/HBase ensuring efficient data handling.
- Work closely with business stakeholders to understand requirements and deliver data solutions that align with business goals.
- Monitor, troubleshoot, and improve backend services to ensure smooth operation.
- Participate in the full development lifecycle, including requirements gathering, system design, coding, testing, and deployment.
- Maintain and enhance existing backend services, ensuring code quality and performance.
- Collaborate with cross-functional teams to integrate data solutions into existing applications.
- Bachelor's or Master’s degree in Computer Science, Engineering, or a related field.
- 8 years of professional experience in Java development, with a minimum of 3 years focused on Spring Boot microservices.
- Expertise in designing and implementing Java Spring-based microservices architecture, with experience in Spring Cloud.
- Hands-on experience with data development and big data technologies, including Hive, Spark and Flink
- Familiar with and proficient in using components such as Apollo Configuration Center, Kafka Message Middleware, Xxl Job Scheduler, Pinpoint Link Tracking, and Prometheus Monitoring.
- Proficient in optimizing and managing large-scale databases and data processing workflows.
- Strong problem-solving skills, with a proven ability to troubleshoot and optimize backend services.
- Excellent communication and teamwork skills, with the ability to work in a cross-functional environment.
- Experience with CI/CD pipelines, such as Jenkins or GitLab CI, is a plus.
- Knowledge of cloud services like AWS or is a plus.
- Strong knowledge of financial or blockchain business models and processes, is a plus.
- Shape the future with the world’s leading blockchain ecosystem
- Collaborate with world-class talent in a user-centric global organization with a flat structure
- Tackle unique, fast-paced projects with autonomy in an innovative environment
- Thrive in a results-driven workplace with opportunities for career growth and continuous learning
- Competitive salary and company benefits
- Work-from-home arrangement (the arrangement may vary depending on the work nature of the business team)
By submitting a job application, you confirm that you have read and agree to our Candidate Privacy Notice . Seniority level
- Seniority level Mid-Senior level
- Employment type Full-time
- Job function Information Technology and Engineering
- Industries Technology, Information and Internet
Referrals increase your chances of interviewing at Binance by 2x
Sign in to set job alerts for “Big Data Developer” roles.Federal Territory of Kuala Lumpur, Malaysia 3 weeks ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 6 days ago
Python and Kubernetes Software Engineer - Data, AI/ML & AnalyticsKuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 4 months ago
Python and Kubernetes Software Engineer - Data, Workflows, AI/ML & AnalyticsKuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 3 weeks ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 days ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 month ago
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 week ago
Binance Accelerator Program - Frontend Developer (Big Data)Kuala Lumpur City, Federal Territory of Kuala Lumpur, Malaysia 6 days ago
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-LjbffrBe The First To Know
About the latest Data engineer Jobs in Malaysia !
Big Data Engineer (Spark, Flink, Java)
Posted 4 days ago
Job Viewed
Job Description
Join to apply for the
Big Data Engineer (Spark, Flink, Java)
role at
Binance Binance Kuala Lumpur City, Federal Territory of Kuala Lumpur, Malaysia 6 days ago Be among the first 25 applicants Join to apply for the
Big Data Engineer (Spark, Flink, Java)
role at
Binance Get AI-powered advice on this job and more exclusive features. Binance is a leading global blockchain ecosystem behind the world’s largest cryptocurrency exchange by trading volume and registered users. We are trusted by over 250 million people in 100+ countries for our industry-leading security, user fund transparency, trading engine speed, deep liquidity, and an unmatched portfolio of digital-asset products. Binance offerings range from trading and finance to education, research, payments, institutional services, Web3 features, and more. We leverage the power of digital assets and blockchain to build an inclusive financial ecosystem to advance the freedom of money and improve financial access for people around the world.
Responsibilities
Design and develop microservices architecture using Java Spring Boot, ensuring system performance, scalability, and reliability. Implement and manage microservices components such as service discovery, configuration management, and load balancing with Spring Cloud. Analyze, process, and explore data using big data technologies. Optimize and manage data storage and retrieval systems like ES/HBase ensuring efficient data handling. Work closely with business stakeholders to understand requirements and deliver data solutions that align with business goals. Monitor, troubleshoot, and improve backend services to ensure smooth operation. Participate in the full development lifecycle, including requirements gathering, system design, coding, testing, and deployment. Maintain and enhance existing backend services, ensuring code quality and performance. Collaborate with cross-functional teams to integrate data solutions into existing applications.
Requirements
Bachelor's or Master’s degree in Computer Science, Engineering, or a related field. 8 years of professional experience in Java development, with a minimum of 3 years focused on Spring Boot microservices. Expertise in designing and implementing Java Spring-based microservices architecture, with experience in Spring Cloud. Hands-on experience with data development and big data technologies, including Hive, Spark and Flink Familiar with and proficient in using components such as Apollo Configuration Center, Kafka Message Middleware, Xxl Job Scheduler, Pinpoint Link Tracking, and Prometheus Monitoring. Proficient in optimizing and managing large-scale databases and data processing workflows. Strong problem-solving skills, with a proven ability to troubleshoot and optimize backend services. Excellent communication and teamwork skills, with the ability to work in a cross-functional environment. Experience with CI/CD pipelines, such as Jenkins or GitLab CI, is a plus. Knowledge of cloud services like AWS or is a plus. Strong knowledge of financial or blockchain business models and processes, is a plus.
Why Binance
Shape the future with the world’s leading blockchain ecosystem Collaborate with world-class talent in a user-centric global organization with a flat structure Tackle unique, fast-paced projects with autonomy in an innovative environment Thrive in a results-driven workplace with opportunities for career growth and continuous learning Competitive salary and company benefits Work-from-home arrangement (the arrangement may vary depending on the work nature of the business team)
Binance is committed to being an equal opportunity employer. We believe that having a diverse workforce is fundamental to our success.
By submitting a job application, you confirm that you have read and agree to our
Candidate Privacy Notice
. Seniority level
Seniority level Mid-Senior level Employment type
Employment type Full-time Job function
Job function Information Technology and Engineering Industries Technology, Information and Internet Referrals increase your chances of interviewing at Binance by 2x Sign in to set job alerts for “Big Data Developer” roles.
Federal Territory of Kuala Lumpur, Malaysia 3 weeks ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 6 days ago Python and Kubernetes Software Engineer - Data, AI/ML & Analytics
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 4 months ago Python and Kubernetes Software Engineer - Data, Workflows, AI/ML & Analytics
Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 3 weeks ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 2 days ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 month ago Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia 1 week ago Binance Accelerator Program - Frontend Developer (Big Data)
Kuala Lumpur City, Federal Territory of Kuala Lumpur, Malaysia 6 days ago We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-Ljbffr
Data Engineer
Posted 1 day ago
Job Viewed
Job Description
Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, deliver digital marketing solutions, and gain deeper insights into the automotive market, all using our unique combination of data, analytics and software. We also assist millions of people to realise their financial goals and help them to save time and money.
We operate across a range of markets, from financial services to healthcare, automotive, agrifinance, insurance, and many more industry segments.
We invest in talented people and new advanced technologies to unlock the power of data and to innovate. A FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 23,300 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.com .
Job DescriptionWe're looking for a skilled Data Engineer with a strong product builder mindset to join our Product Development Team. In this role, you'll be instrumental in building and maintaining the data infrastructure that powers our data products. You'll work closely with product owners, engineers, developers, and data scientists to ensure the quality, consistency, and scalability of our solutions.
This is a hands-on role where you'll help shape the future of our data products—translating business needs into robust, scalable data solutions that empower our clients to make smarter decisions.
What you'll do
- Design and build robust, scalable data pipelines to ingest and process large volumes of data.
- Develop complex data models to support client-facing teams and product features.
- Lead and support the migration of data infrastructure from on-premise to cloud environments.
- Collaborate with stakeholders to understand data requirements and translate them into technical solutions.
- Implement and maintain optimal cloud-based architectures to support our data products.
- Own the full lifecycle of data processes—from ideation to deployment and maintenance.
What we are looking for
- Hands-on experience with AWS and Azure cloud platforms.
- Strong Python skills and advanced SQL proficiency.
- Experience with Dagster, DBT, Snowflake, and SQL Server.
- Deep understanding of conceptual, logical, and physical data modeling for both OLTP and OLAP systems.
- Familiarity with AWS EC2, Terraform, CI/CD pipelines, Git, Docker, ECR, and EKS.
- Excellent collaboration skills to work effectively with cross-functional teams.
Our uniqueness is that we truly value yours.
Experian Asia Pacific's culture, people, flexibility and environments are key differentiators. We take our people and equal opportunity agenda very seriously. We focus on what truly matters; DEI, work/life balance, development, authenticity, engagement, collaboration, wellness, reward & recognition, volunteering. the list goes on. We're an award winning organisation due to our strong people focus (Great Place To Work, Top Employer and Employer of Choice).
Experian Asia Pacific leverages cutting edge data science, inclusion and start-up mindsets to build tomorrow's credit solutions. Innovation is a critical part of Experian's DNA and practices. As is our diverse workforce, which drives our success. Everyone can succeed at Experian, irrespective of their gender, ethnicity, colour, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity.
Experian Careers - Creating a better tomorrow together
#J-18808-LjbffrData Engineer
Posted 1 day ago
Job Viewed
Job Description
MoneyLion is a leader in financial technology powering the next generation of personalized products and content, with a top consumer finance super app, a premier embedded finance platform for enterprise businesses and a world-class media arm. MoneyLion’s mission is to give everyone the power to make their best financial decisions. We pride ourselves on serving the many, not the few; providing confidence through guidance, choice, personalization; and shortening the distance to an informed action.
In our go-to money app for consumers, we deliver curated content on finance and related topics, through a tailored feed that engages people to learn and share. People take control of their finances with our innovative financial products and marketplace - including our full-fledged suite of features to save, borrow, spend, and invest - seamlessly bringing together the best offers and content from MoneyLion and our 1,100+ Enterprise Partner network, together in one experience. MoneyLion’s enterprise technology provides the definitive search engine and marketplace for financial products, enabling any company to add embedded finance to their business, with advanced AI-backed data and tools through our platform and API. Established in 2013, MoneyLion connects millions of people with the financial products and content they need, when and where they need it.
About the Role
The Kuala Lumpur office is the technology powerhouse of MoneyLion. We pride ourselves on innovative initiatives and thrive in a fast paced and challenging environment. Join our multicultural team of visionaries and industry rebels in disrupting the traditional finance industry!
MoneyLion is building a world class data ecosystem that will dramatically improve the ability of internal data users and analysts to create products and experiences that captivate our customers. As the Data Engineer, you will participate in technical and architectural planning as well as contributing as a developer while interacting with data scientists, software engineers, and analysts as we build and scale the world's most rewarding program for financial wellness.
Key Responsibilities
- Assist with reviewing and interpreting business and technical requirements with a focus on data objectives of the product
- Identify requirements and develop data delivery solutions
- Implement company policies on data access and data distribution
- Support members of various teams to determine database structural requirements by analyzing their applications and client operations
- Collaborate with highly talented engineers while striving to help them build their skills and achieve their career goals
- Support data flow and ETL processes from end to end for projects and initiatives including troubleshooting of data issues, integrity checks, data model design, querying of data, and performance resulting from data issues
- Assist in training over the data systems used by the company for the employees. Specifically, training programs for new hires on various reports and models available in the data visualization tools that are used
- Analyze large data sets, some of them messy, to answer and solve real-world business problems while charting paths towards data cleanup and preventing data from becoming messy
About You
- Familiarity with financial and banking products
- Demonstrated skills in good software engineering practices
- Building and shipping large-scale engineering products and/or infrastructure
- Scalable data architecture, fault-tolerant ETL, and monitoring data quality in the cloud (We use AWS)
- Collaborating with a diverse set of engineers, data scientists and product managers
- Deep expertise in data engineering and data processing at scale. Requires a focus on the development of pipelines for the creation of data lakes to enable exploration as well as machine learning model training, deployment and scoring at massive scale.
- Familiarity with data science techniques and frameworks
- NoSQL, Relational databases and Presto (We use MongoDB, MySQL, PostgreSQL, ElasticSearch)
- The AWS stack combined with technologies such as Java, Python, Spark, and Kafka
- Familiarity with Apache Airflow or equivalent workflow management tools
What's Next.
After you submit your application, you can expect the following steps in the recruitment process:
- Online Preliminary HackerRank test
- Interview - Talent Acquisition Team (Virtual or face-to-face)
- Take-home Assessment - To be discussed in the technical round
- Interview - Hiring Manager (Virtual or face-to-face)
We value growth-minded and collaborative people with high learning agility who embody our core values of teamwork, customer-first and innovation . Every member of the MoneyLion Team is passionate about fintech and ready to give 100% in helping us achieve our mission.
Working At MoneyLion
At MoneyLion, we want you to be well and thrive. Our generous benefits package includes:
- Competitive salary packages
- Wellness perks
- Paid parental leave
- Generous Paid Time Off
- Learning and Development resources
MoneyLion is committed to equal employment opportunities for all employees. Inside our company, every decision we make regarding our employees is based on merit, competence, and performance, completely free of discrimination. We are committed to building a team that represents a variety of backgrounds, perspectives, and skills. Within that team, no one will feel more “other” than anyone else. We realize the full promise of diversity and want you to bring your whole self to work every single day.
#J-18808-Ljbffr