Long Description Job Summary
Data integration , unification, cleansing and data quality management
Job Responsibilities
Management of data inflow
Create and maintain optimal data pipeline architecture; adopt new technologies
to improve existing frameworks of data flow and monitoring
Assemble large, complex data sets that meet functional / non-functional
business requirements
Build the infrastructure required for optimal extraction, transformation, and
loading of data from a wide variety of data sources using SQL and ‘big data’
technologies; take necessary steps to implement necessary changes in the IT
infrastructure such as MDM tool acquisition, data lake design, cloud solutions
implementation in coordination with IT and technology coordinator
Create data tools for analytics and data scientist team members that assist
them in building models by automating and simplifying data preparation
Translate customer data strategy into actionable data integration plans and
execute these plans
Maintain 360 degree view of customer, enhance the customer datamarts by
continuously integrating new sources of data
Data cleansing and unification
Create automated data anomaly detection systems and constant tracking of its
performance
Process, cleanse, and verify integrity of data used for analysis; active use
of built-in data quality dashboard on CDP and coordination of corrective
actions
Develop algorithms to de-duplicate and export customer data from multiple BUs
to ensure data unification
Ensure continuous unification of customer records and associated profile and
transactional data
Candidate Requirements • Bachelor’s degree in Computer Science, Computer
Engineering, Mathematics, or related field
**Work Styles at Zoom** In most cases, you will have the opportunity to choose your preferred working location from the following options when you join Zoom: in-person, hybrid or remote. Visit this page for more information about Zoom's Workstyles. **About Us** Zoomies help people stay connected s…
**Job Summary:** The Data Engineer will be primarily responsible for the data integration (ETL) between our operational data sources and central data-lake and creating specific data-marts according to data analytics team's needs. **Job Responsibilities:** Management of data inflow Cre…
## **Role purpose:** The Data engineer is a core member of the agile teams delivering data pipelines and data governance capabilities within the organization through building and automating data flows across all Analytics and Autonomics domain. ## **Key accountabilities and decision ownership:** …
#### **Job Description** * Develop and maintain data pipelines implementing ETL processes. * Take responsibility for data warehouses and data lakes (i.e. Hadoop, Elastic Stack and Spark) development and implementation. * Work closely with a data scientists implementing data analytic pipelines…
Company Description Denodo is a high-growth, market-leading enterprise software company backed by HGGC. We are recognized as a leader by Gartner and Forrester and uniquely positioned to address the data fragmentation problems that exist in many enterprises. We thrive in dynamic environments, and…
Fivos is the industry’s only provider of comprehensive data solutions for specialty providers, medical registries and device manufacturers. By taking a holistic approach to data capture, insights and action, we help drive innovation, improve outcomes and lower costs for unique clinical workflows. V…
When you think of food delivery in the MENA region, we'd be pretty surprised if talabat didn't pop into your mind first! Since delivering our first order in Kuwait in 2004, we've grown quite a lot over the past 17 years. Today, we deliver hundreds of millions of food orders, grocery items and other…
#### **About the job Data Engineer** **Requirements/Job Specification** * Conduct meetings to gather business requirements from all functions in order to support managing demand and change management processes * Identify new technologies, potential hardware, operating systems, and connectivity …
#### **Job Description** * Develop and maintain data pipelines performing ETL processes. * Work closely with data scientists and software developers to implement pipelines performing advanced * analytics including machine learning models. * Selecting features, building and optimizing pipeli…
Company Description ** At AECOM, we 're delivering a better world.** We believe infrastructure creates opportunity for everyone. Whether it's improving your commute, keeping the lights on, providing access to clean water or transforming skylines, our work helps people and communities thrive. Our…