Skip to main content CDW

CDW is committed to employment equality. If you need any adjustments to support your application or to access our facilities, please contact us.

Azure Data Engineerat CDW Careers (UK)

Job ID: 
11878
Team: 
Information Technology
Post Date: 
29 May 2024
Location: 
Cape Town, South Africa
Eligible For Remote Work: 
Yes - Hybrid
Contract Type: 
Permanent (Full-time)
Travel: 
No
Security Clearance Required: 
No

This job posting is no longer active

Description

BACKGROUND

CDW has a mission to become the leading B2B integrated technology solutions provider in the markets that we serve. Enabling, supporting and accelerating our ambitious plans to grow from a £1.35bn business to £5bn by 2025 requires a major programme of transformation; understanding the needs of our customers, partners & coworkers, redefining & optimising our operating models and modernising our business systems.

CDW UK’s Business Transformation are responsible for driving and accelerating change and transformation across people, process, and systems. Its role is to:

  • Provide portfolio management for all change initiatives ensuring they are assessed, prioritised, sequenced and governed to maximise benefit to CDW’s customers and co-workers, supported by robust Change Management
  • Support change initiatives with additional resource and skills, from Project Managers to Business Analysts and technical experts
  • Own some of the major 70+ business-wide initiatives including Robotic Process Automation, ServiceNow implementation, Process Re-engineering, and ERP replacement.

KEY DUTIES

CDW is undergoing a roll out of our Azure data platform, and we are looking for a Data Engineer to join a growing team to ingest, cleanse and model data into single source of truth Kimball datasets, used for Business Intelligence reporting and Data Science machine learning use cases.

  • Ingest data from various sources, including on premise SQL databases, REST API’s, Apache Kafka streams, etc.
  • Define and apply cleansing rules to the data to ensure meets quality expectations.
  • Define and model data into single source of truth Kimball datasets, e.g. dimensions and facts.
  • DevOps approach to software; create high quality code using traditional software practices, such as building, linting, unit and integration testing, repositories, CI/CD, peer reviews, etc.
  • DevOps approach to support; monitor pipelines to ensure businesscritical data pipelines are processed meeting time constraint and data quality expectations (including implementing reactive changes where applicable).
  • Drives and Integrates into the data team’s WOW, including backlog refinement, sprint planning, demonstrations, and retrospectives, including translation of user requirements into technical requirements including complexity estimation, implement into sprint cycles, collaboration into process improvement, etc.
  • Based on requirements, perform ad-hoc analysis of structured and unstructured data across multiple data sources to inform solution design.
  • Document datasets in data catalogue, including ownership, stewardship, dictionaries, glossaries, lineage, sensitivity, etc.
  • Ownership of Work Items and works with data owners / stewards to ensure high quality, aligned deliveries, and compliance with legislation, such as GDPR, e.g. PII vs Non-PII, data retention, etc.
  • Document solution design in wiki.

KNOWLEDGE AND EXPERIENCE

Must Have:

  • Architecture, modelling and leadership skills
  • Strong Azure data skills, including:
  • Azure Data Factory V2
  • Azure Data Lake Storage V2
  • Azure Databricks
  • Azure Function Apps& Logic Apps
  • Azure Stream Analytics o Azure Resource Manager skills (Terraform, Azure Portal, Az CLI and Az PowerShell)
  • Strong PySpark, Delta Lake, Unity Catalog and Python skills. o Includes ability to write unit and integration tests in Python with unittest, pytest, etc.
  • Strong understanding of software development practices, such as SOLID principals, structuring code, testing, IOC, etc.
  • Strong repositories, CI/CD skills.
  • Strong knowledge of Kimball data modelling, such as star schema, snowflake, etc.
  • Strong SQL skills.
  • Strong data analysis skills.
  • Excellent written and verbal communication skills
  • A minimum of 2 years of experience as an Architect.
  • A demonstrable track record of getting stuff done whilst managing competing pressures and deadlines and retaining an eye for detail and quality.
  • A passion for technology and its ability to have a positive impact on business.

Nice To Have:

  • Azure DevOps (git, multistage YAML, etc.)
  • Other languages, such as C#, PowerShell
  • IaC, e.g. Terraform, ARM, Bicep, etc.
  • Test driven development (TDD)
  • Streaming technologies, such as Azure Stream Analytics, Spark Structured Streaming, etc.
  • Power BI Engineering experience
  • Certified SCRUM Developer (CSD)
  • Machine learning and Artificial intelligence

PERSONAL ATTRIBUTES

  • Self-driven, doesn’t require micromanagement.
  • Organised and structured
  • Comfortable in fast-paced environments with shifting & ambiguous requirements.
  • Passionate about both process and technology and the impact they can have on business and our customers.
  • Articulate and credible
  • Quality and detail orientated.
  • Positive attitude and influence on others
  • Fast learner and able to adapt to new technology and keep abreast of current industry trends and practice.
  • Excellent communicator in all forms to key stakeholders
  • Excellent organisation and time management skills

 

Apply Now
 
Create Job Alert
Create Job Alerts