Data Engineer - Azure / GCP, Data Lake, Snowflake

Data Engineer - Azure / GCP, Data Lake, SnowflakeUp to 700 per day (Inside IR35)London / Hybrid (1-2 days per week hybrid working)6 monthsI am currently working with an instantly recognisable, high profile client who urgently require a Data Engineer with expertise in Azure / GCP and Data Lakes to join a major transformation programme, whilst expanding Data sources and identifying more Data sources to help produce more metrics to drive Data capability across the entire organisation, helping bridge the gap between Data Engineering and Security.Key Requirements:Proven experience as a Data Engineer in a large, complex, regulated organisationExpertise with Cloud Platforms (Azure and GCP preferred)Previous experience of working with Data LakesDemonstrable experience of ingesting, extracting and analysing Data from diverse sourcesAbility to create a centralised and standardised view from using Data from across multiple Business / Market Units across the entire organisationUnderstanding of future hosting model(s)Capability to give Market Units some guidance whilst understanding Data capability, working with vendors / 3rd parties and working out what more can be doneStrong communication skills and ability to work autonomously and drive innovationNice to have:Familiarity with Data ArchitectureExposure to Cyber Security tooling or working closely with InfoSec / Risk teamsUnderstanding of Data Management frameworks (DCAM, DMBOK)Working knowledge of GraphQL / Data Bricks / Snowflake / Ora
Other jobs of interest...
Perform a fresh search...
-
Create your ideal job search criteria by
completing our quick and simple form and
receive daily job alerts tailored to you!