jobs board

AWS Data Engineer x4

Interested in this role?

Enter your details and register your interest with us

Apply now

Job title

AWS Data Engineer

Job description

Location : Sunbury (remote until Covid restrictions allow)

Pay rate : £500pd – Umbrella companies only, No Own Ltd

Duration : 29th March to 26th November 2021 (extensions possible)

Position : AWS Data Engineer - AWS Migration From CDL x4

We are looking for AWS Data Engineer - AWS Migration From CDL x4 to join our client.

- Data Engineers develop and maintain data infrastructure and write, deploy and maintain software to build, integrate, manage, maintain, and quality-assure data at bp. Data Engineers collaborate with their business stakeholders, Data Managers, Data Scientists, Software Engineers and Architects.

- Data Engineers will have broader responsibilities than the more narrow data pipeline building capabilities that were previously emphasized. For example, Data Engineers will be also now responsible for distributed systems architecture design, data warehousing, executing on GDPR and other privacy requirements from digital security and need to have business context and knowledge about the data domains they are working with (more overlap with Data Managers).

- Data Engineers will own the end-to-end technical data lifecycle and corresponding data technology stack for their data domain and to have a deep understanding of the bp technology stack.

- Data Manipulation: develop / actively contribute to, debug and maintain software across the entire data lifecycle (from ingestion to deprecation), augment the data tech stack in your immediate scope and integrate these with existing data systems. Understand the end-to-end lifecycle of the data and the data tech stack deployed across your business entity / enabler area. Enhance understanding of the business entity's / enabler area's key data technologies / and research alternatives to bp platform offerings. Influence your team’s technical decisions.

- Software Engineering: Proficient in at least one object-oriented programming language. Advanced SQL skills. Understand how to use the standard bp (e.g. cloud) tools to efficiently produce software at scale. Write clean, re-usable code and commit code in manageable chunks. Write efficient and optimized algorithms. Code written has extensive test, automated monitoring and alerting coverage. Proactively find and address bugs, technical debt and inefficient practices / tools. Demonstrate the ability to debug and solve complex issues by translating business problems into technical solutions. Understand the immediate codebase, as well as its surrounding systems. The software solutions deliverd perform as described, minimizing the number of unexpected failures or changes required. In-depth knowledge of and apply software engineering best practices and processes, e.g. Continuous Integration and Continuous Deployment practices, Kubernetes, AzureDevOps or similar.

- Scalability, Reliability, Maintenance: Ensure existing tools / code bases are re-used where possible/logical. Advocate for tech debt removal in addition to building new features. Identify areas of improvement and lead through execution (e.g. writing small scripts and sharing them with the team, adding documentation to existing services, improving service health). Approach problems with a mindset that prioritize automation and long-term productivity over short-term speed, and execute on those opportunities to improve products or services. Design systems and products to be future-facing and engage with the team to ensure your work can be extended.

- Data Domain Knowledge: Deep understanding in at least one data domain area within the business entity and awareness of related domains.

- Standards (internal + external): Create or update documentation and ensure metadata is kept up-to-date (e.g. descriptions and purposes for data sets). Advocate and adhere to data privacy and digital security standards (e.g. GDPR, digital security policy, etc.) to ensure security of bp data.

- Best practices: Provide leading insight of industry and technology trends and best practices on data infrastructure and data engineering. Advocate for data best practices in the business entity.

- Right approach / tool choice: Produce solutions that are focused on solving the needs of customer(s), e.g. service design / product acumen. Deep understanding of several commonly available Data Engineering approaches and tools that range across all data elements (e.g. streaming, structured, unstructured data) and can select the right ones to solve the problem (to avoid ""complexity / technology for complexity / technology sake""; ""simple first"").

- Architectural Design: Firm understanding of the architectural design for systems in the business entity's / enabler area's scope. Articulate advantages / shortcomings of any proposed solutions taking future requirements into consideration.

- Citizenship: Engage in honest conversations and encourage team members to ask questions and actively listen to concerns. Be inclusive and collect diverse set of opinions and ideas. Leverage your own skills and experience to upskill others. Work with business partners across the wider team to implement continuous improvement opportunities to create safer operations, reduce cost, or other opportunities with data.

- Autonomy in problem identification and problem breakdown: Independently identify problems to be solved in the team / sub-area as a result of in-depth technical and business understanding. Understand vague and broad problems and break them down into actionable sub-problems and tasks.

- Stakeholder Management: Identify new project opportunities as a result of a deep understanding of stakeholder needs and pain points. Suggest ideas for improving the service to stakeholders.

- Communication Skills: Effectively communicate orally and in writing to a technical and non-technical audience.

- Business impact: Apply data strategies relevant to the business entity / enabler area

- Commercial Acumen: Utilize deep understanding of business operations and commercial factors to support decision-making in immediate scope.

- Values: Foster a culture that promotes and enables bp values and behaviours.

AWS experience is also necessary.

AWS MQTT (IoT), AWS glue, Kinesis, Redshift or Amazon EMR. Very strong Python and SQL background

• Person should have AWS experience

• Experience of AWS Architecture, AWS CloudFront, AWS S3, AWS Lambda, DynamoDB, SQS, SNS, AWS CloudFormation and other AWS Tools (ideally with some AWS certification or training)

• Experience with GIT and source code management best practices

• Experience with Windows and Linux operating systems

• Experience in working with AWS S3 and native managed services along with Informatica BDM, Python/Spark, Scoop, Scala

• Experience in AWS Kinesis (and other data streaming technologies)

• Experience in encryption configuration (e.g. at-rest, in transit and tokenization)

• Experience working on AWS, Informatica tools

• Experience in distributed cloud-based services

• Working understanding of the other responsibilities in an Agile/Scrum development methodology

Please apply if you have the suitable skills.

Interested in this role?

Enter your details and register your interest with us

Apply now