A Senior DevOps Engineer is required by my truly amazing public sector client in their London/Glasgow Office, on a one year fixed-term contract.
You will be responsible for the configuration and on-going management of PaaS/IaaS, working with Big Data such as Hortonworks based on Hadoop clusters configured on data centres such as those of Azure/AWS. You will ensure availability of the platform to services teams and their service users. You will also carry out continuous improvements to , including system development, processes design and automation and diagnostics improvements. Assurance of platform security, privacy and resilience will be your greatest concern.
? Collaborating with teams who are provisioning infrastructure in a DevOps environment
? Ensure principles of privacy, security and resilience are assessed and designed into solutions
? Responsible for implementation and on-going administration of Hadoop infrastructure
? Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments
? Working with delivery teams to setup new Hadoop users, including setting up users, Kerberos principles and testing HDFS and Hive access for new users
? Monitor and maintain Hadoop ecosystem connectivity and security to guarantee confidentiality, integrity and availability
? Working with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability
? Agile software engineering practices such as CI/CT/CD
? Eagerness to prioritise collaboration and partnering within the team, experience defining and implementing effective data engineering working practices in agile teams and across teams
? Experience with architecture/engineering of based distribution systems (such as Azure) o Experience of Microservices Architecture and APIs
o Strong knowledge of Azure and its offerings and the Hadoop ecosystem using Hortonworks (preferred) or Cloudera
o Cloudbreak, Hive, Ambari, Sqoop, Oozie, SPARK, Atlas, Ranger, HBase, HDFS, YARN and ELK
? Hands on experience connecting cloud infrastructure to on-prem networks o VPNs, tunnelling, AD connectors, routing o Knowledge of network protocols such as TCP, UDP, HTTP/HTTPS, SSL/TLS, and API's o Authentication technologies. i.e. LDAP, OAuth, 2FA, SAML, and Kerberos o Bash, Python, Java or Ruby
? Experienced with security controls o VPNs, encryption & key/certificate management, endpoint protection, virtual firewalls/ACLs/NSGs, setting up bastion nodes, etc
? Background in distributed systems of databases, security, networking & load balancing, monitoring, scripting, automation
o A deep understanding of distributed system design and dependency management
A fantastic opportunity!