Hadoop (HDP/HDF) Administrator

Location: Atlanta, Ga
Date Posted: 01-10-2019
Hadoop (HDP/HDF) Administrator
Responsibilities will include providing operational support on Hadoop for data science, business intelligence and ETL/streaming workloads, running on Hortonworks Data Platform and Data Flow.  Operational support responsibilities will include platform installation/upgrade, troubleshooting, documentation, performance tuning, root-cause analysis, and issue resolution.  The position will also involve management of security and change control for HDP and HDF. 
The position will work closely with Hadoop Team Leads, other Hadoop Administrators, Data Engineers, Application Developers and Data Scientists.  This position will involve participation in on call rotation for 24/7 support.
4-yr College Degree preferably in Information Systems, Computer Science, Engineering or related field
·       Work directly with client technical and business resources to devise and recommend solutions based on the understood requirements
·       Analyze complex distributed production deployments, and make recommendations to optimize performance
·       Install and configure new Hortonworks Data Platform (HDP) and Data Flow (HDF) clusters.
·       Upgrade existing HDP/HDF clusters to newer versions of the software
·       Analyze and apply SmartSense recommendations
·       Work with development and business stakeholders to setup new HDP users. This includes setting up Kerberos principals and testing HDFS, Hive, HBase and Yarn access for the new users
·       Perform HDP security configurations with Ranger, Kerberos, Knox and Atlas.
·       Work closely with vendor Support to address support tickets
·       Monitor and optimize cluster utilization, performance and operations. 
·       Write and produce technical documentation, administration runbooks and knowledge base.
·       Keep current with Hadoop Big Data ecosystem technologies
·       Be part of an on-call rotation.
·       Hortonworks HDP Administration certification
·       Demonstrated experience implementing big data use-cases, understanding of standard design patterns commonly used in Hadoop-based deployments.
·       At least 2 years of HDP installation and administration experience in multi-tenant production environments, with experience on HDP 2.6.x/HDF 3.x versions.
·       Experience designing and deploying production large-scale Hadoop architectures 
·       Strong experience implementing software and/or solutions in enterprise Linux or Unix environments, including strong experience in shell scripting.
·       Strong experience with various enterprise security solutions such as LDAP and Active Directory
·       Strong experience with Ambari, Ranger, Kerberos, Knox and Atlas
·       Strong experience with Hive
·       Strong experience with Nifi
·       Sound experience with other RDBMS, such as Oracle, MS SQL server.
·       Sound experience with source code control methodology and real systems such as PVCS, TFS or Git.
·       Sound experience with Hortonworks Data Plane (DPS) product.
·       Strong understanding of network configuration, devices, protocols, speeds and optimizations
·       Strong understanding of Java development, debugging & profiling
·       Good troubleshooting skills, understanding of HDP capacity, bottlenecks, memory utilization, CPU usage, OS, storage, and networks
·       Understanding of on premise and Cloud network architectures
·       Excellent verbal and written communication skills
this job portal is powered by CATS