TitleSenior Hadoop Engineer Architect
Location
City/TownNational
State/ProvinceAny State
Description
As a Senior Hadoop Engineer/Architect, you will solve problems for clients using Big Data and Hadoop ecosystem technologies. You will help them select the appropriate Hadoop, NOSQL and Search components to use, design a solution to solve their business and technical problems, present your solution to the client, and be a leader for its implementation.  
Preferred home base locations include Dallas/DFW, Austin, St. Louis, Chicago, Denver or Minneapolis.  However, opportunities are open to candidates in all major US cities.  Candidates will travel to various client sites based on needs (~50% travel). 
 
Direct Hire (W-2) and Contract (1099) opportunities available.  No C2C.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                        

Representative Job Duties:
 
  • Lead clients and colleagues through analysis and solution engineering of Big Data systems
  • Design and build Hadoop clusters and provide expert assistance to clients doing the same
  • Select the right approach of technology depending on data type and latency requirements for batch, interactive and streaming data analytics 
  • Train clients in usage and features of big data technologies, through in-person training sessions, speaking at conferences, and blog posts
  • Demonstrate best practices in design, planning, testing, deployment, security and administrative of high-performance computing platforms
  • Assist sales and account management efforts with presentations, estimates, demonstrations, mentoring, and planning
  • Perform structured data analytics on Hadoop using various methods for SQL on Hadoop including Hive, Impala and Spark SQL - experience with other Spark components is a plus
  • Integrate Hadoop with existing RDBMS systems to implement two-tiered EDW & ETL solutions offloading data and processing to achieve the best cost-benefit for the overall system
  • Implement and train machine learning algorithms using big data to help understand and use large datasets
  • Consult with client to address data governance and security requirements
  • Perform performance tuning
  • Perform POC deployments and conversions
  • Install technical security controls
  • Administer solutions for clients
 
Required Experience: 
  • Bachelor’s degree in Computer Science, Information Systems or other related field
  • A minimum of ten years’ work experience designing, developing and deploying application and systems using multiple paradigms and development methodologies
  • A minimum of three years’ work experience developing systems using Hadoop, NOSQL and Search technologies
 
Required Technical Skills:
  • Deep Hadoop development and administration experience
  • Deep Java or Scala development experience
  • Knowledge of scripting languages (Python, Bash, Ruby, Perl) and at least one SQL variant
  • Knowledge and use of ETL tools and methods
  • Ability to install and configure components in the Unix/Linux environment
  • Experience working with formal Software Development Lifecycles, preferably Agile methodologies
  • Use of version control systems such as Git or SVN
  • Basic comprehension of distributed systems and how they work
  • Basic understanding of networking and data centers
  • Possess a one or more technical certifications such as:
    • Certified Information Systems Security Professional (CISSP)
    • International Information Systems Security Certification Consortium (isc)2
    • Microsoft Certified Technology Specialist (MCTS)
    • Windows 7 Administration
    • Hortonworks Apache Hadoop 2.0 Certified Developer
    • Hortonworks Certified Hadoop 2.X Administrator
    • Cloudera Certified Professional (CCP)
      • CCP Data Engineer
    • Cloudera Certified Associate (CCA)
      • CCA Spark and Hadoop Developer
      • CCA Data Analyst
      • CCA Administrator
    • Cloudera Certified Administrator (CCA)
 
Knowledge, Skills and Abilities:
  • Intellectual curiosity and demonstrated critical thinking and creative problem solving ability
  • Track record of learning new technologies and methods quickly
  • Ability to architect designs and solutions based on client problems
  • Ability to program a solution based on provided design
  • Ability to see pros/cons for approaches and reason about them
  • Proven experience explaining complex topics to others and communicating effectively with clients
  • Ability to work independently and as a part of a team of other consultants and/or client representatives
  • Experience working in a consultative role with external or internal clients
  • Ability to travel up to 50%

PRINCIPALS ONLY.  NO AGENCIES PLEASE.