Security Clearance
- An active government clearance, background investigation, and polygraph are required for this position
Description
- Install, configure, test, administer, monitor, troubleshoot, and sustain the operating system and application software for large, clustered systems based on the CLOUD technologies COPILOT, ACCUMULO, PIG/PIGLET, SPARK and Hadoop (HDFS and MapReduce).
- Develop, integrate, and test software to provide encryption/decryption and wrapping/unwrapping services for the TDF; integrate same with a variety of IC GovCloud Discovery, Storage, and Utility Cloud components.
- Experience with the Map Reduce programming model and technologies such as Hadoop, Hive, Pig, Hadoop Distributed File System (HDFS), etc.
- Experience with distributed scalable Big Data Store (NoSQL) such as Hbase, Accumulo, Big Table, etc.
- Experience with Serialization data interchange formats such as JSON and/or BSON.
- Experience in the design and development of at least one Object Oriented system.
- Experience developing solutions integrating and extending FOSS/COTS products.
Qualifications
- Six years experience using Java programming language. Bachelor’s degree in Computer Science or related discipline from an accredited college or university is required. Four (4) years of additional experience on projects with similar requirements may be substituted for a bachelor’s degree.
Not sure if you're a fit? Submit your resume and we will contact you via email to let you know!