Search Jobvertise Jobs
Jobvertise

Hadoop Administrator/Development
Location:
US-TX-Irving
Email this job to a friend

Report this Job

Report this job





Incorrect company
Incorrect location
Job is expired
Job may be a scam
Other







Apply Online
or email this job to apply later

Role- Hadoop Administrator/Development

Location: Irving-TX Remote during covid

Job Description:

Hadoop Administrator/Development with a minimum of 6+ years of experience to join our Datalake team. They will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure Data Platform runs 24/7.

Possess extensive analysis, design and development experience in Hadoop and AWS Big Data platforms
Able to critically inspect and analyze large, complex, multi-dimensional data sets in Big Data platforms
Experience with Big Data technologies, distributed file systems, Hadoop, HDFS, Hive, and Hbase
Define and execute appropriate steps to validate various data feeds to and from the organization
Collaborate with business partners to gain in-depth understanding of data requirements and desired business outcomes
Create scripts to extract, transfer, transform, load, and analyze data residing in Hadoop and RDBMS including Oracle and Teradata
Design, implement, and load table structures in Hadoop and RDBMS including Oracle and Teradata to facilitate detailed data analysis
Participate in user acceptance testing in a fast-paced Agile development environment
Troubleshoot data issues and work creatively and analytically to solve problems and design solutions
Create documentation to clearly articulate designs, use cases, test results, and deliverables to varied audiences
Create executive-level presentations and status reports
Under general supervision, manage priorities for multiple projects simultaneously while meeting published deadlines
Bachelor's degree or Master's degree in Computer Science or equivalent work experience
Highly proficient and extensive experience working with relational databases, particularly Oracle and Teradata
Excellent working knowledge of UNIX-based systems
Excellent Interpersonal, written, and verbal communication skills
Very proficient in the use of Microsoft Office or G Suite productivity tools
Experience with designing solutions and implementing IT projects
Exposure to DevOps, Agile Methodology, CI/CD methods and tools, e.g. JIRA, Jenkins, is a huge plus
Prior work experience in a telecommunications environment is a huge plus
Experience with Spark, Scala, R, and Python is a huge plus
Experience with BI visualization tools such as Tableau and Qlik is a plus
Background in financial reporting, financial planning, budgeting, ERP (Enterprise Resource Planning) is a plus
Exposure to advanced analytics tools and techniques e.g. machine learning, predictive modeling is a plus.


MUST HAVE SKILLS (Most Important):


Experience with Hadoop Data Platforms Experience with Relational databases like Oracle
SQL/PLSQL
Unix Shell Script
CRONJOB
HIVE

DESIRED SKILLS:


Python
Java
Hive and Spark cluster environments
Qlik Sense


JOB DUTIES:


Install and configure Hadoop clusters, Sqoop, Python & Spark packages Expertise in administration of Hive, Kafka, Python, Hbase, Spark, Sqoop Manage Hadoop,Kafka,hbase, Sqoop,Hive and Spark cluster environments Apply proper architecture guidelines to ensure highly available services Plan and execute major platform software and operating system upgrades and maintenance across physical environments Develop and automate processes for maintenance of the environment Implement security measures for all aspects of the cluster (SSL, disk encryption, role-based access) Ensure proper resource utilization between the different development teams and processes Design and implement a toolset that simplifies provisioning and support of a
large cluster environment Review performance stats and query execution/explain plans; recommend changes for tuning Create and maintain detailed, up-to-date technical documentation Ability to shell script with Linux Integration to other Hadoop platforms

EDUCATION/CERTIFICATIONS:

Bachelors or Master's degree in Computer Science, Information Systems, Engineering, or Mathematics preferred.

Please share resume to veena.r@techgene.com

Techgene Solutions LLC

Apply Online
or email this job to apply later


 
Search millions of jobs

Jobseekers
Employers
Company

Jobs by Title | Resumes by Title | Top Job Searches
Privacy | Terms of Use


* Free services are subject to limitations