Location
Job description
Hadoop Developer Charlotte, NC Fulltime (Permanent)
(H1b Transfer Workable) Job Description:
Experience Required: 10+ Years
Core Skills:
Key Responsibilities:
(H1b Transfer Workable) Job Description:
Experience Required: 10+ Years
Core Skills:
- Big Data & Hadoop: Hadoop, Big Data Processing, Distributed Data Systems, Large Data Volume Handling
- Data Warehousing: Data Warehouse, Data Warehousing Concepts, Dimensional Data Modeling, Data Architecture
- ETL Development: ETL Development, ETL Processing, Data Integration, Data Pipelines
- Database Technologies: Oracle, Exadata, Oracle SQL, PLSQL, SQL Performance Tuning
- Performance Optimization: Performance Tuning, Query Optimization, Data Processing Optimization
- Design & Architecture: High-Level Design, Detailed Design, Technical Solution Design, Information Architecture
- Development & Delivery: Production Support, Code Development, Technical Deliveries, SDLC, Quality Standards
- Collaboration & Coordination: Business Analysts Coordination, Offshore/Nearshore Coordination, QA Collaboration, Cross-Functional Team Collaboration
- Leadership & Problem Solving: Technical Leadership, Analytical Skills, Logical Thinking, Problem Solving, Thought Leadership
- Methodologies & Best Practices: Project Lifecycle, Process Adherence, Best Practices, Quality Assurance
Key Responsibilities:
- Design and develop ETL solutions in a Data Warehouse environment.
- Work on multiple concurrent projects and manage technical deliveries.
- Collaborate with business analysts to understand requirements and design solutions.
- Support production issues and implement scalable technical solutions.
- Work with Information Architecture and development teams on data solutions.
- Handle large data volumes and optimize system performance.
- Coordinate with offshore, QA, L2 support, and upstream/downstream teams.
- Ensure adherence to quality standards, processes, and best practices.