We have following urgent need for our Direct client.
TITLE: Big Data Architect ( Hadoop / ELT / BI )
Location: San Francisco, CA
Compensation: Competitive ( Base + stock options + bonus & benefits )
* Local candidates are strongly preferred
* All Visa holders are OK. H1B visa OK too. Client will sponsor visa
The big data architect is responsible for the design and management of the data warehouse and processes that support the enterprise’s Big Data Initiatives. Must have deep understanding of the Big Data Technologies such as Hadoop, Pig, Hive,Solr, NoSQL Databases, ETL/ELT, MDM & BI Tools. This person must have the ability to understand complex data relations, business requirements, formulate efficient and reliable solutions to difficult problems. This Big Data Architect role will be responsible for the development and maintenance of critical ETL, ELT processes that support the data warehouse and other aspects of the Big Data Initiatives.
REQUIRED EXPERIENCE AND SKILLS
* 8+ years of experience in architecting Database, ETL, Data modeling, storage, and other back-end services in bare metal or virtualized environments
* A proven industry leader in Big data architecture and operations
* Strong experience with one or more ETL/ELT tools.
* Deep expertise in Hadoop ecosystem and NoSQL database technologies.
* Comprehensive knowledge and experience in process improvement, normalization/de-normalization, data extraction, data cleansing and data manipulation.
* Practical experience with the data modeling (dimensional, relational, star schema’s, snowflake schema modeling with facts and dimensions).
* Established experience with automated, elastic scaling of cloud services, automated deployment and remediation
* Experience with implementation of Cloud application fabric technologies such as Cloud Foundry
* Experience working effectively in Agile SDLC
* Work Collaboratively with all levels of business stakeholders to architect, implement and test Big Data based analytical solution from disparate sources
* Demonstrate and understanding of concepts, best practices and functions to implement a Big Data solution in a corporate environment.
* Design of scalable Big Data Architectures and solutions
* Translate business requirements into technical designs
* Implementation of some or all of the big data systems in distributed cloud environments
* Database, storage, and other back-end services in fully virtualized environments
* Scalability and high availability, fault tolerance, and elasticity
* Security, encryption for Big Data environments
* Architect, Design and maintain high performing ELT/ETL Processes.
* Monitoring, alerts, and automated recovery
* Participate in an Agile SDLC to deliver new cloud platform services and components
* Champion best practices for Linux administration and Security for delivery of cloud services
* Set architectural vision and direction across a matrix of teams