logo

View all jobs

Sr. Data Engineer

San Bruo, CA

One of our direct client is urgently looking for a Sr. Data Engineer @ San Bruno CA
 
TITLE: Sr. Data Engineer

LOCATION: San Bruno, CA
Duration: 6 to 12+ Months
Rate: DOE

Job Duties:


Client Note:
Looking for a backend data engineer with Apache Sprak, Hive and Airflow knowledge.

Must-haves
  1. Excellent knowledge with SQL and Hive (HiveQL)
  2. Ability to build data pipelines using PySpark, Hive, and Scala Spark
  3. Proficient knowledge of a programming language preferably Python or Java or Scala


Good-to-haves
  • Tableau, Looker and PowerBI experience
  • Knowledge of Google Cloud Platform (GCP)
  • Good understanding of Adobe Analytics or Google Analytics
  • Prior experience with Marketing datasets and building Marketing reporting


Description:
Position Summary
• Very Strong engineering skills. Should have an analytical approach and have good programming skills.
• Provide business insights, while leveraging internal tools and systems, databases and industry data
• Minimum of 5+ years’ experience. Experience in retail business will be a plus.
• Excellent written and verbal communication skills for varied audiences on engineering subject matter
• Ability to document requirements, data lineage, subject matter in both business and technical terminology.
• Guide and learn from other team members.
• Demonstrated ability to transform business requirements to code, specific analytical reports and tools
• This role will involve coding, analytical modeling, root cause analysis, investigation, debugging, testing and collaboration with the business partners, product managers and engineering teams.

Must Haves
Must be able to develop data pipelines with PySpark, Hive and SQL
Proficient with Python and developing scripts in python
Ability to build Tableau dashboards

Good to Have
Experience n architecting data pipelines – from Data model to the jobs and the sequence of jobs
Ability to build dashboards with Tableau or looker
Software Engineering knowledge – ability to build web applications using Java and AngularJS or ReactJS tech stacks

Technical Requirements
• Knowledge/experience on Teradata Physical Design and Implementation, Teradata SQL Performance Optimization
• Advanced SQL
• Experience working with large data sets, experience working with distributed computing (MapReduce, Hadoop, Hive, Pig, Apache Spark, etc.).
• Strong Hadoop scripting skills to process petabytes of data
• Experience in Unix/Linux shell scripting or similar programming/scripting knowledge
• Experience in ETL/ processes

Education
BS degree in specific technical fields like computer science, math, statistics preferred

Share This Job

Powered by